A survey on wireless sensor network technologies in pest management applications


Animal pests notoriously cause billions of dollars of damage by spoiling crops, damaging infrastructure and spreading disease. Pest control companies try to mitigate this damage by implementing pest management approaches to respond to and prevent infestations. However, these approaches are labour intensive, as pest control technicians must regularly visit the affected areas for monitoring and evaluation. Current remote sensing technologies can allow for decision making based on real-time data remotely uploaded from pest traps. This reduces the frequency of field visits and improves the pest management process. In this paper, we survey a variety of modern data-driven pest management approaches. We also evaluate wireless communication infrastructures which can be used to facilitate data transfer between pest traps and cloud servers.


Animal pests are a major problem globally, as they can spread disease, and cause damage to crops and infrastructure. Rats, for example, carry a number of diseases including Plague (the Black Death [1]), Murine typhus, Rickettsial pox, Salmonellosis, Rat-bite fever, as well as Leptospirosis - all of which can spread to humans [2]. Plant eating pests such as codling moths burrow into pome fruit, which can result in severe crop losses [3]. In terms of infrastructure damage, termites are capable of destroying buildings by eating into the timber [4]. To mitigate some of these problems, pest control strategies need to be implemented to treat and prevent infestations.

Animal pest control involves the eradication of both invertebrate and vertebrate select creatures from homes, office buildings, farms, and other areas frequented by people. Insects, rodents, and birds are the most common animal pests present around the world. Prevention and elimination of smaller pests is usually achieved through the use of pesticides, while larger pests are normally eradicated using pest traps [2].

There are five stages involved in the traditional pest management process [2]:

  • Inspection—a pest control technician carries out an inspection in the area where pest activity is suspected.

  • Identification—the type of pest (or pests) in the area are identified.

  • Recommendation—based on the pest type, the technician will recommend a treatment strategy.

  • Treatment—After the recommended strategy has been agreed on, the treatment stage begins. This usually involves exposing the infested area to pesticides or rodenticides.

  • Evaluation—At this stage the technician periodically visits the area to check the effectiveness of the treatment strategy.

The treatment and evaluation stages are usually ongoing, as long term monitoring is the most effective way of preventing a re-infestation [2]. While the traditional pest management approach mentioned above is effective, it has three distinct weaknesses. Firstly, the pest management strategy is based on old information (the state of the site the last time it was visited). This means that if there is a need to change the pest management strategy between site visits, perhaps as a result of increased pest activity since the last visit, the change can only be effected during the next site visit. Depending on the type of pest and level of infestation, site visits can range from weekly to annually [5]. Secondly, there is an ongoing high labour cost involved with sending technicians out for site visits periodically. Finally, despite the effectiveness of pesticides in eliminating pests, they have a negative impact on crops and other non-target species when used in excess. For example, experimental studies have shown that wild bumble bees exposed to Neonicotinoid insecticides experienced a reduced growth rate [6]. Another study also detailed the toxic effects of Fenitrothion on dunnarts (a native Australian marsupial) [7]. Farmers therefore have to constantly monitor the amount and types of pesticide used on their crops.

One way of ensuring this is by estimating the population of pests in a given area through the use of pheromone based traps. These traps contain lures placed on an adhesive card, as shown in Fig. 1. When the pest enters the trap it remains stuck on the card. The adhesive cards are then periodically removed and inspected by entomologists, who then count the number and types of pests found on them. These counts give an estimate of the effectiveness of the pest treatment strategy, and aids farmers in determining the right amount and types of pesticide to use. However, a major drawback of this method is the amount of labour involved in manually removing and inspecting all the cards from the various pheromone traps.

Fig. 1

A basic pheromone trap (card exposed for illustration) [8]

For management of larger pests, such as rodents, bait stations or instant kill traps are placed around areas of suspected pest activity. Instant kill traps are designed to kill the rodent when it comes into contact with the trap (through electric shock, suffocation, or trauma). This is in contrast to bait stations, where poison infused into wax blocks or pellets are used. The rodents consume the poisoned bait, and usually die a few days after. Figure 2 shows a typical bait station with wax blocks used as the bait.

Fig. 2

A rodent bait station with wax blocks [9]

One problem involved with using bait is poisoning of non-target species, either through direct consumption or through secondary poisoning (where the non-target species consumes the body of a poisoned pest) [2]. Bait blocks are also harmful to humans if ingested, with children being the most vulnerable. This is especially true when the traps are easily accessible to children (for example when the bait station is placed at ground level). In 2012, the Poison Control Centre in Australia received 15,000 calls concerning children under six years being exposed to rodent bait [10].

New technological innovations have resulted in novel ways of approaching the pest management paradigm, particularly in terms of pest identification. For example, advances in image processing and computational power now make it possible to automate the classification and counting of pests in pheromone-based traps [11]. Some commercial pest management systems allow farmers to remotely monitor pest activity in their fields [12, 13]. Also, the use of passive infra-red (PIR) sensing in pest traps makes it possible to identify target pests based on the amount of heat they emit. By restricting bait access to only the target pests and using bait types that have a low risk of secondary poisoning, the amount of accidental deaths due to non-target species ingesting the bait in the traps is reduced [14].

The Internet of Things (IoT) has been instrumental in enabling pest traps which are capable of wireless communication with cloud servers. This allows for information specific to each trap (such as pest activity, bait levels, and ambient environmental conditions) to be sent to a cloud server for real-time analysis. The analysed information can then be used to make data driven decisions, resulting in a more optimized pest management strategy.

By employing technological solutions to animal pest management, we can increase the quality and timeliness of data, hence facilitating better decision making. Additionally we can automate many of the labour-intensive pest management jobs, thereby reducing cost and increasing efficacy.

The purpose of this paper is to give an overview of modern data-driven approaches to pest management. The paper is structured as follows: Sect. 2 discusses a number of commercially available systems capable of detecting the presence of pests in a pest trap, while Sect. 3 details current systems capable of both detection and classification of pests, and discusses the image processing methods used for pest classification. A discussion of IoT communication infrastructures available for pest control systems (such as LoRa, WiFi, and NB-IoT) is given in Sect. 4. Section 5 gives a critical analysis of the systems discussed in this paper, with an overall conclusion given in Sect. 6.

Pest detection systems

This section provides an overview of the various pest detection systems available, as well as their principle of operation. By adding a pest detection system to traditional pest traps, the pest management process can be significantly improved—as real-time data facilitates a better response to infestations and fewer site visits.Significant cost savings can therefore be realized by this addition. Pest detection systems also allow for more optimized baiting strategies, since the amount of bait can be adjusted according to the pest activity in near-real time.

One of the most successful smart pest detection systems is Trapview, a pest management system which came on to the market in 2013. The standard Trapview system uses four cameras to capture images of the inside of a pheromone based trap. These images are then uploaded to a cloud server for processing. Each trap has an internet enabled SIM card which enables wireless communication with the cloud server. A GPS sensor in the trap provides information on the location of the trap for mapping purposes. The pests in the image are then classified and counted through the use of image processing algorithms, and this information is presented to the user through a web based application, or alternatively, a mobile application. Some of the pests which can be classified by this system include codling moths, plum fruit moths, tomato leafminers, cotton bollworm, grape berry moths, as well as diamondback moths. Users also have the option of purchasing temperature and humidity sensors for the traps [12].

Other Trapview systems available include Trapview AURA, which uses polarized UV light instead of a pheromone lure; and Trapview FLY, which uses higher resolution cameras for monitoring smaller pests such as fruit flies. All Trapview systems come with a solar panel and Lithium-Ion battery for power management [12].

Another smart pest trap targeting insects is the Spensa Z-Trap. Inside this trap is a pheromone lure as well a series of electrified rods. When an insect makes contact with the rods, it gets electrocuted and falls into a collection bucket. Bioimpedance sensors measure the electrical impedance of the insect, and this measurement is used as a means of classifying the insect. Data from the traps is sent to a cloud based server, and is presented to the user via a web application [13].

The same company also offers an image based classifier trap, the Spensa Sentinel, where insects in pheromone based adhesive traps are identified and counted through deep learning algorithms in the cloud [13].

One particular pest management system has proved to be successful in determining bedbug infestations when used in hotel rooms. The GUPSY(Global Urban Positioning and Sensor sYstem) bedbug monitoring system attracts bedbugs by emitting carbon dioxide from its trap [15]. Images of the base of the trap are captured periodically, and computer vision algorithms are used to determine the number of bed bugs in the trap, as well as their type (juvenile or adult). The bedbug count is then sent from the traps in the rooms to a gateway device, using LoRaWAN (a low power wireless network commonly used with Internet of Things systems) [16]. The gateway device then sends this information to a cloud based server. Users can then be sent alerts whenever an infestation is detected.

The conventional method of checking for bedbugs in commercial settings is to use specially trained dogs to sniff around rooms trying to pick up a bedbug scent. If found, heat is used to kill the bedbugs as the use of pesticides for bedbug eradication is illegal. The presence of bedbugs needs to be checked for quite often, especially in places like hotel rooms where they can be introduced into the rooms through luggage. The GUPSY pest management system placed third in the 2016 LoRa Alliance Global IoT challenge [17].

Smart pest management systems are not only limited to insects. Rentokil, a multinational company in the pest control industry has a number of pest management systems targeted at rats and mice. One such system, RADAR Connect (Rodent Activated Detection and Riddance), is an electronic mouse trap which uses carbon dioxide to humanely kill mice. The trap has a tube shape with two entrances (one on either side). When a mouse enters the trap, it triggers break beam sensors in the trap. Actuators then close the entrances of the trap. Carbon dioxide is subsequently released into the trap, thereby killing the mouse within a minute. A signal is then sent from the trap to a Rentokil server, indicating activity in the trap, as well as the location of the trap. A technician can then be dispatched to remove the dead mouse and reset the trap [18].

Autogate is another Rentokil pest management system used for rats. The Autogate system is added to an existing rat bait station, and acts to only allow access to the bait if a rat has been detected in the bait station. For a rat to be detected, the break beam sensors in the bait station must be triggered at least three times over a short time interval. When this happens, an electronic gate will open in the bait station so as to allow the rat to have access to the bait. An SMS is also sent to the customer informing them of the presence of the rodent activity [14].

Another electronic trap targeted at rodents is the Goodnature E2 Automated Rodent trap, from the Australian company Protect-us. This cylindrical-shaped trap contains a miniature gas cannister connected to a movable piston. A lure which emits olfactory notes attractive to the rodents is placed in front of the piston. When a rodent enters the trap, it triggers a sensor, which then results in the piston striking a fatal blow to the head of the rodent. The cannister holds enough gas to fire the piston 24 times before needing replacement [19].

Another company, Anticimex, offers an integrated solution to smart pest control. In terms of their rodent control, they offer a number of smart traps. The Smart Pipe for example, is a trap placed in underground sewers and is designed so that rats and mice can pass through the trap as they move in the sewer. A proximity sensor detects the rodent moving through the trap, which subsequently triggers a metal bar to drop onto the rodent thereby killing it. The rodent is then washed along with the sewer water [20].

Anticimex also sell the Smart box, which can be placed along building walls. This trap uses electricity to kill rodents as they enter in it. The bodies are then moved into a collection bin (part of the trap), by a motor mechanism. The bin needs to be manually emptied after an accumulation of dead bodies. Smart catch is another type of rodent trap offered by Anticimex. It kills the rodents by striking a blow to the neck when a rodent is detected in the trap. This trap can also be placed along building walls. A non-lethal system is also used to gather data about rodent activity. This unit is called Smart Eye (by Anticimex). It is designed to be placed in small, narrow areas where rodent activity is suspected but conventional traps can not be used. It does not have a killing mechanism, however it sends information of rodent activity based on PIR proximity sensors. All information from the above mentioned traps are sent to Smart Connect, a hub which is used to relay the information to a central server using a cellular network. Data sent from these units is then used for effective pest management [20].

BoxSense, from IoT Box Systems is an add-on to traditional rodent bait stations. The system uses both motion and touch sensors to detect rodent activity in the bait station. The information from these sensors is sent to a central server, where it is analysed, and used as a guide for baiting strategies, based on occupancy. This product has a battery life of one year [21]. The company also offers a solution which does not require batteries, the BoxSense eMitter Snap Trap. This spring loaded trap uses EnOcean wireless sensors [22] to detect when a rodent has been trapped and to upload the information to a cloud server. The EnOcean sensors harness their energy from the kinetic energy resulting from the spring loaded trap being triggered. More recently, [23] developed a low power system capable of remotely monitoring rodenticide levels in bait stations. The system uses an infra-red proximity sensor placed behind the bait rod to measure the distance between the sensor and the bait blocks. As the bait is consumed, the measured distance will increase. The sensor output is then sent to a cloud server for processing.

Computer vision based pest classification

In contrast to some of the simple systems from the previous section which just detect the presence of something in a trap (e.g BoxSense, Smart Eye and RADAR Connect), computer vision systems (where algorithms are applied to images) provide a rich source of information which also facilitates pest classification. Several of the computer vision implementations described here used labour-intensive manually captured images (effectively only reducing the labour in the expert classification step). Integrating a camera into the detection system (like in TrapView) provides the greatest gains in labour efficiency whilst providing the most up to date data from the traps.

Advances in digital camera technology have resulted in low-cost camera sensors that have both a small form factor and high resolution. The Omnivision OV5647 camera for example, is a 5 mega-pixel camera with dimensions of 20 mm × 25 mm × 9 mm. This low cost camera has successfully been used in detecting pests as small as fruit flies [24]. These types of cameras, coupled with powerful computer vision algorithms, make up the core of vision enabled smart pest traps. This section reviews some of the computer vision methods used to detect and/or classify pests from a given image. The methods are further categorized into three broad techniques, namely Segmentation based techniques, Feature-based techniques, and Deep learning based techniques.

Segmentation based pest detection

Segmentation based pest detection techniques use simple algorithms and require low processing power. The principle behind these techniques is to remove the background pixels in the image, thereby resulting in an image containing only pixels belonging to the target pest [25].

One such segmentation algorithm was used to detect pests from images of pheromone based trap cards [26]. Wireless cameras placed in the traps facilitated the image acquisition. The presence of pests in the traps was determined by comparing the pixels in successive grayscale images (captured at one minute intervals). The equation used for this comparison is given in (1).

$$\begin{aligned} O(x,y) = {\left\{ \begin{array}{ll} 255 &{} \text {if}\ C(x,y) = L(x,y)\\ 2 &{} \text {if}\ C(x,y) \ne L(x,y)\\ \end{array}\right. } \end{aligned}$$

where O(xy) , C(xy) and L(xy) are the two dimensional matrices which represent the output image, current image, and last received image for each trap respectively.

It is evident from equation (1) that the output image will show areas where the two successive images differ as non-white pixels on a white background. To reduce the effect of varied illumination conditions on the output, a median filter was applied to the output image. The location of each pest in the output image was then determined by examining the pixel intensities in the filtered output image. While this system provides a computationally inexpensive way of successfully detecting and localizing the pests in the images, it is unable to determine the type of pest, or to distinguish between a pest and any other moving object in the traps, as it relies only on changes in pixel intensities between successive images.

In another implementation [27], the authors focused on segmenting footprints on tracking cards for pest classification, as track analysis gives information regarding the type and maturity of pests. The target pests were rats and mice, specifically the Norway rat, the Ship rat, the House mouse, and the Pacific rat. The tracks were captured in tracking tunnels, which are rectangular polyethylene boxes containing a white cardboard doused with non-drying ink (the tracking card). Lures were placed in the tracking tunnels to attract the rodents, which then leave ink footprints on the card. These tracking cards were manually checked, and images of them were subsequently captured. The identification method used thresholding (where pixels with intensities above a given threshold value are set to white, and those below the threshold are set to black) followed by a form of template matching [28] to identify the tracks on the card. The footprints of rats and mice have a circular shape, with the toes encompassing a central pad. This feature allowed the authors to create a number of templates for identifying the footprints. Each template contained a number of circles which represented the possible locations of the central pad and toes for each of the four rodents (with different templates for front and back feet). Figure 3 shows an image of a rat footprint made on a tracking card, as well as the template used for classifying this footprint.

Fig. 3

Rat footprint on tracking card (left) and template used for classification (right) [28]

After thresholding, the algorithm searched for candidate central pads using blob analysis. The area around a candidate central pad was then searched for possible toes. Only those (blobs) within a predefined distance from the candidate central pad were considered to be toes. Footprints were then identified by comparing the central pad and toes to the different templates. The detected footprints were also used to identify strides, which can be used as a means of gender classification and size estimation. This algorithm proved more successful in identifying the rat tracks than the mice tracks, yielding a false positive rate of 11\(\%\) for rats compared to 38\(\%\) for mice.

The authors in [29] improved on this method by providing a better way of thresholding the tracking card images. Instead of the hard threshold used in [27] which eliminated faint footprints on the tracking cards, their method employed an adaptive thresholding technique. This technique differentiated between a background pixel and a footprint pixel by comparing the pixel’s intensity against the mean of its surrounding pixels (relative to some local tolerance value). The tracking cards used in [29] contained footprints from Polynesian rats, Norway rats, and Ship rats. The algorithm proved successful in identifying the footprints of these species, yielding a best result of 78.4\(\%\) true positive for Norway rat footprints, as well as a 90.1\(\%\) true negative for the same species.

Hand-crafted feature based pest detection

While segmentation methods are effective in detecting whether or not a pest is present in an image, they tend to fail when multiple pests are present in the image, or for classification of different pest species. This is particularly true when analysing the images from pheromone based adhesive traps, where there will typically be a number of different pest species in the same image. To help solve this problem, machine learning algorithms can be incorporated into the detection process. Unique image features of each pest species (such as colour, shape and size) can be incorporated into the algorithm, so that multiple species can be detected and classified in the same image.

The authors in [30] successfully implemented such a system to classify Lobesia botrana moths using a K-means classifier. These moths are considered to be the major pest for the grapevine industry. A total of 50 pheromone based traps were deployed across various locations. Each trap was visited once a week and an image of the adhesive surface of the trap was captured using a mobile phone. An android based application on the phone was used to send the image to a web server, along with the GPS location of the phone, and a date and time stamp. The image was then preprocessed by applying a median filter to it so as to remove some dust particles that may have fallen on the adhesive surface. Following this the image was enhanced using the Contrast Limited Adaptive Histogram Equalization method (CLAHE). This process enhances features in the image which have poor contrast.

To account for the fact that images may be captured at different distances from the traps, the image was scaled by factors of 0.75, 0.5, and 0.25. The next step in the processing pipeline was to apply K-means clustering to the image. Once the image is split into segments, two descriptors are generated for each segment. The first descriptor is a 10 dimensional vector generated from taking the histogram of grayscale values of the pixels in the segment. The second descriptor is a 5 dimensional vector generated from taking the histogram of gradients of the pixels in the segment. These descriptors are then used to train a Support Vector Machine in order to distinguish segments containing Lobesia botrana moths to those without. A total of 360 images were used for the training set, and from these, 2136 segments containing the moths and 5018 segments containing no moths were used to train the SVM. This method yielded an average specificity of 95.1\(\%\). As with any classical machine learning approach, all parameters had to be tuned manually in order to find the segmentation size and classifier which yielded the most favourable results. Figure 4 shows the results of segmenting the image using the K-means clustering, as well as the SVM classification. The yellow outline around each moth in Fig. 4 indicates a successful classification and localization.

Fig. 4

Segmented image using K-means clustering (left) and result after SVM classication (right) [30]

In another application, a support vector machine was used to correctly classify Thrips (Thysanoptera) on the flowers of strawberry plants [31]. The main focus of this system was to provide a real-time pest identification system, which was to be run on a mobile robot. The images used for training the system were captured in an orchard under natural lighting conditions. A total of 100 images of strawberry flowers were captured by the camera on the mobile robot. Each image was captured at a fixed distance from the flower.

The relatively small size of Thrips posed a problem for the identification process, and therefore a number of preprocessing steps were taken to maximize the chances of classification. The first of these steps was to remove the non-flower parts of the image. To do this, the RGB image was split into its individual colour channels, and the gamma operator was applied to the blue channel to enhance the contrast of the flower regions in the image relative to the non-flower regions. With the contrast of flower regions enhanced, the non-flower regions were removed by thresholding (using OTSU’s binarization method). Morphological operations of opening and closing were then applied to the thresholded image in order to remove noise. The resulting image would then contain only the flower regions.

The next step involved treating the flower regions as background pixels, with the reasoning that any remaining pixels in the image would belong to a pest. This was done by inverting the image, which had the effect of making all pixels belonging to the white petals of the strawberry flower black, and making the pixels belonging to the middle region of the flower, as well as those belonging to the pest, white. Large regions of white pixels were then removed by considering their area, leaving only small regions, which were assumed to be target pixels of pests. This image was then used as a mask for the original colour image, so as to extract the pixels belonging to the pest in the original image. Figure 5 illustrates this process.

Fig. 5

Original image (top-left), result after thresholding and inverting (top-right) and result after region elimination by area (bottom) [31]

Considering that other pests such as the houseflies and ants can also be found on the petals of strawberry flowers, a Support Vector Machine was implemented for the correct classification of Thrips. Two features were used to train the SVM. The first feature was the region index, which was obtained by considering the ratio of the major diameter to the minor diameter of the insect. The second feature was the colour index, which was found by converting the image to its HSI equivalent, and extracting the Intensity values. A radial basis function kernel was used for the SVM. Of the 100 images taken, 80 were used for training the SVM, and 20 were used for testing. The system was successful in classifying Thrips (both in their adult stage as well as their Larva stage), with a low mean percent error of 2.25%.

Deep learning based pest detection

One major drawback of using feature based detection methods such as SVM’s is that the features for each pest species must be manually tuned (hand crafted) in order to produce favourable results. This step is labour intensive, and limits the types of pests which can be identified to only those whose features have previously been hand crafted into the machine learning algorithm. To reduce the reliance on hand crafted features, Deep learning approaches have been implemented for pest classification. This machine learning approach lets the algorithm automatically learn the best features to use to train the classifier, meaning that new species can be identified by simply retraining the algorithm with images containing the new species.

Ding and Taylor [11] for example, implemented an identification and detection system for codling moths using a Convolutional Neural Network (CNN). The training images used in this system were acquired from pheromone based moth traps. A low cost camera placed in the traps was used to capture the images under artificial lighting conditions. The captured images were then sent remotely to a server, where entomologists marked the locations of moths in the images to create training and validation data sets. As pheromone based traps can also attract other species, these images contained pests other than moths, hence the need for expert labelling of the moths prior to training the CNN. To achieve illumination invariance, the images were first preprocessed for white balance. This reduces the effects of ambient lighting from images captured at different times in the day. A total of 177 images were used to train a LeNet-like CNN. Data augmentation was used to increase the training data by rotating, translating, and flipping the moth images. The detection pipeline consisted of splitting the input image into a number of patches, and passing each patch to the trained CNN for classification. The CNN then mapped the input patch to a probability representing the likelihood of a codling moth being in the patch image. Overlapping patches with high probabilities were eliminated through non-maximum suppression. The remaining candidate patches were compared against a threshold, and those above this threshold were taken to be the detected regions. The system was successful in classifying the codling moths, with a precision recall accuracy of 93.1% reported in the paper. The authors suggest that since the features were learned by the CNN, this method can be used to identify other types of pests simply by re-training the CNN with images containing the those pests.

IoT Communications Infrastructure

As shown in the previous sections, one way of improving the pest management process is by communicating sensor data from the pest traps to the pest management team. This information gives them a better picture of what is happening in the field. The Internet of Things has been instrumental in making this communication possible on a large scale.

The Internet of Things (IoT) refers to a network of interconnected devices (“things”, or nodes), which exchange data with servers over the internet, through a communications infrastructure. The data received by the servers is typically processed using cloud computing, and then presented to the user in the form of a web based application [32]. Figure 6 shows a standard IoT architecture, showing the layout and interconnections between the “things” (gas monitor, trash container etc) and the application servers.

Fig. 6

A typical IoT architecture [33]

In the context of smart pest traps, the traps are the “things” and the data sent from them is usually sensor data (such as temperature, humidity, proximity, optical, intruder detection etc). This data is either sent from the “things” to an internet enabled device, known as a gateway [32], or directly to the cloud if internet connectivity is provided on-board. The gateway device then uploads the data to a cloud based server for processing . Various communication infrastructures can be used to send data between the “things” and the gateway. The choice of communication infrastructure is influenced by the following:

  • The distance between the “things” and the gateway device

  • The environmental surroundings (rural vs built up)

  • The quantity of data to be sent or received between the devices

  • The rate at which the data is to be sent or received

  • The power source of the devices (battery, mains, solar etc)

In a typical communications system, the data to be transmitted is converted into an electromagnetic wave by an antenna, and propagated into the air. As the electromagnetic wave propagates from a transmitter to a receiver, it loses power to the medium it travels through. The amount of power lost is known as the path loss, and it is dependant on factors such as the environment (buildings, trees, terrain), as well as the distance between the transmitter and receiver. As the distance between the transmitter and receiver increases, the path loss increases too. This means that a signal arriving at the receiver will get weaker as the distance between the transmitter and the receiver increases. This path loss is least when there is a clear line-of-sight path between the transmitter and receiver antennae. However, when other obstacles are present (such as building walls), the energy loss is significantly increased. For example, as a wave propagates through an office window it could lose up to half of its energy. The effect of path loss means that the signal power when it reaches the receiver will be much less than when it left the transmitter [34].

Noise (any unwanted signal) is also superimposed onto the wave as it propagates. Examples of noise sources are other transmitted signals in nearby frequency bands, electromagnetic waves produced from power lines, as well as noise signals generated in the transmitter or receiver. The power of the transmitted signal must be significantly higher than that of the noise introduced into the signal. The ratio of the transmitted power to the noise power (as seen at the receiver end) is known as the signal-to-noise ratio (SNR) [34].

There are two ways of reducing the effect of losses on the communications infrastructure. The first method would be to transmit the signal with more power, however there are laws which regulate the maximum power transmission over different frequency bands to a few milliwatts. Another problem with this approach is that for battery powered applications, the extra energy required to transmit the signal with more power would significantly reduce the battery life.

The other method for reducing losses is to increase the sensitivity of the receiver so that it can pick up heavily attenuated signals. The sensitivity of a receiver is defined as the minimum power which a signal must have in order for it to be correctly detected and demodulated by the receiver, at a given bit-error-rate [35]. A receiver with a high sensitivity can be placed further away from a transmitter than one with a lower sensitivity, (thus extending the range). As an example, a receiver with a sensitivity of -40dBm can typically decode signals as low as 0.1 milliwatts [34].

There is a trade-off between data rate (the speed at which data is exchanged between a transmitter and receiver) and range. According to the Shannon-Hartley theorem [34], the slower the data rate, the further the range can be. This is because there is a direct correlation between the SNR and data rate. Since a carrier signal loses signal power as it moves away from the transmitter, the SNR reduces as well. In order to achieve the same bit-error-rate, the data rate must therefore be reduced as well.

The choice of frequency also determines the range of communication, with lower frequency signals being able to travel further than higher frequency carrier signals. High frequency carriers also require more power from the transmitter in order for them to be generated. Some frequency bands also exhibit special characteristics. For example, sub-gigahertz carrier signals lose less power to obstacles in their environment when compared to higher frequency signals.

Battery management is crucial in practical IoT applications. A system optimized for low power results in longer intervals between battery replacements, effectively reducing the running costs of the system. In most IoT applications, the largest battery drain occurs when information is being transmitted over the wireless communication network. This has motivated research into protocols which reduce the power demand for wireless sensor networks.

One such protocol is node cooperation with network coding. The basic premise behind this method of communication is that all nodes in a wireless communication network will relay both their data as well as data received from other nodes into a single network-coded message. The message is then sent to a common destination node (such as a gateway device). This is in contrast to traditional wireless communication protocols where each node sends only its own data to the destination node. By combining information from other nodes and sending over a single transmission, the total time required for information from all nodes to reach the destination node is reduced. This means that the total time the wireless communication is active is reduced, thereby resulting in significant energy savings. Node cooperation with network coding also reduces the required bandwidth for transmitting the data, while increasing the wireless communications range [36, 37].

The main wireless technologies used to establish communications infrastructures in IoT systems are given in Table 1. These technologies can be grouped into three main categories, namely:

  • Short range Local Area Networks (LANs) in unlicensed frequency bands (e.g Bluetooth, WiFi, RFID).

  • Long range cellular communications in licensed frequency bands (e.g 3G/4G).

  • Low Power Wide Area Networks in unlicensed frequency bands (e.g LoRa, SigFox).

Table 1 Comparison of wireless communication standards for IoT systems [35, 38,39,40]

The most commonly used short range communication protocols in unlicensed frequency bands are RFID, Bluetooth, Zigbee, and WiFi [41]. These protocols are characterized by high data rates over short ranges (less than 100 m).

The distance between the “things” and the gateway device can also be extended by using repeaters to amplify the carrier signal power at fixed distances from the transmitter, or by using a mesh network. In a mesh network, each node can communicate with all other nodes in range. A node requiring data to be sent to the gateway device will send the same data to all nodes in range. The data is then passed (or hopped) from node to node until it reaches the gateway device [35]. The mesh approach can be good for coverage and fault tolerance, however it can have significant latency as the routing algorithm dynamically changes the number of hops. Another problem with the mesh approach is that it is not suitable for low-power sleeping applications, since all nodes effectively should always be active when data is being sent or received across the network.

In applications where ubiquitous coverage is required, or where the distance between “things” is too large for a multi-hop mesh network to be feasible, a cellular communications infrastructure is often used. In such a system, each node would require a SIM card to send data over a cellular network such as 3G or 4G. This data can be sent in the form of an SMS, or packet data, to the nearest base station. One major disadvantage of using cellular networks is the ongoing costs required to keep the SIM cards active as well as to send data over the network.

Low Power Wide Area Networks (LPWANs) are a more recent development in IoT communications. These networks operate in the sub-gigahertz unlicensed bands while offering a much wider range than traditional short range LANs. They are characterized by low power and low data rates over long ranges. LPWAN receivers have much higher sensitivities than traditional receivers, thereby giving them longer ranges. For example, typical Bluetooth receiver sensitivities are in the range of -90dBm, while an LPWAN receiver sensitivity can be as high as -150dBm. This means that an LPWAN receiver is capable of successfully decoding a signal whose power is one million times weaker than the weakest signal capable of being detected by a Bluetooth receiver (at the same bit-error-rate). LPWANs are configured in a star topology, where each node is directly connected to the gateway device. This means that only one hop is required for data to be sent from a node to the gateway device, hence latency problems associated with multi-hop mesh networks are eliminated [35].

One of the most commonly used LPWAN technologies for IoT systems is LoRa (Long Range). LoRa devices use a Chirp Spread Spectrum (CSS) modulation scheme to send data between the “things” and the gateway device. CSS spreads the carrier signal over a wider band of frequencies at lower signal strengths. The SNR of the carrier signal is thus very low, however the high sensitivity of the receiver device allows for accurate decoding at specified bit-error-rates. In Australia, LoRa uses the 915MHz - 928MHz frequency range, with each channel occupying a maximum of 500KHz. The low bandwidth means that LoRa does not support high data rates, however since many IoT nodes only send small amounts of data (typically 50 bytes per message), it matches well with such applications. Typical ranges for LoRa communication infrastructures are 5 to 15 kilometres depending on the environment (built up, rural, forested etc) [40].

More recently, SigFox LPWAN systems have become a popular IoT connectivity solution. SigFox systems use a Differential Phase Shift Keying (D-BPSK) modulation scheme (a scheme where the phase of the carrier signal is modulated by the data to be sent), together with Ultra-Narrowband carrier signals to exchange small packets of data over its network. The data is transmitted at a low bit rate of either 100 bps or 600 bps. The low bit rate coupled with the D-BPSK modulation results in long range communication in the order of several kilometers. In a similar manner to legacy cellular systems, the SigFox network architecture also consists of widespread base stations, with multiple radios in range of the base stations being able to exchange data with the base stations simultaneously. Each SigFox radio is permitted to send 140 messages per day (12 bytes of data per message). Data received by the base stations is then sent to the SigFox cloud, and subsequently sent to customer servers. SigFox communication occurs in the ISM unlicensed band, resulting in significantly reduced service charges to the customer (starting at 10 USD per device per year) [42, 43].

There are also LPWANs which rely on cellular systems as their backbone. While traditional cellular systems offer much wider coverage and pre-existing infrastructures, they consume too much power for use in many IoT applications. To address this problem, cellular-IoT systems were developed which offer the coverage and infrastructure of legacy cellular networks, but at a fraction of the power. Narrowband IoT (NB-IoT) is one such recent cellular-IoT. It takes advantage of the licensed cellular sub-gigahertz bands to achieve low power wireless communication. It is based on the LTE standard for cellular communication, however, its bandwidth is limited to 200 KHz. NB-IoT can also transmit data at much faster speeds than LoRa, with typical speeds of 200 kbps [40].


When considering all of the aforementioned smart pest traps, it is evident that technology is having a disruptive effect on the pest management approach. Traditional methods of manual pest trap inspections are being replaced with smart traps which are capable of constant real time monitoring. The use of these traps has the advantage of minimizing the amount of pesticides used by focusing on areas where pest activity is high (as compared to a blanket use of pesticides).

There is a significant reduction in the amount of resources needed to monitor and establish pest infestations, as this can be easily seen using the web applications which accompany these smart traps. Aside from increasing efficiency, these smart traps can provide a wealth of knowledge to better understand pest behaviour, prevalence, and risks.

Another important observation is that smart pest management systems do not offer an end to end solution which eliminates the need for professional pest technicians. Rather, these systems can be used as a tool for the professionals when planning and evaluating their pest eradication methods. Technicians are also required to replace the bait in lure-based traps, as well as remove the bodies of dead pests from the traps.

Considering costs, it is clear that the initial setup cost of smart pest traps is much higher than traditional traps, however the gains come in over the lifetime of the system, as manual inspections (and their associated costs) are minimized with the smart traps. Currently, the recommended time between sending technicians out to check bait stations is between 4 to 8 weeks [44]. Remote monitoring using smart pest traps would significantly reduce the labour costs involved with periodic site visits. Smart traps do however require ongoing maintenance costs, mainly battery replacement and costs associated with sending the information over a wireless network.

Another notable observation is that considering larger pests (such as rodents), most smart traps only send information regarding when a trap was visited or triggered, but not what type of pest visited the trap. This classification information can be used as an additional aid when planning a baiting strategy. Considering the potential of machine vision classification algorithms (as discussed in Sect. 3), automated image capture and analysis in smart traps show great promise.


In this paper, a survey of smart pest traps capable of pest identification and classification were discussed. These traps were grouped into two main categories, namely those capable of pest detection only, and those capable of both pest detection and pest classification. The latter group classify pests through image processing algorithms applied to images captured in the pest traps.

Three categories of image processing algorithms were discussed, namely segmentation based algorithms, hand-crafted feature based algorithms and deep learning based algorithms. It was noted that deep learning based algorithms produced the most favourable results, however the scarcity of labelled pest image data limits the use of these algorithms in smart pest traps.

Various IoT communication infrastructures for transferring data between smart pest traps and cloud servers were also discussed, with their individual trade-offs making the choice of infrastructure application dependant. Smart pest traps coupled with cloud based data analysis will be key in reducing the labour costs involved with traditional pest management, and producing more optimised pest management approaches based on real-time information.


  1. 1.

    (2005) The black death: The greatest catastrophe ever. History Today. https://www.historytoday.com/ole-j-benedictow/black-death-greatest-catastrophe-ever

  2. 2.

    Bennett G, Owens J, Corrigan R, Truman L (1977) Truman’s scientific guide to pest control operations. Adv Commun. https://books.google.com.au/books?id=MNwnAQAAMAAJ

  3. 3.

    (2018) Codling moth. RHS Gardening. https://www.rhs.org.uk/advice/profile?PID=489

  4. 4.

    (2017) Two workers escaped death as house collapsed due to severe termites infestation. RentoKil. https://www.rentokil.com.my/blog/two-workers-escaped-death-as-house-collapsed-due-to-severe-termites-infestation/

  5. 5.

    (2016) Safe food australia—appendix 7: Pest management. http://www.foodstandards.gov.au/publications/Documents/SafeFoodAustralia/Appendix7-Pestmanagement.pdf

  6. 6.

    Whitehorn PR, O’connor S, Wackers FL, Goulson D (2012) Neonicotinoid pesticide reduces bumble bee colony growth and queen production. Science 336(6079):351–352

    Article  Google Scholar 

  7. 7.

    Story P, Hooper MJ, Astheimer LB, Buttemer WA (2011) Acute oral toxicity of the organophosphorus pesticide fenitrothion to fat-tailed and stripe-faced dunnarts and its relevance for pesticide risk assessments in australia. Environ Toxicol Chem 30(5):1163–1169

    Article  Google Scholar 

  8. 8.

    Farmers Market Kenya (2016) Tutrack pheromone trap. https://www.fmk.co.ke/product/tutrack

  9. 9.

    DoMyOwn (2014) Protecta lp rat bait station. https://www.domyown.com/protecta-lp-rat-bait-station-p-1291.html

  10. 10.

    Anderson M (2012) Keeping city children safe from rat poisons | national poison prevention month. https://blog.epa.gov/blog/2012/03/keeping-city-children-safe-from-rat-poisons/

  11. 11.

    Ding W, Taylor G (2016) Automatic moth detection from trap images for pest management. Comput Electron Agric 123:17–28

    Article  Google Scholar 

  12. 12.

    (2018) Trapview—automated pest monitoring. TrapView. http://www.trapview.com/en/

  13. 13.

    (2018) Spensa technologies. https://spensatech.com

  14. 14.

    Nixon N (2017) Toxic baiting on demand. https://cleaningmag.com/news/toxic-baiting-on-demand

  15. 15.

    O’Connor MC (2016) Inventor develops iot solution to bedbug problem. IOT J http://www.iotjournal.com/articles/view?14165/

  16. 16.

    (2015) lora-alliance | technology. www.lora-alliance.org/technology

  17. 17.

    Hopkins T (2016) Leroy merlin wins the loraÂ\(\textregistered\) alliance global iot challenge. http://www.marketwired.com/press-release/leroy-merlin-wins-the-lorar-alliance-global-iot-challenge-2089511.htm

  18. 18.

    Rentokil. (2016) Radar connect. https://www.rentokil.com/products/connected-pest-control/radar-connect/

  19. 19.

    Goodnature. (2014) Goodnature e2 rodent control system. http://www.goodnature.com.au/

  20. 20.

    (2017) Smart—a more intelligent solution to pest control—anticimex. www.anticimex.com/smart-pest-control

  21. 21.

    (2014) Boxsense—wireless transmitting bait stations for rodents with cloud and mobile applications. [Online]. Available: www.iotboxsys.com/products/bait-stations

  22. 22.

    (2017) Self-powered wireless technology for sustainable buildings. www.enocean-alliance.org/what-is-enocean/self-powered-wireless-technology

  23. 23.

    Nibali A, Ross R, Parsons L (2019) Remote monitoring of rodenticide depletion. IEEE Int Things J

  24. 24.

    Shaked B, Amore A, Ioannou C, Valdés F, Alorda B, Papanastasiou S, Goldshtein E, Shenderey C, Leza M, Pontikakos C et al (2018) Electronic traps for detection and population monitoring of adult fruit flies (diptera: Tephritidae). J Appl Entomol 142(1–2):43–51

    Article  Google Scholar 

  25. 25.

    Piccardi M (2004) Background subtraction techniques: a review. In: IEEE international conference on systems, man and cybernetics, vol 4, pp 3099–3104

  26. 26.

    Miranda JL, Gerardo BD, Tanguilig BT III (2014) Pest detection and extraction using image processing techniques. Int J Comput Commun Eng 3(3):189

    Article  Google Scholar 

  27. 27.

    Hasler N, Klette R, Agnew W (2004) Footprint recognition of rodents and insects. CITR, The University of Auckland, New Zealand, Tech. Rep

  28. 28.

    Brunelli R (2009) Template matching techniques in computer vision: theory and practice. Wiley. https://books.google.com.au/books?id=AowB9dRNTqYC

  29. 29.

    Russell JC, Hasler N, Klette R, Rosenhahn B (2009) Automatic track recognition of footprints for identifying cryptic species. Ecology 90(7):2007–2013

    Article  Google Scholar 

  30. 30.

    García J, Pope C, Altimiras F (2017) A distributed-means segmentation algorithm applied to lobesia botrana recognition. Complexity 2017:

  31. 31.

    Ebrahimi M, Khoshtaghaza M, Minaei S, Jamshidi B (2017) Vision-based pest detection based on svm classification method. Comput Electron Agric 137:52–58

    Article  Google Scholar 

  32. 32.

    Gubbi J, Buyya R, Marusic S, Palaniswami M (2013) Internet of things (iot): a vision, architectural elements, and future directions. Future Gener Comput Syst 29(7):1645–1660

    Article  Google Scholar 

  33. 33.

    Navada V (2018) Lora. https://www.devopedia.org/lora

  34. 34.

    Goldsmith A (2005) Wireless communications. Cambridge University Press. https://books.google.com.au/books?id=ZtFVAgAAQBAJ

  35. 35.

    (2016) A comprehensive look at low power wide area networks. www.link-labs.com

  36. 36.

    Attar H, Vukobratovic D, Stankovic L, Stankovic V (2011) Performance analysis of node cooperation with network coding in wireless sensor networks. In: 2011 4th IFIP international conference on new technologies, mobility and security. pp 1–4

  37. 37.

    Attar H, Stankovic L, Stankovic V (2012) Cooperative network-coding system for wireless sensor networks. IET Commun 6(3):344–352

    MathSciNet  Article  Google Scholar 

  38. 38.

    Yaqoob I, Hashem IAT, Mehmood Y, Gani A, Mokhtar S, Guizani S (2017) Enabling communication technologies for smart cities. IEEE Commun Magaz 55(1):112–120

    Article  Google Scholar 

  39. 39.

    Frenzel L (2017) Long-range iot on the road to success. https://www.electronicdesign.com/embedded-revolution/long-range-iot-road-success

  40. 40.

    Raza U, Kulkarni P, Sooriyabandara M (2017) Low power wide area networks: an overview. IEEE Commun Surv Tutor 19(2):855–873

    Article  Google Scholar 

  41. 41.

    Centenaro M, Vangelista L, Zanella A, Zorzi M (2016) Long-range communications in unlicensed bands: the rising stars in the iot and smart city scenarios. IEEE Wirel Commun 23(5):60–67

    Article  Google Scholar 

  42. 42.

    SigFox. (2017) Sigfox technology overview. https://www.sigfox.com/en/sigfox-iot-technology-overview

  43. 43.

    Sayer P (2017) Sigfox shows 20-cent iot wireless module. https://www.computerworld.com.au/article/627814/sigfox-shows-20-cent-iot-wireless-module/

  44. 44.

    Berney P, Esther A, Jacob J, Prescott C (2014) Risk mitigation measures for anticoagulant rodenticides as biocidal products

Download references


This work was funded with a $30,000 grant from the Securing Food, Water and Environment research focus area from La Trobe University.

Author information



Corresponding author

Correspondence to Lyle Parsons.

Ethics declarations

Conflict of Interest

On behalf of all authors, the corresponding author states that there is no conflict of interest.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Parsons, L., Ross, R. & Robert, K. A survey on wireless sensor network technologies in pest management applications. SN Appl. Sci. 2, 28 (2020). https://doi.org/10.1007/s42452-019-1834-0

Download citation


  • Pest management
  • Internet of things
  • Wireless sensors
  • Image processing