Keywords

1 Introduction

As the weight of a pig at shipment is an indicator of price, it is very important to ship pigs of proper weight. For economic reasons, the best weight for shipping pigs is approximately 115 kg. However, in order to reduce labor costs, many farmers ship pigs without measuring their weight. Therefore, the price of pigs has decreased, which is a significant management problem.

In order to deal with this problem, automatic sorting systems for selecting pigs of appropriate weight have been developed, primarily in Europe and United States [1, 2]. Although load cells are generally used as weight sensors, pigs must remain still during measurement in order to avoid errors. This increases the measurement time. Sawdust is often used as a matting material in pig houses. Although sawdust is comfortable for the pigs, it often gets under the load-cell baseboard, which causes mechanical errors.

In order to cope with these issues, the development of a weight measurement method using a camera is required. Some systems have been developed to measure some parts on a pig body using camera to estimate the pig weight from 2D information. Kashiha et al. [3] estimated the pig weight by measuring the top view body area and Schofield et al. [4] identified differences in the measured area to weight relationships for the three strains. Wang et al. [5] developed walk-through weighing system of pigs using computer vision. Traditionally, the weight of a pig can be accurately estimated by manually measuring specific parts, such as body length and chest circumference [6, 7]. The purpose of the present study is to measure these parts on a pig body using a camera, and thereby estimate the weight of the pig. In order to simulate manual measurement on a computer, extraction of these parts in three dimensions is needed. Moreover, since it is difficult to keep a pig still during measurement, instantaneous measurement is required.

Although color-coded structured light projection methods [8,9,10,11] have been introduced to perform instantaneous measurement, under practical conditions, these methods are easily influenced by ambient light. Recently, RGB-D cameras such as KINECT or Xtion sensors have been used in a number of applications to obtain instantaneous three-dimensional (3D) data for motion capture [12,13,14,15]. KINECT has been applied to cattle measurement and satisfactory results have been obtained [16, 17]. The Xtion sensor has also been used for pig measurement [18]. Although using RGB-D cameras to obtain instantaneous 3D shape information has been proposed, there is a limit to the maximum length of the USB cable. A USB3 cable should be less than 5 m in length. As such, the computer used to control the KINECT sensor must be nearby. In general, the environment of a pig house is not good for computers. Temperature, humidity and dust cause computer trouble in a couple of days. Over 1 mm dust is accumulated in a day. Therefore, the use of RGB-D cameras in pig house is not practical.

In the proposed method, Gig-Ethernet camera and multiple slits laser with random dots are used to obtain instantaneous shape data. Using this camera and laser system, remote control is possible, and instantaneous 3D shape data can be acquired without the need to sweep the laser. Many cross sections on multiple slits of the laser can be captured simultaneously. In order to convert the cross-sectional shape into global coordinates, the direction of each slit should be recognized. In this method, the pattern of random dots around each slit is used to identify the slit. The use of random dots is also adequate to extract the pig body robustly from background as it shows the big difference between the images with a pig and without a pig under the nature light condition. This method is appropriate for use in a pig sorting system because it enables robust measurement under the poor environmental conditions in a pig house. In the present paper, a method for extracting a pig body shape from a captured image and reconstructing the 3D surface to estimate the weight of the pig is introduced.

2 Pig Sorting System

2.1 Sorting Process

Figure 1 shows a schematic illustration of the pig sorting system. A pig passes through (a) the entrance and enters (b) the camera and sorting area, which is located in the center of the figure. The camera system estimates the weight of the pig. If the pig has not reached the proper weight (around 115 kg), then the pig is guided to (c) the food zone on the left-hand side of the figure. In the food zone, there is (d) a one-way path to (e) the relaxation zone. The pig moves from the food zone to the relaxation zone after eating food. When the pig becomes hungry, it reenters (a) the camera and sorting area. The pigs repeat these actions (from (a) to (e)), and when the pig has reached the proper weight for shipment, the pig is guided to (f) a shipping zone located on the right-hand side of the figure. Figure 2 shows a photograph of the developed sorting system. In this system, a rotary mechanism is used to separate pigs. This is an effective way of avoiding multiple pigs becoming jammed at the exit.

Fig. 1.
figure 1

Pig sorting system. Pigs repeatedly move from (a) to (e), and when a pig has reached the proper weight for shipment, the pig is automatically guided to (f) the shipping zone.

Fig. 2.
figure 2

Photograph of the pig sorting system (2018). A computer is placed outside of the pig room, and the system is controlled by a LAN connection. A Gig-Ethernet camera system is placed at the top of the system and observes the movement of the pigs.

2.2 Camera Settings for Weight Estimation

Generally, the weight of an animal is measured by a load cell. However, load cells are not appropriate for automatic animal measurement because pigs do not remain still during measurement. Sawdust is often used for matting in pig houses. However, sawdust can get under the load-cell baseboard, causing mechanical errors. Therefore, a computer vision system is more appropriate for use in pig weight measurement. Upon implementing the system, quick measurement is also required. The proposed system uses a computer and laser projectors to project a pattern of multiple slits and random dots in the same area. In a practical situation, the use of a laser projector with a specific wavelength is recommended for robustness. The random dot projector is effective for distinguishing laser lights from ambient light because it forms distinguishable structured light over the entire surface of a pig. Figure 3 shows an image of the measurement area. Multiple slits with random dots (wavelength: 660 nm) are projected from the top of the system. A Gig-Ethernet camera with a bandpass filter is also placed at the top of the system. A magnified image captured by the camera is shown in the figure. The random dots form specific patterns along each slit. These patterns are used to identify each slit, and the direction of each slit is estimated. Both the slits and random dots are generated using diffractive optical elements placed at the laser projector. The projectors are placed vertically on the holder. The separation between the two projectors causes a slight displacement on the surface of the target, as shown in Fig. 4. However, identification of the slits is not influenced by this shift, because the displacement is along the laser slit direction.

Fig. 3.
figure 3

Multiple slits with random dots are projected onto the surface of the pig from the top of the sorting system. The random dots are used to identify the direction and number of slits.

Fig. 4.
figure 4

Shift of the random dot pattern along the slit direction.

2.3 Identification of Slit

The direction of each slit from the projector has to be identified to estimate the global data for a triangulation. Reference image that include multiple slits with slit number and random dots pattern is recorded first as initial setting. The example of the reference image is shown in Fig. 5. This reference image is set on the computer memory and is used to allocate the slit address number on the measurement process.

Initially, Epipolar equation [19, 20] is determined between measurement image and reference image. The relation between measurement image m(u’,v’) and reference image r(u,v) is shown as Eq. (1).

$$\begin{aligned} \left[ \begin{array}{ccc} u'&v'&1 \end{array} \right] \left[ \begin{array}{ccc} m_{11} &{} m_{12} &{} m_{13} \\ m_{21} &{} m_{22} &{} m_{23} \\ m_{31} &{} m_{32} &{} m_{33} \end{array} \right] \left[ \begin{array}{c} u \\ v \\ 1 \end{array} \right] =0 \end{aligned}$$
(1)

where \(m_{11}\)\(m_{33}\) is a rotational and translational matrix, these parameter can be determined by setting over 8 corresponding pairs of points between measurement image and reference image. By setting the point (u’,v’) in the measurement image, the Epipolar line in the reference image is determined. as following,

$$\begin{aligned} (u'm_{11}+v'm_{21}+m_{31})u+(u'm_{12}+v'm_{22}+m_{32})v+u'm_{13}+v'm_{23}+1 =0 \end{aligned}$$
(2)

After obtaining the measurement image, small size of template such as Fig. 3 is selected and the same arrangement of the random dots are searched in the reference image. Therefore, the slit address number in measurement image is determined by finding the slit address with the same random pattern on the reference image. Once the slit address number is determined the global coordinate can be calculated by the triangulation as the ordinary slit ray projection method. Equation (3) is a relation ship between measurement image (u,v) and global coordinates (x,y,z) . (\(c_{11}\)\(c_{33}\)) are camera parameters that include rotational and translational matrix and \( \rho \) is a scale factor between images [21].

$$\begin{aligned} \rho \left[ \begin{array}{c} u \\ v \\ 1 \end{array} \right] = \left[ \begin{array}{cccc} c_{11} &{} c_{12} &{} c_{13} &{} c_{14} \\ c_{21} &{} c_{22} &{} c_{23} &{} c_{24} \\ c_{31} &{} c_{32} &{} c_{33} &{} 1 \end{array} \right] \left[ \begin{array}{c} x \\ y \\ z \\ 1 \end{array} \right] \end{aligned}$$
(3)
Fig. 5.
figure 5

Example of reference image that has include the slit numbers and random dot pattern. The slit number is allocated in the measurement image by finding the same random dot pattern in this image.

Figure 6 shows the example of the slit segment. Some points are randomly selected on each slit segment and the small template are defined that has a selected point as center. Epipolar equation is used for quick processing. Figure 7 shows the image of Epipolar line calculated for the selected point. Computer calculate the cross points between the Epipolar line and slit line on the reference image. The candidates of matching points are limited in these cross points. The same arrangement of the random dots with the template are searched in the reference image and the evaluation value for the matching is recorded. Finally, the slit address number is allocated by this process on selecting the address number with the maximum evaluation value of matching. This process enables us a quick processing.

Fig. 6.
figure 6

Template setting. Dot pattern around segment is used for identification of each slit

The relation between global coordinates and slit with address number n is as following [21].

$$\begin{aligned} \alpha \left[ \begin{array}{c} n \\ 1 \end{array} \right] = \left[ \begin{array}{cccc} p_{11} &{} p_{12} &{} p_{13} &{} p_{14} \\ p_{21} &{} p_{22} &{} p_{23} &{} 1 \end{array} \right] \left[ \begin{array}{c} x \\ y \\ z \\ 1 \end{array} \right] \end{aligned}$$
(4)

where (\(p_{11}\)\(p_{23}\)) are projector parameters that include rotational and translational matrix and \( \alpha \) is a scale factor between global coordinates and reference number. Finally, the global coordinates can be calculated by Eq. (5) using Eqs. (3) and (4).

$$\begin{aligned} \left[ \begin{array}{c} x \\ y \\ z \end{array} \right] = \left[ \begin{array}{ccc} c_{11}-c_{31}u &{} c_{12}-c_{32}u &{} c_{13}-c_{33}u\\ c_{21}-c_{31}v &{} c_{22}-c_{32}v &{} c_{23}-c_{33}v\\ p_{11}-p_{21}n &{} p_{12}-p_{22}n &{} p_{13}-p_{23}n \end{array} \right] ^{-1} \left[ \begin{array}{c} u-c_{14} \\ v-c_{24} \\ n-p_{14} \end{array} \right] \end{aligned}$$
(5)
Fig. 7.
figure 7

Cross points between slits and Epipolar line are extracted and template matching processing is executed among them

2.4 Extraction of Pig Images

For practical use, robust extraction of the target from a captured image is required. Image subtraction is a basic approach to extraction. The use of a specific laser wavelength and a bandpass filter on the camera is effective at preventing interference due to ambient light. In the present system, random dots (wavelength: 660 nm) are the main light source for the camera. The use of random dots is effective to extract the pig body robustly from background as it shows the big difference between the images with a pig and without a pig under the nature light condition. The random dots are dispersed in the measurement area and they are displaced depend on the depth. The displacements are used to extract the pig body from the image without being affected by the nature light. In order to reduce the noises, a median filter is used to construct the filled area of the pig image.

3 Experiment and Discussion

Figure 8 is an experimental system in the pig house. When a pig enter the measurement area, the image of the pig is automatically captured and the image is sent to a computer. The sorting direction of the pig is determined by the result of a weight estimation processing.

Figure 9 is the example of extraction result of a pig from captured image. As the random dots are displaced according to the depth, the pig body is easy to be extracted from a back ground image using subtracting procedure. This extracted single image is used for 3D processing.

Fig. 8.
figure 8

Experimental setup in the pig house.

Fig. 9.
figure 9

Extraction process of a pig in the system (a) Original Image (b) Extracted pig image using the proposed subtraction method.

The specific points to be measured on the body of a pig are shown in Fig. 10. The body length from ears to tail and the girth are highly correlated with the weight of the pig [6, 7]. The proposed measurement system has been developed to measure the body length and girth automatically.

Fig. 10.
figure 10

Specific points to be measured. These points have a high correlation with the weight of the pig.

The laser has a wavelength of 660 nm, and the bandpass filter of the camera is centered on this wavelength of 660 nm. A multiple-laser projector and a random dot projector are arranged vertically on one side of the rod, and the camera (Basler acA1300 60 fps) is placed at the other side of the rod. When the entire body of the pig is captured in the camera image, and computer starts processing. The software was developed using Visual Studio 2015 and Open CV 3.1.0.

3.1 Conversion of Slit Images to Three Dimensional Data

In order to obtain the three dimensional information from the image, the address have to be allocated to each slit for the triangulations. For slit line address allocation, points on each slit in the measurement image are randomly selected and the template around the point is selected. Figure 11 shows an image of the candidate points corresponding to the selected point from among the intersection points between the Epipolar line and the slits in the reference image. Pattern matching is performed among these intersection points. In order to assure the allocation, 10 dots were selected from one segment in order to perform pattern matching, and the highest degree of matching was allocated for the selected segment from among the 10 randomly selected points. Table 1 shows the allocation result of the image. There are 25 segments in the measurement image to be allocated the slit address. On the allocation process, evaluation values on template matching used for the reallocation. The slit address with highest evaluation value is selected for the final address of the segment. The processing time to allocate line numbers to these 25 segments took 1.5 s. Points on the slit segment with global coordinates are calculated by Eq. (5) after the slit number is allocated. These data with global coordinate are used to extract the specific parameters for estimating the pig weight.

Table 1. Allocation result of slit address
Fig. 11.
figure 11

Pig image extracted from the measured and reference images. The selected point generates the epipolar line in the reference image, and the corresponding point is selected from the candidates at the intersection points between the epipolar line and the slit lines. In this image, colors indicate corresponding slit numbers between images. (Color figure online)

3.2 Estimation of Body Length

Length of the pig body is calculated after the global coordinates on the slit are obtained. Figure 12 shows the procedure of the length estimation. The point with the highest point in each segment is selected and curved line along these selected points is generated by connecting these selected points. The length of the body is calculated using this curved line.

Fig. 12.
figure 12

Estimation of body length. The point with maximum z value is selected from each segment of slit and the curved line along the spine is generated by connecting these selected points. Length of the body is calculated using this curved line.

3.3 Estimation of Pig Girths

Girth as well as a body length is an important parameter to estimate the weight of the pig. The image captured from upper side of a pig is used in our system since the back shape of a pig reflects the carcass shape. Figure 13 shows the estimation procedure of girths. As the cross sections of the pig is close to circle, each segment on the body is estimated by circle. These estimated circles are arranged along the pig body and the pig body is reconstructed as (c) in Fig. 13. Around ear of pig, approximation is missed by the image noises, but the data can be smoothed by taking the average of these circles. The reconstructed of the pig body has similar shape with the carcass data as (d). As the final price of a pig is determined by the weight of the carcass, this similarity shows the extra feature of our proposed method. Though more data and more discussions from the view point of an animal science are needed to establish our assumption, it has a possibility that the price of the pig can be estimated before shipping.

Fig. 13.
figure 13

Estimation of girths from the upper side image. Girths are estimated by circle approximation using the arc that captured. The reconstruction of the pig body has similar shape with the carcass data (d).

3.4 Accuracy Evaluation

To estimate the accuracy of the proposed system, the depth measurement accuracy was checked first using a plain board. A plain board was placed at a distance of 1 m in front of the measurement system and the deviation was checked. The average deviation was 0.7 mm.

The accuracy of weight estimation was conducted using real pigs. The number of pig was five and they were randomly passed the measurement area. 60 pig images were used by comparing the ground truth weights that were measured by the load cell. The range of the weight was from 95.0 kg to 124.5 kg. The weight was estimated by the measured body length and girth. The equation developed to estimate the weight was as follows.

$$\begin{aligned} W=0.103*L+0.122*G-101.4 \end{aligned}$$
(6)

where L [mm] is the length , G [mm] the girth width of the pig and the W [kg] is the estimated weight of the pig. The correlation coefficient between the estimated weight and the ground truth weight was 0.92.

4 Conclusion

A 3D measurement method for a pig sorting system that uses a single image to estimate the weight of a pig was introduced. The projection of multiple slits enables the simultaneous measurement of multiple cross sections of the pig body. The proposed method is adequate for extracting the parameters used to estimate the weight of a pig, and subtraction of the background image using random dots enables robust practical measurement in a pig farm. Since pigs take various pose during the measurement, the measurement method that is not affected by the posture is needed. The image processing method to realize the stable measurement for various posture was introduced.

In this paper, the performance of the proposed method was shown from the engineering view point. The relationship between the result and the measured data may be changed depend on pig houses and strains. Therefore, it have to be discussed also from the animal science view point on the next step.