A high throughput robot system for machine vision based plant phenotype studies
- 2k Downloads
This work demonstrates how a high throughput robotic machine vision systems can quantify seedling development with high spatial and temporal resolution.The throughput that the system provides is high enough to match the needs of functional genomics research. Analyzing images of plant seedlings growing and responding to stimuli is a proven approach to finding the effects of an affected gene. However, with 104 genes in a typical plant genome, comprehensive studies will require high throughput methodologies. To increase throughput without sacrificing spatial or temporal resolution, a 3 axis robotic gantry system utilizing visual servoing was developed. The gantry consists of direct drive linear servo motors that can move the cameras at a speed of 1 m/s with an accuracy of 1 μm, and a repeatability of 0.1 μm. Perpendicular to the optical axis of the cameras was a 1 m2 sample fixture holds 36 Petri plates in which 144 Arabidopsis thaliana seedlings (4 per Petri plate) grew vertically along the surface of an agar gel. A probabilistic image analysis algorithm was used to locate the root of seedlings and a normalized gray scale variance measure was used to achieve focus by servoing along the optical axis. Rotation of the sample holder induced a gravitropic bending response in the roots, which are approximately 45 μm wide and several millimeter in length. The custom hardware and software described here accurately quantified the gravitropic responses of the seedlings in parallel at approximately 3 min intervals over an 8-h period. Here we present an overview of our system and describe some of the necessary capabilities and challenges to automating plant phenotype studies.
KeywordsHigh throughput Visual servoing Focusing Plant phenotyping Image segmentation
This article is distributed under the terms of the Creative Commons Attribution License which permits any use, distribution, and reproduction in any medium, provided the original author(s) and the source are credited.
- 5.Corke, P., Hutchinson, S.: A new hybrid image-based visual servo control scheme. In: Proceedings of the 39th IEEE Conference on Decision and control (2000)Google Scholar
- 8.Groen, F.C., Young, I.T., Ligthart, G.: A comparison of different focus functions for use in autofocus algorithms. Cytometry, pp. 623–691 (1985)Google Scholar
- 9.Hutchinson, S.A., Hager, G.D., Corke, P.I.: A tutorial on visual servo control. IEEE Trans. Robot. Autom. 12(5), 651–670 (1996). http://citeseer.ist.psu.edu/hutchinson96tutorial.html Google Scholar
- 14.LemnaTec: http://www.lemnatec.com (1998)
- 17.Nathaniel, N.K.C., Neow, P.A., M.H.A. Jr.: Practical issues in pixel-based autofocusing for machine vision. In: ICRA, p. 2791 (2001)Google Scholar
- 18.Otsu N.: A threshold selection method from gray-level histograms. IEEE Trans. Syst. Man. Cyber. 9, 6266 (1979)Google Scholar
- 19.Song, Y., Sun, L.: A new auto focusing algorithm for optical microscopebased automated system. In: ICARCV (2006)Google Scholar
- 20.Sun, Y., Duthaler, S., Nelson, B.: Autofocusing algorithm selection in computer microscopy. In: Proceedings of the IEEE Conference on IROS (2005)Google Scholar
- 21.Tahri, O., Chaumette, F.: Image moments: generic descriptors for decoupled image-based visual servoing. In: Proceedings of the IEEE Conference on Robotics and Automation, pp. 1861–1867 (2004)Google Scholar
- 22.Tahri, O., Chaumette, F.: Complex objects pose estimation based on image moment invariants. In: Proceedings of the IEEE Conference on Robots and Automation, pp. 436–441 (2005)Google Scholar
- 23.Walter A., Spies H., Terjung S., Kusters R., Kirchgebner N., Schurr U.: Spatio-temporal dynamics of expansion growth in roots: automatic quantification of diurnal course and temperature response by digital image sequence processing. J. Exp. Biol. 53, 689–698 (2002)Google Scholar