Skip to main content
Log in

Print registration for automated visual inspection of transparent pharmaceutical capsules

  • Original Paper
  • Published:
Machine Vision and Applications Aims and scope Submit manuscript

Abstract

This paper addresses a challenging problem of visual inspection of transparent pharmaceutical capsules, where print registration is used to determine the capsule’s print region. The determination of the print region allows for reliable detection of defects both on the print region and on the rest of the capsule’s surface. On transparent capsules, both the print on the front and the print on the back of a capsule are concurrently visible. Moreover, the print on the back may be partially or entirely occluded by the powder inside the capsule. All this causes that the print registration methods used for opaque capsules do not achieve adequate performance. In this paper, we present a novel registration method designed specifically for transparent capsules. The method utilizes a template matching technique with a new similarity measure that considers the specific properties of transparent capsules to increase the registration robustness. Additionally, we present a registration refinement step that reduces the effect of possible print deformations and image distortions. The performance of the method was evaluated in terms of robustness, accuracy and speed on large image sets of four different radial prints. The new method shows highly improved robustness (>98.6 %) compared to the method based on normalized cross-correlation (>72 %) and the method based on feature matching (>80 %). Furthermore, the additional refinement step improves the registration accuracy. Although the execution time is raised from 3 to 11 ms, it still meets the usual speed requirements.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Berman, A.: Reducing medication errors through naming, labeling, and packaging. J. Med. Syst. 28, 9–29 (2004)

    Article  Google Scholar 

  2. FDA: FDA 21CFR206, Imprinting of solid oral dosage form drug products for human use, Available at: http://www.accessdata.fda.gov/, (2015)

  3. Vasudevan, P., DelGianni, T., Robertson, W.O.: Avoiding medication mixups - Identifiable imprint codes. West. J. Med. 165, 352–354 (1996)

    Google Scholar 

  4. Podczeck, F., Jones, B.E.: Pharmaceutical Capsules. Pharmaceutical Press, London (2004)

    Google Scholar 

  5. Karloff, A.C., Scott, N.E., Muscedere, R.: A flexible design for a cost effective, high throughput inspection system for pharmaceutical capsules. In: IEEE International Conference on Industrial Technology, 2008. ICIT 2008. pp. 1 –4 (2008)

  6. Bukovec, M., Špiclin, Ž., Pernuš, F., Likar, B.: Automated visual inspection of imprinted pharmaceutical tablets. Meas. Sci. Technol. 18, 2921–2930 (2007)

    Article  Google Scholar 

  7. Islam, M.J., Ahmadi, M., Sid-Ahmed, M.A.: Image processing techniques for quality inspection of gelatin capsules in pharmaceutical applications. In: 10th International Conference on Control, Automation, Robotics and Vision, 2008. ICARCV 2008. pp. 862 –867 (2008)

  8. Špiclin, Z., Likar, B., Pernuš, F.: Real-time print localization on pharmaceutical capsules for automatic visual inspection. In: 2010 IEEE International Conference on Industrial Technology (ICIT). pp. 279 –284 (2010)

  9. Tsai, D.M., Lin, C.T.: Fast normalized cross correlation for defect detection. Pattern Recognit. Lett. 24, 2625–2631 (2003)

    Article  Google Scholar 

  10. Grosso, E., Lagorio, A., Tistarelli, M.: Automated quality control of printed flasks and bottles. Mach. Vis. Appl. 22, 269–281 (2011)

    Article  Google Scholar 

  11. Edwards, D.: Applications of capsule dosing techniques for use in dry powder inhalers. Ther. Deliv. 1, 195–201 (2010)

    Article  Google Scholar 

  12. Možina, M., Tomaževič, D., Pernuš, F., Likar, B.: Real-time image segmentation for visual inspection of pharmaceutical tablets. Mach. Vis. Appl. 22, 145–156 (2011)

    Article  Google Scholar 

  13. Derganc, J., Likar, B., Bernard, R., Tomaževič, D., Pernuš, F.: Real-time automated visual inspection of color tablets in pharmaceutical blisters. Real-Time Imaging 9, 113–124 (2003)

    Article  Google Scholar 

  14. Cheng, Y.: Mean shift, mode seeking, and clustering. IEEE Trans. Pattern Anal. Mach. Intell. 17, 790–799 (1995)

    Article  Google Scholar 

  15. Kuglin, C., Hines, D.: The phase correlation image alignment method. IEEE Conf. Cybern. Soc. pp. 163–165 (1975)

  16. Bracewell, R.N.: The Fourier Transform and Its Applications. McGraw-Hill Higher Education, New York (2000)

    MATH  Google Scholar 

  17. Marquardt, D.: An algorithm for least-squares estimation of nonlinear parameters. J. Soc. Ind. Appl. Math. 11, 431–441 (1963)

    Article  MathSciNet  MATH  Google Scholar 

  18. Druckmüller, M.: Phase correlation method for the alignment of total Solar eclipse images. Astrophys. J. 706, 1605 (2009)

    Article  Google Scholar 

  19. Padfield, D.: Masked object registration in the Fourier domain. IEEE Trans. Image Process. 21, 2706–2718 (2012)

    Article  MathSciNet  Google Scholar 

  20. Lowe, D.G.: Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 60, 91–110 (2004)

    Article  Google Scholar 

  21. Rosten, E., Drummond, T.: Machine learning for high-speed corner detection. In: Leonardis, A., Bischof, H., Pinz, A. (eds.) Computer Vision - ECCV 2006, pp. 430–443. Springer, Berlin (2006)

    Chapter  Google Scholar 

  22. Rosten, E., Drummond, T.: Fusing points and lines for high performance tracking. In: Tenth IEEE International Conference on Computer Vision (ICCV’05) Volume 1. Vol. 2, pp. 1508–1515 (2005)

  23. Fischler, M.A., Bolles, R.C.: Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM. 24, 381–395 (1981)

    Article  MathSciNet  Google Scholar 

  24. Lohmann, A.W., Weigelt, G., Wirnitzer, B.: Speckle masking in astronomy: triple correlation theory and applications. Appl. Opt. 22, 4028 (1983)

    Article  Google Scholar 

Download references

Acknowledgments

This work was supported by Sensum, Computer Vision Systems, and by the European Union, European Social Fund.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andraž Mehle.

Appendix: Efficient calculation of the similarity measures FO and BO in the Fourier domain

Appendix: Efficient calculation of the similarity measures FO and BO in the Fourier domain

The calculation of \(\hbox {FO}\) and \(\hbox {BO}\) in the Fourier domain requires the following Fourier transforms:

$$\begin{aligned} F_{\mathrm{fg}} \left( {\xi ,\eta } \right)= & {} \mathcal{F}\left\{ {f_{\mathrm{fg}} \left( {x,y} \right) } \right\} \end{aligned}$$
(18)
$$\begin{aligned} H\left( {\xi ,\eta } \right)= & {} \mathcal{F}\left\{ {h\left( {x,y} \right) } \right\} \end{aligned}$$
(19)
$$\begin{aligned} M_{\mathrm{fg}} \left( {\xi ,\eta } \right)= & {} \mathcal{F}\left\{ {m_{\mathrm{fg}} \left( {x,y} \right) } \right\} \end{aligned}$$
(20)
$$\begin{aligned} H'\left( {\xi ,\eta } \right)= & {} \mathcal{F}\left\{ {1 - h\left( {x,y} \right) } \right\} \end{aligned}$$
(21)

The equation for \(\hbox {FO}\) (6) can be rephrased as:

$$\begin{aligned} \hbox {FO}\left( {u,v} \right) =\frac{\left[ {\mathop \sum \nolimits _{x,y} f_{\mathrm{fg}} \left( {x,y} \right) h\left( {x-u,y-v} \right) } \right] ^{2}}{\mathop \sum \nolimits _{x,y} m_{\mathrm{fg}} \left( {x,y} \right) h\left( {x-u,y-v} \right) }, \end{aligned}$$
(22)

Each individual sum in \(\hbox {FO}\) (6), (22) can be translated to the Fourier domain using the Cross-Correlation Theorem:

$$\begin{aligned} c\left( {x,y} \right) =\left( {a\star b} \right) \left( {x,y} \right) \Rightarrow C\left( {\xi ,\eta } \right) =A^{*}\left( {\xi ,\eta } \right) \cdot B\left( {\xi ,\eta } \right) , \end{aligned}$$
(23)

where \(\star \) designates the cross-correlation operator.

$$\begin{aligned} \hbox {FO}\left( {u,v} \right) =\frac{\left[ {\mathcal{F}^{-1}\left\{ {F_{fg} \left( {\xi ,\eta } \right) \cdot H^{{*}}\left( {\xi ,\eta } \right) } \right\} \left( {u,v} \right) } \right] ^{2}}{\mathcal{F}^{-1}\left\{ {M_{fg} \left( {\xi ,\eta } \right) \cdot H^{{*}}\left( {\xi ,\eta } \right) } \right\} \left( {u,v} \right) } \end{aligned}$$
(24)

To reduce the total number of the necessary Fourier transforms, the equation of \(\hbox {BO}\) can be expressed with images \(f_{\mathrm{fg}} \), h, and \(m_{\mathrm{fg}} \) (otherwise used in the definition of \(\hbox {FO})\). This can be done because the extended input image \(f\left( {x,y} \right) \) is horizontally symmetrical (\(f^{\mathrm{F}}\left( {x,y} \right) =f\left( {x,y} \right) )\), flipped background mask is equal to foreground mask (\(m_{\mathrm{bg}} ^{\mathrm{F}}\left( {x,y} \right) =m_{\mathrm{fg}} \left( {x,y} \right) )\), and because of the theorems that hold for the operation of flipping on an arbitrary image \(i\left( {x,y} \right) \) (2527). Following the derivations (28–30), the equation for \(\hbox {BO}\) can be rephrased as depicted in (31). Using the Cross-Correlation Theorem (23), the denominator of \(\hbox {BO}\) can also be easily translated to the Fourier domain (32). On the contrary, the numerator of \(\hbox {BO}\) (\(\hbox {BO}_{\mathrm{num}} )\) is generally a special case of cross triple correlation [24] of signals \(f_{\mathrm{fg}} \left( {x,y} \right) \), \(\left( {1-h\left( {x-u,y-v} \right) } \right) ^{2}\), and \(h^{\mathrm{F}}\left( {x,y,u,v} \right) \); thus, it cannot be easily translated to the Fourier domain using only the Cross-Correlation Theorem (23). Because \(\hbox {BO}_{\mathrm{num}} \) (31) includes shifts in three different directions (u, v and \(-u)\), its calculation in the Fourier domain requires a 3D Fourier transform of dimensions \(M\times N\times M\) (M – image width, N – image height), which requires \(O(M^{2}N\log _2 M^{2}N)\) multiplications. It turns out that it is more efficient to calculate it partially in the time domain. Since horizontal flip of \(h^{\mathrm{F}}\) only affects the sums in the x direction, the correlation in the y direction can be calculated using the 1D Cross-Correlation Theorem for each pixel column at each horizontal shift u, while the sums over x are calculated in the time domain (33). \(F_y \) represents a column-wise 1D Fourier transform in the y direction. Using this formula, the computational complexity is reduced to \(O(M^{2}\left( {N\log _2 N} \right) )\). Note that the 1D Fourier transform \(\mathcal{F}_y \left\{ {\left( {1-h(x-u,y} \right) )^{2}h^{\mathrm{F}}\left( {x+u,y} \right) } \right\} \) can be pre-calculated offline for all possible shifts u.

$$\begin{aligned}&\mathop \sum \limits _{x,y} i\left( {x,y} \right) =\mathop \sum \limits _{x,y} i^{\mathrm{F}}\left( {x,y} \right) \end{aligned}$$
(25)
$$\begin{aligned}&\quad \left( {i_1 \left( {x,y} \right) \cdot i_2 \left( {x,y} \right) } \right) ^{\mathrm{F}}=i_1 ^{\mathrm{F}}\left( {x,y} \right) \cdot i_2 ^{\mathrm{F}}\left( {x,y} \right) \end{aligned}$$
(26)
$$\begin{aligned}&\quad \left( {i_1 \left( {x,y} \right) +i_2 \left( {x,y} \right) } \right) ^{\mathrm{F}}=i_1 ^{\mathrm{F}}\left( {x,y} \right) +i_2 ^{\mathrm{F}}\left( {x,y} \right) \end{aligned}$$
(27)
$$\begin{aligned}&\quad f^{\mathrm{F}}\left( {x,y} \right) =f\left( {x,y} \right) ,m_{\mathrm{bg}} ^{\mathrm{F}}\left( {x,y} \right) =m_{\mathrm{fg}} \left( {x,y} \right) \nonumber \\&\quad \Rightarrow \left( {f\left( {x,y} \right) m_{\mathrm{bg}} \left( {x,y} \right) } \right) ^{\mathrm{F}}=f\left( {x,y} \right) m_{\mathrm{fg}} \left( {x,y} \right) =f_{\mathrm{fg}} \left( {x,y} \right) \nonumber \\ \end{aligned}$$
(28)
$$\begin{aligned}&\quad f_{\mathrm{bg}}^{\mathrm{F}}\left( {x,y} \right) =\left( {f\left( {x,y} \right) m_{\mathrm{bg}} \left( {x,y} \right) \left( {1-h^{\mathrm{F}}\left( {x-u,y-v} \right) } \right) } \right) ^{\mathrm{F}}\nonumber \\&\quad =f_{\mathrm{fg}} \left( {x,y} \right) \left( {1-h\left( {x-u,y-v} \right) } \right) \end{aligned}$$
(29)
$$\begin{aligned}&\quad h_{\mathrm{bg}}^{\mathrm{F}}\left( {x,y,u,v} \right) \nonumber \\&\quad =\left( {h\left( {x-u,y-v} \right) \left( {1-h^{\mathrm{F}}\left( {x,y,u,v} \right) } \right) m_{\mathrm{bg}} \left( {x,y} \right) } \right) ^{\mathrm{F}}\nonumber \\&\quad =h^{\mathrm{F}}\left( {x,y,u,v} \right) \left( {1-h\left( {x-u,y-v} \right) } \right) m_{\mathrm{fg}} \left( {x,y} \right) \end{aligned}$$
(30)
$$\begin{aligned}&\quad \hbox {BO}\left( {u,v} \right) =\frac{\left[ {\mathop \sum \nolimits _{x,y} f_{\mathrm{bg}}^{\mathrm{F}}\left( {x,y} \right) h_{\mathrm{bg}}^{\mathrm{F}}\left( {x,y,u,v} \right) } \right] ^{2}}{\mathop \sum \nolimits _{x,y} f_{\mathrm{bg}} ^{\mathrm{F}}\left( {x,y} \right) }\nonumber \\&\quad =\frac{\left[ {\mathop \sum \nolimits _{x,y} f_{\mathrm{fg}} \left( {x,y} \right) \left( {1-h\left( {x-u,y-v} \right) } \right) ^{2}h^{\mathrm{F}}\left( {x,y,u,v} \right) } \right] ^{2}}{\mathop \sum \nolimits _{x,y} f_{\mathrm{fg}} \left( {x,y} \right) \left( {1-h\left( {x-u,y-v} \right) } \right) }\nonumber \\&\quad =\frac{\left[ {\hbox {BO}_{\mathrm{num}} \left( {u,v} \right) } \right] ^{2}}{\hbox {BO}_{\mathrm{den}} \left( {u,v} \right) } \end{aligned}$$
(31)
$$\begin{aligned}&\quad \hbox {BO}_{\mathrm{den}} \left( {u,v} \right) =\mathop \sum \limits _{x,y} f_{\mathrm{fg}} \left( {x,y} \right) \left( {1-h\left( {x-u,y-v} \right) } \right) \nonumber \\&\quad =\mathcal{F}^{-1}\left\{ {F_{\mathrm{fg}} \left( {\xi ,\eta } \right) \cdot {H}'}^{*}(\xi , \eta ) \right\} (u,v) \end{aligned}$$
(32)
$$\begin{aligned}&\quad \hbox {BO}_{\mathrm{num}} \left( {u,v} \right) \nonumber \\&\quad =\mathop \sum \limits _{x,y} f_{\mathrm{fg}} \left( {x,y} \right) \left( {1-h\left( {x-u,y-v} \right) } \right) ^{2}h^{\mathrm{F}}\left( {x,y,u,v} \right) \nonumber \\&\quad =\mathop \sum \limits _x \mathcal{F}_y^{-1}\nonumber \\&\quad \left\{ {\mathcal{F}_y \left\{ {f_{\mathrm{fg}} \left( {x,y} \right) } \right\} \cdot \mathcal{F}_y \left\{ {\left( {1-h(x-u,y} \right) )^{2}h^{\mathrm{F}}\left( {x+u,y} \right) } \right\} } \right\} \left( {u,v} \right) \nonumber \\ \end{aligned}$$
(33)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mehle, A., Bukovec, M., Likar, B. et al. Print registration for automated visual inspection of transparent pharmaceutical capsules. Machine Vision and Applications 27, 1087–1102 (2016). https://doi.org/10.1007/s00138-016-0797-z

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00138-016-0797-z

Keywords

Navigation