Skip to main content
Log in

Color invariant state estimator to predict the object trajectory and catch using dexterous multi-fingered delta robot architecture

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

This paper proposes a design of a state estimator for tracking and predicting the object trajectory for the manipulation using a dexterous multi-fingered Delta robot. The observations of the object state acquired from the cameras (Basler), in the real-time scenario. Initially, pixels are removed that corresponds to the background pixels using a mixture of Gaussian algorithms. Secondly, the color invariant approach is implemented as a Hough transform. The same is used for the tracking of the object. This results in the color invariant thresholding to filter the region of interest. As the successive frames have some noise, morphological operations have also performed in to remove if any present outlier. After removing the noise from the frame, estimating the object center followed by velocity estimation done using the k-means clustering. Kalman predictor is used for the prediction of the future state(s) using the current state and known system dynamics. The catching strategy of the object using the Delta robot-based multi-fingered architecture is also discussed. Different trajectories and objects are provided for the catching of the object.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

References

  1. Alberto T, Jose M, Jesus T, Ricardo C, Flavio R (2014) Using a 3OF parallel robot and a spherical bat to hit a ping- pong ball. Int J Adv Robot Syst. https://doi.org/10.5772/58526

  2. Andrew D (2005) Active search for real-time vision. In IEEE International Conference on Computer Vision (ICCV'05) vol 1, Beijing, pp 66–73

  3. Antonio M, Pedro S, Angel J (2002) Vision-based computation of three-finger grasps on unknown planar objects. In IEEE/RSJ International Conference on Intelligent Robots and Systems, Lausanne, vol 2, pp 1711–1716

  4. Aykut S, Fabio R, Vincenzo L, Bruno S (2016) An optical trajectory planner for a robotics batting task: the table tennis example. In International Conference of Robotics and Automation, Lisbon

  5. Bachmann E, Yun X (2006) Design, implementation and experimental results of a quaternion-based Kalman filter for human body motion tracking. IEEE Trans Robot 22(6):1216–1227. https://doi.org/10.1109/TRO.2006.886270

  6. Bhatt B, Vishwanath S, Kalman ST (n.d.) Filters-Theory and implementation

  7. Bonev IA (2001) Delta robot — the story of success. Parallel MIC

  8. Codourey A (1998) Dynamic modeling of parallel robots for computed-torque control implementation. Int J Robot Res. https://doi.org/10.1177/027836499801701205

  9. Hussain N, Khan MA, Sharif M, Khan SA, Albesher AA, Saba T, Armaghan A (2020) A deep neural network and classical features based scheme for objects recognition: an application to machine inspection. Multimed Tools Appl. https://doi.org/10.1007/s11042-020-08852-3

  10. Khan MA, Javed K, Khan SA, Saba T, Habib U, Khan JA, Abbasi AA (2020) Human action recognition using fusion of multiview and deep features: an application to video. Multimed Tools Appl. https://doi.org/10.1007/s11042-020-08806-9

  11. Khan MA, Zhang Y, Khan SA, Attique M, Rehman A, Seo S (2020) A resource conscious human action recognition framework using 26-layered deep convolutional neural network. Multimed Tools Appl. https://doi.org/10.1007/s11042-020-09408-1

  12. Kosinska A, Galicki M, Kedzior K (2003) Designing and optimization of parameters of Delta-4 Parallel Manipulator for a given Workspace Institute of Aeronautic and Applied Mechanics. Warsaw University of Technology, Poland

  13. Laribi MA, Romdhane L, Zeghloul S (2006) Analysis and Dimensional synthesis of the DELTA robot for a prescribed workspace. Future scope Chasseneuil Cedex, France

  14. Lesniak A, Wojdyla M, Danek T (2009) Application of Kalman filter to noise reduction in multichannel data. Sched Inf 17(18):63–73

  15. Lin Hsien I, Yu Z, Huang YC (2020) Ball tracking and trajectory prediction for table-tennis robots. Sensors 20(2):333

  16. Matthias N, Roland R, Hubert G (2008) A ball catching redundant industrial robot. In Robotics and Autonomous Systems

  17. Moshe B, Shree N (2004) Motion-based motion deblurring. IEEE Trans Pattern Anal Mach Intell. https://doi.org/10.1109/TPAMI.2004.1

  18. Pradipta R, Prabir B, Binoy D (2013) Kalman Predictor based edge detector for noisy images. IEEE Second International Conference on Image Information Processing (ICIIP-2013), Shimla, pp 236–141

  19. Qi Z, Li X, Jiaya J (2014) 100+ times faster Weighted Median Filter (WMF). In IEEE Conference on Computer Vision and Pattern Recognition, Columbus, pp 2830–2837

  20. Rashid M, Khan MA, Alhaisoni M, Wang SH, Naqvi SR, Rehman A, Saba T (2020) A sustainable deep learning framework for objects recognition using multi-layers deep features fusion and selection. Sustain MDPI J 12:5037

    Article  Google Scholar 

  21. Rashid M, Khan MA, Sharif M, Raza M, Sarfaraz MM, Afza F (2019) Object detection and classification: a joint selection and fusion strategy of deep convolutional neural network and SIFT point features. Multimed Tools Appl 78:15751–15777

    Article  Google Scholar 

  22. Sachin K; Sudipto M (2017) Automatic single view monocular camera calibration based object manipulation using novel dexterous Multi-Fingered Delta robot. Neural Comput & Applic 31:2661–2678

  23. Sakshi R, Sachin K, Bhivraj S, Zubair M, Sudipto M (2016) Jitter removal in KUKA KR-5 using modified kalman filter while teleoperation with exoskeleton. In Asian Conference of Multibody Dynamics (ACMD 2016), Kanazawa

  24. Seungsu K, Aude B (2012) Estimating the non-linear dynamics of free-flying objects. Robot Auton Syst 60(9):1108–1122

  25. Simon D, Kalman KS (n.d.) Filtering with uncertain noise covariance

  26. Tsai L-W (1999) Position analysis of parallel manipulator robot analysis- the mechanics of serial and parallel manipulators. John Willey & Sons, Inc., New York

  27. Tsai L-W (1999) Static analysis of parallel manipulators’. Robot analysis –the mechanics of serial and parallel manipulators. Dept. of Mechanical Eng. and Institute of System Research, the University of Maryland, pp 285–286

  28. Vincenzo L, Bruno S, Luigi V (2005) Visual motion tracking with full adaptive extended kalman filter: An experimental study. IFAC Proceeding Volumes 16:283–288

  29. Wilhelm A, Steffen W, Bjoern H, Heinz W (2010) Accurate object throwing by an industrial robot manipulator. In International Conference of Intelligent Robots, Brisbane

  30. Yang M, Pei M, Jia Y (2020) Online maximum a posteriori tracking of multiple objects using sequential trajectory prior. Image Vis Comput 94:103867

    Article  Google Scholar 

  31. Zhong J, Sun H, Cao W, He Z (2020) Pedestrian motion trajectory prediction with stereo-based 3D deep pose estimation and trajectory learning. IEEE Access 8:23480–23486

    Article  Google Scholar 

  32. Zsombor-Murray PJ (2004) Descriptive geometric kinematic analysis of clavel’s delta robot. McGill University, Department of Mechanical Engineering, Center for Intelligent Machine, Montréal

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sachin Kansal.

Ethics declarations

This paper does not contain any studies with human or animal subjects, and all authors declare that they have no conflict of interest.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kansal, S., Kumar, R. & Mukherjee, S. Color invariant state estimator to predict the object trajectory and catch using dexterous multi-fingered delta robot architecture. Multimed Tools Appl 80, 11865–11886 (2021). https://doi.org/10.1007/s11042-020-09937-9

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-020-09937-9

Keywords

Navigation