Skip to main content
Log in

Speed-Bump Detection for Autonomous Vehicles by Lidar and Camera

  • Original Article
  • Published:
Journal of Electrical Engineering & Technology Aims and scope Submit manuscript

Abstract

In this paper, we propose a speed-bump detection method for an autonomous vehicle by using a camera and light detection and ranging (lidar). A speed bump may have an impact with a vehicle if its speed does not decrease during driving. To prevent this, it is necessary to detect a speed bump and determine its position. In this study, we use a camera and lidar to detect and locate a speed bump. In addition, two detectors are used to extract and verify candidates for speed-bump. The detection method first extracts the regions of the speed-bump candidate using an image pattern. Then, using the image pattern and distance information, the speed bump is detected in the candidate area. The result includes the area of the speed bump, classification result, and speed-bump height information. The experimental results show that the proposed method improves the accuracy of detection and improves the classification accuracy of pedestrian crossings with similar patterns.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  1. Premebida C, Ludwig O, Nunes U (2009) Lidar and vision-based pedestrian detection system. J Field Robot 26(9):696–711

    Article  Google Scholar 

  2. Chaveez-Garcia RO, Aycard O (2016) Multiple sensor fusion and classification for moving object detection and tracking. IEEE Trans Intell Transp Syst 17(2):525–534

    Article  Google Scholar 

  3. Devapriya W, Babu CNK, Srihari T (2015) Advance driver assistance system (ADAS)—speed-bump detection. In: Proceedings of IEEE international conference on computational intelligence and computing research (ICCIC), Madurai, India, pp 1–6

  4. Kiran AG, Murali S (2014) Automatic hump detection 3D view generation from a single road image. In: Proceedings of international conference on advanced computing, communications and informatics (ICACCI), New Delhi, India, pp 2232–2238

  5. Chen S, Wang S (2012) Lidar based edge-detection for bridge defect identification. In: Proceedings of nondestructive characterization for composite materials, aerospace engineering, civil infrastructure, and homeland security, San Diego, USA, p 83470X

  6. Kang Y, Roh C, Suh SB, Song B (2012) A lidar-based decision-making method for road boundary detection using multiple Kalman filters. IEEE Trans Ind Electron 59(11):4360–4368

    Article  Google Scholar 

  7. Choi J, Lee J, Kim D, Soprani G, Cheei P (2012) “Environment-detection-and-mapping algorithm for autonomous driving in rural or off-road environment. IEEE Trans Intell Transp Syst 13(2):974–982

    Article  Google Scholar 

  8. Fernandez C, Gavilan M, Llorca DF, Parra I, Quintero R, Lorente AG, Vlacic L, Sotelo MA (2012) Free space and speed humps detection using lidar and vision for urban autonomous navigation. In: Proceedings of intelligent vehicles symposium (IV), Alcala de Henares, Spain, pp 698–703

  9. Lee D, Son S, Yang K, Park J, Lee H (2009) Sensor fusion localization system for outdoor mobile robot. In: Proceedings of Fukuoka, Japan, ICCAS-SICE, pp 1384–1387

  10. Sucgang NJ, Ramos M, Arriola NA (2017) Road surface obstacle detection using vision and lidar for autonomous vehicle. In: Proceedings of the international multiconference of engineers computer scientists (IMECS), Hongkong, China

  11. Li Q, Chen L, Li M, Shaw SL, Nuchter A (2014) A sensor-fusion drivable-region and lane-detection system for autonomous vehicle navigation in challenging road scenarios. IEEE Trans Veh Technol 63(2):540–545

    Article  Google Scholar 

  12. Devapriya W, Babu CNK, Srihari T (2016) Real time speed-bump detection using gaussian filtering and connected component approach. In: Proceedings of futuristic trends in research and innovation for social welfare(Startup Conclave), Coimbatore, India, pp 2168–2175

  13. Prioletti A, Mogelmose A, Grisleri P, Manubhai M, Broggi A, Moeslund TB (2013) Part-based pedestrian detection and feature-based tracking for driver assistance: real-time, robust algorithms, and evaluation. IEEE Trans Intell Transp Syst 14(3):1346–1359

    Article  Google Scholar 

  14. Li F, Zhang R, You F (2017) Fast pedestrian detection and dynamic tracking for intelligent vehicles within V2 V cooperative environment. J IET Image Process 11(10):833–840

    Article  Google Scholar 

  15. Dalal N, Triggs B (2007) Histograms of oriented gradients for human detection. In: Proceedings of IEEE conference on computer vision and pattern recognition (CVPR), San Diego, USA, pp 1–8

  16. Suykens JAK, Vandewalle J (1999) Least squares support vector machine classifiers. J Neural Process Lett 9(3):293–300

    Article  Google Scholar 

  17. Zhang Q, Pless R (2004) Extrinsic calibration of a camera and laser range finder (improves camera calibration). In: Proceedings of IEEE international conference on intelligent robots and systems (IROS), Sendai, Japan, pp 2301–2306

  18. Song H, Choi W, Kim H (2016) Robust vision-based relative-localization approach using an RGB-depth camera and lidar sensor fusion. IEEE Trans Ind Electron 63(6):3725–3736

    Article  Google Scholar 

  19. Burke JV, Ferris MC (1995) A Gauss–Newton method for convex composite optimization. J Math Program 71(2):179–194

    Article  MathSciNet  MATH  Google Scholar 

  20. Zhang L, Wu B, Nevatia R (2007) Pedestrian detection in infrared images based on local shape features. In Proceedings of IEEE conference on computer vision and pattern (CVPR), Minneapolis, USA, pp 1–8

  21. Kim J, Baek J, Kim E (2015) A novel on-road vehicle detection method using πHOG. IEEE Trans Intell Transp Syst 16(6):3414–3429

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by an Institute for Information and Communications Technology Promotion (IITP) Grant funded by the Korean government (MSIT) (no. R7117-16-0164, Development of eight wide area driving environment awareness and cooperative driving technology, which are based on V2X wireless communication).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tae-Hyoung Park.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yun, HS., Kim, TH. & Park, TH. Speed-Bump Detection for Autonomous Vehicles by Lidar and Camera. J. Electr. Eng. Technol. 14, 2155–2162 (2019). https://doi.org/10.1007/s42835-019-00225-7

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s42835-019-00225-7

Keywords

Navigation