Winning the Sussex-Huawei Locomotion-Transportation Recognition Challenge

Part of the Springer Series in Adaptive Environments book series (SPSADENV)


The Sussex-Huawei Locomotion-Transportation Recognition Challenge presented a unique opportunity to the activity-recognition community to test their approaches on a large, real-life benchmark dataset with activities different from those typically being recognized. The goal of the challenge was to recognize eight locomotion activities (Still, Walk, Run, Bike, Car, Bus, Train, Subway). This chapter describes the submissions winning the first and second place. They both start with data preprocessing, including a normalization of the phone orientation. Then, a wide set of hand-crafted domain features in both frequency and time domain are computed and their quality evaluated. The second-place submission feeds the best features into an XGBoost machine-learning model with optimized hyper-parameters, achieving the accuracy of 90.2%. The first-place submission builds an ensemble of models, including deep learning models, and finally refines the ensemble’s predictions by smoothing with a Hidden Markov model. Its accuracy on an internal test set was 96.0%.


Activity recognition Machine learning Deep learning Ensembles HMM Competition 


  1. Cvetković B et al (2017) Real-time physical activity and mental stress management with a wristband and a smartphone. In: Proceedings of the 2017 ACM international joint conference on pervasive and ubiquitous computing and proceedings of the 2017 ACM international symposium on wearable computers. ACMGoogle Scholar
  2. Cvetković B, Janko V, Luštrek M (2015) Demo abstract: activity recognition and human energy expenditure estimation with a smartphone. In: 2015 IEEE international conference on pervasive computing and communication workshops (PerCom Workshops). IEEE, pp 193–195Google Scholar
  3. Cvetković B, Szeklicki R, Janko V, Lutomski P, Luštrek M (2018) Real-time activity monitoring with a wristband and a smartphone. Inf Fusion 43:77–93CrossRefGoogle Scholar
  4. Gams M (2001) Weak intelligence: through the principle and paradox of multiple knowledge. Nova ScienceGoogle Scholar
  5. Gjoreski H et al (2016) Comparing deep and classical machine learning methods for human activity recognition using wrist accelerometer. In: Proceedings of the IJCAI 2016 workshop on deep learning for artificial intelligence, vol 10, New York, NY, USAGoogle Scholar
  6. Gjoreski M et al (2018) Applying multiple knowledge to Sussex-Huawei locomotion challenge. In: Adjunct proceedings of the 2018 ACM international joint conference and 2018 international symposium on pervasive and ubiquitous computing and wearable computers, Singapore, Singapore, 08–12 October 2018Google Scholar
  7. Gjoreski H, Ciliberto M, Wang L, Morales FJO, Mekki S, Valentin Roggen S (2018) The University of Sussex-Huawei Locomotion and Transportation dataset for multimodal analytics with mobile devices. IEEE Access 6:42592–42604. Scholar
  8. Hung WC et al (2014) Activity recognition with sensors on mobile devices. In: International conference on machine learning and cybernetics (ICMLC) 2014, vol 2. IEEEGoogle Scholar
  9. Janko V et al (2018) A New frontier for activity recognition—the Sussex-Huawei Locomotion Challenge. In: Adjunct proceedings of the 2018 ACM international joint conference and 2018 international symposium on pervasive and ubiquitous computing and wearable computers, Singapore, Singapore, 08–12 October 2018Google Scholar
  10. Janko V et al (2017) e-Gibalec: mobile application to monitor and encourage physical activity in schoolchildren. J Ambient Intell Smart Environ 9(5):595–609CrossRefGoogle Scholar
  11. Kozina S, Gjoreski H, Gams M, Luštrek M (2013) Efficient activity recognition and fall detection using accelerometers. In: Botía JA, Álvarez-García JA, Fujinami K, Barsocchi P, Riedel T (eds) Evaluating AAL systems through competitive benchmarking. EvAAL 2013. Communications in computer and information science, vol 386. Springer, Berlin, HeidelbergGoogle Scholar
  12. Olsson LE, Gärling T, Ettema D et al (2013) Soc Indic Res 111:255–263. Scholar
  13. Pearson correlation coefficient. Accessed 13 Feb 2019
  14. Ravì D, Wong C, Lo BP, Yang G (2017) A deep learning approach to on-node sensor data analytics for mobile or wearable devices. IEEE J Biomed Health Inform 21:56–64CrossRefGoogle Scholar
  15. Reddy et al (2010) Using mobile phones to determine transportation modes. ACM Trans Sens Netw (TOSN) 6(2):13Google Scholar
  16. Roggen D et al (2010) Collecting complex activity data sets in highly rich networked sensor environments. In: Seventh international conference on networked sensing systems (INSS’10), Kassel, GermanyGoogle Scholar
  17. Ronao CA, Cho S-B (2016) Human activity recognition with smartphone sensors using deep learning neural networks. Expert Syst Appl 59:235–244. Scholar
  18. Su X, Tong H, Ji P (2014) Activity recognition with smartphone sensors. Tsinghua Sci Technol 19(3):235–249CrossRefGoogle Scholar
  19. Sussex-Huawei Locomotion Challenge database (2018).
  20. Teng H et al (2018) Chiron: translating nanopore raw signal directly into nucleotide sequence using deep learning. GigaScience 7(5).
  21. Tsfresh (2018). Accessed 13 Feb 2019
  22. Wang S, Chen C, Ma J (2010) Accelerometer based transportation mode recognition on mobile phones. In: Asia-Pacific conference on wearable computing systems, Shenzhen, 2010, pp 44–46.
  23. Wang L, Gjoreski H, Murao K, Okita T, Roggen D (2018) Summary of the Sussex-Huawei Locomotion-Transportation Recognition Challenge. In: Proceedings of the 6th international workshop on human activity sensing corpus and applications (HASCA2018), SingaporeGoogle Scholar
  24. XGBoost (2018). Accessed 13 Feb 2019

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Department of Intelligent SystemsJožef Stefan InstituteLjubljanaSlovenia
  2. 2.Jožef Stefan Postgraduate SchoolLjubljanaSlovenia

Personalised recommendations