As reported by a new study, published in the Journal of ACM, apps will soon be able to detect the transportation mode used by commuters and intelligently provide relevant information, tips and advice. The researchers say that machine learning techniques that were developed in a global competition could result in smartphones being able to predict the road conditions and level of traffic, provide efficient recommendation for routes and parking, and even detect of the smartphone user has consumed food and drinks while on the move.
Daniel Roggen, author of the study and Professor at the University of Sussex, mentions that the data set obtained is quite unique, and the sensor data is expected to be rich, in terms of the annotations quality. He says that while older research only collected motion and GPS data, this study has a wider scope- comprising of all the sensor modalities that a smartphone offers. The study additionally considers four different locations where phones are typically placed by users such as back pocket, hand, handbag and backpack.
The team led by Roggen gathered over 117 days’ worth of data equivalent, which monitored the various facets of the test commuters travel in the UK, using different modes of transport. This is slated to be the largest set of data of its kind that is available publicly.
The team also held a global competition challenging teams for the development of the most accurate algorithms to identify eight different transport modes (such as walking, sitting still, cycling, running, or taking the subway, car or bus). The collected from 15 different sensors measures everything ranging ambient pressure to movement.
The team that won achieved a high score of 93.9 percent and had pursued the technological combination classical machine learning and deep learning.