Hand Monitoring and Gesture Recognition for Human-Car Interaction
Gesture-based human-computer interaction is a well-assessed field of application of computer vision algorithms. In particular, we are studying its exploitation in automotive applications. Our main goal is the development of a hand gesture-based interaction with car devices where the hands are kept on the steering wheel.
In order to detect and classify a gesture, it is mandatory to find the hand position into a given image.
The study of hand detection and tracking has been widely studied in the computer vision community. In particular, hand motion analysis is an interesting topic in automotive context, due to the fact that the hands are a crucial body part to study the motion, the interaction between human and the environment, the car, and the behavior.
Besides, the rising trend of vision-based systems on the car to look both outsides and inside of a car gives the opportunity to have a huge amount of data and new ways to acquire images during driving activity.
An important aspect is that distracting driving represents a crucial role in road crashes. Three categories for driving distraction are present:
- Manual distraction: driver's hands are on the wheel;
- Visual distraction: driver's eyes are not looking at the road;
- Cognitive distraction: driver attention is not focused on driving activity.
In this research project, we focused on the first one. We believe the hand position is a key element to check driver attention and behavior: according to the National Safety Administration (NHTSA), that defines the driving distraction as "an activity that could divert a person's attention away from the primary task of driving", we assume that hand have to grasp steadily during all the driving activity, apart from some short and well time delimited events as shifting gears, adjusting rear-view mirror and so on.
After all, for example, the smartphone is one of the most important causes of fatal driving crashes: it causes about 18% of fatal driver accidents in North America and involves all three distraction categories mentioned above. Besides, drivers today are increasingly engaged in secondary tasks behind the wheel.
For these reasons, we aim to propose a method that is able to understand if the driver's hands are next to or on the wheel. To do this, we place a particular infrared camera (Leap Motion) in a particular position.
Click here to download the Turms dataset
(published in HBU 2018 - FG 2018)
Click here to download the Briareo dataset
(published in ICIAP 2019)
|1||Manganaro, Fabio; Pini, Stefano; Borghi, Guido; Vezzani, Roberto; Cucchiara, Rita "Hand Gestures for the Human-Car Interaction: the Briareo dataset" Proceedings of the 20th International Conference on Image Analysis and Processing, Trento, Italy, 9-13 September 2019, 2019 Conference|
|2||Caputo, F. M.; Burato, S.; Pavan, G.; Voillemin, T.; Wannous, H.; Vandeborre, J. P.; Maghoumi, M.; Taranta, E. M.; Razmjoo, A.; J. J. LaViola Jr., ; Manganaro, Fabio; Pini, S.; Borghi, G.; Vezzani, R.; Cucchiara, R.; Nguyen, H.; Tran, M. T.; Giachetti, A. "Online Gesture Recognition" Eurographics Workshop on 3D Object Retrieval, Genova, 5-6 May 2019, 2019 | DOI: 10.2312/3dor.20191067 Conference|
|3||Borghi, Guido; Frigieri, Elia; Vezzani, Roberto; Cucchiara, Rita "Hands on the wheel: a Dataset for Driver Hand Detection and Tracking" Proceedings of the 8th International Workshop on Human Behavior Understanding (HBU), Xi'An, pp. 564 -570 , 15 May 2018, 2018 Conference|
|4||Borghi, Guido; Vezzani, Roberto; Cucchiara, Rita "Fast gesture recognition with Multiple StreamDiscrete HMMs on 3D Skeletons" Proceedings of the 23rd International Conference on Pattern Recognition, Cancun, Dec 4-8, 2016, 2016 Conference|