Back to the research area

Hand Monitoring and Gesture Recognition for Human-Car Interaction

Gesture-based human-computer interaction is a well-assessed field of application of computer vision algorithms. In particular, we are studying its exploitation in automotive applications. Our main goal is the development of a hand gesture-based interaction with car devices where the hands are kept on the steering wheel.

In order to detect and classify a gesture, it is mandatory to find the hand position into a given image.


human-car interaction

The study of hand detection and tracking has been widely studied in the computer vision community. In particular, hand motion analysis is an interesting topic in automotive context, due to the fact that the hands are a crucial body part to study the motion, the interaction between human and the environment, the car, and the behavior.
Besides, the rising trend of vision-based systems on the car to look both outsides and inside of a car gives the opportunity to have a huge amount of data and new ways to acquire images during driving activity.

An important aspect is that distracting driving represents a crucial role in road crashes. Three categories for driving distraction are present:

  1. Manual distraction: driver's hands are on the wheel;
  2. Visual distraction: driver's eyes are not looking at the road;
  3. Cognitive distraction: driver attention is not focused on driving activity.

In this research project, we focused on the first one. We believe the hand position is a key element to check driver attention and behavior: according to the National Safety Administration (NHTSA), that defines the driving distraction as "an activity that could divert a person's attention away from the primary task of driving", we assume that hand have to grasp steadily during all the driving activity, apart from some short and well time delimited events as shifting gears, adjusting rear-view mirror and so on.
After all, for example, the smartphone is one of the most important causes of fatal driving crashes: it causes about 18 \% of fatal driver accidents in North America and involves all three distraction categories mentioned above. Besides, drivers today are increasingly engaged in secondary tasks behind the wheel. 
For these reasons, we aim to propose a method that is able to understand if the driver's hands are next to or on the wheel. To do this, we place a particular infrared camera (Leap Motion) in a particular position.

Click here to download Turms dataset

(published in HBU 2018 - FG 2018)

hbu

Publications

1 Borghi, Guido; Frigieri, Elia; Vezzani, Roberto; Cucchiara, Rita "Hands on the wheel: a Dataset for Driver Hand Detection and Tracking" Proceedings of the 8th International Workshop on Human Behavior Understanding (HBU), Xi'An, 15 May 2018, 2018 Conference

Video Demo

Research Activity Info