Gesture-based control is already known from the world of entertainment electronics and has been making inroads into vehicles for some time now.
Continental is now developing an innovation project that, for the first time ever, focuses the detection zone of gestures on the steering wheel. This is possible due to a time-of-flight sensor, which is integrated into the instrument cluster. Using this approach, the solution minimizes driver distraction and further enhances the development of the holistic human-machine interface.
Where previous gesture-based control systems in the area of the center console meant that drivers had to take their hands off the steering wheel or take their eyes off the road, the field of action in the solution developed by Continental is much more focused.
“With gestures in a clearly defined area on the steering wheel, we can minimize distraction and increase safety. This narrowing down also prevents the driver from unintentionally starting gesture-based control by means of their usual everyday gestures, and thus making unwanted selections,” declares Ralf Lenninger, head of Strategy, System Development, and Innovation in Continental’s Interior division.
The new operating concept integrates seamlessly into the holistic human-machine interface and can replace other elements such as buttons or even touch-sensitive surfaces on the steering wheel. Instead, it uses two transparent plastic panels – without any electronic components – behind the steering wheel, which a driver can operate with his thumbs, almost like a touchpad.
As a result, a driver will benefit from intuitive operation, while vehicle manufacturers benefit from optimized system costs for innovative operating concepts. The clear design of the panels is compatible with almost any control geometry and new gestures can be added at any time. In addition, the variable complexity ensures that the system can be integrated in many different vehicle classes and not just in the luxury segments.
The time-of-flight sensor detects the motion of the hand and converts it into actions. The driver can navigate through the menus by swiping up and down, and confirm the selection with a brief tapping motion. Touch-free operation is also possible for other functions. For example, if the driver moves his fingers up and down in a uniform movement while keeping his hands on the steering wheel, he can accept calls or reject them.
“These gestures are intuitive for the driver and are very closely based on the familiar operating methods of smartphones and other smart devices due to the transparent gesture panels. This simplifies the dialog between driver and vehicle, even for more complex applications, and driver distraction is minimized as well,” adds Lenninger.
A gesture is typically a movement linked to a specific property. Thanks to the time-of-flight sensor integrated in the instrument cluster, this development from Continental has a high rate of gesture recognition. The sensor comprises a 3D camera system with an integrated 3D image sensor and converts the infrared signal detected by the sensor into a 3D image. Consequently, the hand positions and gestures of the driver are detected with millimeter precision and converted to actions.
The system can currently detect four different gestures: setting the navigation, browsing through apps and starting music, answering calls, and controlling the on-board computer. Initial reactions of test users confirm the selection of these gestures. In particular, they welcomed the proximity to the steering wheel, operation with the thumb, as well as the intuitive learnability of the gestures.
“The development of a holistic human-machine interface is crucial for further strengthening the driver’s confidence in their vehicle. Building up this confidence, combined with an intuitive dialog between driver and vehicle is yet another important step on the road to automated driving, one that we are supporting with gesture-based control on the steering wheel,” Ralf Lenninger summarizes.