Multinational automotive manufacturer, Hyundai, is currently testing an artificial intelligence-based technology that aims to aid hearing-impaired drivers.
Sounds are converted using software and hardware into visual and tactile cues that drivers can read. Sounds patterns are analyzed using AI and sent to two systems: the Audio-Visual Conversion (AVC) and the Audio-Tactile Conversion (ATC). The first system, AVC, converts sounds into pictograms on the head-up display of the car, while ATC transforms sounds into physical cues, such as sending vibrations through the steering wheel.
The first hearing-impaired taxi driver test has been ongoing as of December 2018 in Seoul. Hyundai is in the process of adapting technology to help drivers overcome an impairment, expanding the use of systems from complementing driver experiences to enhancing and promoting driver safety.
How can automotive manufacturing companies benefit from AI?
How is technology solving problems in the automotive industry?
How does AI improve driver safety?