You are here:
In Germany, more than six million people with impaired hearing currently do without hearing aids. One of the obstacles to this is the lack of customization options to meet individual hearing needs. New, high-priced hearing aids already have artificial intelligence (AI) that can recognize predefined listening environments such as conversations or music. However, the default settings for the sound adjustments are based on average values previously determined by users.
Objectives and procedure
The researchers of the AIHearS project are developing an AI-based algorithm for the sound adaptation of hearables (headphones, headsets or hearing aids), which users can train individually in their everyday listening environment according to their preferences. The system achieves the best individual hearing amplification setting in every hearing situation. In this way, hearing aids can be adapted to the needs of users and their acceptance by people over 60 increases. To this end, the researchers are working together with users to examine the effect of automated sound adaptation on the target group through repeated data collection in the laboratory and field tests: for example, on perceived autonomy, sovereignty and their trust in AI.
Innovations and perspectives
With the system, people with reduced hearing ability can completely self-determine the best sound fitting setting for them individually. Since it is also available without a doctor's prescription and can be adjusted without the assistance of hearing care professionals, the barrier to using hearing aids at an early stage is lowered.