A team of engineers at the University of Illinois Grainger College of Engineering is discovering and developing a new subset of mobile technology based on sophisticated and connected earphones.
According to News i and quoted by the magazine & T, “The leap from today’s earphones to a newer technology called earables could be like a revolution in the mobile phone industry,” said Professor Romit Roy Choudhury, an electrical and computer engineer. Because over the years, the structure of telephones has changed, which has made the difference between old cell phones and today’s smart phones. Today’s smartphones have many capabilities and are no longer used only for calling, and it is just as if Airbels technology will no longer be used as just a smartphone accessory in the future and will have many applications.
Chaudhry et al. Are developing new algorithms and testing them on the earphone platforms used by users. They hope that in the future, these wearable devices will be able to continuously understand human behavior, assist users on a number of occasions, provide them with information such as digital assistants, and track user health and fitness.
“If you want to find a particular store in a mall, this earphone can estimate the relative position of the store and play a three-dimensional sound that easily tells you to follow me,” said Zhijian Yang, a doctoral student. And then guide you to that store. At that time, it seems that the sound in your ear will play in the direction you need to walk, and this action is like a voice escort.
The “EarSense” system is one of the things that researchers are thinking about, and it is an earphone that acts as a sensor of tooth activity and can sense users’ facial and mouth movements such as tooth movements and impact, and thus a kind of hands-free communication with Makes smartphones and other computing devices possible. In addition, as various medical diseases become apparent through teeth chattering, a smart earphone can make it possible to identify them. The engineers are planning to investigate whether the sensors on the earphone can be used to analyze facial muscle movements and, consequently, emotions.
Their other paper is on voice location using near-wall reflections, in which they examine algorithms for detecting sound direction. This allows users’ earphones to be tuned to the person the user is talking to.
“We’ve been working on measuring and calculating cell phones for 10 years,” said doctoral student Yu-Lin Wei. We have a lot of experience in designing emerging earable computing technology.
Recently, Anders Andréen, CEO of the Swedish audio brand Urbanista, told ETT that these sensors are likely to play a role in making consumer audio products more useful in the future.