Ah – Nothing like a neologism
Hearing Aids, Hearables, Wearables, Earables – Evolving, But in Which Direction?
New product trends related to devices at the ear have taken up some rather interesting names – hearables, wearables, and earables – in addition to the more traditional hearing aids.
A hearable (Figure 1) as described by Wikipedia is “a wireless in-ear computational earpiece,… housing a micro computer and uses wireless technology to supplement and enhance your listening experience. Many hearables will also feature additional features such as heart rate monitoring…” Smart Advisors and Smart Headphones are terms that have been used to identify the same type product, but these terms appear not to have been accepted for common use.
Hearable, as a term, was introduced just recently in 2014 by product designer and wireless application specialist Nick Hunn, and reportedly simultaneously by Apple in the context of their acquisition of Beats Electronics.
Some individuals envision hearables as essentially wireless earbuds having advanced technology based on audio information principles and services. Others see hearables more as advanced and innovative hearing aids taking on additional functions.
Regardless, hearables are more than a variation of existing headsets/earbuds, or hearing aids. They have the potential to be whatever one wants them to be, in which case one can argue that if it sits at, on, in, hangs on, or maybe even hovers by the ear and incorporates audio in any way, it is a hearable. Hearables can be expected to employ varying sophistication and application beyond what they do today. Valencell (2006) has been reported to have provided the first description of a wearable ear-worn multimedia platform for health monitoring, entertainment, guidance, and cloud-based communication. For information purposes, the writer of this post was involved in a project primarily for vital signs monitoring in the ear canal in 2003 with a major U.S. medical company, so that type function of this concept had been investigated even by then.
Even though the term hearable had not been identified until 2014, Entrepreneur Magazine dubbed 2015 the Year of the Hearable. At the CES (Consumer Electronics Show) in 2014, Hunn went so far as to nominate smart wearables as the breakout technology and predicted a global market of $5 billion by 2018 (Figure 2).
Hearable is a neologism that could be said to involve any of the combination terms of wearable, ear, hear, head, or headphone. Take your pick, although my preference is for combining hear and wearable, especially since it involves some kind of audio, implying hearing, and at the ear. The hearable has emerged partly from wearable technology.
If the wearable device is worn in the ear, and manages features other than audio reception (vital signs monitoring only, as an example), I would define this as an earable because it is gathering information from the ear while being worn.
A wearable, on the other hand, is described as consisting of smart sensor electronics that are worn on the body – either as an accessory or as part of material used in clothing (Figure 3). A major feature is the ability to connect the technology to the Internet (usually using Bluetooth), thus allowing for the exchange of data between the device and a network. Wearables have provided tracking information related to achieving goals of staying fit, being active, losing weight, or just to be better organized. They have been integrated into the flow of a synchronized daily life. As such, the technology has been worn on body parts having the most obvious link to fashion – especially the wrists, but also the eyes, and feet. The popular wrist band FitBit and Smart Watch provide good examples of wearables.
Although the term hearable is of fairly recent origin, the concept has been around for quite some time. Were not hearing aids perhaps the first such devices – presenting information into the ear, followed many years later by the Walkman in 1979? And, if at the ear, and wearable, doesn’t this make them also hearables and wearables – all at the same time?
Hearables – Chicken or the Egg?
Hearables have been featured in the news since 2014 as an emerging, innovative market. Audio industry tech sources have been pontificating about hearables, suggesting that based on their potential expansion from the earbud industry, that the hearing aid as we have known it might/would/could soon fall into the category of hearables.
This is somewhat laughable because hearing aids would have to be called the first hearables, regardless of how hearables are defined.
However, this observation does present an interesting question relative to trends. Are hearing aids moving in the hearable direction, or is it that hearables are moving in the direction of hearing aids? Although this might appear to be a chicken and egg argument, I tend to lean toward wireless hearables moving in the direction of hearing aids.
My reasons for believing that hearables are moving more toward hearing aids than the other way around are as follows, based on the direction toward wireless devices delivering/gathering sound/information at the ear:
Signal Attenuation Caused by the Head Between the Units
- How is the signal from a wireless earbud sent to the opposite side earbud? This is a difficult task because Bluetooth and other 2.4GHz signals are heavily attenuated by the head when communicating from one ear to the other. Making a large enough antenna at this time would require that it stick out of the ear too far to be able to operate cosmetically.
- Bluetooth, which some hearables use, relies on reflections from surrounding walls to overcome the attenuation created by the head between the units. When within a building, reflections can function well, but when outside in the open air, there is nothing to reflect the signal. And, when coupled with the continuing reduction in the size of earbuds and their deeper insertion into the ear canal, the antenna’s function is reduced even further.
- A solution from the hearing aid industry is Near Field Magnetic Induction (NFMI), something that the hearing aid industry has been using for a number of years, and which Bragi has taken advantage of with its association with Starkey Labs.
- Companies who understand the implication of this question are most likely to be further along in development, and have most likely worked with or in the hearing aid industry, gathering a high level of expertise.
- If audio streaming is involved, there is a need to synchronize the left and right audio streams to within 20 microseconds of each other (Hunn, 2016).
- Wireless streaming to a small hearing device at the ear has been successfully accomplished by GN ReSound (LiNX) and Starkey (Halo), and also from ear-to-ear by hearing aid companies.
The Occlusion Effect
- Earbuds inserted into the ear canal shallowly, create the undesirable effect of one’s own voice sounding loud and/or hollow.
- The hearing aid industry has been involved with this issue from the beginning of fitting hearing aids and has taken a front seat to understanding and resolving this problem over the past twenty years or so. Members of the hearing aid industry, and related researchers, have conducted numerous studies over the years to understand the mechanics of the activity, its implications for successful device use, and solutions (venting and deep insertion). This has put the industry light years ahead of earbud competition. Few earbuds have solved the problem because of the way they are designed to fit primarily in the ear concha and only marginally into the ear canal proper.
Fit and Comfort
- The best fidelity earbud in the ear is worthless unless it fits properly, is comfortable, and secure. No business has spent more time and money resolving these issues than the hearing aid industry. The fact that earbuds are starting to take on the physical characteristics and dimensions of custom-molded hearing aids, as shown here, is no coincidence.
- The electronics audio industry has traditionally placed the speaker either in a headset or earbud, neither of which has taken advantage of the acoustic effects allowed by the ear canal – features of many hearing aids.
- Mechanical engineers complain about attempting to place Bubba into a mini skirt.
- The hearing aid industry has been, and continues to be driven by cosmetics. Some may argue with this, claiming that optimal acoustics is the goal, but even they continue to design smaller housing for their technology to satisfy consumer requests. That hearing aids have been produced that fit completely and invisibly in the ear canal attests to the progress that the hearing aid industry has made, something that hearables have not yet accomplished.
- With hearables, this issue may take a slight turn by accepting larger devices, but don’t put your life’s savings into this as a long-term solution.
- The first headsets (Walkman) used primarily for music, were hard-wire linked to the processing device. This has continued until recently as wireless technology has progressed.
- Hearing aids have essentially been tethered-free since the introduction of eyeglass and behind-the-ear hearing aids in about the early 1950s. It is true that they were not dedicated to communicate with a specific device and/or to document activity using sensors, but they were completely self-contained amplification devices.
- The hearing aid industry long ago established alternate wired and wireless methods to connect the hearing aid to the telephone, TV, radio, CD player, etc. with the use of magnetic induction, RF, and infrared technology, among others, showing such activity long before earbuds. The coupling of the hearing aid to other devices has led to improvements in reception of such signals.
- Rather than having physical controls to affect change in hearables, a swiping motion (forward/backward/up/down – 2009) or touch (1988) on the faceplate of the device can change the gain, take the device to music, bring in music, etc.
- The suggestion of such use actions on hearables has been preceded by hearing aids almost 28 years ago. New with hearables are suggestions for personalized head gestures projected to be “learned” to answer the phone (such as shaking the head), and other activities.
- Managing amplified signals for a hearable has been the forte of hearing aids, especially since the advent of digital signal processing. Sophisticated circuitry and algorithms have been introduced in hearing aids to manage adaptive compression, adaptive noise reduction, adaptive feedback cancellation, ear-to-ear communication, etc.
Many additional examples could be provided suggesting that the hearable takes more from hearing aids than from earbuds. However, that is not really the issue. The issue is what will hearables be and how will they be marketed as products evolve.
*This article was originally published at Wayne’s World on March 8, 2016.
Wayne Staab, PhD, is an internationally recognized authority on hearing aids. His professional career has included University teaching, hearing clinic work, hearing aid company management and sales, and extensive work with engineering in developing and bringing new technology and products to the discipline of hearing. Dr. Staab is the Founding Editor of Wayne’s World and served as the Editor-In-Chief of HHTM from 2015 to 2017.
Thank you for your insight as always, Dr Staab. We’re lucky enough to have Nick Hunn as an advisor on our team at Third Skin, and we’ve moved towards a hearing aid form factor as you may remember, as it hugely remedies many of the challenges faced by a traditional audio earbud design (signal propagation, occlusion effect, battery life).
That was neat hearing of Thirdsk.in ! Some day we may even find a high-sense way to prevent sun damage to the ear while not swamping ears, snubbing neurostimulation on the ear (Chiropractors,) or leaving out some kind of mic. and synthesis people would pay to have around. Probably not ATSC 3.x resource description on dictation, or ultrasonic ad tag interpretation and routing, but it might have a little chance to time-shift some conversations around!