Intel’s Initiatives for Hearing Accessibility: PC Connectivity and the Coming AI Revolution

intel accessibility hearing aid bluetooth le
May 15, 2024

Host Andrew Bellavia welcomes Eric McLaughlin, VP and GM of Wireless Solutions at Intel’s Client Computing Group, and Arnaud Pierres, Senior Director of Product and Ecosystem Enablement, to discuss Intel’s initiatives for supporting individuals who are Deaf and Hard of Hearing on Global Accessibility Awareness Day (GAAD).

Eric and Arnaud delve into Intel’s efforts to enable direct connectivity between PCs and hearing aids using Bluetooth® LE Audio technology. They showcase a live demo of how users can easily pair their hearing aids with a laptop, adjust settings like ambient sound levels, and switch between presets directly from the PC. The Intel Evo program, which co-engineers with hearing aid manufacturers to ensure high-quality experiences, is also highlighted.

Additionally, they discuss the potential of AI on Intel PCs to enhance accessibility further, such as real-time American Sign Language translation and contextual awareness to alert users to external sounds or events. The conversation emphasizes Intel’s commitment to improving the lives of everyone, including those with hearing challenges, through innovative technologies and collaboration with industry partners.


Full Episode Transcript

Hello everyone and welcome

to this week in hearing.

A year ago,

I had on as my guest intel’s

director of accessibility,

Darryl Adams.

This was after Intel’s press

release on Global Accessibility

Awareness Day,

announcing their collaboration

with 3DP4ME enabling

3d printing of custom hearing

aid molds in underserved areas

and also Intel’s plans for

enabling direct connection of

hearing aids to PCs using

LE audio with advanced


Much has happened at intel

in the last year.

To learn more about Intel’s

initiatives to support hearing

impaired and Deaf people for

this year’s Global Accessibility

Awareness Day,

I would like to welcome

Eric McLaughlin,

VP and GM of Wireless solutions

of the client computing group.

Arnaud Pierres,

senior director of product

and ecosystem enablement.

Thank you both for joining me.


please share a bit of background

with our viewers and also what

is the wireless solutions

group that you lead?

Sure. thanks, Andy.

We’re really excited to be here

and really appreciate the

opportunity to share what we’re

doing and where we think things

are heading in this area.

So at Intel,

the wireless Solutions group

is the team that has the

responsibility for all of the

wireless hardware and software

solutions and innovations that

we drive primarily for client

and IoT applications.

So if you look at radios

like Wi Fi, Bluetooth,

in the past we’ve done cellular,

we’ve done 60 GHz radios.

We’re always experimenting with

things like UWP and

other things.


on top of that we do software

solutions that enhance

performance and has


And of course in this

day and age,

we’re all over working on AI.

So that’s what the team does and

it’s a very exciting and

innovative place to be.

And we have a lot of fun trying

to find ways to improve

the working and connecting lives

of everyone that uses

Intel platforms.

Oh that’s great. Thanks.

And Arnaud.

So what we do in the wireless

and why this is linked to the

accessibility. So last year,

I mean,

since years we’ve been working

with the Bluetooth SIG to define

the LE audio technology.

Now the specs are released and

this is really the time to

scale this technology.

Last year with Microsoft,

we launched what we call

the LE audio essentials.

So the connectivity between

earbuds and pc,

we’re scaling that now.

And of course this technology

and many companies join together

to define the spec,

to connect hearable device and

hearing aids to different type

of device and pc is

one of the device.

This is really important for us

to be able to have a better

accessibility in the

pc ecosystem.

And this is what we are driving.

So what you are seeing is that

with our partner Microsoft,

they announced in October the

connectivity to hearing aids.

Earlier this year,

they announced

more hearing aids functionality.

We will discuss and we

will show you that,

but this is really what we are

trying to do so guarantee

a good user experience,

accessibility for hard

of hearing users.

I was just going to add to that,

that you commented on it earlier

that you had another Intel

guest last year.

And as we look at accessibility,

part of Intel’s charter that

we repeat internally,

as we look at what it is we do

and how we spend our time and

energy and investments,

is to enhance and improve the

lives of every person on earth.

And so accessibility is a

passion of ours, right.

It’s a place where we really

feel like we can bring the

best of our technology,

the best of leading edge

technology like BLE,

as we’ve talked about and AI

together to really improve how

people interact with devices,

especially those,

as Arnaud said,

especially those that are

hearing impaired.

So this is a huge focus for us

and something we feel really

passionate about.


and this is all really exciting

because it’s a connected

and half virtual world.

Everybody is using PCs.

And to the extent that we can

enable everybody to perform at

their best I think is terrific.

So I really give credit to intel

for exploring all the different

ways they can use the pc for

these kinds of features. Now,

when we talk about people with

hearing impairment what

capabilities exist on a modern

Intel PC and how does

it actually work?

Sure. So what we enable,

I mean what we’re enabling is

the direct connectivity between

pc and hearing aids.

In the past,

you needed to have dongles or

boxes that you connect.

And you need to have,

you need to,

I mean to be almost a geek to be

able to connect your

hearing aids.

now with the LE audio,

the direct connectivity to pc

user will be able to connect

directly pair and

ease of pairing.

Microsoft as swift pair,

you can pair very fast your

device and the quality.


we want to be sure that

the quality is good.

So maybe what I can

do is that I can

share the screen of

my second pc.

I can show you how it looks like

in the, on a real laptop.

Please do.

So. This is my second laptop.

I have a pair of hearing

aids that I will unbox.

They are here

and what they need a couple

of seconds to,

to boot and to connect.

I should be


zoomed discovered.

my hearing aids,

as you see here,

my hearing aids are

already connected.

I can see that I have two

hearing aids right on left.

I can see the battery level

of the hearing aids.

Very quickly I can go in this

setting of I can see the icon

of the hearing aids.

I could use it now I’m using

the song on my second pc,

but if I want to switch

on this pc,

I could do that

here. If I go in the menu,

I can get additional settings

for my hearing aids.

And for example,

I can adjust the ambient

sound level to get, I mean,

if I want to get focus

on my code,

I would probably lower down

the ambient amplification.

And if I want to

hear what is around me,

I can increase that.

I have also some preset that has

been configured in this set.

And I can switch from one

preset to the other.

That’s really the complete


And this is part of the

Windows insider build.

And Microsoft communicated that

it would be part of a future

release that will be done

later this year.

So that the


this is working out of the box.

easy pairing,

good connectivity

good voice quality,

thanks to the new technology and

the new codec of the le audio.

So that’s really interesting.

So when you’re changing

the presets,

you’re actually selecting the

different hearing aid programs

as if you were using the

hearing aid app,

except you’re doing it on

the pc now, correct?

Correct. Yeah,

that has been defined

by the Bluetooth.

so the Bluetooth not only

defined the connectivity,

but also define the way to

select and to set those

presets it’s part of.

And I can adjust the ambient

level as well.

So if I want to be aware

of my surroundings,

but on a more muted basis while

I’m concentrating on the call,

I can do that.

And you also have an automatic

mode for that, it appears, yeah.

So if I set this mode

so it will use automatically

the preset.

So user has really the choice to

use the automatic mode or you

can also control manually

the ambient sound level.

Terrific. Thanks for sharing it.

It all looks very easy and

intuitive to operate and adds a

lot of functionality to the

hearing aids when you’re

connected online.


easy to use.

I would put back my hearing aids

and you can see that that was,

I can disconnect and my

win zoom font that I’m not using

anymore. So yeah, easy to use.

Okay, terrific.

Now is all of that out of the

box functionality for an LE

audio device or is there more?

I guess that’s where does the

engineered for Intel Evo program

fit in relative to this


So the functionality that I’ve

shown you are part of what will

deliver Microsoft and obviously

the platform provider or

providing the connectivity

that will enable that.

What we do with the engineer for

Evo is that we want to ensure

the best end to end quality.

And to do that we realized that

we need to work with

our ecosystem.

We started first the program for

headset earbuds, mice, keyboard.

And we realize that this is a

great infrastructure for us as

well to work with hearing

aids provider.

So we announced that at least

two of them joined the

collaboration GN ReSound that

has launched their product

end of last year.

And we also have Starkey

as part of the program.

so for us this is really a

co engineering program.

We define a spec and requirement

that we want the pc hearing

device to meet voice quality,

time to connect as an example,

the time to pair,

and that forced us to

meet those criteria.

And we work with our OS partner,

we improve our solution

driver firmware.

But the hearing aid vendor

for example,

we are fixing bugs and being

sure that the quality is there.

So this is an

co engineering program and

we use that as well,

obviously to communicate better

about the functionality.

So it’s both co engineering and

a way to communicate about

the experience.


And one thing I’ve learned

over time anywhere in the

accessibility space is that it’s

really important to have end

user input as you develop

a program like this one.

What have you done there?

Oh yeah, good question.

So while we are,

I mean we have several

activities we test in our lab,

but we want to be sure that when

end user will get those products

that the quality is great.

So we have what we call our

crowdsourcing validation


specifically in hearing aids

for us we have to learn,

I mean

I’m not a hearing aid user.

we had to learn how people

are using those devices.

So we fitted like now

I think we have 25,

30 intel employees that we

fitted with hearing aids from

different companies and we gave

a laptop with audio enabled and

they are using those laptop

hearing aids for their daily

walk and live. So that,

that is really beneficial.

And will,

I mean the couple of things we

learn is that first they love

the direct connectivity

to the pc.

They love to be able to take a

call directly. For example,

this call I could use

the hearing aids,

no need of dongle and so on.

So that’s great.

the comfort is better.

They don’t need to put an

additional headset on top of the

hearing aid. For example,

for some of the users,

what we also learn is quality.

is key.

We need to ensure that


that this is working because

this is amplifying the

need of quality.

people that are with

hearing aids,

they must use their device.

They need their device,

so we cannot have this

connection. For example,

I would say the second learning

is that the connectivity must

be with several devices.

You must connect to your pc.

But I want to use my hearing

aids with my phone as well.

So the seamless connection from

one device to the other is

really key. And the third,

which is what we shown in

the small live demo,

is that the capability to adjust

the ambient sound level,

the preset is very important

feature as well for the user.

The couple of learning,

and we’re still


those user trial and we will

continue to improve.

And so when you think about

direct connection to

hearing aid, then,

and the feedback you were

getting from the users,

was it a better experience to

be directly connected versus

listening with the pc speakers

or the headphones?


most of the users love the

experience of being able to get,

be less tired after calls.

I think it’s important.


that was really beneficial

for them,

the focus.


and I can testify to that

because I had done the things

you said. For example,

attaching a tv connector to the

pc so I could direct

stream to mine.

And the experience of getting

corrected audio directly to your

ears is much less fatiguing than

it is trying to listen to a

headphone or especially

to the pc speakers.

So I’m not surprised that you

got that feedback from

the users as well.

Did they have prior experience

with direct connectivity or

was this their first time?

Some of the users were using

their hearing aids with their

phone, which was. I mean,

they But the primary device they

use for walking is the pc.

So it was kind of,

I have my pc and I

need to use my

phone on the side,

which was not a good experience.


So they find this truly a

meaningful improvement to be

direct, connected to the pc.


And this is something I hope the

hearing care professionals

in our audience hear,

because I have found that

generally speaking,

hearing aid users are not being

well educated on the

connectivity options,

and it’s really detrimental in

the workplace. For example,

there are studies that

show that people can

understand and recall better

if they can hear well.

Their ability to recall falls

off if they don’t hear well.

And that’s, of course,

really a burden on one’s career.

It’s important to understand

that role of connectivity and

the fact that it’s being made

easier and easier just makes it

that much better for hearing

impaired people and your hearing

care professionals who

are supporting them.


you described that you worked

with the hearing aid companies

to get you named two of them to

get them engineered for

Intel Evo verified.

How do consumers actually learn

which hearing aids are?

So on the hearing aid side,

we will let those companies

communicate about it,

I think

GN, for example,

announce their product line

with the LE Audio capability.

So consumers need to

look for LE Audio,

enabled hearing aids on the

client side, on the pc side.

So we are starting to

communicate about it.

And the recommendation we have

is we have a new product

that we launched end

of last year,

which is called Intel

Core Ultra.

And we have enabled all the evo

platform of Intel Core Ultra

with the LE Audio.

So if you want to ensure that

your laptop is supporting LE

audio go from an Evo laptop

with Intel core Ultra.

And we are working with all the

OEMs to ensure that LE Audio is

enabled out of the box and to

get this experience when tos

will be launched later

this year.


so you look for Core Ultra

and Intel Evo.

And are those labels on the

computer which will also be

shown in advertising material.

Yeah. So you have Intel Evo,

a small black batch of

Intel Evo. Yeah.

And so you’ve actually done

a lot in the last year.

When I talked with Darryl,

this was at the end of

the development,

but still in the future now

you’ve done so much.

What comes next?


we definitely have

a lot of things that

we’re looking at.

I’d say what’s next for us

besides driving these existing

products that Arnaud

just showed,

help working with Microsoft to

get these our PCs out the

door and tell Coral,

get them into the

hands of users,

getting the next version of the

OS with the full build in it out

to users so that those two

things meet and users can

actually take advantage of it.

So we’re not quite finished with

that. So that has to happen.

And we’ll continue to drive that

throughout the rest

of this year.

But then as we look

at what’s next,

we’re going to continue to scale

LE audio even outside of those

Core Ultra processors.

We want to get it on as

many PCs as possible.

And OEMs are driving that as

well they see the benefits and

the opportunity to ensure that

these devices really support

this hearing community.

so we see the current list

of current …,

and as we get into the

end of the year,

we’re launching our next

generation lunar Lake platform,

which will also scale with these

LE audio capabilities,

hearing enhancement


So that’s coming up.

And then from a feature


Auracast, that’s big.

We’re preparing our platforms

for that and we wanted,

similar to what we’ve done here,

we want to launch the overall

end to end experience working

with our OS partners,

as well as our engineered for

Evo and other partners

to make sure that,

that when you buy a solution,

that it’s high quality,

high reliability and really

delivers on that amazing

Auracast experience.

So those are the things that are

coming in the near term.


so Auracast on the pc means I

might have somebody here

in the meeting with me.

We both could be streaming the

audio to our respective devices

because you’ve got orcast

capability on the pc.

I’ve also been following and

even saw in the innovation

summit a little while ago,

Pat Gelsinger CEO,

demonstrating the AI

capabilities of the devices that

seems to offer a lot of useful

applications in the greater

accessibility space.

for example,

I saw an intel video

demonstrating omnibridge real

time american sign language

translation. How does that work?

Well, pretty exciting stuff,

we agree. So, obviously,

one of the best proponents

of all these

improvements in accessibility

is our CEO, Pat,

who also uses hearing aids

on a daily basis.

So he’s passionate

and helps us to

identify opportunities that

we can do to improve this

accessibility area.

So AI PC is absolutely a place

where this can happen.

I think you mentioned

in some of our previous

discussions that you saw and

Arnaud was front and

center on this,

is that Pat demonstrated in

intel vision last year how we

used AI on the AIPC to

do local processing.

Once a user is connected direct

with their hearing aids to the

PC and are immersed in that

experience to actually have the

ability to differentiate via AI,

a knock on the door,

a ring of the doorbell,

or a person trying to get the

person’s attention and having a

prompt coming up on the screen

to allow them to know that in

addition to that immersive

experience they’re having,

that there’s things outside of

that that need their attention,

something they couldn’t really

easily do before.

So that’s an example.

That’s a demo.

We’re working on how we take

them into product.

But as you mentioned,

there’s other things that

we’re working on,

including using AI and the AIPC

with the processing power that

can be spread across cpu,

a GPU and an NPU on an intel

platform to enable training and

use cases that allow us to take

somebody doing sign language

on one end of the call and

translate to text and eventually

voice on the other

side of the call.

So something we’re really

excited about again,

it enhances that connection

between hearing impaired or

hearing challenged individuals,

as well as the rest of the

people on the call that may

not be hearing challenge.

And people can interact in a way

that is natural for them.

So pretty exciting things.

And a really great use of AI.

And a really great

use of an AIPC.

Along with these capabilities,

we’ve been talking about the

direct connective hearing aids

and the hearing accessibility

features into that experience.


I found the ASL translation to

be really fascinating and it

looked like in the video I could

actually use it in the real

world as well. For example,

if I took, if I was traveling.

Well, I’ll give you an example.

I went to an accessibility

conference called CSUN.

And at the booth I was at a Deaf

person came with a translator

who was translating ASL for

me because I don’t sign.

And I thought, wow,

if I had like my portable

camera like this one,

I just faced it out.

I could sit in front of my

pc and I could watch the

translation. Right?

So the burden wouldn’t be on

the Deaf person to do

the translation.

I would see to the translation

just as I would if I were going

to a foreign country trying to

hear another spoken language.

did I understand that correctly?

Because that’s a really

fascinating capability to have.

You did? Yeah.

And as we work to deliver this

capability we absolutely see

that as a great use case.

And as you say,

we often meet in these

conferences and conference rooms

and other venues and

having the ability

to utilize that AIPC and the

cameras that are already in them

or connected to them like yours,

to real time improve that

interaction between those two

types of individuals. So yeah,

exciting use case and we’re

working diligently to try to

bring that to the market.

This is exciting. I mean,

I’m really impressed by the full

scope of work Intel is doing

in the accessibility space.

are there any other initiatives

we haven’t talked about today

that you’d like to highlight

or any closing thoughts,

either one of you?


I think I’ll let Arnaud

highlight anything

future wise that he wants to.

I think the way I would

summarize is we’ve kind of

talked about this, but I think,

first of all,

accessibility and pc is we’re

just scratching the surface.

It’s really important.

These devices have become our

primary communication device.

We use these devices every day

for more types of calls and

communication than I do

my phone anymore.

And so we have to continue

to invest to improve the,

not just the quality,

as Arnaud said, but the usability,

the features, the swiftness,

and especially for the

engagement between

for example,

the hearing impaired and or deaf

users, with the hearing users,

and figure out how best

to make this happen.

And we’re really thrilled at

what’s happening in

the industry.

We’re not the only ones engaged

here. We got Microsoft,

we’ve got the hearing,

hearing aid companies,

we have the accessory companies,

all engaged to try to make all

of these things go

work together.

And when you layer AI

on top of that,

and just the capabilities that

AI will bring to us to take data

that’s resident on our Bluetooth

or wireless or PC,

and take that data,

understand the environment,

be contextually aware,

understand what’s being done,

what apps are being used,

what the intent is,

and take that data, analyze it,

and put it into an AI engine to

deliver an experience that

absolutely will enhance our

ability to communicate together.

So, really,

really excited about

what’s to come.

We’ll have more announcements as

we have more things to share.

But I just think that the world

is going to change a lot,

and we’re excited to

be a part of that.

Any last thoughts or.


I think Eric summarized

it pretty well.

So we need to work with

the ecosystem.

We need to make sure we deliver

this experience on AI will

help us to improve.


I really appreciate you both

taking the time to dig into

everything Intel is doing

in this space. I mean,

no doubt it’s going to have a

positive impact on hearing

impaired people and deaf people,


navigating both the real and the

virtual worlds. As you said,

we go back and forth between

almost seamlessly. Now,

if people want to learn more,

they want to reach out

to either one of you.

How would they do it?

You know, I’m on LinkedIn.

Contact me on LinkedIn.

And we can, you know,

we can connect there.

But I’m absolutely open to,

you know,

folks giving Arnaud and I a call

figuring out how we can

engage together.


thanks again to you both,

and thanks also to everyone for

watching or listening to this

edition of This Week.

In Hearing.


Be sure to subscribe to the TWIH YouTube channel for the latest episodes each week, and follow This Week in Hearing on LinkedIn and on X (formerly Twitter).

Prefer to listen on the go? Tune into the TWIH Podcast on your favorite podcast streaming service, including AppleSpotify, Google and more.

About the Panel

Arnaud Pierres is the Senior Director Wireless Strategic Planning, Product planning and Ecosystem enablement at Intel Corporation

Eric McLaughlin is the Vice President and General Manager of Wireless Solutions at Intel’s Client Computing Group

Andrew Bellavia is the Founder of AuraFuturity. He has experience in international sales, marketing, product management, and general management. Audio has been both of abiding interest and a market he served professionally in these roles. Andrew has been deeply embedded in the hearables space since the beginning and is recognized as a thought leader in the convergence of hearables and hearing health. He has been a strong advocate for hearing care innovation and accessibility, work made more personal when he faced his own hearing loss and sought treatment All these skills and experiences are brought to bear at AuraFuturity, providing go-to-market, branding, and content services to the dynamic and growing hearables and hearing health spaces.


Leave a Reply