Auracast Demo: Accessibility & Audio Sharing LIVE from EUHA 2023

auracast demo presentation bluetooth audio
December 5, 2023

In this special Part 2 episode, Andrew Bellavia shares his experience demoing the new Auracast™ broadcast audio technology at the 2023 EUHA congress in Nuremberg, Germany.

He visits the Bluetooth SIG booth to learn how Auracast™ enables audio sharing and accessibility in public spaces and has the opportunity to participate in a demo where he listens to different TVs and gate announcements in an airport scenario.

He then stops by the GN booth to see their implementation of Auracast™ in company’s latest ReSound Nexia hearing aids, allowing direct streaming from a Bluetooth® LE Audio enabled Windows PC.

He demos music streaming and Teams calls, highlighting the sound quality and microphone capabilities. Both demos showcase the low latency and enhanced experience Auracast can provide and the potential for the technology to improve accessibility, audio sharing, and direct connectivity.

Full Episode Transcript

Welcome to EUHA or ‘Oi-ha’, as you prefer. Day three was LE Audio and Auracast day. I went back into the hall, visited the Bluetooth SIG and experienced their demo and had a conversation about the value of Auracast. It was interesting because I thought I knew what Auracast was all about, and yet I learned something new that I think you will also appreciate. And then I stopped back at the GN booth in order to see their Auracast demo and get a little bit deeper into their relationship with Intel and how that’s going to affect people. So I’m at the booth of the Bluetooth SIG with Chuck Sabin, senior Director of Market Development. Thank you for spending some time with me. Thank you. Happy to be here. So please tell me what you’re doing. How you’re doing? The Auracast demo here, LE Audio, is the new architecture for Bluetooth Audio going forward for the future. As part of that, there is a new capability called Auracast Broadcast Audio that comes with LE Audio. Auracast Broadcast Audio effectively allows a transmitter to transmit a Bluetooth audio signal to an unlimited number of in-range devices. So we see this as a possibility for new and next generation assistive listening systems, but also providing changing the way that people interact with the world around them through audio in public spaces. And you actually have my favorite demo because I often use the sports bar as an example. Yes, right. Where it’s a mass consumer application anybody can use. For example, very soon, those Galaxy buds, two pro that I have in my bag, will be enabled for Auracast. So it’s not just for hearing impaired people, it’s for. Everybody. It is for everyone. There’s really three key experiences that we talk about for Auracast. The first is sharing your audio. That’s actually here where it’s me wanting to share an audio experience with you. It’s more personal. My friends, my family wanting to share an audio experience. No more changing earbuds and passing them back and forth. You now are able to set up basically an audio hotspot for your friends and family. The second area is exactly what you mentioned on the bars. It’s an unmute your world scenario where you’ve got silent TVs everywhere. You’ve got them in airports, gyms, waiting rooms, bars, restaurants, where you’re getting a visual experience, but you have no audio experience with that. Now you’ll be able to scan for those available audio access points from those public TVs and tune into whichever audio you want to. The one I’m really looking for is the gate announcements at the airport. Even before I was hearing impaired, if you were any distance away, you didn’t understand anything, right? Yes. That’s the hear your best scenario, right? Hearing. I just need to hear better. Right? And that can apply to not only people with hearing loss, it can apply to just general consumers as well. Noisy environments where you’re in an airport, a train station or other public venues where there’s a lot of background noise, but you want to listen to exactly what you’re interested in. And the same thing can happen at conferences where you’re trying to listen to the person on stage. And there’s other applications as well for simultaneous translation. So you might be in a foreign country, you want to listen to the lecture in your local language, and they might have simultaneous translation going, and you can listen to that instead of listening to the person on stage. That’s a great example, because at the beginning of this conference, I took the press tour and they had two people with FM radio systems and microphones, one in German, and so as we were all strung out, you could hear them. And then there was an English translator, and I had actually taken the FM radio plugged in my Roger Mic to. With streaming. Right. Well, in the future, I will directly tap in with Auracast. I won’t need that setup. How do you see the deployment taking place? Okay, you define the three applications, and I defined almost the same three applications. One would be personal, which is not only audio sharing, but devices like if I buy a new Samsung TV, I can tap in with Auracast directly. Right, right. So there’s the audio sharing/ personal. There’s the multichannel environment, like sports bars, and then there’s the single channel venues, like theaters, for example. How do you see the deployment taking place over time in each of those three areas? Which will come first and what’s the timeline? Yeah, that’s a fair question. Yeah. Roughly speaking, I think you’re going to see a lot of the personal sharing applications come to market very quickly. Quickly. At least maybe that’s how I said that in the beginning. You’re going to see it on your mobile phone or your laptop, me being able to share with you. But we’ve actually starting to see launches from. Companies that are providing other types of applications like GN has just launched their hearing AIDS and a streamer plus product right behind you is actually a product that is getting a little bit more public from Ampetronic that’s actually designed for doing Auracast broadcast in large public spaces. But effectively you are going to start seeing it implemented into smartphone and into your everyday devices for more personal. So that’ll come first. Generally speaking, we believe that a lot of that will come first, especially since you need the assistant application to see what is available to you as a broadcast. And a lot of that’s going to come through the smartphone or it’s going to come sorry, smartwatches or smartphones. So when I think of tell me if you agree or if you disagree and why, but I think the next application will actually be the multiple TV application because hearing loops can’t operate in. Sports bars, right. Single channel per correct. Plus it’s mass market. In other words, if there are three sports bars in town and I own one of them, the minute there are enough ordinary true wireless earbuds doing Auracast, I may install them to get a competitive advantage over the other two sports bars in town. Whereas an auditorium that has an FM system or a loop installed, they’re not necessarily in a rush because that are already providing that service and they can wait a little longer before installing yet another one. Is that how you see it playing? Yes. That’s a good example of how this all might work out. Where places that have when we talked about your unmute, your world, places where you have multiple screens or multiple channels of application or availability of audio Auracast is perfect for that. I do want to challenge one aspect associated with theaters and plays and so on, even within the loop system. Which might be a single channel, you might get just the audio from on stage. We’ve heard from a number of people and a number of these installations that are looking at different types of audio that people might want to have. Some of it might be just, I need dialogue enhancement. So there’s general audio, there’s dialogue enhancement, and then there’s also a place for people who don’t have sight that want actually audio description as a part of their experience at that theater. So they can hear fine, they can hear what’s happening on stage, but they can’t see what’s happening on stage. So now, even within those theaters and so on, they can provide multiple different types of accessibility to the individual based on what their actual needs. That is super interesting. I’d never heard it described that way before, but that is really interesting. Yeah. So this is really to me, this is really about overall accessibility. It’s not accessibility for one group of people, it’s accessibility options for a large group of people. Whether or not it’s. Just the general consuming public or other people that have other types of accessibility challenges that need to be addressed through an audio experience. Oh, that’s fantastic. I love that. I really love that. So let’s do the demo. Sure. So what we have here is what we’re demonstrating here at EUHA is an Auracast experience. The scene that is set here is that you are traveling to a conference and you may encounter various opportunities and options to use auracast broadcast audio while you’re here right while you’re on your journey. So I have a set of earbuds that are associated with this particular phone. You can proxy these as hearing aids or earbuds or headphones or headsets or whatever that have LE audio in them. So why don’t you put this is a right earbud, and actually, they’re narrow depth, so I can cheat and I can just put them in. I’ll. Hear it through the dome of the RIC. So why don’t you just put one in so then you can still hear well, I can still hear you because the hearing aid that’s perfect. That’s cheating. I can hear you perfectly well. So what we have is an application, an assistant application, that will be able to see all of the audio access points that are in Auracast access points that are available here in this experience. So if you hit scan, all right, that should start scanning for all the different access points that are available here. So what you’re seeing is, as I mentioned before, those three different experiences, if you’re coming to the airport to go to that conference, the first thing you might do is you’ve arrived early. You’re looking for something to eat. You walk into a sports bar and there’s TVs on the wall. And while you’re eating or drinking, you want to listen to watch. So if you selected TV One or TV Two now you should be automatically listening to the TV One audio. And I am. If you change to TV Two, then you can actually yeah, I’d rather watch the football than the basketball. So there we are. Two. It’s perfect. Now you’re a TV two. Now imagine you finished your food and now you’re off and you’re at your gate, right? So now when you’re at your gate, you want to make sure that you’re listening to and hearing all of the announcements associated with any gate changes timing when you’re supposed to get on there. And this is a great coincidence because I’m actually leaving on gate B 23, and that’s what happened to be right? There you go. There we are. So now you’re listening to the gate announcements at that point, but you happen to be at that gate. You happen to see a friend or a colleague that is there who’s watching a movie, and you’re like, whoa, while we’re waiting, can I watch the movie with you? So again, especially if you have earbuds, no more changing earbuds and so on, or if you have hearing aids. Now, I can set up essentially an audio hotspot, an Auracast hotspot from my laptop, and that’s called Dave’s laptop if you select it. Now, again, this is personal, so it actually is encrypted. So we have a less than secure encryption code, which is one, two three four. That’s worse than password. That’s worse than password. And now you’re listening to Dave’s laptop. And now I’m listening to this laptop right now. Yes, I have a laptop. So you and your friend are now experiencing this screen and the audio all at the same time. Terrific. Now you’ve taken off and you’ve flown to your destination. You’ve arrived at your destination, you’re at your conference, and so now you’re at the aerospace auditorium, but you happen to be in the very back of the auditorium. And conference centers are notorious for having not so good audio experience dead spots, dead spots and so on. So if you select aero but luckily they’ve got an Auracast broadcast set up for this particular auditorium. And now you should be listening to the auditorium directly at this point. This is now a live mic scenario between me and you coming from that same transmitter that you had for the aerospace auditorium. So one of the things that we hear most from people with hearing loss is around what is the latency and. Watching your lips. And when I speak, am I also able to hear you at the exact same time? Because many people have only hearing loss in one ear or they are hearing ambient sound. And the augmented I live mixed reality. I can dial any mix I want so I’ll usually keep a little ambient in so I can talk to the person next to me. Right. But then I’m going to get the speaker audio as well as the streamed audio. And there are your lips. Yes. So you don’t want any echo associated with that because that’s what causes people to stop using these types of systems is that kind of echo. But because this system is direct to ear audio from the transmitter, it’s not going through the phone. The phone is only being used as a control mechanism. The audio is coming directly from the transmitter into your hearing aids so it keeps the latency very low and accurate associated with those. Well, and I can tell you right now I’m getting actually my hearing aids I’m listening to you through the hearing aids which are obviously low latency device. The Auracast transmission through the buds which I can hear through the domes of my hearing aids. And I’m watching your lips and hearing your live voice. And the latency is very, very low between all of them. It’s really working. Well, there’s some but it’s almost imperceptible. It’s almost imperceptible. This is an unfair test, running the live audio through the hearing aids, because normally I would only use my hearing aids and I would have some ambient mix dialed in. Yes. But even in this less than ideal case, the latency is extremely low and doesn’t impede understanding at all. And the latency between your lips and the audio is fantastic. Yeah. So this is something that’s very important, especially to people with hearing loss. And then it’ll also be important in the end to people that use their personal devices to listen to that same audio experience. Yeah. Because if somebody without hearing loss is using in a personal device, the latency can still be disturbing. Yes. So especially if you’re running mixed reality mode, you still can’t have delay between two different audio sources. Correct. Well, this is really great. I very much appreciate it. This is my very first Auracast demo. You know, I’ve always been enthusiastic about the possibilities that Auracast bring. I Learned a New One in A way that you can Accommodate accessibility in Different ways in a single Channel theater setting by having Multiple broadcasts. And so I very much appreciate it. Thanks a lot. Thank you very much. Appreciate it. This has been great. Thank you. Thank you. Hello? I’m at the GN booth and I’ve got Thomas Olsgaard, principal engineer, with me. And for now we want to focus on Auracast and How GN is implementing Auracast in the Nexia hearing aids. Thank you for joining me, Thomas, thank you for Visiting us. You’re quite welcome. So please tell me how you view LE Audio and Auracast and what the advantages are implementing in the Nexia hearing aids. Well, basically this is the next generation of Bluetooth and this is what we have been working On. For ten years. So getting the direct connectivity to any multimedia devices, we had it for long with your phones, your smartphones and so forth. But now we are taking the next step with your PC, but also on the long term on TVs and you don’t need any extra devices. On top of that you get the better sound quality, what we have been waiting on for a while here. So now it’s there, now it’s live. Okay, very good. And in this particular case, this is a first for me because Microsoft just announced their capability to support LE Audio yesterday as we’re filming this. And because GN has a relationship with Microsoft, they already have a PC running it. This is the other half of the conversation I had about Intel support for it in their latest processors. But it takes both Intel and Microsoft together. And you’ve been working with Microsoft and now have a fully functional PC. So let’s actually do the demo. Yeah, so we have been working very close with Microsoft. We have the same desire here to make sure things are working smooth, easily and can do what it needs to do for hearing impaired as well as it does for non hearing impaired. So, basically, this is a PC, standard PC. Out of the box, it’s LE Audio compliant. And with Windows running on top of it, you can actually directly connect your hearing aids. And you see here, it pops up with the hearing aid symbol. It tells you that it’s connected now and it has the microphones connected also. So you can stream sound, but you can also send your voice the other way. So you can have bi directional conversations through the hearing aids, microphones and receivers. Exactly. So anything you can do as normal hearing person, you can also do as hearing impaired now. And just like with ordinary earbuds, you can see the power. Status of the hearing aids. Exactly. And it very easy to pair like any other devices. The icon is different. So you get this recognition that yes, it’s your hearing aids that are now there, and it’s working and alive. Okay, terrific. Okay, so show me the demo here. You’ve got headphones which are not hearing aids. Yeah. But then you have this gentleman. Please introduce me. This is Mr. Chris, and he is the one that is very patient and have been here throughout the day. And here we have the hearing aids on top of the ears. These are live hearing aids. We have a microphone sitting here at the end of the receivers. It comes out here with cables. And since most people here today are not hearing impaired about normal hearing, they get the sound through the headset. Okay. So these Nexias are actually paired to the computer with Le Audio. They are paired direct. Clear. Yeah. And then we’re listening to the output of the hearing aid with these headphones. Yes, correct. Okay, and let me show you. So this is streaming music directly. And one of the benefits is, of course, now you don’t just get the stream. I don’t know if that worked, putting it up against my microphone so people could hear. But we tried it and now I’m listening to it. And of course, I put these on over my hearing aids, but I can hear. The treble bandwidth is pretty high. So it’s a broad bandwidth transmission. Yeah. You really have the nice high bandwidth, good sound quality here. And if you go into a Team’s call, you also have the advantage that those in the other end hear you directly. So even though you have colleagues around you making their business a little bit noise and so forth. It’s filtered out, and it’s just your voice that comes through, and it just works out of the box. And that’s what we wanted, what we worked for for ten years. Now it’s here. That’s terrific. I really appreciate you showing me the demo. You’re welcome. Thank you very much. Thank you. I’m glad you could come along for Virtual EUHA, with me. Thanks for watching this episode of this Week in Hearing.

**Click here to check out Part 1, where Andrew had the opportunity to engage in discussions with several prominent hearing aid companies, exploring the trajectory of hearing care advancements.

Be sure to subscribe to the TWIH YouTube channel for the latest episodes each week and follow This Week in Hearing on LinkedIn and Twitter.

Prefer to listen on the go? Tune into the TWIH Podcast on your favorite podcast streaming service, including AppleSpotify, Google and more.

About the Panel

Andrew Bellavia is the Founder of AuraFuturity. He has experience in international sales, marketing, product management, and general management. Audio has been both of abiding interest and a market he served professionally in these roles. Andrew has been deeply embedded in the hearables space since the beginning and is recognized as a thought leader in the convergence of hearables and hearing health. He has been a strong advocate for hearing care innovation and accessibility, work made more personal when he faced his own hearing loss and sought treatment All these skills and experiences are brought to bear at AuraFuturity, providing go-to-market, branding, and content services to the dynamic and growing hearables and hearing health spaces.

Chuck Sabin is responsible for market development and research at the Bluetooth SIG and leads a wide range of market research, market planning, and business development initiatives. Working with the Bluetooth SIG executive team, Board of Directors, and member companies, Chuck helps to expose insight, trends, and projections that influence and drive the development of strategic business priorities. A proud member of the Bluetooth SIG team for ten years, Chuck has an extensive background in marketing, product management, planning, and business development for mobile wireless networks, enterprise servers, mobile operating systems, mobile devices, and client software and services

Thomas Olsgaard is the Principal Engineer at GN Group


***The Bluetooth® word mark and logos are registered trademarks owned by Bluetooth SIG, Inc. The Auracast™ word mark and logos are trademarks owned by Bluetooth SIG, Inc

Leave a Reply