Brian Taylor, AuD

Brian Taylor, AuD

“Signal & Noise” is a bimonthly column by Brian Taylor, AuD.

Most everyone is interested in maintaining a healthy lifestyle, a big part of which is paying close attention to diet and exercise.  Some, armed with formal academic training, quickly take note of the latest research in these areas.  Social media, cable news and the internet make this information readily available in easy-to-digest sound bites.

 

Science McNuggets

 

It seems each day there is a new study, often with attention-grabbing headlines we can use to make better decisions about how we can improve our health.  And since many of us simply don’t have the time to conduct a systematic review of the literature that may address various components of their diet and exercise routine, we often rely on these nuggets of research to help guide our decisions. The problem with this approach, as a recent NYT report attests, is that much of the diet and exercise research reported in the popular press, distilled by reporters and bloggers to showcase the most attention grabbing part of the study, is often misleading, contradictory, or worse yet – just plain false.

Gina Kolata, respected science writer for the New York Times, refers to this cacophony of research, often finding its way into the mainstream consciousness, as “whipsaw literature.”  Its effects are dizzying: Does coffee cause cancer or does it stave off heart disease? Do carbohydrates make me fat or are they an essential part of a well-balanced diet?  Without a systematic analysis of the actual study it’s impossible to address any of these questions from the whipsaw literature.  We are left to yo-yo between the headlight grabbing summaries when the “truth” is somewhere in between.

 

Attention Grabbing but Fuzzy on the Details

 

Audiology has its own struggles with whipsaw literature and its effects can be dizzying as well. Over the past three to four years, for example, there has been a sudden rise in the number of studies suggesting a relationship between hearing loss of adult onset and social isolation, increase healthcare expenditures, and various chronic medical conditions, such as dementia and depression. Further, we have seen a number of recent studies examining the relationship between hearing aid intervention and its effects on cognition decline. Many of the studies have been publicized in industry trade press with headline grabbing attention, including several posted at HHTM. The HHTM blog post snippets about hearing aid use and cognition when read in brief succession can indeed produce a whipsaw effect.

Interestingly, one of the studies hyped in the industry trade press equating hearing aid use to better cognition, when read more carefully, leaves the skeptical reader with the impression that the relationship between hearing aid use and cognition may be less impressive than the headlines lead us to believe. In another similar study, results were a bit fuzzy, as their findings indicated hearing aid use had no significant impact on cognitive performance or the incidence of cognitive impairment between hearing aid users and non-users.  As you might expect the latter study got minimal attention in the industry press, while the former enjoyed considerable attention.

In our exuberance to find clear answers that we can apply to our work with patients it’s easy to over interpret research results.  It’s a cliché, but the devil really is in the details. In the two studies cited above examining the linkage between hearing aid use and cognition, the limitations of the research design as well as a thoughtful explanation of correlation vs. causality were thoroughly addressed by the authors, yet their nuanced analysis was not reported in the industry press.

 

Clickbaits aren’t Causalities

 

As you might expect these details didn’t warrant attention grabbing headlines. It’s probably impossible to generate a lot interest in a headline that says “the data presented in this study is correlational and no conclusions about hearing aid use directly causing cognition to not decline can be supported by this research.” 

There is not a lot of clickbait in the details of the Discussion section of a paper that addresses the alternative explanations of the results of a short-term, observational study.  As any well-versed consumer of the research understands, without the use of a randomized control trial, rather than hearing aids causing better cognition, a plausible explanation for the results might be that individuals with better cognition tend to recognize their hearing handicap, see a hearing care professional and wear their hearing aids more consistently. Yes, there is a linkage, but we don’t know the direction of the linkage.

 

Deep Dive the Data to Gain Real Insights

 

A less impressive result than what’s touted in the industry press doesn’t make these studies any less valuable. After all, each data point when placed into the proper context should help us make better, evidence-based decisions for our patients. It’s just a matter of being a mindful consumer of the research – understanding the design limitations of the study and putting the results in proper context.

However, given the proclivity of most audiologists, who are trying to make a living and have a normal life, to focus on the attention grabbing headlines, it’s easy to see how the nuances of the study findings are lost. The workaday audiologist doesn’t have the required 30 to 60 minutes to read the study, parse the details and think about how the findings can be meaningfully related to their next patient. Instead, too many of us rely on the whipsaw literature. This can have serious negative consequences. For example, in the past week I have received mailers from two local clinics. The emphatic message in each mailer was “studies show that if you don’t get hearing aids and you will suffer from dementia.”

How can we avoid the dizzying effects of whipsaw literature? One way is to stay grounded in the rigors of evidence-based (EB) decision making. This requires knowledge of how to analyze the details of a study, to understand its design limitations and how to contextualize the findings into your existing experience with specific patient populations. When studies are properly read and interpreted with a healthy dose of skepticism they can provide important insights into the clinical decision making process.

Eleven years ago the Journal of the American Academy of Audiology published a special issue devoted to evidence-based practice. The lead article in that special issue, authored by Robyn Cox, provided an essential framework for bringing evidence based decision making to life in a clinical practice. Since that issue was published you can find more meta-analysis – the gold standard in an evidence based practice paradigm – in the dozen or so peer reviewed audiology journals. When properly “consumed” by professionals they can contribute to making better decisions in the clinic.

The problem, in my opinion, is that most audiologists in the field have failed to take a deep dive into EB thinking because by its very nature it’s time consuming, complex and its end results are a little unclear. It’s easier to take short cuts by relying on the attention grabbing headlines of the whipsaw summary articles. We need someone to do the heavy lifting for us. We need someone like Indiana University pediatrician Aaron Carroll of Healthcare Triage who asks the tough clinical questions, analyzes the literature and provides key findings – all in 8 minutes or less. Subscribe to his YouTube channel to see what I mean. 

 

Practice the Profession, Skip the Whipsaw 

 

Living with the fuzziness of science means we must have the ability to take something fundamentally complex and clearly communicate its implications to patients. By its very nature this ability is an art form. It requires mentoring and leadership – something missing in a profession that seems to be overly reliant on even fuzzier product claims and whipsaw literature. As we move into an uncertain future, somewhere in the gray area of art and science is the humble and skeptical audiologist actively listening to her patient, staying current with the literature, applying the basic rules of the scientific method. In other words, practicing her profession — one patient at a time.

 

 

Brian Taylor, AuD, is Senior Director, Clinical Affairs, for Turtle Beach/Hypersound.   He continues to serve as Editor of Audiology Practices, the quarterly publication of the Academy of Doctors of Audiology. During the first fifteen years of his career, he practiced clinical audiology in both medical and retail settings. Since 2005, Dr. Taylor has held a variety of leadership & management positions within the hearing aid industry in both the United States and Europe. He has published over 50 articles and book chapters on topics related to hearing aids, diagnostic audiology and business management. Brian has authored three text books:  Fitting and Dispensing Hearing Aids(co-authored with Gus Mueller), Consultative Selling Skills for Audiologists, and Quality in Audiology: Design & Implementation of the Patient Experience.  His latest book, Marketing in an Audiology Practice, was published in March, 2015.  Brian lives in Golden Valley, MN with his wife and three sons.  He can be reached at brian.taylor.aud@gmail.com or brian.taylor@turtlebeach.com.

feature image courtesy of Cambridge in Color (edit)

One Response to Living in the Gray Area between Art and Science: Overcoming the Effects of the Whipsaw

  1. Fortunately, there are two sources where audiology-related research is “pre-digested” for busy practitioners:

    • Starkey Evidence Blog, where Jason Galster does a great job reviewing all of the literature:
    http://blog.starkeypro.com/

    • AudiologyOnline’s Wednesday webinars where leading professors present the research in an easy-to-digest format.