How to Cut through the Hype and Find the Smart Stories

Author(s): 
Rafinski, Karen

On the radio or off, the medical beat may be the toughest part of a science journalist's job. The stakes are high: if you hype a questionable treatment, your listener's health could be at risk.

The issues are complicated and new developments are rapidly pouring in, so it's genuinely hard to keep up. Plus, there’s a vast medical public relations machine out there constantly pushing the latest "breakthrough" that isn't. Too often these releases come in on deadline and you have little time to decide whether it's a story or not.

So here are a few tips to help you sort the wheat from the chaff.

Learn the Basics of Study Design

Many stories in the media fail to point out that while a study result is intriguing, it may be far from certain because of the way the study itself was conducted. Most of the time we report these things as fact—then come back two weeks later and report a completely contradictory finding, also as if it were scientific fact. No wonder people get frustrated with us! Avoid this problem by learning enough about how studies are designed and conducted to be able to tell your listeners how solid the research really is. As a bonus, this can help you weed out a lot of nonsense research that probably shouldn't be reported on in the first place. The gold standard is a randomized, double-blind, controlled trial. Other kinds of studies, like retrospective or cohort studies, can provide intriguing hints but not firm evidence. Here are a few links to get you started:

Stay Away from the One-Source Story

A firm rule of journalism is that there should be no one-source stories. Apply that to research as well. Don’t base a story on a single study; find out the background and history. Does the study confirm past findings or does it buck the trend?' Don't fall into the trap of assuming that because the study is new, its conclusions are more valid than previous studies. If it's an "outlier,"o; be sure to find out why; it could turn out that the study has a flaw or that it's just one of those fluke things that doesn't mean much in the long run. On the other hand, it could be that the researchers took into account a factor that previous studies failed to address—making their contradictory finding more valid. You need to find out which it is, if possible, and share that with your listeners.

A good way to do this sort of backgrounding is to look for scientific reviews on the topic. A good review will not only hunt up all the previous studies but evaluate them. It will tell you how strong the evidence is on a certain point, if there are flaws with the existing research, and what questions remain. A well-done review can provide an even higher level of evidence than a randomized controlled trial because it draws on numerous studies. These are also great places to look for stories because many times a review of the evidence may end up debunking current medical practice. Get in the habit of checking for reviews when you’re researching a story. A good source is the Cochrane Library, which both produces and disseminates evidence reviews. You can find them at www.cochrane.org. They’re also available through the Library of Medicine at www.pubmed.gov. The Health Behavior News Service, at www.hbns.org, now provides alerts when new evidence reviews are coming out.

Always Look for the Minority Report

When you’re looking at a study, check the results section and find out how many people in the study didn't fit the premise. If a large percentage of people in a study don't fit the premise, you have to question how strong the conclusions really are. A good example is the heart failure drug combination BiDil, which was the controversial first race-specific drug approved by the FDA. It did lower mortality overall in a population of black heart-failure patients that was studied. But that lowered mortality rate came from a small minority of patients that actually responded to the drug and did well on it. Most did not. Also, some patients of other races do respond to the drug. This is one of many reasons for the controversy: critics question how a drug can be linked to race when less than half of that racial population seems to respond to it. When you're interviewing researchers, always ask about this issue. It helps put a finding into context. Your listeners need to know if there's less than a 50 percent chance a treatment will actually work for them.

Distinguish Between Absolute and Relative Risk

Learn the difference between absolute risk and relative risk, and share that with your readers. Again, this is often a problem with epidemiological studies and nutritional studies, but it can help you evaluate claims for a new drug as well. Press releases are almost always written with the relative risk included but the absolute risk left out because it makes a finding sound much more important and newsworthy.

To pick an absurd hypothetical example, let's say a study finds that eating potato chips has been shown to double your risk of dying from dandruff. That’s relative risk. It's just a comparison that shows the strength of the link between dandruff and potato chips. On the other hand, your absolute risk of dying from dandruff, let's say, is one in 100 billion. If you double that risk by scarfing down a bag of potato chips every day, you will have raised your risk to 1 in 50 billion. Call me a gambler, but if I were a potato chip devotee, I might find that an acceptable chance to take.

Understand that every medical treatment is a trade-off, not an absolute good. If a drug is powerful enough to work, it will always cause side effects for at least some people. The same is true of surgery or alternative medical approaches: there is always some risk or downside. Unfortunately, with surgery and alternative medicine, there's often little information about the dangers, though you may find some studies if you search the medical literature. It's important to include those risks in any story. Your listeners will have to make medical decisions in the real world where both risks and benefits come into play. Help them by spelling out what are the trade-offs between risks and potential benefits. Drug company press releases and many studies themselves often don't focus on risks or side effects. So you'll have to search the medical literature and talk to experts in the field to sort this out for your listeners.

Beware of "New & Improved"

Don't fall into the "new & improved" trap. Always look for head-to-head comparisons of new treatments with existing treatments. You'll almost never see those in a press release touting a new treatment. Usually, they'll give you a study that compares a new treatment to a sugar pill, which is all that's needed to get FDA approval. But if your listeners are sick, they're unlikely to take a sugar pill; their real choices are between existing treatments and new ones. These studies are often few and far between but you should always look for them in the medical literature and ask experts. Evidence reviews will often cover this issue. This is where many reporters went wrong with coverage of Vioxx and other Cox-2 inhibitors, which were touted as being better than existing painkillers, mostly because they were supposed to have fewer side effects. If you took the trouble to check at the time, you'd have found that these new drugs were no more effective than existing painkillers, and the evidence that they caused fewer side effects was weak. They were a lot more expensive than older treatments. Later, of course, it turned out they carried significant heart risks, and Vioxx had to be pulled from the market.

Be a Skeptic

Be skeptical of all studies—even the ones that appear in the best peer-reviewed journals like the or the Journal of the American Medical Association. First of all, even peer review is sometimes fallible. More to the point, medical science is awash in cash from drug, medical technology and biotech companies. Too often, studies are funded by companies that have a huge financial stake in the outcome and they're designed in a way that's most likely to show a favorable outcome. The more reputable journals will print the source of funding at the bottom of the study and list any other ties researchers have to interested parties. But if it isn't there, ask. Even if the research hasn't been directly funded by an interested party, ask the researchers if they have other ties to interested parties or their competitors. Very often, they may have been paid many thousands of dollars for consulting work unrelated to the specific study. So be sure to ask about potential conflicts of interest and to look out for ways that may have affected the outcome. A good thing to do is to cultivate relationships with experts in study design and biostatistics so that you can run studies by them. They can alert you to flaws and help you ask the right questions.

Think About Implications

Think about the real world implications of a new treatment. How expensive is it going to be and will it be covered by insurance? If not, many people won’t have access to it. If it's a new surgical procedure, how widespread will it be if most surgeons don't have the training to perform it? You could be touting a new treatment that your listeners can't really get. Also, new treatments carry more uncertainty than older treatments with long track records. As in the case of Vioxx, unsuspected side effects can turn up after a drug hits the market, so it’s a good practice to warn people of this. With surgery, the first patients trying a new procedure while a surgeon is still getting up to speed are at most risk.

Don't Trust Your Local Doctor

I know your editors might want a local angle and that’s fine. But many reporters just call a few local doctors and that’s that. Studies show that many community doctors simply don’t keep up with the medical literature and the latest developments. The longer they’ve been out of medical school, the more out-of-date they tend to be. So if you just consult a friendly local doctor, you can get really bad information. Aside from that, medicine is so specialized that doctors can’t possibly keep up with fields they don’t specialize in. Always try to find one who specializes in the topic at hand and then go beyond that to talk to at least a few of the best national experts on the issue. You may very well find that your local community doctors aren't delivering the current standard of care—which is actually a much better story.

General Resources

  • Background information and links to a lot of great, reliable health information: www.medlineplus.gov. This government sponsored site includes links to all the federal health agencies, a reliable medical encyclopedia and drug information.
  • If you’re interested in the Cochrane Library but don’t have access to it at work, you can get a free subscription by joining the Association of Health Care Journalists at www.ahcj.umn.edu. You can also get a free subscription to Health Affairs, a health policy journal.

Karen Rafinski is a freelance medical and science reporter based in Boston. A former Knight Science Journalism Fellow, she’s been writing about these issues for 15 years, including as a staff medical writer for the Miami Herald. Her work has been honored by the American Psychiatric Association, among others.