Last Friday I happened to be driving around and heard a well done story on Chronic Wasting Disease on Science Friday. I know that many TSW readers are interested in wildlife, so here is the link.
I’ve decided to start a new feature, based on the links for a book I’m writing on the practice of science, and (of course) how it could be improved. This feature will be called “Practice of Science Friday” and highlight a variety of studies and opinions on the practice of science. The intention will be to talk about topics relevant to the forest and climate sciences but we’ll also talk about the broader contexts of the sciences.
This week’s post is a link to an excellent piece in Vox, by a science journalist, Brian Resnick, on the puzzling problem of why science journalists publish exaggerated stories about research (hint, they copy press releases!). But rather than engaging in protracted hand-wringing, he actually has practical suggestions for how scientists can get better stories about their work. I recommend reading the whole thing.. there are many interesting examples from the health field.
To be honest, the research on how scientific press releases translate to press coverage doesn’t make my profession look all that good. It suggests that we largely just repeat whatever we’re told from the press releases, for good or for bad. It’s concerning. If we can’t evaluate the claims of press releases, how can we evaluate the merits of studies (which aren’t immune to shoddy methods and overhyped findings themselves)?
The difference between what scientists report in the studies and what journalists report in their articles can look like a game of broken telephone. A study investigating the neural underpinnings of why shopping is joyful get garbled into a piece about how your brain thinks shopping is as good as sex. A study exploring how dogs intuit human emotions, becomes “Our dogs can read our minds.”
Resnick cites a study by a fellow named Chambers:
Still, it comes to a conclusion that ought to be obvious: When universities put forth good, unhyped information, unhyped news follows. And perhaps more importantly, the researchers didn’t find evidence that these more careful press releases get less news coverage. Which should send a message: Universities don’t need to hype findings to get coverage.
“I think what we need is to establish that the responsibility [to be accurate] lies with everyone,” Chambers says. “The responsibility lies with the scientists to ensure that the press release is as accurate as possible. The responsibility lies with the press officer to ensure that they listen to the scientist. And then the science journalists need to be responsible for making sure they read the original article to the best of their ability and deflate exaggeration as much as possible that might persist despite all of our best efforts.”
As a science journalist, I’ve really appreciated it when academics have gone above and beyond to try to communicate tricky findings to the public. One of the best examples of this: Last year, when a huge, easy-to-misunderstand paper linked genetics to educational success, the researchers wrote an enormous FAQ, written in plain language, getting ahead of misconceptions. The FAQ was possibly longer than the scientific paper itself. In my interviews with researchers, I try to ask a version of the question: “What are the wrong conclusions to draw from this study?” Press releases, and other science communications from universities, could do better to include similar disclaimers in plain language.
I like the idea of FAQ’s published by the scientists.