Here is the CMT Uptime check phrase
A University of Michigan study published March 4 strongly encourages media coverage to include the numerical magnitude of the research findings they report on. Without a clear magnitude, people may incorrectly assume that the findings have a larger significance than they truly have.

Audrey Michal, the study’s lead author and U-M psychology assistant research scientist, said the inspiration for the paper came from a student in her lab who found that college students rarely noticed when journalists made false claims. Michal said she worked with her team to further investigate the student’s finding.

“One of the things we focused on a lot in the lab is when people read about correlational studies, and this tendency to just assume that there’s a causality between two variables that are more relationally related,” Michal said. “We’ve also noticed that people don’t think about the magnitude of findings very often, especially if they’re laypeople, they just might not realize that a lot of scientific findings are quite small.”

The study looked at the comparison between two online studies measuring U.S. adults’ endorsements of expensive interventions described in media reports. The effect sizes, or true correlation, of the interventions were small, large or of unreported magnitude. Without knowing magnitudes, participants were just as likely to endorse interventions with unreported effect magnitudes compared to interventions with large effects, suggesting a significance bias. When effect magnitudes were reported, however, participants adjusted their evaluations accordingly to endorse large effect magnitudes.

Adriene Beltz, associate professor of psychology, said most people are inclined to trust cognitive perceptions and assume that scientific effects are meaningful. However, Beltz said that researchers should work to ensure their studies are properly represented in the media.

“(Researchers) have to work hard to make sure that some of the caveats of the work are also presented accurately in articles,” Beltz said. “And as this (study) shows us, one way that we can do that is by providing some numerical information about the size of our effects in ways that make sense in everyday life.”

Pam Davis-Kean, professor of psychology and research professor at the Institute for Social Research, said there is no “right” way to disseminate complex scientific information, but it is important to make the information digestible for people with little to no background in the field.

“(Researchers) just need to make sure that (findings are shared) in a way that they can be consumed well, and when that’s the case, maybe people will make a better decision,” Davis-Kean said. “When we’re trying to put out complex scientific information where it looks pretty easy, like we’ve answered a simple question, but it wasn’t simple to get to the answer to that question, that will be a challenge.”

Beltz said it is also difficult for results to be interpreted in the media since there is typically no baseline comparison.

“Results are inherently based on a comparison to something else — another baseline, some other condition,” Beltz said. “But often when statistical findings are recorded in popular outlets, the reporting is absolute with words like significant, impactful, less than or decrease. We should try to train ourselves to reflexively ask ‘Less than what, significance at what cost?’”

Michal said it is often difficult for the general public to truly determine whether studies’ effects are of a significant size on their own. She added that it would be helpful for students to begin learning about misinformation in the media to be better prepared.

“The first step is for a person (is) to kind of pause and just ask, ‘Am I going to accept the science at face value or am I going to question it?’” Michal said. “High school, especially, I think would be a great time for students to start learning, maybe as part of general media literacy.”

Davis-Kean said many research papers will use terms like “statistically significant,” which may not be meaningful based on context. Davis-Kean said it is important for people to accurately understand scientific findings to build trust between laypeople and the research community.

“We don’t want people not to believe scientists,” Davis-Kean said. “Then when there is something real that’s happening … things that have pretty strong science behind it, but now (people are) suspicious and they’re skeptical. And we don’t want that.”

Rackham student Wenshan Yu, who studies in the department of the Institute for Social Research, said language is really important when it comes to scientific reporting.

“There is a distinction between significant findings and practically meaningful findings,” Yu said. “The first one is meaningful for the scientists themselves and also for (their) scientific colleagues. And the latter is important for the paper to be accepted by the journals.”

According to Yu, communicating science is just as important as researching it.

“(This study) reminds me of the importance of remembering to train my skills to communicate the scientific findings to other people, not only my colleagues,” Yu said. “I think those daily communications may be very helpful to exercise the ability to explain in a meaningful way for the laypeople.”