In order to increase public understanding of the scientific process by improving science reporting in news, the Annenberg Science Media Monitor analyzes the news coverage of widely reported scientific findings and disseminates its findings to science journalists. “Because media shape our perceptions,” noted Kathleen Hall Jamieson, director of the project and director of the Annenberg Public Policy Center, “the scientific community needs to understand the storylines characterizing news accounts both about consequential research and about the scientific community’s responses to concerns about such matters as failures to replicate consequential findings and the rise in the rate of retractions.”
This first report of the Annenberg Science Media Monitor, a project supported by a grant from the Rita Allen Foundation, focuses on the ways in which scientific discovery is portrayed in the news.
As prior research has confirmed, news reports cast most scientific findings as a quest that leads to discovery.1 In this storyline, scientists produce knowledge through an honorable journey. Central to this story structure is a plotline in which a scientist or group of scientists arrive at the featured finding through a search that involves surmounting challenges to attain reliable knowledge characterized in quest terms, such as “advance,” “path-breaking,” “a breakthrough,” or “discovery.” Humankind is the beneficiary. And throughout, science is reliable, scientists trustworthy, and the scientist’s report accepted as a faithful account of the search.
Although this plotline is consistent with the process recounted in scholarly publications, lost in this narrative are the complexities that characterize the scientific process, the most reliable form of knowledge generation humans have devised. Instead, the quest/discovery storyline inaccurately implies that the path to scientific knowledge is inevitable. Underplayed in such press accounts are the false starts, disproven findings, and dead ends that characterize the investigative process.
The inaugural report of the Annenberg Science Media Monitor is a content analysis of news reports about 165 scholarly studies in The New York Times, USA Today, The Wall Street Journal, and The Washington Post. According to Altmetric (see the Appendix), these studies were among the most widely covered in the 33 months from April 2015 through December 2017.
1 Jamieson, K. H. (2017) Crisis or self-correction: Rethinking media narratives about the well-being of science. Proceedings of the National Academy of Sciences, 115(11), pp. 2620-2627.
The selected scholarly articles were identified by Altmetric as having been among the most widely covered in the month in which they were published, from April 2015 through December 2017. Relevant news coverage of this research was collected using keyword searches on Factiva Dow Jones and LexisNexis. Articles unavailable through databases were found using the search functions on each outlet’s primary website: nytimes.com, usatoday.com, wsj.com, and washingtonpost.com.
Five coders were trained on a sample of 58 news articles using a coding instrument of 11 items. In six rounds of training, nine of those items (listed below) were coded with a Krippendorff’s alpha above 0.7.
Here we report the Science Media Monitor’s analysis of 281 news articles from The New York Times (88), USA Today (55), The Wall Street Journal (20) and The Washington Post (118) from April 2015 through December 2017.
|Item||Number of articles||Percentage in which item appears||Alpha (intercoder agreement)|
|Mentions of newsworthy findings||277||98.58||1|
|Use of words or phrases conveying that the finding is a discovery in the headline or first three paragraphs of an article||60||21.35||0.73|
|Use of discovery words anywhere in an article||131||46.62||0.77|
|The process leading to a finding||210||74.73||0.73|
|The significance of a finding||272||96.80||1|
|Whether authorities such as scientists or institutions involved in the finding were mentioned||273||97.15||1|
|Disagreement among scientists||50||17.79||0.71|
|Calls for continued research related to a finding||138||49.11||0.72|
|False starts in the process of acquiring new knowledge||13||4.63||0.73|
Words or phrases indicating that the finding is a discovery
Coders identified whether the words “advance,” “breakthrough,” “discovery,” “path-breaking,” or “paradigm shifting” appeared in the headline or first three paragraphs of an article, and whether the same words or synonyms appeared anywhere in the article, including terms such as “for the first time,” “groundbreaking,” and “unprecedented,” and superlatives like “earliest-known.”
- Overall, 21% of the articles used specific discovery words within the headline or first three paragraphs;
- 47% of the articles used synonyms for “discovery,” “breakthrough” or “advance” anywhere in the article;
- Washington Post articles most frequently characterized findings as discoveries, doing so 53% of the time (amounting to 22% of the total articles);
- Wall Street Journal articles most frequently used discovery words within the headline or first three paragraphs, doing so 40% of the time.
Process leading to a finding
Content captured by this item includes: explanations of the scope of a study, for instance, the number of participants, the locations in which the study occurred, and the questions addressed in the study; descriptions of the duration of a study, for example, the number of years in a longitudinal study; and mentions of tools and materials used, such as the gene editing tool CRISPR-Cas9.
- Overall, 75% of articles described the process of inquiry that led to the reported finding;
- The Wall Street Journal and the Washington Post most frequently described the process of inquiry, doing so in 80% of articles by each outlet (6% and 33% of the total articles).
Scientists, academic and research institutions, journals, and research companies were mentioned in 97% of the articles.
Significance of finding
Statements explaining why a finding is important were found in 97% of the articles.
Disagreement among scientists
Content coded as disagreement includes criticism of a finding by individual scientists as well as mentions of debate among scientists generally.
- 18% of articles in our sample described disagreement among scientists;
- The Wall Street Journal most frequently mentioned disagreement, doing so in 25% of articles published by the outlet;
- The New York Times and USA Today were equally likely to mention disagreement, doing so in 18% of articles published by each outlet;
- The Washington Post accounted for the greatest share of articles mentioning disagreement, doing so in 7% of the articles analyzed.
Calls for further research
For this item, coders identified coverage that mentioned further questions relevant to the finding that the research could not answer, comments on the limitations of a finding, and direct calls for more research.
- Overall, 49% of articles described more research needed in the context of a new finding;
- The Wall Street Journal was most likely to mention further research, doing so in 60% of articles published by the outlet, while the Washington Post was second most likely, doing so in 51% of articles.
False starts in the process of acquiring new knowledge
For our analysis of false starts, coders identified mentions of barriers to successful scientific exploration, specific failures, or past exploration resulting in dead ends.
- Overall, 5% of articles in the sample mentioned false starts;
- The New York Times most frequently mentioned false starts, doing so in 6% of their articles (2% of total articles).
Altmetrics serve as “a record of attention,” “a measure of dissemination,” and “an indicator of influence and impact.” (See altmetric.com/about-altmetrics/what-are-altmetrics/.) Its system uses three items to rank articles: output – the article or dataset itself; identifier – the DOI, arXiv ID, SSRN ID, ISBN or comparable unique identifier; and the number of mentions in a tracked source – mainstream media, public policy documents, blogs, citations, social media, multimedia or other sources. Altmetric ranks articles by counting links made to the identifier of a specific output in tracked sources, as well as by mining text for mentions of author names, journal titles and publication time frames.
When tracking mainstream media, Altmetric collects data from real-time RSS feeds and APIs. As a result, some relevant news articles are excluded from their records. Our analysis includes relevant coverage gathered manually which did not appear on Altmetric.
The 165 research studies for which coverage was examined in this report were highlighted during the period covered on Altmetric’s monthly High Five blog.
The Annenberg Science Media Monitor is supported by a grant from the Rita Allen Foundation.