In 2020, a study (Bloom et al., 2020) highlighted a significant inefficiency in the productivity of scientific research output. Using sector-specific and aggregate data from the United States, the authors revealed a decline in the productivity of US researchers by 5.3% per year in aggregate data after 1970. This research sparked discussions and debates among growth experts, with a consensus on the main findings, as noted by Clancy (2023). Supporting evidence from citation-based studies by Park et al. (2023) and Chu and Evans (2021) corroborate the decline in productivity and reveal a substantial decrease in the rate at which breakthrough innovations have emerged over the past fifty years or so.
Concurrently, there has been a notable rise in the volume of research papers published across diverse scientific fields, what Chu and Evans (2021) refer to as a “deluge” of publications. Landhuis (2016) estimates the annual growth rate of published scientific papers to around 8–9% over the past several decades. In 2020, the Scopus database documented 4.6 million publications within the journals it encompasses, representing an increase from the 1.3 million publications recorded in 2000. Furthermore, Nicolaisen and Frandsen (2019) observe that a significant proportion (approximately 65%) of these papers do not receive any citations within the 20 years following their publication.
What’s behind the declining numbers of breakthrough ideas arriving simultaneously with a tsunami of publications?
According to Bhattacharya and Packalen (2020), the existing system of rewards in science, with its essential emphasis on publication-counting, can explain both the torrent of publications and the declining generation of major innovations. The system involves high pay for observed outcomes of research, publications and citations, and does not adequately reward hard-to-observe exploratory research.
Gold (2021) argues that a lesser propensity to take risks against the background of increasingly expensive research prompted both academic organizations and private firms to tune incentives toward low impact but predictable research outcomes. These incentives for safe results “push away scholars from risk-taking, and hence, breakthroughs.
Chu and Evans (2021) argue that at the origin of this ever-increasing number of published papers, one can find the “more is better” vision of scientific progress, which led to the generalization of quantitative measures of scientific performance, and incentives to publish no matter what. Citation databases, available from the 1960s and extended in the 1990s with the advent of the internet, allowed us to add an assessment for “quality” without challenging the quantitative nature of the measurement. They further claim that the ever-increasing flow of papers obscures the emergence of new valuable ideas, which are lost in the mass of irrelevant manuscripts.
In a recent paper in Research Policy (May 2024), Damien Besancenot (Université de Paris) and Radu Vranceanu (ESSEC) offer a signaling model as a complementary explanation to the declining number of breakthrough ideas combined with ever rising publication numbers. Their analysis builds on the assumption according to which, due to its disruptive nature, a path-breaking innovation is generally observed with a significant lag. At the time where a major innovation is recognized as such, its authors have likely moved to another organization, which makes it almost impossible to contract on breakthrough innovation. Besancenot and Vranceanu back this assumption with two contemporaneous examples (analyzed in detail), the mARN technology and large language models, which both were recognized as major innovations with several years lag.
If managers of research institutions can observe publications but are unable to observe breakthrough innovations, low-skilled scholars might reduce their investment in exploratory research and instead invest time in publishing as many papers as high-skilled scholars. This would allow them to claim the same level of compensation. In response to the imitation by low-skilled scholars, high-skilled scholars would publish even more, reaching a point where low-skilled scholars would abandon the imitation strategy.
The outlook for scientific research, as implied by this signaling model, appears discouraging. In the most likely equilibrium, where low-skilled researchers lack the ability to emulate the publication strategy of their high-skilled counterparts, elevating the relative rewards for significant innovation would not necessarily boost the occurrence of such innovations. While augmenting the overall resources allocated to researchers might have a positive impact, its efficacy could be diminished as some of these resources might be diverted towards increasing the number of publications rather than fostering major innovations.
While experts continue to discuss the reasons behind the decreasing rate of groundbreaking research, one certainty remains: tackling some of humanity's most fundamental challenges, including resource depletion, climate change, widespread diseases and pandemics, and persistent poverty, will necessitate a significant infusion of innovation.
References
Besancenot, Damien and Radu Vranceanu, “Reluctance to pursue breakthrough research: A signaling explanation”, Research Policy, 2024, 53, 4, Open access, Elsevier Online..
Bhattacharya, Jay and Mikko Packalen, “Stagnation and scientific incentives,” Technical Report 26752, National Bureau of Economic Research 2020.
Bloom, Nicholas, Charles I. Jones, John Van Reenen, and Michael Webb, “Are ideas
getting harder to find?,” American Economic Review, 2020, 110 (4), 1104–1144.
Chu, Johan S.G. and James A. Evans, “Slowed canonical progress in large fields of science,” Proceedings of the National Academy of Sciences, 2021, 118 (41), e2021636118.
Clancy, Matt, “Are ideas getting harder to find? A short review of the evidence,” in “Artificial Intelligence in Science. Challenges, Opportunities and the Future of Research,” OECD, 2023.
Gold, E. Richard, “The fall of the innovation empire and its possible rise through open science,” Research Policy, 2021, 50 (5), 104–226.
Landhuis, Esther, “Scientific literature: Information overload,” Nature, 2016, 535 (7612), 457–458.
Nicolaisen, Jeppe and Tove Faber Frandsen, “Zero impact: a large-scale study of uncitedness,” Scientometrics, 2019, 119 (2), 1227–1254.
Park, Michael, Erin Leahey, and Russell J Funk, “Papers and patents are becoming less disruptive over time,” Nature, 2023, 613 (7942), 138–144.