Back to all Advice

The corrupting effects of academic citation metrics

August 21, 2025

"When a measure becomes a target, it ceases to be a good measure." -- Goodhart's Law

The Commissioner of the United States Bureau of Labor Statistics was recently fired because the president was unhappy about weak job statistics [1]. For many, this event brought to mind past crises where governments manipulated economic data. Greece once used complex financial deals to artificially suppress its reported budget deficit while it was working to join the EU. Argentina systematically underreported its inflation. The deceptions both held for a number of years, but when it became clear that the information lacked integrity, economic catastrophe followed. When the numbers we rely on to make critical decisions can't be trusted, the systems they support will crumble.

A similar crisis is unfolding in the halls of academia, where efforts to manipulate publication metrics risk undermining publishing and scholarship.

Academic publication metrics arose in the 1950s to address a real need: how to identify ideas that had widespread impact on subsequent scholarship [2]. Citation is an important intellectual tool, and it makes sense to identify highly cited work. Counting citations to a paper then evolved into counting how often a particular author or journal was cited. These metrics provided a quantitative way to evaluate scholars and institutions, fighting nepotism and bias that existed in the past qualitative world. Unfortunately, the new system made the metrics a target rather than an indicator, and the 2000s saw scholars increasingly being hired and promoted based on the number of papers they wrote, their citation counts, and whether they published in highly cited journals [3]. Goodhart's Law had arrived as researchers responded to these new incentives.

Today, these metrics are so ingrained as to be a natural part of the academic world. Faculty bios make statements like "Professor X has published more than 200 papers and has over 10,000 citations," rather than focusing on their most important insights or innovations. CVs sometimes categorize publication lists according to the associated Journal Impact Factors (a metric of citations to articles in the journal). Journal webpages highlight their Impact Factor more than their award-winning or most-read papers. These practices are not universal--I'm fortunate that my own institution explicitly avoids citation counting during hiring and promotion--but they are unfortunately widespread.

Given the prominence of these metrics, many methods have arisen to manipulate them. Journals may coerce self-citations and preferentially publish 'review papers' that attract more citations than typical research papers [4]. Paper reviewers may (anonymously) suggest citations to their own work. Authors may excessively self-cite their own prior papers. Authors can also add each other as coauthors to papers despite minimal involvement, or can break a piece of research into multiple incremental papers rather than a single big one, increasing publication counts and opportunities for citations. Most abhorrently, there exist tools and companies to generate completely fake papers, boosting authors' publication and citation numbers at the cost of polluting academia with intellectual garbage. Much more benignly, Scholars also make decisions based on metrics rather than ideas (e.g., publishing in a high-impact-factor journal rather than the most topically appropriate one, or choosing a research topic based on how many citations it might attract), subtly shifting societal efforts towards less valuable outcomes.

The effect of this manipulation is that researchers become cynical, believing that professional success depends upon playing games rather than doing the best research. The metrics themselves lose value as they cease to measure real contributions.

At present, the system is creaking along. But just like Greece and Argentina, academia's reckoning is coming. With enough manipulation, the metrics will have to be discarded, and what will happen to everyone who built their careers on suddenly devalued numbers? New statistical and artificial intelligence tools are beginning to reveal the bad actors in this game, just as Forensic Image Analysis has unveiled past abuses in biomedical sciences [5].

So what is to be done?

At the system level, the current metrics must be recognized as corrupting rather than advancing science. The Declaration on Research Assessment (DORA) urges the elimination of Journal Impact Factors as indicators of individual article quality or individual scientist contributions [6]. New metrics may be less sensitive to manipulation, but inevitably still subject to Goodhart's Law if they are used to reward scholars.

As an individual, fight the temptation to participate in metric games. Cite responsibly, by being cautious about self-citation, and citing the best and most relevant work. Cite foundational papers rather than summary reviews when possible, so the original innovator gets credit. Focus on producing a few great publications rather than slicing your work into multiple incremental papers. Publish in the most relevant journals rather than those with the highest impact factors. And don't work with researchers or journals who manipulate these statistics. When promoting yourself, focus on your ideas and their impact, rather than your citation statistics.

Senior scholars with credibility and job security have the most ability to influence this system. Use your weight to promote true scholarship. If you are involved in hiring or promoting researchers, focus on their ideas and not their metrics. One effective approach is to evaluate their self-selected top three or four contributions. This focuses evaluations on ideas, and rewards researchers for focusing on their best work rather than producing high volumes of incremental papers. If you are a journal editor, avoid manipulative practices such as coercive citation. If you are on an editorial board, complain about such practices when you see them and resign if they aren't corrected. Read the Declaration on Research Assessment for more ideas of concrete actions, and sign it to show your support [6].

I'm uncertain whether this system will change through orderly reform or crisis. I'm uncertain when change will arrive. But someday we will look back at today's citation obsession the way we now view Greece's financial deceptions: a cautionary tale of what happens when numbers replace reality.

The current system has problems, but it can still be navigated with integrity. You can chase metrics or produce work that matters. Twenty years from now, citation counts will be forgotten. Make sure your contributions aren't.

Register

Use the following link to register to receive very occasional updates about new offerings on this page. I will not share your information with anyone.

Go to the registration form