Exclusive: These universities have the most retracted scientific articles

A first-of-its-kind analysis by Nature reveals which institutions are retraction hotspots
Two days before the end of 2021, administrators at Jining First People’s Hospital in Shandong, China, issued a highly unusual report. The hospital announced that it had disciplined some 35 researchers who had been linked to fraud in publications, such as fabricating data. These sanctions were part of a countrywide crackdown motivated by concerns about a flood of sham medical papers emanating from hospitals.
The problem was that some young physicians at hospitals had purchased fake manuscripts from paper mills: companies that churn out fraudulent scientific reports to order. These doctors were under pressure because they were required to publish papers to get jobs or earn promotions, says integrity sleuth Elisabeth Bik in California. Sleuths such as Bik soon began spotting signs of this problem, identifying duplicated images in large numbers of papers. They publicized the issue and a wave of retractions followed.
This surge can now be seen in a first-of-its-kind analysis of institutional retraction rates around the globe over the past decade, for which Nature’s news team used figures supplied by three private research-integrity and analytics firms. Jining First People’s Hospital tops the charts, with more than 5% of its total output from 2014 to 2024 retracted — more than 100 papers (see ‘Highest retraction rates’). That proportion is an order of magnitude higher than China’s retraction rate, and 50 times the global average. Depending on how one counts, the hospital could be the institution with the world’s highest retraction rate.

Source: Nature analysis
Many other Chinese hospitals are retraction hotspots. But universities and institutes in China, Saudi Arabia, India, Pakistan and Ethiopia feature in the data as well. Retractions can be for honest mistakes and administrative errors, but evidence suggests the majority of cases in these data are related to misconduct.
This type of institution-level analysis has never been done before because it is particularly challenging: retractions data are messy, there are errors in assigning author addresses to institutions, and creators of the underlying data sets make different choices about how to count institutions and published papers. So the results from the three independent analyses don’t always agree, and should be seen as preliminary.
Despite those challenges, common trends emerge. They show that there is significant variation within countries: some institutions seem to be retraction hotspots, whereas others are relatively unscathed. In certain cases, that might just be because some institutions have evaded sleuths’ attention, but it might also be related to specific research environments that seem to be associated with high proportions of erroneous or fraudulent work.
“It’s tempting to think about whether differences are related to varying incentives for researchers in different institutions,” says Ivan Oransky, co-founder of the website Retraction Watch, which maintains a public database of retractions on which the companies contacted by Nature partly depend.
Nature’s analysis found that, for the majority of institutions with high retraction rates, retractions are spread across many authors. This hints at a problem with research-integrity culture rather than a few rogue researchers, says Dorothy Bishop, a retired neuropsychologist at the University of Oxford, UK, who investigates cases of research misconduct. This kind of analysis “could lead to some positive action” if institutions respond by examining what is leading to the patterns, she adds.
Integrity signals
The retraction data provided to Nature come from research-integrity tools that technology firms have launched over the past two years, which aim to help publishers stem a surge in fake and significantly flawed research. Among a flurry of these software products are Argos, from Scitility in Sparks, Nevada; Signals, from the firm Research Signals in London; and Dimensions Author Check, from the London-based company Digital Science. (The latter is part of Holtzbrinck, the majority shareholder in Nature’s publisher, Springer Nature. Nature’s news and features team is editorially independent of its publisher.) These three firms provided their institution-based retractions data to Nature; they also looked at countries and journals associated with retractions.
Their tools aim to alert users to potential ‘red flags’ in research articles or submitted manuscripts, such as authors who might have high numbers of misconduct-associated retractions. To build them, the firms have created internal data sets of retracted papers and their affiliations. These are largely based on Retraction Watch’s database; launched in 2018, it was acquired in 2023 for public distribution by Crossref, a US non-profit organization that indexes publishing data. This made it easier for others to use and analyse the information.
The firms Nature contacted have built on this data set by omitting retracted articles that lack DOIs (digital object identifiers) and adding retraction information gleaned from other online sources, such as CrossRef, the life-sciences index PubMed and journal websites. The companies shared their internal work on the condition that their data sets would not be published in full, in part because they use them for the software tools that they sell to scientific publishers and institutions.
One firm, Scitility, says it will make its institution-level figures public later this year. “We think scientific publishing will be helped if we create transparency around this,” says the firm’s co-founder Jan-Erik de Boer, a former chief information officer at Springer Nature who is based in Roosendaal, the Netherlands.
The three firms’ numbers of retracted articles are some 6–15% greater (over the past decade) than the Retraction Watch data set they build on. But in some cases, articles can be recorded erroneously in CrossRef and other online sources1, so the data might not be 100% accurate, cautions Jodi Schneider, an information-sciences researcher at the University of Illinois Urbana-Champaign who studies retractions.
Retraction Watch staff have also manually filled in the reasons — as far as they can determine them — for each recorded retraction, indicating that the majority are due to misconduct. Those data aren’t available for the records that the firms have added.
Retraction patterns
Data on retractions show that they are rare events. Out of 50 million or more articles published over the past decade, for instance, a mere 40,000 or so (fewer than 0.1%) have been retracted, according to the firms’ data sets. But the rise in retraction notices (by which journals announce that a paper is being retracted) is outstripping the growth of published papers — partly because of the rise of paper mills and the growing number of sleuths who spot problems with published articles.

Source: Nature analysis
In 2023, as Nature reported, more than 10,000 retraction notices were issued (Nature 624, 479–481; 2023). Most of these were from Hindawi, a now-closed London subsidiary of the publisher Wiley, which found that Hindawi journals were affected by a blizzard of peer-review fraud and sham papers. (Wiley told Nature at the time that it had scaled up its research-integrity teams, put in place more rigorous processes to oversee manuscripts and removed “hundreds” of bad actors, including some guest editors, from its systems). During the past decade, the annual retraction rate — the proportion of published articles in a particular year that have been retracted — has trebled (although fewer retraction notices were issued overall in 2024 than in 2022 or 2023; see ‘A tide of retractions’). The proportion reached around 0.2% for papers published in 2022, and will rise as more articles are withdrawn (see ‘Rates on the rise’).

Source: Nature analysis
Over 2014–24, almost 60% of retracted articles (more than 20,000 of them) have authors with affiliations in China. Overall, about 0.3% of that country’s articles have been retracted so far — three times the global average.
But this is surpassed by retraction rates in Ethiopia and Saudi Arabia, and exceeded or rivalled (depending on the data set) by those in Iraq and Pakistan. (In the Retraction Watch data, Russia also rivals Iraq’s rates, but because many Russian retractions don’t have DOIs and aren’t in global databases, the firms omitted them from their analyses.) Meanwhile, countries such as the United States and the United Kingdom have rates of around 0.04%, much lower than the global average of 0.1%, and many countries have even lower rates (see ‘Retraction rates by country’).

Source: Nature analysis
The rates quoted depend on how analysts count a country’s overall number of research articles: the denominator. For instance, Signals and Argos use data from OpenAlex, a public bibliometric database, to tally published articles. But this data set’s coverage is generally larger than that of the Dimensions database, which Digital Science curates. Therefore, retraction rates are typically lower in the data from Signals and Argos.
Institutional hotspots
Analysing institutions is an even thornier task, because there are both data errors and different approaches in the underlying databases, Nature’s analysis found. To map institutional affiliations, Dimensions uses a private Global Research Identifier Database (GRID), whereas OpenAlex uses the public Research Organization Registry (ROR). Both contain quirks — affiliations can be missing or wrongly attributed (a particular issue for some smaller institutes) or the database curators might simply have made different choices about how to assign an affiliation. Accordingly, the firms’ analyses vary.
Still, a general picture emerges of an institutional retraction-rate leader board that is dominated by small Chinese hospitals and medical universities. In the Dimensions data set, over 2014–24, around 70% of the 136 institutions with a retraction rate above 1% are from China, and about 60% of those are hospitals or medical universities. The Argos data set, with 186 institutions with rates above 1%, gives a similar split. Signals shared only its top-ten lists, again dominated by Chinese institutions. (Nature reached out to all of the institutions mentioned in this article for comment, and did not hear back before publication.)
China’s science ministry and other government bodies have tried to crack down on fraud and perverse publishing incentives. In guidelines issued in 2020 and 2021, they emphasized that “research publications should not be a mandatory requirement for professional advancement,” says Shaoxiong Brian Xu at Huanggang Normal University in China, who studies retraction notices. The government agencies have also announced other sweeping investigations into research fraud. But, as Xu points out in an analysis published this January, China’s annual retraction rates have risen2. (Jining First People’s Hospital, however, has seen only three retractions of papers published in 2022, and none since.)
Outside China, institutions that are notably high up on several leader boards include Ghazi University in Dera Ghazi Khan, Pakistan; Addis Ababa University in Ethiopia; and India’s KPR Institute of Engineering and Technology in Coimbatore. In KPR’s case, according to the Retraction Watch data set, almost all of the retractions relate to a decision by IOP Publishing (the publishing arm of the UK Institute of Physics), which in 2022 retracted 350 papers from 2 volumes of conference proceedings released the year before. The publisher cited misconduct as the reason, including “systematic manipulation of the publication process and considerable citation manipulation”. More than 100 of those papers had authors who were affiliated with KPR.
To avoid counting the smallest institutes, for which high rates reflect just a few retractions, Nature filtered out universities that produce fewer than 100 articles a year. This means, for instance, that Showa University Hospital in Tokyo — where the work of one researcher generated 124 retractions — is excluded from the list; otherwise, it would have notched up a retraction rate of almost 10%.
Still, the universities that top the lists tend to be relatively small. In Ghazi University’s case, around half of its total retracted papers are by just four authors.
Some sleuths have speculated that authors from Ethiopia and some other African countries that publish relatively few articles might sometimes have been added to paper-mill products that originated elsewhere, to take advantage of Hindawi’s waived open-access fee for scholars from low- or lower-middle-income countries. A spokesperson for Wiley (the owner of Hindawi) did not give details, but commented: “We are aware of authorship for sale and waiver manipulation schemes deployed by paper mills,” adding that the publisher has strengthened its internal checks.
Most retractions
A different way to view the data is to pick out institutions with the largest overall numbers of retractions. Most are Chinese universities, but King Saud University in Riyadh, Saudi Arabia, makes the top three. The institutions with the largest number of retracted papers tend not to have the highest retraction rates, because they are large and so publish a lot. The lists of highest-overall retractions also differ because for some Chinese universities — such as Jilin University — some analysts counted retractions from hospitals associated with the university, too.
One clear change on which all the analyses agree is that in the past half-decade, retractions from institutions in Saudi Arabia and India have come to the fore — largely because of paper-mill activity from Hindawi journals (see ‘Institutions with most retractions’).

Source: Nature analysis
Another way to examine the overall data is though scatter charts, which highlight that there is significant variation within countries. These show, for instance, that Chinese universities tend to have a lower retraction rate than do Chinese hospitals (see ‘Retractions across research institutions’).

Source: Nature analysis
Pressures to publish
In India, it’s notable that the institutes with the highest retraction rates are almost all private colleges and institutes in the state of Tamil Nadu, an education hub, says Achal Agrawal, a freelance data scientist in Raipur. That accords with his own analyses of retractions in India, which he has been posting online over the past year after founding India Research Watch (IRW), an online group of researchers and students that is trying to highlight plagiarism and other publication misconduct in the country. Working with IRW, Agrawal has also created a dashboard to visualize Retraction Watch’s data for other countries.
Researchers in public universities and government institutes in India face fewer pressures to publish than do those in private universities and colleges, says Agrawal. Private institutions, he says, push students and researchers to publish many articles, and in some cases pay bonuses for papers published.
Enjoying our latest content?
Login or create an account to continue
- Access the most recent journalism from Nature's award-winning team
- Explore the latest features & opinion covering groundbreaking research
or
Sign in or create an accountNature 638, 596-599 (2025)
doi: https://doi.org/10.1038/d41586-025-00455-y
This story originally appeared on: Nature - Author:Richard Van Noorden