Retraction – The Publication Plan for everyone interested in medical writing, the development of medical publications, and publication planning https://thepublicationplan.com A central online news resource for professionals involved in the development of medical publications and involved in publication planning and medical writing. Wed, 17 Dec 2025 15:05:57 +0000 en-US hourly 1 https://s0.wp.com/i/webclip.png Retraction – The Publication Plan for everyone interested in medical writing, the development of medical publications, and publication planning https://thepublicationplan.com 32 32 88258571 When politics meets publishing: researchers fight back https://thepublicationplan.com/2025/12/17/when-politics-meets-publishing-researchers-fight-back/ https://thepublicationplan.com/2025/12/17/when-politics-meets-publishing-researchers-fight-back/#respond Wed, 17 Dec 2025 15:05:56 +0000 https://thepublicationplan.com/?p=18549

KEY TAKEAWAYS

  • US government executive orders targeting EDI programmes are prompting federally funded journals to censor demographic data and equity-focused language.
  • Authors and editors are pushing back to ensure data are made available and to maintain the integrity of the scientific record.

Following US government executive orders to end federal equity, diversity, and inclusion (EDI) programmes and to only recognise two sexes, The BMJ has emphasised the importance of retaining sex and gender data in published research. In an article in Undark, Peter Andrey Smith highlights another example of the scientific community pushing back against federal pressure to remove EDI-related data.

Authors make a stand

Smith describes the case of anthropologist Tamar Antin and co-authors, who faced an unusual request from the federally funded journal Public Health Reports following acceptance of their paper on tobacco use. The editors requested removal of the word “equitably” and demographic data, citing compliance with executive orders. Rather than grant the request, Antin and co-authors withdrew their paper entirely and went public. This “act of defiance” was met with widespread support from the scientific community, who argued that removing demographic data doesn’t just affect one paper’s conclusions – it hampers future studies by denying other scientists the opportunity to reanalyse findings or build on existing research.

“Removing demographic data doesn’t just affect one paper’s conclusions – it hampers future studies by denying other scientists the opportunity to reanalyse findings or build on existing research.”

The bigger picture

Smith also shares examples of federally funded researchers requesting:

  • withdrawal
  • removal of authors from bylines
  • specific wording changes

to accepted papers, citing the political landscape. While this affects a minority of submissions directly, maintaining the integrity of the scientific record is paramount.

Looking ahead, the Committee on Publication Ethics’ position statement emphasises that publishing decisions and language choices should not be influenced by politics or government policies, and there is no place for retractions to censor the scientific record.

————————————————

Have the US executive orders around EDI directly impacted your work?

]]>
https://thepublicationplan.com/2025/12/17/when-politics-meets-publishing-researchers-fight-back/feed/ 0 18549
Safeguarding scientific image quality and integrity: what more can be done? https://thepublicationplan.com/2025/10/29/safeguarding-scientific-image-quality-and-integrity-what-more-can-be-done/ https://thepublicationplan.com/2025/10/29/safeguarding-scientific-image-quality-and-integrity-what-more-can-be-done/#respond Wed, 29 Oct 2025 15:38:19 +0000 https://thepublicationplan.com/?p=18377

KEY TAKEAWAYS

  • Scientific image editing serves a vital role in clear communication, but seeking presentation clarity must not compromise data integrity.
  • Combatting image manipulation requires systematic collaboration across the research ecosystem, including standardised guidelines and new verification technologies.

As concerns mount over image manipulation in scientific publishing, the research community has begun developing new strategies to balance visual clarity with data integrity. Writing in Nature, Sara Reardon explores the “fine line between clarifying and manipulating”, highlighting the challenge of making figures both accessible and faithful to original data.

The art and science of visual presentation

Scientific images often require editing for clarity, like adjusting brightness, adding scale bars, or enhancing contrast. While such modifications are essential for effective scientific communication, a 2021 study by Helena Jambor and colleagues revealed that poorly presented figures remain surprisingly common, suggesting researchers need better training in visual data presentation.

When enhancement becomes manipulation

The boundary between legitimate clarification and misconduct can be perilously thin. Science integrity consultant Elisabeth Bik warns that even minor edits – such as cloning image sections to cover dust particles – can undermine data credibility. Echoing a seminal 2004 article, Bik emphasises that “the images are the data”, meaning they should present the results actually observed rather than those the researchers expected. Any undisclosed alteration that changes the scientific message could constitute misconduct. As Reardon notes, the cardinal rule remains to “show your work” – enhancing clarity without obscuring underlying data.

“The boundary between legitimate clarification and misconduct can be perilously thin… the cardinal rule remains to ‘show your work’ – enhancing clarity without obscuring underlying data.”

Detection and prevention strategies

Phill Jones examines potential systemic solutions to what Bik calls science’s “nasty Photoshop problem” in The Scholarly Kitchen. Journals increasingly conduct pre-publication screening using image-integrity specialists or AI tools that have demonstrated substantial promise in identifying manipulated images. Guidelines such as those from the International Association of Scientific, Technical & Medical Publishers aim to standardise best practice, while individual journals are also establishing specific image integrity requirements. Beyond journals:

  • Institutions are urged to provide training and embed image integrity expectations into research culture.
  • Post-publication peer-review platforms also play a role in identifying problematic images after publication.

Looking ahead, technical innovations offer promise. Jones highlights developments such as encrypted hashes and digital ‘signatures’ embedded in images, akin to secure web certificates, that could enable reliable verification of image authenticity. Ongoing collaboration and systematic change across the research ecosystem will be required to ensure scientific images are both clear and credible.

—————————————————

Are current image integrity detection tools sufficient to prevent manipulation in scientific publishing?

]]>
https://thepublicationplan.com/2025/10/29/safeguarding-scientific-image-quality-and-integrity-what-more-can-be-done/feed/ 0 18377
Retractions and corrections are falling under the radar: should open repositories step up? https://thepublicationplan.com/2025/08/06/retractions-and-corrections-are-falling-under-the-radar-should-open-repositories-step-up/ https://thepublicationplan.com/2025/08/06/retractions-and-corrections-are-falling-under-the-radar-should-open-repositories-step-up/#respond Wed, 06 Aug 2025 08:53:35 +0000 https://thepublicationplan.com/?p=18175

KEY TAKEAWAYS

  • Most open access repositories have evolved without sufficient means to communicate corrections or retractions.
  • Metadata, such as DOIs, could be used to link all article versions and ensure corrections/retractions are clearly indicated to readers.

Open access repositories have an important role in disseminating scientific research. But what happens when a journal corrects or retracts a publication? A recent LSE Impact Blog article describes Frédérique Bordignon’s alarming discovery around how well this is captured by repositories.

Open repositories’ ‘blind spot’ to corrections and retractions

As Bordignon explains, most journals display up-to-date editorial notices alongside publications, although clarity can vary. On the other hand, open repositories do not necessarily pull through information on correction/retraction from published counterparts, and guidance from the Confederation of Open Access Repositories is lacking.

To examine the topic further, Bordignon’s team conducted a manually verified analysis of the world’s second largest institutional repository, HAL, by cross-checking its records against 24,430 corrected or retracted publications extracted from the Crossref x Retraction Watch database. Shockingly, they found that 91% of corrections/retractions were not indicated in the repository. Bordignon emphasises that this situation is not unique to HAL, but reflective of repositories across the world.

“91% of corrections/retractions were not indicated in the repository…this situation is…reflective of repositories across the world.”

How to ‘fill the gap’ in effective reporting of corrections

The solution? Bordignon points out that open repositories have a powerful opportunity to ‘fill the gap’ in effective reporting of corrections. However, rather than expecting repository managers to make individual version control decisions for every publication, Bordignon suggests that open repositories:

  • create their own archives
  • clearly display the editorial status of each article
  • include a permanent, bidirectional link to the corrected published version
  • enable automated updates through partnerships with Crossref x Retraction Watch, making use of metadata such as digital object identifiers
  • incorporate platforms that detect and report retractions, such as PubMed, PubPeer, and Scite.

Bordignon provides a stark reminder that omission of corrections/retractions notices from open repositories risks that users may be learning, citing, or even propagating, flawed science; this can ultimately “erode public trust in science”. She urges open repositories to galvanise their position in the fight for research integrity, paving the way for a more streamlined archiving system that leaves readers in no doubt as to the reliability of the information they are accessing.

—————————————————

Do you agree that open repositories need to clearly identify corrected or retracted publications?

]]>
https://thepublicationplan.com/2025/08/06/retractions-and-corrections-are-falling-under-the-radar-should-open-repositories-step-up/feed/ 0 18175
How can we prevent retracted research from polluting the literature? https://thepublicationplan.com/2025/04/03/how-can-we-prevent-retracted-research-from-polluting-the-literature/ https://thepublicationplan.com/2025/04/03/how-can-we-prevent-retracted-research-from-polluting-the-literature/#respond Thu, 03 Apr 2025 08:53:50 +0000 https://thepublicationplan.com/?p=17589

KEY TAKEAWAYS

  • A literature-mining project featured in Nature News showed that some papers heavily rely on retracted research.
  • Technological tools, such as Guillaume Cabanac’s ‘Problematic Paper Screener’, could be part of the solution.

The results of a project featured in Nature News showed that problematic research continues to amass citations in the literature, even after retraction. Perhaps unwittingly, some authors are citing large numbers of retracted papers, which can cause questions to be raised about their own work once it is published. With retracted works accounting for as many as 65% of citations in some papers, there is a drive to harness technology to solve the problem.

You are what you cite

Worryingly, problematic papers can continue to be cited long after retraction. Although not a definitive sign of misconduct, heavily relying on research that has been withdrawn can retrospectively undermine a paper’s reliability. Unfortunately, no system exists for alerting researchers to retractions that may impact papers they have already authored.

Hoping to change this, “research integrity sleuth” Guillaume Cabanac, who is behind the project reported in Nature News, has developed tools such as his ‘Feet of Clay’ detector, which flags papers that cite retracted works (and his earlier ‘annulled detector’, which tracks the retracted papers themselves).

As well as encouraging publishers to conduct regular checks and notify authors of any retractions they have cited, Cabanac urges authors to make use of these and other tools, such as plug-ins that can automatically flag papers that have received comments on PubPeer, before submitting papers.

“You always have to double-check what you’re basing your work on.”

Tools to clean up the literature

As Cabanac himself reports, the Feet of Clay detector is just the latest addition to his Problematic Paper Screener, an automated system for flagging papers that may warrant further scrutiny. Launched in 2021, the screener tracks the global landscape of retractions and uses multiple detectors to automatically mine the literature for signs of potential misconduct, such as:

  • tortured phrases’ typically seen when AI re-writes existing scientific content
  • ‘fingerprints’ associated with random paper generators such as SCIgen or Mathgen
  • nonsensical content, such as cell lines or nucleotide sequences that have been fabricated
  • citations for journals known to have been hijacked.

Cabanac hopes that this software will facilitate continuous evaluation of published literature, with over 875,000 papers having been flagged and assessed via the system so far. He calls publishers and authors to action: “A combined preventive and curative effort from all involved is key to sustaining the reliability of the scientific literature — a crucial undertaking for science and for public trust.”

————————————————–

Who should be responsible for checking whether cited works have been retracted?

]]>
https://thepublicationplan.com/2025/04/03/how-can-we-prevent-retracted-research-from-polluting-the-literature/feed/ 0 17589
Retractions as corrections: shifting the narrative https://thepublicationplan.com/2024/09/25/retractions-as-corrections-shifting-the-narrative/ https://thepublicationplan.com/2024/09/25/retractions-as-corrections-shifting-the-narrative/#respond Wed, 25 Sep 2024 12:34:46 +0000 https://thepublicationplan.com/?p=16501

KEY TAKEAWAYS

  • Retractions should be seen as neutral corrections made to preserve the integrity of academic work, rather than punitive actions.
  • Consistent communication and transparency throughout the retraction process are key to maintaining trust within academic publishing.

Retractions in academic publishing have long been viewed as a mark of shame, often associated with misconduct. However, this perception can in itself be detrimental to the integrity of the scientific record. As Tim Kersjes argues in an LSE Impact Blog, in order for research to be self-correcting it might be time to shift the narrative and start to view retractions as ‘neutral tools’.

Remove the stigma

Kersjes outlines how the stigmatisation of retractions deters authors from retracting their work, even when errors are discovered. Viewing retractions as a routine part of the scientific process could encourage more authors and editors to retract flawed work, ensuring that the published record remains reliable. While past suggestions have included systems that categorise retractions based on the reasons behind them, Karsjes cautions against this, questioning whether these approaches really remove stigma or have the unintended consequence of increasing it.

Standardise reporting

Meanwhile, The Scholarly Kitchen reported on relevant new guidance by the National Information Standards Organisation (NISO). The Communication of Retractions, Removals, and Expressions of Concern (CREC) Recommended Practice emphasises consistency and transparency in the way that retractions are communicated, rather than focusing on the reason for retraction. In particular, it recommends:

  • consistent terminology, including naming protocols
  • retraction status to be clearly indicated in the title of the article
  • use of watermarks and labels on landing pages
  • clear responsibilities regarding handling of associated metadata.

The way forward

It is crucial for all stakeholders—authors, editors, and publishers—to embrace retractions as correction tools and for retractions to be communicated clearly and consistently. In doing so, we can foster a culture whereby the integrity of published research is prioritised above all else.

————————————————–

Do you believe that retractions should be treated as neutral corrections in academic publishing?

]]>
https://thepublicationplan.com/2024/09/25/retractions-as-corrections-shifting-the-narrative/feed/ 0 16501
The paper mill problem: are AI tools the answer? https://thepublicationplan.com/2024/08/01/the-paper-mill-problem-are-ai-tools-the-answer/ https://thepublicationplan.com/2024/08/01/the-paper-mill-problem-are-ai-tools-the-answer/#respond Thu, 01 Aug 2024 16:00:21 +0000 https://thepublicationplan.com/?p=16249

KEY TAKEAWAYS

  • In a test run, a new AI-based system developed by scientific publisher Wiley flagged 10–13% of submitted manuscripts as potential fakes.
  • Generative AI tools could help combat the threat posed by paper mills to research integrity.

An AI-based service designed to detect bogus scientific articles flagged 10–13% of submitted manuscripts in a pilot run, according to a blog post by Ivan Oransky for Retraction Watch. The fake papers were caught by publisher Wiley’s Papermill Detection service, which screens submissions ahead of editorial review. The multi-tool system is a promising development in ongoing efforts to ensure the integrity of published research.

Spotting fake articles

Paper mills are paid to produce fake research papers, which can appear very similar to genuine manuscripts. According to Wiley, its new system uses 6 different approaches to identify what it calls “potentially compromised research content”:

  • checking for similarity with existing paper mill papers
  • flagging the use of “tortured phrases
  • identifying authors with unusual publication behaviour
  • verifying the identity of researchers
  • detecting potential misuse of generative AI
  • checking that manuscripts fall within a journal’s scope.

The test run involved over 270 Wiley journals, which rejected between 600–1,000 submitted manuscripts per month once they started using the tool. A spokesperson for Wiley told Retraction Watch that flagged papers would not automatically be rejected, but would be considered by an editor before being processed further. The publisher says it is partnering with Sage and IEEE for its next testing phase, and aims to roll out the service as early as next year.

The test run involved over 270 Wiley journals, which rejected between 600–1,000 submitted manuscripts per month once they started using the tool.

Paper mill problems

Paper mills are a major source of articles that end up being retracted after publication. Most manuscripts retracted in 2023 were published by Hindawi, a subsidiary of Wiley, with a high proportion involving Chinese authors. This lead to a government-initiated review that required all university researchers in China to declare their retracted papers.

Last year, Wiley closed 4 Hindawi journals due to paper mill issues and announced that it will stop using the Hindawi brand. Wiley has since discontinued another 19 journals overseen by Hindawi, which it said was due to portfolio integration.

Possible solutions on the horizon

Investigations into retractions should help ensure the integrity of published research, but there is growing interest in using new tools such as Papermill Alarm to help stop fake papers getting published in the first place. Wiley say their new service will complement the STM Integrity Hub, a resource developed by academic publishers that incorporates Papermill Alarm and other tools to help combat fake science.

While much discussion around developments in AI has focused on possible threats to research integrity, spotting bogus manuscripts could be an area where AI could help restore trust in published science.

————————————————–

Will AI tools that spot fake manuscripts drive paper mills out of business?

]]>
https://thepublicationplan.com/2024/08/01/the-paper-mill-problem-are-ai-tools-the-answer/feed/ 0 16249
Why are retraction rates rising? https://thepublicationplan.com/2024/07/24/why-are-retraction-rates-rising/ https://thepublicationplan.com/2024/07/24/why-are-retraction-rates-rising/#respond Wed, 24 Jul 2024 10:57:46 +0000 https://thepublicationplan.com/?p=16215

KEY TAKEAWAYS

  • The retraction rate for biomedical science papers with corresponding authors based at European institutions quadrupled between 2000 and 2020.
  • Unreliable data has emerged as a leading reason for retraction, while duplication remains a key factor.

Research misconduct remains a major concern, with increasing efforts dedicated to monitoring retraction rates – and the underlying reasons. An analysis recently published in Scientometrics and discussed in Nature news uncovered a quadrupling of retraction rates since 2000 among biomedical science articles with corresponding authors based at European institutions, from about 11 per 100,000 articles to almost 45 per 100,000 in 2020.

Why are articles retracted?

Fabián Freijedo-Farinas and colleagues reviewed over 2,000 retracted English-, Spanish-, and Portuguese-language articles collated by Retraction Watch to identify underlying reasons. Research misconduct was the most prevalent factor, accounting for 67% of cases, while 16% of retractions were due to honest errors (with no reason provided for the remainder). Research misconduct-related retractions were due to:

Reasons have shifted over time, with authorship and affiliation issues falling from one of the top reasons to joint 5th of 7. Duplication has remained steady as a cause, while retractions due to unreliable data – including bias and lack of original data availability – have skyrocketed. The authors suggest paper mills have a major role to play.

However, it’s not the same story across Europe: of the 4 countries with the most retractions, the proportion of duplication-related retractions has fallen in the UK but substantially increased in Italy and Spain.

Why are retraction rates increasing?

Arturo Casadevall, who identified similar rates of research misconduct-related retractions in a 2012 analysis, commented that the overall hike in retraction rates could be due to authors, institutions, and journals increasingly viewing retraction as the best route to correct the scientific record.

The overall hike in retraction rates could be due to authors, institutions, and journals increasingly viewing retraction as the best route to correct the scientific record.

In addition, publications have increasingly drawn the attention of online sleuths, who may raise concerns with journals, according to research integrity specialist Sholto David. New digital technologies are also making it easier to screen publications for suspicious text or data. Retraction Watch co-founder Ivan Oranksy believes use of plagiarism-detection software could be partially responsible for the increase; looking to the future, tools like image manipulation detectors could mean retraction rates rise further.

After reading the article, click here for a brief survey and to receive your authorization code for your Credit Tracker. This serves as documentation for the activity.

————————————————–

How much do you think increasing use of image manipulation detectors will impact retraction rates?

]]>
https://thepublicationplan.com/2024/07/24/why-are-retraction-rates-rising/feed/ 0 16215
Reviewing retractions and research misconduct: a national solution? https://thepublicationplan.com/2024/04/30/reviewing-retractions-and-research-misconduct-a-national-solution/ https://thepublicationplan.com/2024/04/30/reviewing-retractions-and-research-misconduct-a-national-solution/#respond Tue, 30 Apr 2024 10:56:36 +0000 https://thepublicationplan.com/?p=15511

KEY TAKEAWAYS

  • A recent government-initiated national review required all university researchers in China to declare research retractions.
  • Results are awaited, but outputs of national monitoring schemes of this nature could help to reduce research misconduct.

Numerous papers are retracted from academic journals each year, owing to honest mistakes or research misconduct. In 2023, most retractions were from Hindawi, a subsidiary of the publisher Wiley. A recent analysis performed by Nature revealed a high proportion of those retracted articles involved Chinese co-authors. In response, the Chinese government issued a national notice to universities to investigate retracted research papers and misconduct. Now, a recent Nature News article by Smriti Mallapaty summarises the key details of the review and discusses the wider impact it could have on academia.

Notice calling for disclosure of retractions

The notice, issued by the Ministry of Education’s Department of Science, Technology and Informatization, called for:

  • a record of listed and unlisted retractions from English- and Chinese-language journals from the past 3 years
  • reasons for retractions, such as misconduct (eg, image manipulation), or an honest mistake
  • penalties for misconduct or failure to declare retracted articles (eg, salary cuts, bonus withdrawals, demotions or suspensions from grant applications).

As reported by Mallapaty, this is considered to be the first national review on this scale, with a clearer target and broader scope than earlier efforts.

Short timeframe to complete review

Mallapaty also flagged that universities were required to complete their reviews within a strict timeframe, and that views on this approach were somewhat mixed. While some felt that the tight deadline might have ensured that universities worked hard to complete their reviews on time, others suggested that universities may only have submitted preliminary reports.

Impact of national review

Although the next actions from the Ministry are unclear, it is suggested that publicising the reasons for retractions could be useful alongside existing online retraction notices. A yearly review could also ensure universities monitor research integrity.

Science- and innovation-policy researcher Li Tang says “cultivating research integrity takes time, but China is on the right track”.

“Cultivating research integrity takes time, but China is on the right track”.

With reports submitted in mid-February, it will be interesting to see the ultimate impact of this national review and whether other countries undertake similar initiatives to investigate research retraction and misconduct.

————————————————–

What do you think – could national reviews that monitor research retractions and misconduct help to prevent such cases occurring?

]]>
https://thepublicationplan.com/2024/04/30/reviewing-retractions-and-research-misconduct-a-national-solution/feed/ 0 15511
Image manipulation: how AI tools are helping journals fight back https://thepublicationplan.com/2024/04/09/image-manipulation-how-ai-tools-are-helping-journals-fight-back/ https://thepublicationplan.com/2024/04/09/image-manipulation-how-ai-tools-are-helping-journals-fight-back/#respond Tue, 09 Apr 2024 12:34:13 +0000 https://thepublicationplan.com/?p=15454

KEY TAKEAWAYS

  • Image manipulation is a prevalent issue in academic publishing and a potential sign of research misconduct.
  • Many journals are now using AI tools to identify problematic images prior to publication; however, these will need to evolve as image manipulation becomes increasingly sophisticated.

Image manipulation in research articles is a growing concern. In a recent article for Nature News, Nicola Jones outlines how academic journals are embracing the use of artificial intelligence (AI) tools to identify manipulated images pre-publication.

How prevalent is image manipulation?

While often unintentional, image manipulation is prevalent and a potential sign of research misconduct. As reported by Jones, a 2016 study by science integrity consultant Dr Elisabeth Bik and colleagues found that nearly 4% of published biomedical science papers contained problematic figures. Similarly, around 4% of the 51,000 documented retractions in the Retraction Watch database flag a concern relating to published images. A more recent study by Dr Sholto David, which used AI to help identify suspect images, puts this figure at up to 16%.

What action is being taken by journals?

Jones highlights that a number of journals are taking steps to identify problematic images prior to publication. Some, including Journal of Cell Science, PLOS Biology, and PLOS One, either ask for or require the submission of raw images used in figures. In addition, many journals now use AI tools such as ImageTwin, ImaChek, and Proofig to screen images for signs of manipulation prior to publication. In January 2024, the Science family of journals revealed it will be using Proofig across all submissions, while other publishers are developing their own AI image integrity software.

Will AI put an end to this issue?

Jones reports that while AI tools make it faster and easier to detect problematic images, experts warn that they have limited capabilities to detect more complex manipulations, such as those made using AI. Bernd Pulverer, chief editor of EMBO Reports, cautions that as image manipulation becomes increasingly sophisticated it will become ever harder to detect, with existing screening tools soon becoming largely obsolete.

While AI tools make it faster and easier to detect problematic images, experts warn that they have limited capabilities to detect more complex manipulations such as those made using AI.

To stamp out image manipulation in the long run, we need to change how science is done, Dr Bik proposes. She calls for a greater focus on rigour and reproducibility and a crackdown on bullying and high pressure environments in research labs, which she believes create a culture where cheating is acceptable. We look forward to seeing how the development of increasingly advanced AI tools will help in the continuing fight against research misconduct.

————————————————

What do you think – are AI screening tools the answer to stopping image manipulation?

]]>
https://thepublicationplan.com/2024/04/09/image-manipulation-how-ai-tools-are-helping-journals-fight-back/feed/ 0 15454
ISMPP poll: falling prey to a predatory journal – what would you do? https://thepublicationplan.com/2024/02/22/ismpp-poll-falling-prey-to-a-predatory-journal-what-would-you-do/ https://thepublicationplan.com/2024/02/22/ismpp-poll-falling-prey-to-a-predatory-journal-what-would-you-do/#respond Thu, 22 Feb 2024 16:56:46 +0000 https://thepublicationplan.com/?p=15185

KEY TAKEAWAYS

  • In a recent ISMPP poll, medical publication professionals were asked “What would you do?” when presented with a challenging hypothetical scenario.
  • In the scenario, a manuscript had been inadvertently submitted to a predatory journal. The majority of respondents opted to seek legal advice, attempt to retract the manuscript (even if this was made difficult by the predatory journal), and submit the article elsewhere.

Despite widespread recognition that predatory journals are a threat to research credibility, there is no consensus on the best course of action if study sponsors or authors fall prey to them. A recent poll from the International Society for Medical Publication Professionals (ISMPP) asked publication professionals how they would deal with this sticky situation. Dr Eric Y Wong (Janssen) discussed the poll’s findings in the MAP newsletter, providing additional insight and recommendations.

The poll asked: You are a medical publication professional and have been supporting a client with a secondary manuscript for a Phase 3 study. You recommended target journals and worked with the author team throughout the submission process. After publication, it becomes apparent that the manuscript was submitted to a predatory journal. Unfortunately, the journal has no retraction policy and asks for a large sum of money in processing fees to retract the article. The authors have signed a copyright agreement giving the journal full copyright of the manuscript.

What would you do?

The results of the poll, which was answered by 72 respondents, were:

  • Request the editorial office to retract the manuscript and seek legal advice from the sponsor company; at the same time plan a resubmission to another journal: 56.9%
  • Suggest that the authors add some new and substantive data to support submission to a new, reputable journal as a secondary publication: 20.8%
  • Working with co-authors, write a response to the journal highlighting their policies and exposing them as a predatory journal and showcase this letter via authors’ social media channels: 16.7%
  • Recommend against retraction as this can negatively impact reputation, and review the copyright agreement to determine if you are able to submit elsewhere, such as to a preprint server: 5.6%

Dr Wong agreed that the option selected by most respondents was the most reasonable course of action in this difficult situation. Seeking legal advice is vital, particularly as copyright ownership is in question. While paying a retraction fee may be the quickest route to an initial resolution, Dr Wong warned of potential challenges in recouping this fee at a later stage, as contact and personnel details for predatory publishers are “often not available or fictitious”. He remarked that the other poll options would not resolve the primary concern for the authors, ie, that doubt may be cast on study credibility due to association with a disreputable journal.

Prevention is better than cure; hence, despite the challenges associated with identifying predatory journals, Dr Wong recommends that medical publication professionals maintain a comprehensive list of legitimate journals. Moreover, Good Publication Practice guidance states:

“If the credibility of a journal or conference cannot be reasonably ascertained, it should be avoided.”

Dr Wong therefore calls on publication professionals to remain vigilant and to carefully assess new journals.

————————————————

Have you ever worked on a publication that was unwittingly submitted to a predatory journal?

]]>
https://thepublicationplan.com/2024/02/22/ismpp-poll-falling-prey-to-a-predatory-journal-what-would-you-do/feed/ 0 15185