Research integrity – The Publication Plan for everyone interested in medical writing, the development of medical publications, and publication planning https://thepublicationplan.com A central online news resource for professionals involved in the development of medical publications and involved in publication planning and medical writing. Wed, 26 Nov 2025 09:29:38 +0000 en-US hourly 1 https://s0.wp.com/i/webclip.png Research integrity – The Publication Plan for everyone interested in medical writing, the development of medical publications, and publication planning https://thepublicationplan.com 32 32 88258571 Over 100 institutions back eLife’s reviewed preprint model https://thepublicationplan.com/2025/11/26/over-100-institutions-back-elifes-reviewed-preprint-model/ https://thepublicationplan.com/2025/11/26/over-100-institutions-back-elifes-reviewed-preprint-model/#respond Wed, 26 Nov 2025 09:29:37 +0000 https://thepublicationplan.com/?p=18438

KEY TAKEAWAY

  • More than 100 institutions have declared their support for eLife’s reviewed preprint model, following the journal’s loss of impact factor.

Rather than only accepting papers recommended for publication by peer reviewers, eLife publishes all reviewed research as reviewed preprints. However, Clarivate, the provider of Web of Science, only indexes peer reviewed content, resulting in the loss of eLife’s impact factor for 2025. Rather than changing their publishing model, eLife agreed to be partially indexed in Web of Science’s Emerging Sources Citation Index (ESCI). But how has this been received?

As reported in Research Information, eLife surveyed over 100 institutions and funders to assess how their publishing model is viewed. Over 95% of respondents endorsed non-traditional publishing approaches like eLife’s, confirming publications will continue to be factored into hiring, promotion, and funding decisions.

Promoting integrity or outdated metrics?

Dr Nandita Quaderi, Senior Vice President and Editor-in-Chief of the Web of Science at Clarivate, stressed that policies must be applied universally to protect research integrity. Quaderi warned that “cover-to-cover indexing of journals in which publication is decoupled from validation by peer review risks allowing untrustworthy actors to benefit from publishing poor quality content”.

On the other hand, Ashley Farley, Senior Officer of Knowledge & Research Services at the Gates Foundation, believes Web of Science’s policy “reinforces outdated publishing metrics that hinder innovation”, while Damian Pattinson, Executive Director at eLife, noted that with increasing emphasis on open science, “eLife remains confident that its model represents the future of scholarly publishing – one that prioritises scientific quality, transparency, and integrity over outdated prestige metrics”.

“eLife remains confident that its model represents the future of scholarly publishing – one that prioritises scientific quality, transparency, and integrity over outdated prestige metrics.”
– Damian Pattinson, eLife

As debates over the future of the impact factor continue, Farley believes that “indexers must evolve to support responsible, transparent models like eLife’s”.

—————————————————

Are journal impact factors important when deciding where to publish research?

]]>
https://thepublicationplan.com/2025/11/26/over-100-institutions-back-elifes-reviewed-preprint-model/feed/ 0 18438
Will generative AI transform peer review? https://thepublicationplan.com/2024/12/10/will-generative-ai-transform-peer-review/ https://thepublicationplan.com/2024/12/10/will-generative-ai-transform-peer-review/#respond Tue, 10 Dec 2024 09:59:26 +0000 https://thepublicationplan.com/?p=16986

KEY TAKEAWAYS

  • Pilot tests at Springer Nature suggest generative AI tools could soon transform the review process for authors, editors, and reviewers.
  • The publisher emphasises that AI will complement—not replace—human peer reviewers.

Delivering high-quality, timely peer review is challenging in the digital era. Could generative AI (GenAI) provide solutions? In an opinion article for Research Information, Springer Nature’s Director of Content Innovation, Markus Kaindl, suggests that GenAI tools could soon transform the review process for authors, editors, and referees.

Modern peer review: a Gordian knot?

The modern peer review system is increasingly voluminous and complex. According to Kaindl, authors struggle to explain discipline-specific data to reviewers and grow frustrated with multiple requests for clarifications and revisions. Editors struggle to find appropriately qualified reviewers amidst ever-increasing submission volumes and tight deadlines. Due to time pressures and the complexities of highly specialised or interdisciplinary submissions, reviewers struggle to provide feedback that is detailed, constructive, objective, unbiased, and appropriate to the speciality.

AI-assisted…

Following pilot tests at Springer Nature, Kaindl reports that GenAI could soon:

  • provide “actionable feedback” to authors early in the submission process to help reduce the number of time-consuming revision rounds
  • accelerate the editorial process by drafting pre-review notes, highlighting key strengths and limitations in advance, and optimising journal selection
  • generate pre-review drafts for reviewers and facilitate cross-disciplinary reviewing by simplifying complex, discipline-specific submissions.

Furthermore, GenAI tools could be used to help safeguard research integrity during the review process. Springer Nature has developed two such tools: Geppetto detects AI-generated content (a potential indicator of paper mill activity), whereas SnappShot analyses PDF files with gel and blot images for evidence of duplication.

…but human-centred!

Kaindl stresses that GenAI should augment human expertise, not replace it, and says that AI tools should complement and enhance existing screening systems. According to Kaindl, Springer Nature’s approach to AI promotes active community engagement (with authors, editors, and reviewers), tackles AI bias through rigorous testing, and creates a safe environment for pilot testing to prevent the disclosure of unpublished manuscripts.

GenAI should augment human expertise, not replace it.

Kaindl states that GenAI may already be in use to assist peer review. By embracing GenAI technology, publishers could help define practices for its effective use, understand its limitations, and future-proof academic publishing.————————————————–

Have you used generative AI tools as part of the peer review process?

]]>
https://thepublicationplan.com/2024/12/10/will-generative-ai-transform-peer-review/feed/ 0 16986
Retracted papers: are we doing enough to put discredited research to rest? https://thepublicationplan.com/2023/01/12/retracted-papers-are-we-doing-enough-to-put-discredited-research-to-rest/ https://thepublicationplan.com/2023/01/12/retracted-papers-are-we-doing-enough-to-put-discredited-research-to-rest/#respond Thu, 12 Jan 2023 16:55:32 +0000 https://thepublicationplan.com/?p=12928

KEY TAKEAWAYS

  • Retracted journal articles can still be cited years later, disseminating misinformation and potentially affecting patient care.
  • Readers may find it difficult to identify articles that reference retracted work.

Although publications reporting flawed or fraudulent research can be retracted, many such papers linger in the literature thanks to post-retraction citations. In a Science article, Jeffrey Brainard describes how some retracted works are still being cited years later, with potential adverse consequences for patient care.

Avenell et al examined the impact of 27 clinical trial reports retracted between 2015–2019 on the publications that cited them: 70 systematic reviews and 18 clinical guidelines published between 2003–2020 included at least 1 of the retracted works in the analysis, and many did not warn readers that they cited discredited research. In 44% of cases, the authors assessed that removal of the retracted trial(s) would likely weaken the final conclusions substantially.

Systematic reviews and clinical guidelines are often used to guide medical treatments, and Mr Brainard warns that inclusion of retracted work can mislead clinicians and put patients at risk of harm.

Avenell’s team emailed the authors of the affected publications (with or without alerting the journal) to ask whether they believed any action needed to be taken in light of the retractions. Just over half of the authors replied, with 80% stating that they were unsure how to proceed or did not intend to amend their papers. The cited reasons included:

  • publication was too old
  • lack of time to re-analyse the data
  • removal of one retracted study would not affect the overall findings.

One year after the emails were sent, 1 of the papers had been retracted and warnings had been posted for a further 8; however, only 4 of these announcements were directly linked to the citing paper.

“Even if a retracted citation doesn’t change the bottom line, journals and authors have an obligation to say so publicly.”

The article ends on a positive note, highlighting the actions already taken to minimise the impact of retracted publications:

  • Bibliographic databases such as EndNote and Zotero flag papers included in the Retraction Watch database.
  • The International Committee of Medical Journal Editors recommends that journal editors routinely check submitted manuscripts for post-retraction citations.
  • Cochrane now adds a warning to systematic reviews citing retracted papers and asks authors of flagged reviews to revise their work.

We look forward to seeing how other journals and publishers might address this issue.

—————————————————–

In your opinion, who is most responsible for ensuring that findings from retracted clinical trials are not perpetuated?

]]>
https://thepublicationplan.com/2023/01/12/retracted-papers-are-we-doing-enough-to-put-discredited-research-to-rest/feed/ 0 12928
Research integrity: putting principles into practice https://thepublicationplan.com/2021/03/02/research-integrity-putting-principles-into-practice/ https://thepublicationplan.com/2021/03/02/research-integrity-putting-principles-into-practice/#respond Tue, 02 Mar 2021 16:51:49 +0000 https://thepublicationplan.com/?p=8199

Misconduct in medical research has the potential to mislead the scientific community which, in the worst cases, can have major repercussions on patients. Such misconduct can include fabrication, falsification, plagiarism and the emerging trend for ‘post-production misconduct’. In addition to these examples of scientific fraud, a lack of transparency, reproducibility and replicability in medical publications may also affect research integrity.

While there have been several key declarations on the principles of research integrity (such as European Code of Conduct for Research Integrity) occasional high-profile cases of misconduct still occur. The reasons behind misconduct in medical research have been well documented. As outlined in an editorial by Prof Lee Harvey, it is a long-term problem associated with the immense pressure that researchers are under to publish articles that attract funding, which has led to the so called ‘publish or perish’ mentality. This research environment has been compounded by the traditional citation-based metrics that have long been adopted by the scientific community.

In order to combat misconduct, attention is now turning towards how organisations can translate the principles of research integrity into practice. As highlighted in an editorial by Prof Niels Mejlgaard and colleagues, the EU’s next research funding programme will confirm a strong commitment to research integrity. The authors note:

“It is expected that institutions receiving funding from the €81-billion (US$96-billion) programme will be required to have clear plans and procedures in place for research integrity

To evaluate which topics should be addressed by organisations in their plans to promote research integrity, Mejlgaard et al conducted a study called Standard Operating Procedures for Research Integrity (SOPs4RI). They identified nine key areas that should be considered:

  • Research environment: ensure fair assessment procedures and prevent hypercompetition and excessive publication pressure.
  • Supervision and mentoring: create clear guidelines and set up training and mentoring for PhD supervisors.
  • Integrity training: establish training and counselling for researchers.
  • Ethics structures: establish review procedures that accommodate different types of research.
  • Integrity breaches: formalise procedures that protect whistle-blowers and those accused of misconduct.
  • Data practices and management: provide training, incentives and infrastructure to curate and share data according to FAIR principles.
  • Research collaboration: establish rules for transparent working with industry and international partners.
  • Declaration of interests: state conflicts in research, review and other professional activities.
  • Publication and communication: respect authorship guidelines and ensure openness and clarity in public engagement.

Research integrity recommendations together with procedures and other resources are accessible through the SOPs4RI website. Over the next few years these will be refined; the authors urge readers to provide views, concerns, and example of best practice to help tailor these resources. While the vast majority of research is undoubtedly honest, tools and resources such as those highlighted by SOPs4RI, may be needed to help organisations implement integrity principles and improve research.

Note: There is a poll embedded within this post, please visit the site to participate in this post's poll.

——————————————————–

Summary by Josh Lilly PhD from Aspire Scientific

——————————————————–

With thanks to our sponsor, Aspire Scientific Ltd


]]>
https://thepublicationplan.com/2021/03/02/research-integrity-putting-principles-into-practice/feed/ 0 8199