Reporting guidelines – The Publication Plan for everyone interested in medical writing, the development of medical publications, and publication planning https://thepublicationplan.com A central online news resource for professionals involved in the development of medical publications and involved in publication planning and medical writing. Thu, 26 Jun 2025 07:24:25 +0000 en-US hourly 1 https://s0.wp.com/i/webclip.png Reporting guidelines – The Publication Plan for everyone interested in medical writing, the development of medical publications, and publication planning https://thepublicationplan.com 32 32 88258571 Seeing the full picture: the RIVA-C checklist for research infographics https://thepublicationplan.com/2025/06/26/seeing-the-full-picture-the-riva-c-checklist-for-research-infographics/ https://thepublicationplan.com/2025/06/26/seeing-the-full-picture-the-riva-c-checklist-for-research-infographics/#respond Thu, 26 Jun 2025 07:24:23 +0000 https://thepublicationplan.com/?p=18038

KEY TAKEAWAY

  • The RIVA-C checklist helps to create clear, accurate, and standardised infographics and avoid misinterpretation of results of comparative studies.

In the evolving landscape of scientific communication, visual tools such as infographics and visual abstracts are increasingly used to present research findings. While they offer quick and accessible summaries, concerns have emerged about their accuracy, clarity, and completeness – especially when used to convey complex comparative studies. To address these challenges, Joshua R. Zadro and colleagues developed the Reporting Infographics and Visual Abstracts of Comparative studies (RIVA-C) checklist and guide, a tool designed to improve the quality and reliability of infographics summarising comparative studies of health and medical interventions.

Why was RIVA-C developed?

Studies have shown that infographics can reduce full-text views, as readers turn to the infographic for a quick summary rather than reading the full article. However, infographics do not always include all the details needed to fully understand a study, increasing the risk of misinterpretation. The authors argue that previous infographic guidelines were either not rigorously developed or focused mainly on formatting and design.

Previous infographic guidelines were either not rigorously developed or focused mainly on formatting and design rather than content quality.

How was RIVA-C developed?

The checklist was developed through a structured consensus process involving 92 participants from a range of professional backgrounds. This process was led by an international Steering Group to ensure diversity of input and methodological robustness.

The RIVA-C checklist

The full checklist  comprises 10 items across 3 categories: (1) study characteristics, (2) results, and (3) conclusions/takeaway message—each accompanied by detailed explanations and examples to aid practical implementation. The checklist was piloted over a 6-month period to evaluate its clarity, relevance, and usability.

The future of RIVA-C

RIVA-C aims to enhance the transparency and completeness of infographic reporting, reducing the risk of misinterpretation—especially in the context of influential studies like randomised controlled trials and systematic reviews.

The authors recommend that journals endorse RIVA-C, similar to other checklists listed on the EQUATOR Network, by including a link and relevant information on their “instructions for authors” page. They also stress that evaluating the implementation of RIVA-C will be essential to inform future modifications to the checklist, ultimately increasing its impact.

RIVA-C may provide a path to improving the clarity and integrity of comparative study infographics. The Steering Group also hopes RIVA-C will lead to the creation of similar checklists in other areas of healthcare research.

————————————————–

Do you think the RIVA-C checklist will improve the quality of infographics?

]]>
https://thepublicationplan.com/2025/06/26/seeing-the-full-picture-the-riva-c-checklist-for-research-infographics/feed/ 0 18038
Should data sharing be the next checklist item in reporting guidelines? https://thepublicationplan.com/2025/03/11/should-data-sharing-be-the-next-checklist-item-in-reporting-guidelines/ https://thepublicationplan.com/2025/03/11/should-data-sharing-be-the-next-checklist-item-in-reporting-guidelines/#respond Tue, 11 Mar 2025 09:51:52 +0000 https://thepublicationplan.com/?p=17365

KEY TAKEAWAY

  • Data sharing is an essential component of open science. The EQUATOR executive are calling for its inclusion in reporting guidelines.

Mandatory data sharing has been gaining pace in recent years, with data underlying US federally funded research soon needing to be made available immediately on publication. Since its inception in 2006, the Enhancing the QUAlity and Transparency Of health Research (EQUATOR) Network, best known for hosting an online library of research reporting guidelines, has been an important advocate for improving the quality and transparency of medical research reporting. Now, in a recent article for The BMJ, the EQUATOR executive group explain how data sharing could be made standard practice as the next goal for open science.

Are current reporting guidelines sufficient?

The authors describe data sharing as a broad concept, from study registration to protocol availability, then data availability. While many stakeholders have emphasised trial registration and transparency, the requirement to discuss data sharing is missing from most reporting guidelines. A notable exception is a 2020 addition to the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) checklist.

“While many stakeholders have emphasised trial registration and transparency, the requirement to discuss data sharing is missing from most reporting guidelines.”

The EQUATOR executive encourage new or updated reporting guidelines to cover data sharing, to ensure authors:

  • describe data sharing in protocol and results publications
  • are encouraged to share data
  • report on items linked to data sharing, like sharing of code and protocols.

What can authors do now?

While awaiting formal guidance on reporting data sharing, the EQUATOR executive suggest authors include the following in publications:

  • data definition, collection, and management methods
  • manuals or videos used in delivering the intervention
  • the statistical analysis plan, including any coding
  • any barriers to sharing data and other materials from the study.

The EQUATOR executive also encourage authors to adopt the findable, accessible, interoperable, and reusable (FAIR) and collective benefit, authority to control, responsibility, and ethics (CARE) principles when sharing data.

What’s next?

Beyond reporting guidelines, the authors signpost journal policies, funder expectations, and research assessment criteria as avenues to drive increased data sharing. They also point to data management and sharing plans, and suggest an opportunity for the EQUATOR Network to provide guidance on reporting these in a standardised way to further boost data sharing.

————————————————–

Do you routinely include data sharing statements in your publications?

]]>
https://thepublicationplan.com/2025/03/11/should-data-sharing-be-the-next-checklist-item-in-reporting-guidelines/feed/ 0 17365
Redefining research ethics for a fairer future https://thepublicationplan.com/2025/02/19/redefining-research-ethics-for-a-fairer-future/ https://thepublicationplan.com/2025/02/19/redefining-research-ethics-for-a-fairer-future/#respond Wed, 19 Feb 2025 10:21:53 +0000 https://thepublicationplan.com/?p=17252

KEY TAKEAWAYS

  • In late 2024, the Declaration of Helsinki underwent its most radical update in 60 years, including a revision to protect healthy volunteers.
  • However, critics suggest there is still a way to go and that other aspects of research ethics need to be incorporated, such as how to ensure the benefits of clinical research are felt by trial participants and their communities. 

The World Medical Association (WMA) recently updated a key ethical framework, the Declaration of Helsinki, at a scale not seen since the Declaration’s inception in 1964. As reported by Cathleen O’Grady in Science, the WMA hope that the changes will help to drive new standards in research equity.

“Humans”, not “subjects”, and the importance of healthy volunteers

As outlined by O’Grady, the 2024 revisions, which mark the tenth time the document has been updated, struck a new tone, with the Declaration’s title now referring to “human participants” rather than “human subjects”. The revisions, published in JAMA with accompanying editorial, also include the first ever mention of healthy volunteers, rather than considering only patient participants in research.

The 2024 revisions, which mark the tenth time the document has been updated, struck a new tone, with the Declaration’s title now referring to “human participants” rather than “human subjects”.

Expanded scope

These important steps forward are not the only signs of the Declaration’s expanded scope and ambition. Other changes include:

  • a direction that all those involved in medical research should adopt the Declaration’s principles, not just doctors.
  • a focus on ensuring vulnerable groups are included in medical research. Previous guidance aimed at protecting groups such as pregnant people inadvertently led to their exclusion from clinical trials. The revised Declaration notes that this can exacerbate disparities and that the harms of exclusion and inclusion should both be considered.

Radical, but complete?

The WMA General Assembly unanimously supported the 2024 update, which Dr Ashok Philip, President of the WMA, described as a “landmark revision”. However, as reported by O’Grady, some feel the revisions should have gone even further and that there are still key omissions, namely:

  • Benefits for participants and the wider community: the update does not look at ways to ensure that trial participants and their communities benefit from research.
  • Other types of research: the Declaration’s focus remains medical research, with epidemiological and behavioural studies not yet covered.
  • Data protection: the use of data from insurance or pharmaceutical company databases in research, and related issues of informed consent, are not discussed.

Nevertheless, the Declaration of Helsinki remains a cornerstone of ethical conduct in medical research, and the latest revisions provide an important focus on the dignity of research participants. Chair of the revision workgroup, Dr Jack Resneck Jr, calls on all involved in medical research to uphold these renewed principles.

————————————————–

What do you think is the most important topic to be included in future updates to Declaration of Helsinki?

]]>
https://thepublicationplan.com/2025/02/19/redefining-research-ethics-for-a-fairer-future/feed/ 0 17252
AI in SLRs: a tool, not a replacement https://thepublicationplan.com/2025/01/29/ai-in-slrs-a-tool-not-a-replacement/ https://thepublicationplan.com/2025/01/29/ai-in-slrs-a-tool-not-a-replacement/#respond Wed, 29 Jan 2025 16:23:32 +0000 https://thepublicationplan.com/?p=17112

KEY TAKEAWAYS

  • AI can enhance efficiency at every stage of SLR development, facilitating projects of scale that may previously have been unfeasible.
  • Use of AI in SLRs requires human oversight to ensure quality, transparency, reproducibility, and accuracy, with authors remaining accountable for their work.

As the demand for up-to-date systematic literature reviews (SLRs) grows, artificial intelligence (AI) is an increasingly appealing tool given its efficiency and ability to manage a vast evidence base. In their article for the International Society for Medical Publication Professionals (ISMPP), Polly Field, Thomas Rees, and Richard White highlight the benefits of AI in SLRs and key considerations for its use.

Benefits and pitfalls of AI

AI tools can streamline SLRs by analysing large datasets, summarising and grouping data, identifying patterns, and visualising findings – all in a fraction of the time it would take a team of researchers. However, careful attention must be given to how AI tools handle sensitive input data, including confidential content, copyrighted material, and personal information. Human validation remains essential to address potential inaccuracies, ‘hallucinations’, omissions, and bias produced by AI.

When and how should AI be used?

Whether and how to use AI in SLRs depends on the context. AI can help to:

  • frame research questions
  • optimise search strategies
  • screen studies
  • extract data
  • assess the quality of evidence, and
  • synthesise findings.

Different AI tools suit different stages, but the authors stress that all use of AI must adhere to strict principles of transparency, reproducibility, quality, and accuracy.

Medical publication professionals should familiarise themselves with existing guidance from the International Committee of Medical Journal Editors (ICMJE) and individual journal policies, as well as the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) 2020 guidelines on disclosure of AI use. These policies expound the following principles:

  • All authors remain fully accountable for the quality and accuracy of their work, including when AI is involved.
  • Transparency is critical – both the methods and acknowledgment sections must clearly document how and where AI was applied.

The authors emphasise that human oversight is essential, ensuring AI supports rather than replaces expert judgement.

“Human oversight is essential, ensuring AI supports rather than replaces expert judgement.”

As AI embeds deeper into SLRs, the authors encourage medical publication professionals to explore the potential use of AI in their research, while adopting key principles to ensure robust, transparent, and high-quality reviews.

————————————————–

Have you used generative AI tools in your work?

]]>
https://thepublicationplan.com/2025/01/29/ai-in-slrs-a-tool-not-a-replacement/feed/ 0 17112
Are conflicts of interest reported transparently in healthcare guidelines? https://thepublicationplan.com/2024/11/07/are-conflicts-of-interest-reported-transparently-in-healthcare-guidelines/ https://thepublicationplan.com/2024/11/07/are-conflicts-of-interest-reported-transparently-in-healthcare-guidelines/#respond Thu, 07 Nov 2024 08:24:35 +0000 https://thepublicationplan.com/?p=16753

KEY TAKEAWAYS

  • RIGHT-COI&F guides transparent reporting of COIs and funding in healthcare guidelines and policy documents of guideline organisations.
  • The checklist can also be used to assess the quality and completeness of reporting in published guidelines.

Healthcare guidelines substantially influence clinical practice and policy and are developed through extensive analysis and decision-making. Amid broader issues with accurate disclosure in medical publishing, a recent Annals of Internal Medicine article by Yangqin Xun and colleagues highlighted that while guidelines are especially sensitive to conflicts of interest (COIs) and funder influence, disclosure is generally poor.

Clear and complete reporting of COIs and funding is crucial for credibility and is monitored as a key open science indicator. Yet existing checklists, such as Reporting Items for practice Guidelines in HealThcare (RIGHT), often lack detail on how to report COIs and funding. Xun et al. aimed to address this, building on RIGHT to develop a COI- and funding-specific extension. RIGHT-COI&F can be used both while developing healthcare guidelines and to assess completeness of COI and funding reporting.

RIGHT-COI&F can be used both while developing healthcare guidelines and to assess completeness of COI and funding reporting.

Checklist development

RIGHT-COI&F development followed the recommendations of the Enhancing the QUAlity and Transparency Of health Research (EQUATOR) Network, based on a published protocol. Key steps were:

  • establishing working groups, including an expert panel
  • generating an initial checklist based on existing materials and a stakeholder survey
  • agreeing checklist items through surveying experts and consensus meetings
  • refining and testing the checklist.

RIGHT-COI&F items: policy and implementation

RIGHT-COI&F has 27 items, 18 focused on COIs and 9 on funding. Most items are related to policy and include:

  • defining the types of interest to be disclosed (eg, based on relevance, financial amount, or time period) and by whom
  • how accuracy and completeness is verified
  • processes for determining whether interests are conflicts
  • strategies to manage COIs
  • whether accepting funding from certain sources is restricted.

Organisational policies may fulfil these items, alleviating the need for detailed descriptions in individual guidelines.

The remaining items relate to implementation in individual projects, such as ensuring that declared interests are reported in detail, alongside the funding received (and the role of funders).

Next steps

To promote adoption, the authors plan to translate RIGHT-COI&F into multiple languages, disseminate it through academic networks, and seek endorsement by medical journals. Further assessment of real-life feasibility and impact is planned. We look forward to seeing how RIGHT-COI&F helps uphold transparency and trust in the healthcare space.

————————————————–

What do you think – will the RIGHT-COI&F checklist improve the transparency and credibility of guidelines?

]]>
https://thepublicationplan.com/2024/11/07/are-conflicts-of-interest-reported-transparently-in-healthcare-guidelines/feed/ 0 16753
EQUATOR and COS join forces to bring open science to the fore https://thepublicationplan.com/2024/09/17/equator-and-cos-join-forces-to-bring-open-science-to-the-fore/ https://thepublicationplan.com/2024/09/17/equator-and-cos-join-forces-to-bring-open-science-to-the-fore/#respond Tue, 17 Sep 2024 13:51:34 +0000 https://thepublicationplan.com/?p=16454

KEY TAKEAWAYS

  • A partnership between the EQUATOR Network and the Centre for Open Science (COS) could further the objectives of both organisations and raise awareness of best practices for open science.
  • Anticipated activities include educational outreach for researchers and updated reporting guidelines.

The open science movement aims to improve the transparency, accessibility, and reproducibility of scientific research. In May this year, the EQUATOR Network and Center for Open Science (COS) announced a 3-year collaboration in the hopes of accelerating the uptake of open science practices in health research through a series of shared activities.

A shared mission

Since launching the open science framework in 2012 – a project management tool designed to streamline collaboration on, and dissemination of, scientific research – COS have been on a mission to facilitate and incentivise open research practices. This approach is highly complementary to EQUATOR’s objective to improve research quality and transparency, leading the organisations to collaborate on development of the Transparency and Openness Promotion Guidelines in 2015.

Nearly a decade later, the two are joining forces officially.

What can we expect?

Planning is ongoing, but several potential strategies are being explored:

  • Educating researchers on processes such as writing and protocol creation, through a combination of outreach materials and toolkits
  • Developing toolkits to guide reviewers in assessing data sharing practices and protocol deviation
  • Increasing the visibility and use of existing tools, such as COS registration templates and EQUATOR reporting guidelines, through shared hosting
  • Integrating practices such as protocol posting, data sharing, and study replication into existing EQUATOR reporting guidelines, where these are not yet included.

In particular, COS is keen to utilise EQUATOR’s existing systems to enhance research credibility by promoting the uptake of preregistration.

The potential impact

Open science practices are already included in CONSORT, but inclusion in further reporting guidelines could scale-up adoption substantially. In addition, the robustness of EQUATOR’s reporting standards could offer further structure and visibility to COS’ ongoing research.

Director of the EQUATOR Network, David Moher, has expressed his excitement around the partnership:

Since its inception in 2006, the EQUATOR Network has worked hard to help improve comprehensive and transparent reporting of research. Collaborating with COS will help further achieve this objective.”

————————————————–

Do you think open science practices should be included in reporting guidelines?

]]>
https://thepublicationplan.com/2024/09/17/equator-and-cos-join-forces-to-bring-open-science-to-the-fore/feed/ 0 16454
Overcoming bias in ‘overviews of reviews’: a spotlight on appraisal tools https://thepublicationplan.com/2024/07/30/overcoming-bias-in-overviews-of-reviews-a-spotlight-on-appraisal-tools/ https://thepublicationplan.com/2024/07/30/overcoming-bias-in-overviews-of-reviews-a-spotlight-on-appraisal-tools/#respond Tue, 30 Jul 2024 13:10:50 +0000 https://thepublicationplan.com/?p=16233

KEY TAKEAWAYS

  • ‘Overviews of systematic reviews’ are a feature of evidence-based decision making, but are only as strong as the individual reviews they include. Evaluating potential biases and the methodological quality of systematic reviews is therefore crucial.
  • A recent article examines 2 recommended systematic review assessment tools, AMSTAR-2 and ROBIS. While both have value, their use requires proper training, time, and know-how.

Synthesising evidence from multiple systematic reviews (also known as conducting an umbrella review or  ‘overview of reviews’) can form a key part of evidence-based decision making and treatment guidelines. However, conducting effective ‘overviews of reviews’ requires careful planning to minimise bias, which can be present at either a primary study or individual review level. In a recent BMJ Medicine methods primer, Carole Lunny and colleagues address the challenges of assessing and reporting bias in systematic reviews. The group offer a detailed examination of AMSTAR-2 and ROBIS, two recommended appraisal tools, and provide practical guidance for authors of ‘overviews of reviews’.

AMSTAR-2 versus ROBIS

The group compared key features of each tool.

AMSTAR-2:

  • 16-item checklist
  • focuses on the methodological quality of systematic reviews of healthcare interventions, including risk of bias
  • reportedly favoured for its quick and easy-to-use format
  • may be preferred for broad assessment of systematic review quality.

ROBIS:

  • domain-based tool
  • 19 items, aimed at identifying biases in systematic reviews
  • useful for pinpointing concerns in review conduct and assessing relevance
  • requires “more thoughtful assessment and time”
  • may be preferred for more nuanced assessments, or comparisons of risk of bias across multiple types of systematic reviews.

Standardising ‘overviews of reviews’

The authors call for a standardised approach to ‘overviews of reviews’ to enhance their credibility and value.

Regardless of the appraisal tool used, the authors call for a standardised approach to ‘overviews of reviews’ to enhance their credibility and value. They outline several key recommendations:

  • Report methodological quality or bias by item, domain, and overall judgement, focusing on outcomes.
  • Discuss risk of bias for each outcome.
  • Highlight any individual review methodological quality issues or potential biases as limitations of the ‘overview of reviews’.
  • Use ROBIS to subgroup reviews by risk of bias, identifying overemphasised findings and excluding high-risk reviews.

An expanding toolkit

Previously, the launch of  PRISMA-S provided much-needed guidance on reporting literature searches within systematic reviews, and Cochrane’s Hilda Bastian proposed solutions to ensure that systematic review protocols were robust. Now, Lunny and colleagues’ primer, and the tools therein, sit alongside initiatives from the LATITUDES Network to form part of a drive to reduce bias in evidence synthesis.

————————————————–

Do you use a specific tool(s) when synthesising evidence from systematic reviews?

]]>
https://thepublicationplan.com/2024/07/30/overcoming-bias-in-overviews-of-reviews-a-spotlight-on-appraisal-tools/feed/ 0 16233
Building trust: ACCORD guidelines for reporting consensus methods https://thepublicationplan.com/2024/07/09/building-trust-accord-guidelines-for-reporting-consensus-methods/ https://thepublicationplan.com/2024/07/09/building-trust-accord-guidelines-for-reporting-consensus-methods/#respond Tue, 09 Jul 2024 10:48:23 +0000 https://thepublicationplan.com/?p=16181

KEY TAKEAWAY

  • The ACCORD reporting guidelines comprise a 35-item checklist that aims to improve the transparency of reporting on consensus methods.

The COVID-19 pandemic highlighted the need for effective knowledge-sharing to guide healthcare decisions. In rapidly evolving situations, reaching consensus among experts from diverse backgrounds is crucial, especially when evidence is emergent or inconsistent. This process is best achieved using formal consensus methods.

Despite their critical role in healthcare and policy decision-making, consensus methods are often inadequately reported, leading to inconsistencies and lack of transparency. To address these issues, the ACcurate COnsensus Reporting Document (ACCORD) project was established to develop comprehensive guidelines for reporting the numerous consensus methods used in medical research.

The ACCORD reporting guidelines aim to enhance trust in the recommendations made by consensus panels, benefiting authors, journal editors, reviewers, and, ultimately, patients through more reliable healthcare recommendations.

The ACCORD checklist was formulated using the EQUATOR Network’s methodology for developing reporting guidelines, with the full study protocol published in Research Integrity and Peer Review. The project began with a systematic review, followed by 3 rounds of the Delphi process and several steering committee meetings. To ensure a comprehensive perspective, a diverse panel was engaged, comprising 72 participants from 6 continents and various professional backgrounds, including clinical, research, policy, and patient advocacy. Through this rigorous process, a preliminary checklist was refined to a final list of 35 essential items covering all sections of a manuscript.

The ACCORD reporting guidelines aim to enhance trust in recommendations made by consensus panels, benefiting authors, journal editors, reviewers, and ultimately patients through more reliable healthcare recommendations.

————————————————–

What do you think – will the ACCORD guidelines improve the transparency of reporting on consensus methods?

]]>
https://thepublicationplan.com/2024/07/09/building-trust-accord-guidelines-for-reporting-consensus-methods/feed/ 0 16181
AI in scientific reporting: NASW’s position statement https://thepublicationplan.com/2024/05/21/ai-in-scientific-reporting-nasws-position-statement/ https://thepublicationplan.com/2024/05/21/ai-in-scientific-reporting-nasws-position-statement/#respond Tue, 21 May 2024 08:22:59 +0000 https://thepublicationplan.com/?p=15866

KEY TAKEAWAYS

  • NASW sets out its position on the use of AI, highlighting the importance of human writers and editors and the need for transparency.
  • NASW calls for members to follow these principles and for us all to remain vigilant in the use of AI to maintain integrity and accuracy in scientific reporting.

In the wake of organisations such as the International Society for Medical Publication Professionals (ISMPP) and Nature setting out their stance on the use of AI in medical publishing, the National Association of Science Writers (NASW) have now released their position statement on the use of generative AI tools.

Who are NASW?

NASW is a community of people who write and produce material intended to inform the public about science, health, engineering, and technology. At the forefront of NASW’s operating principles is their aim to “foster the dissemination of accurate information regarding science through all media normally devoted to informing the public”.

What is NASW’s position on AI?

In NASW’s statement, they highlight some of the current concerns around AI tools replacing human writers, including the potential for:

In light of these concerns, NASW go on to make the following commitments and recommend that members:

  • do not use generative AI tools to replace human writers and editors
  • do not support publication of content generated entirely by AI, without human input and oversight
  • do not use AI-generated images, except under very particular conditions and with safeguards in place
  • maintain transparency about the use of AI systems
  • support media unions in demanding worker protections and input into AI use.

What can you do?

NASW call on us all to “remain vigilant so that readers and writers alike can clearly distinguish between human- and algorithm-generated content”.

We must remain vigilant so that readers and writers alike can clearly distinguish between human- and algorithm-generated content.

————————————————–

What do you think – should we all be following the NASW guidelines to protect writers and the public from the potential pitfalls of AI?

]]>
https://thepublicationplan.com/2024/05/21/ai-in-scientific-reporting-nasws-position-statement/feed/ 0 15866
ICMJE recommendations update 2024: what’s new and what’s next? https://thepublicationplan.com/2024/04/02/icmje-recommendations-update-2024-whats-new-and-whats-next/ https://thepublicationplan.com/2024/04/02/icmje-recommendations-update-2024-whats-new-and-whats-next/#respond Tue, 02 Apr 2024 13:02:35 +0000 https://thepublicationplan.com/?p=15481

KEY TAKEAWAYS

  • Key updates to the ICMJE Recommendations for the Conduct, Reporting, Editing, and Publication of Scholarly Work in Medical Journals include guidance on the use of AI by authors, editors, and reviewers.
  • Other important updates include statements on fair authorship assignment, sustainability goals, funding support declarations, and protection of research participants.

The International Committee of Medical Journal Editors (ICMJE) recently updated its Recommendations for the Conduct, Reporting, Editing, and Publication of Scholarly Work in Medical Journals. Key updates provide guidance on appropriate authorship of research carried out in low- and middle-income countries (LMICs) and the use of artificial intelligence (AI) in generating and reporting data. The latest recommendations and an annotated version of the previous recommendations are both freely available on the committee’s website, and a summary of all updates is provided below.

  • Authorship: local investigators should be included as authors on publications reporting data from LMICs. As well as ensuring fairness, local author contributions provide additional context on the implications of research.
  • Use of AI (authors): if AI is used to provide writing assistance, this should be clearly stated in the article acknowledgements. The use of AI by researchers to help collect data or generate figures should be noted in the methods.
  • Use of AI (editors and reviewers): journal editors should be aware of potential confidentiality concerns if AI is used in the review process. Reviewers must request permission from the journal before using AI assistance.
  • Carbon emissions: all stakeholders in medical publishing should collaborate to work towards net zero carbon emissions.
  • Acknowledging funding support: funding statements should relate directly to the work being reported, for example: “This study was funded by A; Dr. F’s time on the work was supported by B.” Other potential conflicts of interest and general funding support should be included in the disclosures section.
  • Protection of research participants: authors should be prepared to provide approval documentation for their study if requested by editors.
  • Citations: wherever possible, cited references should be published articles rather than abstracts.

In an editorial published in Cureus, Sankalp Yadav takes a detailed look at the evolution of the recommendations and their impact on medical publishing, describing the latest updates as a “beacon of ethical guidance in the ever-evolving domain of biomedical research and publishing”. Yadav also discusses some of the ongoing challenges in implementing the ICMJE guidance, such as the promotion of fair and ethical authorship practices and keeping pace with new developments – something that may be particularly true for AI and its increasing impact across all areas of medical research and publishing.

If AI is used to provide writing assistance, this should be clearly stated in the article acknowledgements.

————————————————

Which aspect of the updated ICMJE recommendations do you believe will have the most positive impact on the quality and integrity of medical publications?

]]>
https://thepublicationplan.com/2024/04/02/icmje-recommendations-update-2024-whats-new-and-whats-next/feed/ 0 15481