Citation – The Publication Plan for everyone interested in medical writing, the development of medical publications, and publication planning https://thepublicationplan.com A central online news resource for professionals involved in the development of medical publications and involved in publication planning and medical writing. Tue, 15 Apr 2025 13:00:25 +0000 en-US hourly 1 https://s0.wp.com/i/webclip.png Citation – The Publication Plan for everyone interested in medical writing, the development of medical publications, and publication planning https://thepublicationplan.com 32 32 88258571 How can we prevent retracted research from polluting the literature? https://thepublicationplan.com/2025/04/03/how-can-we-prevent-retracted-research-from-polluting-the-literature/ https://thepublicationplan.com/2025/04/03/how-can-we-prevent-retracted-research-from-polluting-the-literature/#respond Thu, 03 Apr 2025 08:53:50 +0000 https://thepublicationplan.com/?p=17589

KEY TAKEAWAYS

  • A literature-mining project featured in Nature News showed that some papers heavily rely on retracted research.
  • Technological tools, such as Guillaume Cabanac’s ‘Problematic Paper Screener’, could be part of the solution.

The results of a project featured in Nature News showed that problematic research continues to amass citations in the literature, even after retraction. Perhaps unwittingly, some authors are citing large numbers of retracted papers, which can cause questions to be raised about their own work once it is published. With retracted works accounting for as many as 65% of citations in some papers, there is a drive to harness technology to solve the problem.

You are what you cite

Worryingly, problematic papers can continue to be cited long after retraction. Although not a definitive sign of misconduct, heavily relying on research that has been withdrawn can retrospectively undermine a paper’s reliability. Unfortunately, no system exists for alerting researchers to retractions that may impact papers they have already authored.

Hoping to change this, “research integrity sleuth” Guillaume Cabanac, who is behind the project reported in Nature News, has developed tools such as his ‘Feet of Clay’ detector, which flags papers that cite retracted works (and his earlier ‘annulled detector’, which tracks the retracted papers themselves).

As well as encouraging publishers to conduct regular checks and notify authors of any retractions they have cited, Cabanac urges authors to make use of these and other tools, such as plug-ins that can automatically flag papers that have received comments on PubPeer, before submitting papers.

“You always have to double-check what you’re basing your work on.”

Tools to clean up the literature

As Cabanac himself reports, the Feet of Clay detector is just the latest addition to his Problematic Paper Screener, an automated system for flagging papers that may warrant further scrutiny. Launched in 2021, the screener tracks the global landscape of retractions and uses multiple detectors to automatically mine the literature for signs of potential misconduct, such as:

  • tortured phrases’ typically seen when AI re-writes existing scientific content
  • ‘fingerprints’ associated with random paper generators such as SCIgen or Mathgen
  • nonsensical content, such as cell lines or nucleotide sequences that have been fabricated
  • citations for journals known to have been hijacked.

Cabanac hopes that this software will facilitate continuous evaluation of published literature, with over 875,000 papers having been flagged and assessed via the system so far. He calls publishers and authors to action: “A combined preventive and curative effort from all involved is key to sustaining the reliability of the scientific literature — a crucial undertaking for science and for public trust.”

————————————————–

Who should be responsible for checking whether cited works have been retracted?

]]>
https://thepublicationplan.com/2025/04/03/how-can-we-prevent-retracted-research-from-polluting-the-literature/feed/ 0 17589
Open access publishing: broadening the impact of research https://thepublicationplan.com/2024/04/18/open-access-publishing-broadening-the-impact-of-research/ https://thepublicationplan.com/2024/04/18/open-access-publishing-broadening-the-impact-of-research/#respond Thu, 18 Apr 2024 14:36:42 +0000 https://thepublicationplan.com/?p=15568

KEY TAKEAWAYS

  • Open access research is not only cited more often than research behind paywalls, these citations also come from a broader range of fields, institutions, and regions.
  • This citation diversity is an important metric for exploring the true impact of research.

Open access publications have the potential to reach a wider audience, evidenced by increased citation counts compared with those behind paywalls. However, a recent large scale study by Dr Chun-Kai Huang and colleagues probes further, challenging the research community to look at the diversity of the readership behind those numbers.

In the largest study of its kind, the authors drew on 19 million research outputs and 420 million citation links, covering 2010–2019, to examine citation diversity. Their data reveal that:

  • open access is indeed associated with higher citation counts
  • open access consistently provides a ‘citation diversity advantage’ (in other words, open access publications are cited by researchers from a more diverse range of fields, institutions, and regions)
  • this citation diversity advantage is stronger for green open access than for gold.

One concern raised by the study is that, while open access increases citation diversity for research from regions typically under-represented in the published literature (eg, North Africa, Latin America, and the Caribbean), the advantage was more pronounced for those areas that already have greater visibility (eg, North America and Northern Europe). The authors question whether this is another instance in which the “rich get richer”, and urge the research community as a whole to advocate for equitable open access.

Open access consistently provides a ‘citation diversity advantage’.

Nevertheless, Dr Huang and colleagues put forward citation diversity as a key metric that looks beyond citation count to explore the impact of research. It would seem that if this measure was commonly used, and valued as highly as the impact factor, we could broaden the reach of medical research.

————————————————

What do you think – should we measure citation diversity of publications as standard?

]]>
https://thepublicationplan.com/2024/04/18/open-access-publishing-broadening-the-impact-of-research/feed/ 0 15568
Is there a citation advantage with open access? https://thepublicationplan.com/2021/06/24/is-there-a-citation-advantage-with-open-access/ https://thepublicationplan.com/2021/06/24/is-there-a-citation-advantage-with-open-access/#respond Thu, 24 Jun 2021 13:18:57 +0000 https://thepublicationplan.com/?p=9137

With Plan S coming into effect earlier this year, there has been much discussion regarding the potential benefits and challenges associated with open access. It has been suggested that open access articles are available to a larger audience than those published behind paywalls, leading to increased visibility, readership and impact. Citations are often used to measure these factors. However, a number of studies have failed to reach a consensus on whether an open access citation advantage exists.

Dr Isabel Basson and colleagues aimed to address this question by applying three measures of citation advantage:

  • normalised citation score (NCS) – indicates if an article received the expected number of citations and corrects for subject area and publication year
  • citedness – whether articles were cited by individuals other than the authors within 2 years of publication
  • most frequently cited – the percentage of publications in the most frequently cited 1%, 5% and 10% of articles in each subject area.

The study, published in Scientometrics, used open access labels in the Web of Science (WoS) metadata to identify open access articles published in journals listed in the Directory of Open Access Journals and compared measures of citation advantage with subscription journal articles. Limiting the articles to English-language only to avoid a potentially confounding effect of language, the authors identified over 3.6 million articles of which 87.3% were published in subscription journals and 12.7% were published open access. The proportion of open access versus subscription journal articles varied considerably with individual subject areas.

Basson and colleagues reported results for the three measures of citation advantage:

  • NCS –a relationship between NCS and access status was found in 76 (30%) of the 250 WoS subject areas investigated. An open access citation advantage was only seen in one subject area; in the remaining 75 subject areas subscription journal articles showed a citation advantage.
  • Citedness – a relationship between citedness and access status was seen in fewer than half of subject areas, only 4 of which showed an open access citation advantage.
  • Most frequent cited – the citation advantage favoured subscription journal articles rather than open access journal articles in the majority of subject areas.

Across all measures of citation advantage, only six of the 250 subject areas in WoS were reported to experience an open access citation advantage compared with subscription journal articles.

This study was one of the first to use open access labels in the WoS metadata to investigate citation advantage with access status. Overall, the authors conclude that access status accounts for little of the variability in the number of citations an article receives and suggest that other factors need to be considered when explaining variation in citation.

——————————————————–

Do you find the reported lack of citation advantage with open access surprising?

——————————————————–

]]>
https://thepublicationplan.com/2021/06/24/is-there-a-citation-advantage-with-open-access/feed/ 0 9137
A bot is monitoring new publications for retracted article citations https://thepublicationplan.com/2021/06/01/a-bot-is-monitoring-new-publications-for-retracted-article-citations/ https://thepublicationplan.com/2021/06/01/a-bot-is-monitoring-new-publications-for-retracted-article-citations/#respond Tue, 01 Jun 2021 14:54:46 +0000 https://thepublicationplan.com/?p=8975

Despite steps taken by researchers and journals to maintain research integrity, a small proportion of published papers (around 0.04% as of 2018) go on to be retracted. Lately, retractions of high profile articles related to COVID-19 have been well publicised, although this is not the norm for other topics. Because retractions are often inadequately communicated, these papers may still be cited in other publications without mention of the retraction—potentially misleading the audience or even invalidating meta-analysis results. A recent article in Nature Index discusses the risks that retractions pose to the scientific literature and the steps being taken to tackle this issue.

One such strategy is the scite Reference Check bot, which flags publications that have cited retracted articles post-retraction on Twitter. The tool can also be used to check the references in an uploaded manuscript, for a small fee. While the tool will incorrectly flag citations which appropriately specify that the article was retracted, authors rarely provide this kind of clarification within the citation. As such, the tool is not expected to generate many false positives.

The bot is a new addition to the arsenal of resources that can be used to identify retracted articles.

The bot is a new addition to the arsenal of resources that can be used to identify retracted articles, including the Retraction Watch Database, Zotero and Open Retractions. These tools acknowledge the threat that retractions pose to science and represent a meaningful step towards achieving greater research integrity.

——————————————————–

When developing a publication, which strategy would you be most likely to use to check whether any of your references have been retracted?

——————————————————–


]]>
https://thepublicationplan.com/2021/06/01/a-bot-is-monitoring-new-publications-for-retracted-article-citations/feed/ 0 8975
Journal citation counts increase despite discontinuation from Scopus https://thepublicationplan.com/2020/07/16/journal-citation-counts-increase-despite-discontinuation-from-scopus/ https://thepublicationplan.com/2020/07/16/journal-citation-counts-increase-despite-discontinuation-from-scopus/#comments Thu, 16 Jul 2020 10:21:47 +0000 https://thepublicationplan.com/?p=6997 Three businessmen push arrows to change arrow direction

Scopus is the largest bibliometric database of its kind, containing the greatest number of abstracts and articles from peer-reviewed academic journals. It also provides a range of journal, article, and author metrics. The quality of journals indexed in Scopus is reviewed at regular intervals or when publication concerns are raised. If a journal is found to fall below a specified standard, or the publication concerns are valid, the journal may be removed from the database. However, articles from these journals published prior to discontinuation remain part of Scopus and can, therefore, be cited. In their recent article published in F1000Research, Dr Cortegiani and colleagues scrutinised the citation metrics and other features of journals that had been discontinued by Scopus due to publication concerns.

The authors looked at 317 journals (from 135 publishers) that had been discontinued. Key findings included:

  • The mean number of citations per year and per document were both significantly higher after journal discontinuation compared with before.
  • Publishers with the most discontinued journals were Academic Journals Inc. (39 titles), Asian Network for Scientific information (19 titles), and the OMICS Publishing Group (18 titles).
  • Subject areas with the most discontinued journals were medicine (16%), agriculture/biological science (11%), and pharmacology, toxicology and pharmaceutics (10%).
  • Open access publication models were used by 93% of discontinued journals.
  • Twenty-three percent of discontinued journals were included in Cabell’s blacklist and 2% in Cabell’s whitelist, while 77% were included in Beall’s list of predatory journals or publishers.
  • Nineteen percent had also been discontinued from the Directory of Open Access Journals.

The authors discuss how the metrics provided by Scopus may be used by institutions to rank journals in order to evaluate the publishing performance of current or potential employees, allocate financial bonuses, or evaluate funding applications. Rigorous quality control of content in Scopus is therefore important to ensure the accuracy of these activities.

The authors note that many of the discontinued journals displayed predatory behaviours, and it is right that they are no longer indexed in Scopus. They also feel that it would be unfair to remove articles published prior to discontinuation as this would punish researchers who chose to publish in these journals unaware of the quality issues or before a deterioration in journal performance. However, the authors conclude that clearer warnings to highlight whether articles have come from discontinued journals, alongside other creative solutions, are required to ensure the reliability of Scopus metrics both at the journal and author level.

Note: There is a poll embedded within this post, please visit the site to participate in this post's poll.

——————————————————–

Summary by Alice Wareham PhD, CMPP from Aspire Scientific

——————————————————–

With thanks to our sponsors, Aspire Scientific Ltd and NetworkPharma Ltd


]]>
https://thepublicationplan.com/2020/07/16/journal-citation-counts-increase-despite-discontinuation-from-scopus/feed/ 1 6997
The link between data sharing and citation counts https://thepublicationplan.com/2019/10/11/the-link-between-data-sharing-and-citation-counts/ https://thepublicationplan.com/2019/10/11/the-link-between-data-sharing-and-citation-counts/#respond Fri, 11 Oct 2019 08:20:37 +0000 https://thepublicationplan.com/?p=6082 File transfer. Two laptops with folders on screen and transferred documents. Copy files, data exchange, backup, PC migration, file sharing concepts. Flat design vector illustration

As journals attempt to increase the openness and reproducibility of published data through mandatory data sharing policies, researchers from The Alan Turing Institute have examined the nature of data sharing statements and how they affect article impact.

In a preprint published on arXiv.org, researchers categorised the data sharing statements of 531,889 journal articles published by Public Library of Science (PLoS) and BioMed Central (BMC), since the introduction of data sharing policies. (Data sharing statements were required or encouraged from 2011 and mandated in 2015 by BMC, and mandated from 2014 for PLoS.)

Data sharing statements were categorised based on whether underlying data were available via a direct link to a repository, available in the paper or supplementary materials, available upon request, or not available at all. Articles linking to source data repositories were found to make up only a fraction (12–21%) of the total, while the most common data sharing statements encapsulated the default journal guidance wording of ‘not applicable’ or ‘data can be found within the article body or supplementary material’.

The researchers found that articles linking to a data repository had a 25% higher citation count than average. These findings expand on previous studies showing a similar link between open data and impact in other fields, providing evidence of a multi-disciplinary effect.

The authors acknowledge the potential for bias; articles with access to suitable repositories may be more likely to be authored by larger, successful research groups with greater resources at their disposal. However, efforts are underway to overcome these barriers via initiatives such as Dryad and FlowRepository, or journal linked repositories such as figshare.

The authors suggest that sharing data enhances the credibility of an article’s results, as it supports reproducibility, and encourages re-use (which might further contribute to citation counts). As an increasing number of funding bodies require that publications include data sharing statements, these findings illustrate the benefits to authors and researchers.

Note: There is a poll embedded within this post, please visit the site to participate in this post's poll.

——————————————————–

Summary by Julianna Solomons PhD, CMPP from Aspire Scientific

——————————————————–

With thanks to our sponsors, Aspire Scientific Ltd and NetworkPharma Ltd


]]>
https://thepublicationplan.com/2019/10/11/the-link-between-data-sharing-and-citation-counts/feed/ 0 6082
Can preprints increase article citations and impact? https://thepublicationplan.com/2019/07/30/can-preprints-increase-article-citations-and-impact/ https://thepublicationplan.com/2019/07/30/can-preprints-increase-article-citations-and-impact/#respond Tue, 30 Jul 2019 09:11:48 +0000 https://thepublicationplan.com/?p=5953 Social media app and social networks

There has been much discussion of the potential advantages and disadvantages of preprints in medical publishing over recent years, and last month’s launch of medRxiv was much anticipated. While the community watches with interest to see how this new platform fares, findings from bioRxiv provide a useful indication of what we might expect to see. For instance, a recent preprint posted on the platform reports that papers posted as a preprint before being published in a peer reviewed journal have higher citation levels and Altmetric scores than other articles.

Nicholas Fraser and his co-authors evaluated articles that had been submitted as preprints to bioRxiv between November 2013 and December 2017. When compared with articles without a preprint, they found that:

  • The number of citations to journal articles with preprints was 61% higher.
  • The citation advantage continued 3 years after publication, with average monthly citations per paper around 50% higher for articles with a preprint.
  • Articles with preprints had higher mean counts for all Altmetrics assessed (tweets, blogs, mainstream media outlets, Wikipedia and Mendeley).

The group also noted that:

  • Further work is needed to identify the cause of this citation advantage, which did not appear to be driven by early access or quality effects.
  • Although there was a strong preference to cite a published article rather than the corresponding preprint, citations were also made to preprints themselves, some of which were not subsequently published in peer reviewed journals. The authors highlight that the increased willingness of researchers to cite unreviewed work may be a consideration in current debates on the role of peer review.
  • bioRxiv preprints themselves were widely shared on Twitter and on blogs, but received far less online attention in mainstream media outlets and Wikipedia than the final peer reviewed articles. Fraser and his colleagues suggest that, while authors are comfortable sharing preprints with their peers using informal media platforms, there may be an unwillingness to disclose non-reviewed research to the public.

The authors conclude that “In the continuing online debates surrounding the value of preprints and their role in modern scientific workflows, our results provide support for depositing preprints as a means to extend the reach and impact of work in the scientific community.”

Note: There is a poll embedded within this post, please visit the site to participate in this post's poll.

—————————————————–

Summary by Debbie Sherwood BSc from Aspire Scientific

——————————————————–

With thanks to our sponsors, Aspire Scientific Ltd and NetworkPharma Ltd

 

]]>
https://thepublicationplan.com/2019/07/30/can-preprints-increase-article-citations-and-impact/feed/ 0 5953
Open source tools: why citations are important https://thepublicationplan.com/2019/07/09/open-source-tools-why-citations-are-important/ https://thepublicationplan.com/2019/07/09/open-source-tools-why-citations-are-important/#respond Tue, 09 Jul 2019 14:53:21 +0000 https://thepublicationplan.com/?p=5857 Website design. Developing programming and coding technologies.Open source tools and packages such as NumPy, SciPy and scikit-image are used in research for analysing and processing data. However, such tools are often not cited in the publications that result from their use. This issue is highlighted by Juan Nunez-Iglesias, a research scientist at the University of Melbourne, in a recent blog post.

In academia, citations are a currency, proving that your work has been useful for others. They contribute significantly to the success of funding applications and career progression. As Nunez-Iglesias highlights, code on the other hand, is not normally recognised as a valuable output by universities or funding bodies – so code developers often write papers describing their software, facilitating its citation. Nunez-Iglesias points out that most open source software developers are practising scientists in academia, not industry: citations are crucial for these contributors, but this is often overlooked by those using the software.

Nunez-Iglesias emphasises that the lack of citation of papers for open source tools is symptomatic of a wider problem: open source software is chronically undervalued. In a second blog post, he stresses that a structural change is required that recognises the importance of such software in science. He hopes that drawing attention to this issue will prompt a cultural change, ensuring that in the future, open source developers get the recognition that they deserve.

Note: There is a poll embedded within this post, please visit the site to participate in this post's poll.

——————————————————–

Summary by Beatrice Tyrrell DPhil from Aspire Scientific

——————————————————–

With thanks to our sponsors, Aspire Scientific Ltd and NetworkPharma Ltd


]]>
https://thepublicationplan.com/2019/07/09/open-source-tools-why-citations-are-important/feed/ 0 5857
Self-citation by peer reviewers: when is it appropriate and how to safeguard against malpractice https://thepublicationplan.com/2019/03/19/self-citation-by-peer-reviewers-when-is-it-appropriate-and-how-to-safeguard-against-malpractice/ https://thepublicationplan.com/2019/03/19/self-citation-by-peer-reviewers-when-is-it-appropriate-and-how-to-safeguard-against-malpractice/#respond Tue, 19 Mar 2019 09:07:35 +0000 https://thepublicationplan.com/?p=5665 Peer review self citation.jpg

The potential vulnerability of the peer review system to unethical behaviour is highlighted in an article from Bioinformatics. The journal reports a case concerning a peer reviewer who asked for the addition of more than 30 citations in each article reviewed, of which ~90% were self-authored and the remainder highly cited their work. The journal has since banned the reviewer. The decentralised nature of the peer review system was felt to have contributed to this behaviour going undetected for so long. Editors and authors are likely to be reluctant to accuse someone of unethical conduct when they may only have one instance upon which to base their judgements.

The journal now calls for greater scrutiny by journal editors to ensure that citations added post peer review are relevant and important. Authors are encouraged to be vigilant and not to be tempted to comply with unjustified suggestions for citations to satisfy reviewers. Bioinformatics has also updated its guidance to peer reviewers (not publicly available), allowing the inclusion of a reviewer’s own papers only when there is clear rationale that the article under review would otherwise be scientifically weaker. Reviewers are asked to refrain from requesting the addition of ‘substantial’ numbers of references — defined as more than one reference per printed page — and should not make vague mention of incomplete reviews of the field but rather specify the nature of the missing studies to be included.

To provide a more cohesive record of peer reviewer behaviour, it is suggested that journals track which peer reviewers make self-citation requests. Where misconduct is suspected, the Committee on Publication Ethics (COPE) has guidelines on how editors should approach sharing of information with other editors within the confines of confidentiality. It is hoped in the future that centralised resources such as the publication activities tracking website Publons and researcher identifier ORCiD may help to identify patterns of unethical behaviour. Once a concern has been raised, the article emphasises that simply removing a reviewer from reviewing duties may not address the wider implications. Instead, COPE advises contacting the reviewer for an explanation as a first step, and if it is unsatisfactory, bringing the matter to the attention of their immediate supervisor.

Note: There is a poll embedded within this post, please visit the site to participate in this post's poll.

——————————————————–

Summary by Julianna Solomons PhD, CMPP from Aspire Scientific

——————————————————–

With thanks to our sponsors, Aspire Scientific Ltd and NetworkPharma Ltd


]]>
https://thepublicationplan.com/2019/03/19/self-citation-by-peer-reviewers-when-is-it-appropriate-and-how-to-safeguard-against-malpractice/feed/ 0 5665
Reduced impact factor due to journal growth https://thepublicationplan.com/2018/08/21/reduced-impact-factor-due-to-journal-growth/ https://thepublicationplan.com/2018/08/21/reduced-impact-factor-due-to-journal-growth/#respond Tue, 21 Aug 2018 13:45:42 +0000 https://thepublicationplan.com/?p=5286 journal growth_impact factor.jpg

In a recent blog for The Scholarly Kitchen, Phil Davis describes how journal growth can negatively affect citation performance measures, most notably the Journal Impact Factor (JIF).

Davis outlines how a particular year’s JIF score indicates the citation level of papers published in the previous two years —Clarivate Analytics released the latest JIF score in Journal Citations Reports in June 2018. JIF is one of a range of citation indicators that can be used to measure the impact of a journal. Despite being criticised for some time now within scientific and publishing circles, JIF is widely used by authors, along with factors such as journal reputation, when selecting a journal for submission of their research. Consequently, a decrease in JIF could lead to fewer high-quality submissions.

Davis highlights that journal growth can reduce JIF because, for most journals, papers receive fewer citations in their second year than their third. Davis explains that “if a journal grows, its JIF calculation becomes unbalanced with a larger group of underperforming 2-year old papers and a relatively smaller group of 3-year old papers. The overall result is a decline the JIF score. Conversely, a journal that shrinks can expect an artificial boost in its JIF, all other factors remaining the same.” He suggests that publishers aiming for journal growth should consider doing this strategically, in order to avoid artificially depressing future citation scores. This could include focussing on growth at the beginning of the calendar year. However, balancing this with production schedules and the needs of authors may be far from simple!

Note: There is a poll embedded within this post, please visit the site to participate in this post's poll.

——————————————————–

Summary by Sophie Boyd, MSc Science Communication student at the University of Manchester. Contact Sophie at s_e_boyd@hotmail.com

——————————————————–

With thanks to our sponsors, Aspire Scientific Ltd and NetworkPharma Ltd.


]]>
https://thepublicationplan.com/2018/08/21/reduced-impact-factor-due-to-journal-growth/feed/ 0 5286