Quickly Share, Gain Feedback, and Improve Your Papers with Research Square

The HSLS Update has published numerous articles about preprints over the years. Here we introduce another iteration of the preprint movement — Research Square , a multidisciplinary platform that helps researchers share their work early, gather feedback, and improve their manuscripts prior to (or in parallel with) journal submission.

So what differentiates Research Square from other preprint servers? The focus is on “added value” features such as:

  • Increased discoverability and readability due to indexed and machine-readable full text in HTML
  • Commenting via a custom-built system or the hypothesis annotation tool
  • Figure rendering with a lightbox, which allows for zooming and downloading
  • Full metrics, including Altmetrics and Dimensions data
  • Research Square Badges to indicate preprint quality
  • Editing services to improve the manuscript prior to journal submission
  • Video and infographic services to help communicate the research

In addition, Research Square collaborates with Springer Nature on a free preprint service that provides authors with the opportunity to have their manuscript posted online in conjunction with submission to select journals . This service, called In Review , gives authors and readers access to the manuscript status through a peer review timeline during the article review process by the selected journal.

Research Square accepts many types of submissions: research articles, systematic reviews, methods articles, short reports, case reports, and data notes. The latter type is particularly compelling, as it provides an opportunity to post a brief write-up of a single dataset ( Data Note example ). All submissions are encouraged to include a Data Availability Statement documenting where to locate the data . Unacceptable submission types are literature reviews, hypotheses, opinions, theories, and commentaries, but manuscripts reporting negative results are included.

Each posted preprint is published under a Creative Commons CC-BY 4.0 license and assigned a Digital Object Identifier (DOI) issued through Crossref . The community-supported scholarly content preservation repository, Portico , permanently archives all content. In addition to preprints, Research Square posts protocols from Protocol Exchange , an open repository of community-contributed protocols sponsored by Nature Research. Explore the FAQ for additional information about the entire Research Square preprint platform.

~Carrie Iwema

Log in using your username and password

  • Search More Search for this keyword Advanced search
  • Latest content
  • For authors
  • Browse by collection
  • BMJ Journals More You are viewing from: Google Indexer

You are here

  • Volume 10, Issue 12
  • Systematic examination of preprint platforms for use in the medical and biomedical sciences setting
  • Article Text
  • Article info
  • Citation Tools
  • Rapid Responses
  • Article metrics

Download PDF

  • http://orcid.org/0000-0003-2579-9325 Jamie J Kirkham 1 ,
  • http://orcid.org/0000-0003-0568-1194 Naomi C Penfold 2 ,
  • Fiona Murphy 3 ,
  • Isabelle Boutron 4 ,
  • John P Ioannidis 5 ,
  • Jessica Polka 2 ,
  • http://orcid.org/0000-0003-2434-4206 David Moher 6
  • 1 Centre for Biostatistics, Manchester Academic Health Science Centre , University of Manchester , Manchester , UK
  • 2 ASAPbio , San Francisco , California , USA
  • 3 Murphy Mitchell Consulting Ltd , Chichester , UK
  • 4 Université de Paris, Centre of Research in Epidemiology and Statistics (CRESS), Inserm , Paris , France
  • 5 Meta-Research Innovation Center at Stanford (METRICS) and Departments of Medicine, of Epidemiology and Population Health, of Biomedical Data Science, and of Statistics , Stanford University , Stanford , California , USA
  • 6 Centre for Journalology, Clinical Epidemiology Program , Ottawa Hospital Research Institute , Ottawa , Ontario , Canada
  • Correspondence to Professor Jamie J Kirkham; jamie.kirkham{at}manchester.ac.uk

Objectives The objective of this review is to identify all preprint platforms with biomedical and medical scope and to compare and contrast the key characteristics and policies of these platforms.

Study design and setting Preprint platforms that were launched up to 25 June 2019 and have a biomedical and medical scope according to MEDLINE’s journal selection criteria were identified using existing lists, web-based searches and the expertise of both academic and non-academic publication scientists. A data extraction form was developed, pilot tested and used to collect data from each preprint platform’s webpage(s).

Results A total of 44 preprint platforms were identified as having biomedical and medical scope, 17 (39%) were hosted by the Open Science Framework preprint infrastructure, 6 (14%) were provided by F1000 Research (the Open Research Central infrastructure) and 21 (48%) were other independent preprint platforms. Preprint platforms were either owned by non-profit academic groups, scientific societies or funding organisations (n=28; 64%), owned/partly owned by for-profit publishers or companies (n=14; 32%) or owned by individuals/small communities (n=2; 5%). Twenty-four (55%) preprint platforms accepted content from all scientific fields although some of these had restrictions relating to funding source, geographical region or an affiliated journal’s remit. Thirty-three (75%) preprint platforms provided details about article screening (basic checks) and 14 (32%) of these actively involved researchers with context expertise in the screening process. Almost all preprint platforms allow submission to any peer-reviewed journal following publication, have a preservation plan for read access and most have a policy regarding reasons for retraction and the sustainability of the service.

Conclusion A large number of preprint platforms exist for use in biomedical and medical sciences, all of which offer researchers an opportunity to rapidly disseminate their research findings onto an open-access public server, subject to scope and eligibility.

  • statistics & research methods
  • medical journalism
  • medical ethics

This is an open access article distributed in accordance with the Creative Commons Attribution 4.0 Unported (CC BY 4.0) license, which permits others to copy, redistribute, remix, transform and build upon this work for any purpose, provided the original work is properly cited, a link to the licence is given, and indication of whether changes were made. See:  https://creativecommons.org/licenses/by/4.0/ .

https://doi.org/10.1136/bmjopen-2020-041849

Statistics from Altmetric.com

Request permissions.

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Strengths and limitations of this study

We developed robust methodology for systematically identifying relevant preprint platforms and involved platform owners/representatives wherever possible to verify data.

We undertook an internal pilot of developing and testing out the data collection form in collaboration with a preprint platform owner and funders.

For platforms that had a partner journal and without verification, it was sometimes unclear if the policy information related to the journal, preprint platform or both.

We provide a searchable database as a valuable resource for researchers, funders and policy-makers in the biomedical and medical science field to determine which preprint platforms are relevant to their research scope and which have the functionality and policies that they value most.

We plan to update this searchable database periodically to include any new relevant preprint platforms and to amend any changes in policy.

Introduction

A preprint is an non-peer-reviewed scientific manuscript that authors can upload to a public preprint platform and make available almost immediately without formal external peer review. Posting a preprint enables researchers to ‘claim’ priority of discovery of a research finding; this can be particularly useful for early-career researchers in a highly competitive research environment. Some preprint platforms provide digital object identifier (DOIs) for each included manuscript. This information can be included in grant applications. Indeed, progressive granting agencies are recommending applicants include preprints in their applications (eg, National Institutes of Health (NIH, USA) 1 and in the UK, preprints are becoming recognised as eligible outputs in the Research Excellence Framework exercise which assesses institutional research performance. 2

Preprints have been widely used in the physical sciences since the early 1990s, and with the creation of the repository of electronic articles, arXiv, over 1.6 million preprints or accepted/published manuscripts have been deposited on this platform alone. 3 Since September 2003, arXiv has supported the sharing of quantitative biology preprints under the q-bio category. The use of preprints in biomedical sciences is increasing, leading to the formation of the scientist-driven initiative Accelerating Science and Publication in biology (ASAPbio) to promote their use. 4 A preprint platform dedicated to life science-related research (bioRxiv) founded in 2013 has already attracted nearly 80 000 preprints. 5 This platform was set up to capture science manuscripts from all areas of biology, however, medRxiv was launched in June 2019 to provide a dedicated platform and processes for preprints in medicine and health related sciences 6 and it already hosts over 3400 preprints, becoming particularly popular with COVID-19. The Center for Open Science (COS) 7 has also developed web infrastructure for these new ‘Rxiv’ (pronounced ‘archive’) services, 8 while F1000 Research has provided instances of its postpublication peer review and publishing platform for use by several funders (eg, Wellcome Trust) and research institutions to encourage preprint-first research publishing. 9 Recently, several large publishers (Springer Nature, Wiley, Elsevier) have developed, codeveloped or acquired preprint platforms or services, and in April 2020, SciELO launched a preprint platform that works with Open Journal Systems. 10 Many other preprint platforms also support dissemination of biomedical and medical sciences within their broader multidisciplinary platforms.

Given the increase in the use and profile of preprint platforms, it is increasingly important to identify how many such platforms exist and to understand how they operate in relation to policies and practices important for dissemination. With this aim in mind, we conduct a review to identify all preprint platforms that have biomedical and medical science scope and contrast them in terms of their unique characteristics (eg, scope of the preprint, preprint ownership) and policies (eg, administrative checking, copyright and licensing). We also provide a searchable repository of the platforms identified so that researchers, funders and policymakers have access to a structured approach for identifying preprint platforms that are relevant to their research area.

Terminology

We define a preprint according to the Committee of Publication Ethics definition:

‘A preprint is a scholarly manuscript posted by the author(s) in an openly accessible platform, usually before or in parallel with the peer review process’. 11

Any platform or server that hosts a collection of preprints will be referred to as a preprint platform. We use ‘platform’ instead of ‘server’ because, within this definition, we include both servers with no dedicated formal peer-review service and platforms where a manuscript has been submitted for peer review and is openly available to view before the peer review is formally complete.

Preprint platform identification

A preliminary list of potentially relevant preprint platforms was identified using Martyn Rittman’s original list 12 and extended using a basic Google web search using the search term ‘preprint’ and the knowledge of the Steering Group (study authors). Additional preprint platforms that were launched up to 25 June 2019 were included.

Preprint platform selection

We included any preprint platform that has biomedical or medical scope according to MEDLINE’s journal selection criteria. 13 Generally this covers: ‘(manuscripts) predominantly devoted to reporting original investigations in the biomedical and health sciences, including research in the basic sciences; clinical trials of therapeutic agents; effectiveness of diagnostic or therapeutic techniques; or studies relating to the behavioural, epidemiological or educational aspects of medicine.’

We aimed to be overinclusive such that preprint platforms that hosted work within the above MEDLINE definition of scope among a broader scope (such as ‘all physical, chemical and life sciences’) were included. For inclusion, the platforms primary focus needed to act as a preprint platform rather than a more general repository where preprints might be incidentally deposited. Platforms were included without any language restrictions on the content accepted for posting. Eligibility of preprint platforms was arrived at in discussion with two authors (JJK and NCP) and independently approved by the Steering Group. Preprint platforms were excluded for any of the following reasons: they were no longer active (as of 25 June 2019); they were print only or had no web-presence (‘offline’); their primary function was classed as a general purpose repository with no exclusive preprint functionality. We also excluded service platforms that only host postprints (after peer review), such as Science magazine’s ‘first release’.

Data extraction items

A data collection form was developed by the Steering Group which aimed to capture both preprint platform characteristics and policies. The form was pilot tested with bioRxiv and revised accordingly following discussion with the platform owner. The final agreed data collection form is available online. 14 In brief, we extracted information on the preprints scope, ownership, governance, indexing and citation identifiers, submission formatting, visibility/versioning, article processing charges, publication timings, editorial board membership and for-profit or not-for-profit status. We also collected data on any checking/screening before preprint posting, open access/copyright and licensing options, sustainability and long-term preservation strategies, usage metrics and the simultaneous deposition policy relating to a manuscript submitted to a journal and the manuscript on the platform, and, if appropriate, policies about the deposition of accepted and published papers onto the platform.

Data extraction process

Manual extraction was completed for each platform using information found on the platform’s website where content was directly accessible or found on associated webpages provided by the platform (eg, the ‘About’ pages for many Open Science Framework, OSF, platforms linked to external websites provided by the platform operators). Verbatim text from the online search was recorded alongside any relevant web links. The completed data extraction form was then sent to the platform contacts (usually the platform owner), who were asked to check the data for completeness, fill in any missing fields and respond to any queries. Where an independent review could not be undertaken due to language barriers on the platforms website, the platform owner/representative provided the data. On receiving the responses from the platforms, the researcher updated the data form, in some cases simplifying the text records into categorised information. These data were then returned to the platform to confirm the data were accurate and as complete as possible, and these records were then recorded as ‘verified by the platform representative/owner’.

If no contact with the platform was established, a second researcher independently completed the data extraction using information found on the platform’s website and consensus was reached. The completed data form was sent to the platforms informing them that the included information would be presented about their platform as ‘unverified’ data. The deadline for preprint platforms to approve any information and to confirm that all data could be shared publicly was 19 January 2020. Further datasets and records were updated with information provided up to 27 January 2020, and are available on the Zenodo repository. 14

Reporting of results

The preprint platform characteristics and policies were summarised descriptively and divided into preprint platforms (1) hosted on the OSF Preprints infrastructure, (2) provided by the Open Research Central infrastructure and (3) all other eligible platforms. Characteristics are presented as: (A) the scope and ownership of each platform; (B) content-specific characteristics and information relating to submission, journal transfer options and external discoverability; (C) screening, moderation and permanence of content; (D) usage metrics and other features and (E) metadata.

Patient and public involvement

No patients were involved in setting the research question nor were they involved in the design, implementation and reporting of the study. There are no plans to involve patients in the dissemination of results.

From all sources, 90 potentially eligible preprint platforms were identified for this review, although 46 were excluded based on scope (n=23), inactivity (n=13), no online presence (n=5) or were general repositories (n=5) ( figure 1 ). A list of excluded preprint platforms can be found in online supplemental table 1 . Of the 44 included preprint platforms, 17 were hosted by the OSF preprint infrastructure (although MarXiv is no longer part of the OSF family), 6 were provided by the Open Research Central infrastructure and 21 were other independent preprint platforms ( figure 1 ). Of the 21 independent preprints platforms, four were First Look platforms (Cell Press Sneak Peek, Preprints with the Lancet, NeuroImage: Clinical and Surgery Open Science). While meeting the criteria for inclusion in this review, PeerJ Preprints decided to accept no new preprints after 30 September 2019. Thirty-eight (86%) of the 44 preprint platforms verified their own data. We present the data tables in this manuscript, though all tables and raw data are available in the Zenodo repository. 14 A searchable database of all the preprint platform information is also available ( https://asapbio.org/preprint-servers ).

Supplemental material

  • Download figure
  • Open in new tab
  • Download powerpoint

Flow diagram of included preprint platforms covering biomedical and medical scope. OSF, Open Science Framework.

Scope and ownership of preprint platforms

Twenty-eight platforms (64%) are owned by non-profit academic groups, scientific societies or funding organisations while two platforms are owned by individuals or small communities (Frenxiv and ViXra) ( online supplemental table 2 ). Fourteen preprint platforms (32%) are affiliated or partly owned by for-profit publishers or companies; however, the preprint service part of their operation was declared as non-profit for three of these (Preprints.org, ESSOAr and MitoFit Preprint Archives). Of the preprint platforms associated with ‘for-profit’ status, only F1000 Research requires authors to pay an article processing charge.

Twenty-four (55%) preprint platforms accepted articles that covered multidisciplinary scope while 20 (45%) were discipline specific (eg, PsyArXiv for psychological research) ( online supplemental table 2 ). Despite the multidisciplinary scope, there were some further restrictions for some of the platforms, for example, there are five regional platforms (AfricArxiv, Arabixiv, Frenxiv, INA-Rxiv, ChinaXiv) aimed mostly at research being conducted in a specific geographical region, however, the content of these articles are globally accessible. The Open Research Central platforms also only accept articles that are funded by certain funders (eg, Wellcome Open Research platform only accepts research funded by the Wellcome Trust). Some preprint platforms also only allow articles that fit the remit of their affiliated journals (eg, Cell Press Sneak Peek). Across all platforms, the median time that they have been active is 32 months (range 10 months, medRxiv to 28 years 8 months, arXiv). In that time, over 2.72 million preprints have been posted and in 2020, two platforms (Research Square and bioRxiv) have averaged more than 2500 biomedical postings per month.

Submission, journal transfer options and external discoverability

Where the information is known, all preprint platforms support the English language, and all accept research articles (with the exception of Thesis Commons which accepts only theses) ( online supplemental table 3 ). Some platforms also accept other languages and other article types including research presentation slides and posters. Readers can access the full content of articles from all platforms with the exception of JMIR Preprints and some of the First Look platforms (Cell Press Sneak Peek, Preprints with the Lancet and Surgery Open Science) where reader registration is required. All platforms support PDF as the main viewing option, for some platforms this can be viewed in the browser while for others it requires a download. For all platforms, authors can submit articles using either a Word doc or as a PDF, with many platforms offering authors a choice of licensing, although where authors do not get a choice, the licence required is commonly the CC-BY licence.

In general, the OSF and many of the other platforms allow authors to submit their articles to any journal although in some cases there is facilitated submission to certain journals, for example, for bioRxiv there is a host of direct transfer journal options ( online supplemental table 3 ). Authors submitting to F1000 Research, the Open Research platforms and all First Look platforms can only submit articles to journals associated with the platform. Where the information is available, all platforms with the exception of Therapoid and ViXra are externally indexed and most are indexed on Google Scholar.

Screening, moderation and permanence of content

Thirty-three (75%) preprint platforms provided some detail about article screening, while two (FocUS Archive and SocArxiv) do mention checks although the details of such checks are unknown ( online supplemental table 4 ). Therapoid does not perform any screening checks but relies on a moderation process by site users following article posting and ViXra does not perform screening checks but will retract articles in response to issues. Fourteen (32%) preprint platforms that perform screening checks actively involved researchers with content expertise in this process. The three most common screening checks performed related to scope of the article (eg, scientific content, not spam, relevant material, language), plagiarism and legal/ethical/societal issues and compliance. Only three preprint platforms (Research Square, bioRxiv and medRxiv) check whether the content contains unfounded medical claims.

All F1000 platforms (inclusive of Open Research ones), MitoFit Preprint Archives, PeerJ Preprints and Preprints.org describe policies online in relation to NIH guidance for reporting preprints 15 with regards to plagiarism, competing interests, misconduct and all other hallmarks of reputable scholarly publishing ( online supplemental table 4 ). Some preprint platforms do have policies but fall short of transparently making these policies visible online while some platforms have no policies. If content is withdrawn, some platforms ensure that the article retains a web presence (eg, basic information on a tombstone page) although this was not standard across all platforms. Almost all platforms have a preservation plan (or are about to implement) for read access. Most commonly, platforms have set up an archiving agreement with Portico. Others have made their own arrangements: as a notable example, the OSF platforms are associated with a preservation fund provided by the COS to maintain read access for over 50 years. In addition, most platforms have details on the sustainability of the service, for the OSF platforms this come from an external source (eg, grants to support the COS framework), while for the Open Research Central infrastructure platforms this comes from article processing charges covered by the respective funding agencies. For some of the other platforms, funding is received from either internal or external sources or from other business model services (eg, from associated journal publishing).

Usage metrics and other features

With the exception of arXiv and MitoFit Preprint Archives (Therapoid metrics arriving soon), all preprint platforms have some form of usage metrics, and apart from JMIR Preprints and ViXra all provide the number of article downloads on the abstract page ( online supplemental table 5 ). The OSF preprints are limited to downloads but the Open Research Central platforms also include the number of views, number of citations and altmetrics, while some of the independent platforms also include details of social media interactions direct from the platform (as opposed to the altmetric attention score). Most platforms (n=33; 75%) have some form of commenting and onsite search options (35; 80%), and some (mostly but not exclusively to the independent platforms) have alerts such as RSS feeds or email alerts.

Forty (91%) of platforms provided information on metadata and all provide the manuscript title, publication date, abstract and author names in the metadata ( online supplemental table 6 ). Nearly all of these with the exception of SciELO Preprints provide a DOI or other manuscript identifier as well. The majority also offer subject categories (n=34) and licence information (n=26) but less than half include author affiliations (n=17) and funder acknowledgements (n=13). Eleven platforms (all six platforms under the Open Research Central infrastructure, Authorea, bioRxiv, ChemRxiv, F1000 Research, Research Square) offer full-text content, but only five include references in the metadata. Half of the platforms (n=22) offer a relational link to the journal publication (if it exists) in the metadata.

Forty-four preprint platforms were identified that considered biomedical and medical scope. This review characterises each of these preprint platforms such that authors can make a more informed choice about which ones might be relevant to their research field. Moreover, funders can use the data from this review to compare platforms if they wish to explicitly support and/or encourage their researchers to use certain platforms.

Preprint platforms are fast evolving and despite our cut-off of 25 June 2019, we are aware of new eligible preprint platforms that have been or are about to be launched after this date, for example, Open Anthropology Research Repository 16 and Cambridge Open Engage. 17 However, the recent advancements in the number of preprint platforms in this field has meant that one platform in this review (PeerJ Preprints) ceased to accept new preprints from the end of September 2019 to focus on their peer-reviewed journal activities. 18 Through our searchable database ( https://asapbio.org/preprint-servers ), we will endeavour to keep this information up-to-date. More specifically the database will be maintained by ASAPbio for at least the next 2 years, and longer pending additional funding, but will be available as a CC BY resource. Our plan for maintenance is to enable preprint platforms to update their listings on demand, pending verification of publically accessible information by ASAPbio staff. We will periodically archive the database in Zenodo to preserve prior versions.

Due to the lack of formal external peer review for many platforms (with the exception of those platforms that follow the F1000 Research model), preprint platforms that include medical content have been criticised as they may lack quality which can lead to errors in methods, results and interpretation, which subsequently has the potential to harm patients. 19 20 This review has demonstrated the reality that many preprints do undergo some checks before going online, in contrast to the perception that preprints are not reviewed at all. Research Square, bioRxiv and medRxiv check specifically if there is potential harm to the preprints’ dissemination before peer review. Research Square also offers a transparent checklist to indicate the status of various quality assurance checks (not equivalent to scientific peer review) for each preprint.

Empirical evidence to support the use of editors and peer reviewers as a mechanism to ensure the quality of biomedical research is relatively weak 21 22 although other studies have rendered peer review as being potentially useful. 23 24 This review provides some justification that preprint platforms might be a reasonable option for researchers, especially given the time spent and associated cost of peer review. 25 In a recent survey of authors that have published with F1000 Research, 70% of respondents found the speed of publication to be important or very important. 26 In some scenarios, the time to deliver research findings may be as equally as important as research quality, and may be critical to healthcare provision. A good example of this is the current outbreak of novel coronavirus, where much of the preliminary evidence has been made available through preprints at the time of WHO declaring the epidemic a public health emergency. 27 The issue of preprints being available before peer review, and also the level of screening before a preprint is posted, has been particularly pertinent in this case. As an example, bioRxiv has rapidly adapted to ensure users appreciate there has not been any peer review of the COVID-19-related work presented on this platform. In light of COVID-19, people including the patients and the public might be interested in a quick and easy way to search across platforms. As a start at improving discoverability, Europe PMC aggregates preprints from several repositories and already nearly 3000 preprint articles with ‘COVID-19’ in the title are listed. 28

Strengths and limitations of the study

The strength of this study is that we developed robust methodology for systematically identifying relevant preprint platforms and involved platform owners/representatives wherever possible to verify data that was either unclear or not available on platform websites, and when this was not possible, a second researcher was involved in the data acquisition process. Systematically identifying web-based data that is not indexed in an academic bibliographic database is challenging, 29 though the methods employed here are compatible with the principles of a systematic search: the methods are transparent and reproducible. This approach builds on an earlier list of preprint servers, 12 the process behind which did not use systematic methods or involve platform owners as far as we are aware.

We undertook an internal pilot of developing and testing out the data collection form in collaboration with a preprint platform owner and ASAPbio staff and funders (promoters of preprint use) in order to ensure that the list of characteristics collected was both complete and relevant to different stakeholder groups including academics and funders. Many of the general policy information for some platforms was not well reported or easy to find online and therefore an unexpected but positive by-product of this research is that several of these platforms have updated their webpages to improve the visibility and transparency of their policies in response to this research. Similarly, some platforms became aware of policy attributes that they had not previously considered and are now in the process of considering these for future implementation.

One limitation is that we focused our attention on the ‘main’ preprint article although in some cases different policies existed for the supplementary material, for example, acceptable formats and licensing options. This level of detail will be included in our searchable database. Another potential shortcoming was that some preprint platforms had a partner journal and without verification it was sometimes unclear if the policy information related to the journal, preprint platform or both. Finally, we defined preprint platforms as hosting work before peer review is formally complete and we acknowledge that some platforms included here also host content that has already been peer-reviewed and/or published in a journal (eg, postprints) 30 ; this is unlikely to affect the interpretation of policies for preprinted works discussed herein.

Implications for authors of biomedical and medical research

With the increase in the number of preprint platforms available in the biomedical and medical research field, authors have the option to make publicly available and gain some early ownership of their research findings with little or no cost to themselves. Moreover, with many preprints platforms there is little restriction with regard to authors later publishing their preprints in peer-reviewed journals of their choice. While we did not tabulate information on this specifically, it was noted that some platforms (notably OSF platforms) did recommend that authors check the SHERPA/RoMEO service for details of a journal’s sharing policy. There is also some evidence that preprinting an article first may even boost citation rates 31 due to increased attention from tweets, blogs and news articles than those articles published without a preprint. With many platforms carrying out suitable quality-control checks and having long-term preservation strategies, preprint platforms offer authors direct control of the dissemination of their research in a rapid, free and open environment. As well as primary research, preprints are also vital to users of research (systematic reviewers and decision makers). As an example, a living mapping systematic review of ongoing research into COVID-19 is currently being undertaken, and almost all included studies to date have been identified through preprint platforms. 32

Implications for preprint platforms

There has been a sharp rise in the number of preprints being published each month and it has been estimated (as of June 2019), preprints in biology represents approximately 2.4% of all biomedical publications 33 ; and as of April 2020 there are already over 2.72 million preprints in the platforms that we evaluated. This review has summarised the key characteristics and policies of preprint platforms posting both medical and biomedical content although there is a need for some of these platforms to update their policies and to make them more transparent online. As preprints are not formally reviewed for scientific rigour through peer review, it is important to make it clear that their validity is less certain than for peer-reviewed articles (although even the latter may still not be valid). There is perhaps a growing need to standardise the checking process across platforms; such a process should not diminish the speed of publication (what authors value most about a preprint 22 ). There is the temptation of making the checking process more rigorous, for example, by including relevant researchers within the field as gatekeepers. However, this may slow down the process of making scientific work rapidly available and may promote groupthink, blocking innovative contrarian ideas to be circulated for public open review in the preprint platforms. Based on current checks, our review shows that most preprint platforms manage to post preprints within 48 hours and all within a week on average. Further challenges may arise on resources if the number of preprints continue to rise at a similar rate and the number of new platforms begins to plateau. And now, as several initiatives progress with work to build scientific review directly onto preprints (eg, Peer Community in, 34 Review Commons, 35 PREreview 36 ), it may become even more important to provide clarity about the level of checks a manuscript has already received and would need to receive to be considered ‘certified’ by the scientific community. If anything, the wide public availability of preprints allows for far more extensive review by many reviewers, as opposed to the typical journal peer-review where only a couple of reviewers are involved. Our review identified 14 platforms linked to for-profit publishers and companies but only F1000 Research currently charges a small article processing charge to authors. With the increase in demand and resources needed to maintain preprint platforms, we should be mindful that article processing charges may change downstream meaning that platforms may have to charge authors.

One outcome of this review has been to understand the various drivers behind the proliferation of preprint platforms for the life and biomedical sciences. While arXiv, bioRxiv, chemRxiv and medRxiv aim to provide dedicated servers for academics within each field they are dedicated to, several academic groups have offered alternative subject-specific or regional services in line with their own community’s needs, such as sharing work in languages other than English, using the OSF infrastructure. A third provider of preprint platforms is industry stakeholders: as academic publishers providing or acquiring preprint services to support the content they receive as submissions to their journals, and as biotechnology or pharmaceutical companies looking to support the sharing of relevant research content. Whether any platform becomes dominant may be influenced by the communities who adopt them, the influencers who promote them (funders and researchers who influence hiring and promotion decisions) and the financial sustainability underpinning them. We hope that enabling transparency into the processes and policies at each platform empowers the research community (including researchers, funders and others involved in the enterprise) to identify and support the platform(s) that help them to share research results most effectively.

Acknowledgments

We thank John Inglis for his advice on developing the data collection form and helpful comments on the manuscript. We also thank Robert Kiley, Geraldine Clement-Stoneham, Michael Parkin, Amy Riegelman, and Claire Yang for helpful feedback and conversations. We also would like to thank collectively the preprint platform owners and representatives who provided both data and verified information.

  • ↵ NIH enables investigators to include draft preprints in grant proposals [online] . Available: http://www.sciencemag.org/news/2017/03/nih-enables-investigators-include-draft-preprints-grant-proposals [Accessed 8 Apr 2020 ].
  • ↵ REF 2021: Guidance on submissions [online] . Available: https://www.ref.ac.uk/media/1092/ref-2019_01-guidance-on-submissions.pdf [Accessed 8 Apr 2020 ].
  • ↵ arXiv [online] . Available: https://arxiv.org/ [Accessed 8 Apr 2020 ].
  • ↵ ASAPbio [online] . Available: http://asapbio.org/preprint-info [Accessed 8 Apr 2020 ].
  • ↵ bioRxiv [online] . Available: https://www.biorxiv.org/ [Accessed 20 Apr 2020 ].
  • ↵ medRxiv [online] . Available: https://www.medrxiv.org/ [Accessed 21 Apr 2020 ].
  • ↵ Center for open science [online] . Available: https://cos.io/ [Accessed 8 Apr 2020 ].
  • ↵ INLEXIO: The rising tide of preprint servers [online] . Available: https://www.inlexio.com/rising-tide-preprint-servers/ [Accessed 8 Apr 2020 ].
  • ↵ F1000 Research Ltd [online] . Available: https://f1000research.com/ [Accessed 8 Apr 2020 ].
  • ↵ SciELO Preprints [online] . Available: https://preprints.scielo.org/index.php/scielo [Accessed 8 Apr 2020 ].
  • Committee on Publication Ethics
  • ↵ Martyn rittman research preprints: server list [online] . Available: https://docs.google.com/spreadsheets/d/17RgfuQcGJHKSsSJwZZn0oiXAnimZu2sZsWp8Z6ZaYYo/edit#gid=0 [Accessed 8 Apr 2020 ].
  • U.S National Library of Medicine
  • ↵ Zenodo: Practices and policies of preprint platforms for life and biomedical sciences [online] . Available: https://zenodo.org/record/3700874 [Accessed 8 Apr 2020 ].
  • ↵ Reporting preprints and other interim research products [online] . Available: https://grants.nih.gov/grants/guide/notice-files/not-od-17-050.html [Accessed 8 Apr 2020 ].
  • ↵ Open anthropology research repository [online] . Available: https://www.openanthroresearch.org/ [Accessed 8 Apr 2020 ].
  • ↵ Cambridge open engage [online] . Available: https://www.cambridge.org/engage/coe/public-dashboard [Accessed 8 Apr 2020 ].
  • ↵ PeerJ blog: PeerJ Preprints to stop accepting new preprints Sep 30th 2019 [online] . Available: https://peerj.com/blog/post/115284881747/peerj-preprints-to-stop-accepting-new-preprints-sep-30-2019/ [Accessed 8 Apr 2020 ].
  • Annesley T ,
  • Bastian H , et al
  • Chalmers I ,
  • Jefferson T ,
  • Brodney Folse S , et al
  • Chauvin A ,
  • Trinquart L , et al
  • Hopewell S ,
  • Collins GS ,
  • Boutron I , et al
  • Carneiro CFD ,
  • Queiroz VGS ,
  • Moulin TC , et al
  • Kovanis M ,
  • Porcher R ,
  • Ravaud P , et al
  • Kirkham J ,
  • ↵ World Economic Forum: Coronavirus and the risks of ‘speed science’ [online] . Available: https://www.weforum.org/agenda/2020/03/speed-science-coronavirus-covid19-research-academic [Accessed 14 Apr 2020 ].
  • ↵ Preprints in Europe PMC: reducing friction for discoverability [online] . Available: http://blog.europepmc.org/2018/07/preprints.html [Accessed 14 Apr 2020 ].
  • Stansfield C ,
  • Dickson K ,
  • ↵ scholcommlab: analyzing preprints: the challenges of working with OSF metadata . Available: https://www.scholcommlab.ca/2019/09/11/preprints-challenges-part-two/ [Accessed 14 Apr 2020 ].
  • ↵ Nature Index: preprints boost article citations and mentions [online] . Available: https://www.natureindex.com/news-blog/preprints-boost-article-citations-and-mentions [Accessed 14 Apr 2020 ].
  • ↵ Living mapping and living systematic review of Covid-19 studies [online] . Available: https://covid-nma.com/ [Accessed 14 Apr 2020 ].
  • ↵ ASAPbio: Biology preprints over time [online] . Available: https://asapbio.org/preprint-info/biology-preprints-over-time [Accessed 13 Feb 2020 ].
  • ↵ Peer Community in [online] . Available: https://peercommunityin.org/ [Accessed 14 Apr 2020 ].
  • ↵ Review COMMONS [online] . Available: https://www.reviewcommons.org/ [Accessed 14 Apr 2020 ].
  • ↵ PREREVIEW [online] . Available: https://content.prereview.org/ [Accessed 14 Apr 2020 ].

Supplementary materials

Supplementary data.

This web only file has been produced by the BMJ Publishing Group from an electronic file supplied by the author(s) and has not been edited for content.

  • Data supplement 1
  • Data supplement 2
  • Data supplement 3
  • Data supplement 4
  • Data supplement 5
  • Data supplement 6

Twitter @dmoher

Contributors JJK and DM jointly conceived the study and are the guarantors. JJK, DM and NCP designed the study methods and developed the data collection form. JJK, NCP, FM, IB, JPI, JP and DM were involved in identifying eligible platforms. JJK, NCP and FM were all involved in data extraction and JJK and NCP did the analysis and prepared the data tables. JP developed the online searchable database. JJK prepared the initial manuscript. JJK, NCP, FM, IB, JPI, JP and DM were involved in the revision of this manuscript. JJK, NCP, FM, IB, JPI, JP and DM read and approved the final manuscript and are accountable for all aspects of the work, including the accuracy and integrity.

Funding NCP, FM and JP received funding for ASAPbio preprint research from The Wellcome Trust, Chan Zuckerberg Initiative, Howard Hughes Medical Institute, Simons Foundation, Medical Research Council and Canadian Institutes of Health Research.

Competing interests JP is executive director of ASAPbio.

Patient consent for publication Not required.

Ethics approval Not required. This is a descriptive study of publicly available information made available on websites. Data was confirmed by preprint platform owners/representatives using only email contacts available on those public websites.

Provenance and peer review Not commissioned; externally peer reviewed.

Data availability statement Data are available in a public, open access repository. The data from this study are available in Zenodo ( https://zenodo.org/record/3700874 ), which we will update periodically with a new version number as new platforms come online and policies of platforms currently identified change.

Supplemental material This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.

Read the full text or download the PDF:

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List

Logo of plosone

Downstream retraction of preprinted research in the life and medical sciences

Michele Avissar-Whiting

Research Square Company, Durham, North Carolina, United States of America

Associated Data

The Retraction Watch Database is available from Retraction Watch, and requests for this data should be sent to moc.hctawnoitcarter@maet .

Retractions have been on the rise in the life and clinical sciences in the last decade, likely due to both broader accessibility of published scientific research and increased vigilance on the part of publishers. In this same period, there has been a greater than ten-fold increase in the posting of preprints by researchers in these fields. While this development has significantly accelerated the rate of research dissemination and has benefited early-career researchers eager to show productivity, it has also introduced challenges with respect to provenance tracking, version linking, and, ultimately, back-propagation of events such as corrigenda, expressions of concern, and retractions that occur on the journal-published version. The aim of this study was to understand the extent of this problem among preprint servers that routinely link their preprints to the corollary versions published in journals. To present a snapshot of the current state of downstream retractions of articles preprinted in three large preprint servers (Research Square, bioRxiv, and medRxiv), the DOIs of the journal-published versions linked to preprints were matched to entries in the Retraction Watch database. A total of 30 retractions were identified, representing only 0.01% of all content posted on these servers. Of these, 11 retractions were clearly noted by the preprint servers; however, the existence of a preprint was only acknowledged by the retracting journal in one case. The time from publication to retraction averaged 278 days, notably lower than the average for articles overall (839 days). In 70% of cases, retractions downstream of preprints were due–at least in part–to ethical or procedural misconduct. In 63% of cases, the nature of the retraction suggested that the conclusions were no longer reliable. Over time, the lack of propagation of critical information across the publication life cycle will pose a threat to the scholarly record and to scientific integrity. It is incumbent on preprint servers, publishers, and the systems that connect them to address these issues before their scale becomes untenable.

Introduction

The use of preprints as a mode of rapidly sharing research findings in the biological and medical sciences has become ubiquitous over the last decade, and their adoption has particularly surged since the onset of the COVID-19 pandemic in early 2020 [ 1 ]. The global public health emergency drove researchers to deposit preprints as they awaited peer review in a journal, a practice that became widely embraced by major publishers as the pandemic intensified [ 2 ]. But this embrace was generally not accompanied by the development of new mechanisms to link the eventual journal publications consistently and unambiguously to previous versions of the work in preprint servers.

In the life sciences, the version of the work that is typically recognized for career advancement purposes and, reportedly, preferred by researchers [ 3 ] is the one published in a journal following peer review–often referred to as the version of record. The preprint version, however, does not become inconsequential once a later version of it is published. Preprints are permanent contributions to the scholarly record. They are fully open access and have their own DOIs, so they can be circulated and cited widely long before a later version is published by a journal. Throughout the pandemic, it has not been unusual to see a preprint cited by another preprint within days of its appearance online–an unprecedented pace of collaborative problem-solving.

Often, preprints continue to be cited even once a later version of the work is published in a journal [ 4 ]. There are numerous potential reasons for this, ranging from the technical limitations of reference management software (programs do not automatically update preprints with versions of record) to an active choice by the citing researcher. But perhaps the most critical reason is that linking of preprints to associated versions of record has been unreliable at best and often nonexistent at worst. Because they tend to operate on limited budgets, most preprint servers do not have automated mechanisms for updating preprints with links to their journal-published versions [ 5 ]. The mechanisms that do exist on a minority of servers have limited fidelity, as links are typically based on fuzzy matching of titles and author lists, which are subject to change.

The issue of nonexistent or unreliable linking becomes particularly salient in instances where a journal-published version of a preprint is retracted or earns an expression of concern, a note typically issued by an editor to alert readers about potential problems with the article. The incidence of retraction, once an extremely rare occurrence, has increased dramatically since the year 2000 [ 6 ]. Retractions are typically carried out because a critical issue comes to light, invalidating the results and conclusions of the work. Misconduct is found to be a factor in about half of these cases [ 7 ]. However, because it is unusual for journals to acknowledge the existence of a preprint associated with an article, information about a retraction does not reach the preprint server unless the author updates the server or there are procedures in place at the preprint server to explicitly search for such information. There is some data to suggest that there are generally few meaningful differences between preprints and their corollary journal-published articles [ 8 ]. Thus, problems discovered in the latter are likely to impact the former. Moreover, the issue of persistent citation of retracted research in the literature [ 9 ] will only be exacerbated by a failure to link versions.

Because preprints have only become popular among life and medical scientists in the past few years and research on retractions in general is sparse, there is little information to be found about the intersection of these two domains. In this analysis, I assess linked journal articles from three major life science and clinical preprint servers to present a snapshot of 1) the incidence of retractions among previously preprinted work, 2) the degree to which these events are acknowledged on the preprints, 3) and other characteristics of these retractions.

The Retraction Watch database was chosen for this analysis, as it is the most comprehensive database of retractions, errata, and corrections available, containing (at the time of access on 23 November 2021) 31,492 records. The three preprint servers Research Square, bioRxiv, and medRxiv were used in the analysis because they are the largest life and medical science servers with automated mechanisms in place to link to journal-published versions and from which the data are easily retrievable via API. Data from Research Square was accessed directly, and data from bioRxiv and medRxiv were obtained via their APIs ( https://api.biorxiv.org/ and https://api.medrxiv.org/ , respectively). The DOIs of journal-published versions of the preprints were matched to entries in the Retraction Watch database, and corollary information about the retractions was collected.

Misconduct in each case of retraction was categorized according to the areas defined by the Council of Science Editors [ 10 ] as follows: Mistreatment of research subjects (includes failure to obtain approval from an ethical review board or consent from human subjects before conducting the study and failure to follow an approved protocol); Falsification and fabrication of data ; Piracy and plagiarism (including unauthorized use of third-party data or other breach of policy by authors); or No evidence of misconduct . The determination of the presence and type of misconduct was based on information contained in the individual retraction notices as well as on the reasons for retraction briefly noted in the Retraction Watch database.

To account for the relative recency of preprints, the calculation of the average time-to-retraction for entries in the Retraction Watch database was limited only to articles published after 2014, when the first life science preprints began to emerge on bioRxiv.

Because the discovery of retractions downstream of preprints relies heavily on the existence of a linkage between the preprint and its journal-published version, I first assessed the proportion of preprints for which such links appear for all three servers. Of all posted preprints on Research Square, bioRxiv, and medRxiv, 24%, 54%, and 35% are linked to a published journal article, respectively ( Table 1 ). When using a 2-year cut-off (i.e., limiting the analysis only to articles posted as preprints more than 2 years ago, as previously done to approximate “publication success” [ 11 ]), the values increase to 45%, 73%, and 74% ( Table 1 ).

1 all time; excludes versions.

Among the three preprint servers, a total of 30 downstream retractions were identified: 17 at Research Square, 11 at bioRxiv, and 2 at medRxiv ( Table 2 ). This represents 0.05%, 0.01%, and 0.02% of all preprints with journal publication links at Research Square, bioRxiv, and medRxiv, respectively ( Table 1 ). All 30 of the retracted papers in this analysis had been published in open access journals.

The time from preprint posting to publication in a journal ranged from 41 to 468 days, with an average time of 169 days. The time from publication in the journal to retraction ranged from 11 days to 993 days and averaged 278 days. Among all retractions in the Retraction Watch database for papers published after 2014 (the year that bioRxiv preprints began to appear), the average time to retraction was 678 days.

In 20/30 cases (67%), the retraction was due–at least in part–to some form of research misconduct. Of these, 6 were categorized as Piracy and plagiarism, 10 were categorized as Falsification and fabrication of data, 5 were categorized as Mistreatment of research subjects. In one of these cases, a retraction fell into two categories ( Table 2 ). 8/30 cases (27%) were due to errors, contamination, or irreproducible/unreliable results and did not qualify as research misconduct. In 18/30 cases (60%), the nature of the retraction suggested that the conclusions of the study should no longer be considered valid. In one case, a clear reason for retraction could not be determined, and in another case, the presence or absence of misconduct could not be determined conclusively.

Among the 30 preprints linked to retracted journal publications, 11 (37%) included a clear indication of the retraction on the preprint itself. In 5/30 cases (17%), the preprint itself was marked as withdrawn ( Table 2 ). None of the 30 retracted journal articles visibly indicated the existence of an associated preprint.

Preprints have introduced a new level of speed and transparency to the dissemination of life science research. They have removed barriers to research communication and have particularly benefited early-career researchers, who use them to share their work on their own terms, to show productivity, and to receive valuable feedback from a vast community of peers [ 12 , 13 ]. However, the rapid growth of preprint servers has also introduced some challenges and complexity into the environment of scholarly publication [ 14 ]. The many new preprint servers that have emerged in the past few years have varying budgets, governance, and features as well as disparate policies and operating procedures. The preprint advocacy organization ASAPbio has been pivotal in uniting representatives from the different servers to develop standards and best practices with the aim of establishing consistency in the most important areas, such as those pertaining to misinformation and trust [ 15 ]. Due to various limitations, however, many servers do not have the means and/or the capacity to connect preprints with their associated journal publications.

Journals, for their part, have been generally slow or reluctant to prioritize surfacing preprint links. Only 6% of journals claim to have a mechanism for linking to preprints (transpose-publishing.github.io/#/), yet the actual appearance of such links is much rarer still. Notably, eLife , as the first journal to require deposition of a preprint at the point of submission [ 16 ], now consistently supplies preprint links on their article pages.

Of note, Google Scholar has its own algorithm for aggregating related publications and their citations and privileging the version of record, even in the absence of formal mechanisms for linking at the preprint server or journal [ 17 ]. Over time and with improved sophistication, this technology could address the linkage issues that exist. It already helps to allay concerns over citation dilution–the issue of citations accumulating across different stages and versions of a paper [ 18 ]. As the scholarly publishing landscape continues to evolve in the direction of author-led dissemination, enshrining a record of versions is likely to take precedence over the traditional norm of privileging a version of record.

One negative repercussion of the linking gap is that preprints cannot be effectively updated when critical events, such as corrigenda, expressions of concern, or retractions, occur downstream on their journal-published versions. Indeed, the current study shows that even under ideal circumstances–in which links are consistently established by the preprint server–fewer than 50% of preprints indicate a downstream retraction.

Metascience researchers have observed a significant, progressive decline in the time to retraction since 2000 [ 6 ]. This increase has been attributed to multiple factors, including wider access to research, which inflates the probability of errors or issues being caught, and increased emphasis on integrity by reputable journals [ 19 ]. Articles published in high-impact-factor journals and open access journals would both seem to benefit from better scrutiny–the former potentially being more careful or thorough and the latter being more extensive (by virtue of their broad accessibility). Interestingly, Cokol et al. concluded from their analysis on the burden of retractions that the higher retraction rates observed at high-impact journals are a reflection of the scrutiny their articles receive post-publication [ 20 ]. Indeed, all 30 of the retractions identified in this study and–as of 2018 –~25% of all retractions in the PubMed database are of open access articles [ 21 ].

The time from publication to retraction among the previously preprinted articles in this analysis averaged 9.2 months, notably lower than the average of 33 months observed by Steen et al [ 6 ]. Since this discrepancy could simply be attributed to the relatively short time of existence of preprint servers, I limited my analysis of overall time to retraction to only articles published after 2013, when bioRxiv was launched. In this set, the time to retraction averaged 23 months, which is still considerably higher than that of the smaller set of previously preprinted articles. This observation could simply be an artifact of a relatively small sample size, but it might hint at a benefit of early exposure and accessibility.

Due to its integration with Springer Nature journals, Research Square has three (rather than two) mechanisms for linking published papers, so the fidelity of linking is likely to be higher in this preprint server than in others, including bioRxiv and medRxiv. Despite this, a smaller proportion of preprints are linked to journal publications on Research Square compared to bioRxiv and medRxiv. There are a number of factors that could account for this discrepancy, including known technical deficits preventing reliable linking of Research Square articles to Springer Nature submissions, the longer time of existence of bioRxiv and medRxiv relative to Research Square, the multidisciplinary nature of the Research Square server, differential screening procedures between the servers, and the quality of preprints that the servers receive.

In this study, fewer than half of the preprints were clearly marked with an indication of the downstream retraction. Preprint servers that issue DOIs via Crossref have the ability to use the “is-preprint-of” metadata relationship to link preprints to their downstream publications. This makes it easier to check for updates in the Crossref metadata associated with the journal publication. However, this requires that journals properly register retractions and other events via Crossref and that preprint servers both initiate the link and regularly check it against Crossref’s metadata. Crossmark–the Crossref service that provides public-facing updates on post-publication events–is not currently enabled on preprint platforms, so the platforms must establish their own mechanisms for finding and surfacing this information. Across the rapidly growing landscape of preprint servers and journals that currently exist, it is unlikely that this occurs reliably. This failure to back-propagate critical information not only leaves preprint readers in the dark about the invalidation of some research, but it could also exacerbate the problem of papers being cited persistently after retraction [ 22 , 23 ]. To be clear, the problem is not with the occurrence of retractions themselves–which should be viewed as an indication that corrective systems are working properly [ 24 ]–but, rather, with the persistence of these papers in the literature due to their continued citation.

In 5 out of the 30 retractions identified in this study, the preprint had also been marked as withdrawn. Withdrawal is considered analogous to retraction in the preprint sphere [ 25 ], but the question of whether a preprint should be withdrawn following retraction of a later version has not been addressed in any formal capacity. ASAPbio, an organization promoting the productive use of preprints as tools for research dissemination, currently does not include downstream retraction as cause for withdrawal in their published recommendations for preprint withdrawal and removal, indicating that preprint servers should be notified and left to decide the appropriate course of action in each individual case. These guidelines also emphasize that while journals should make an effort to alert preprint servers to these events, it is ultimately the author’s responsibility to update both parties about actions taken on an associated record [ 25 ]. However, as it is not uncommon for authors to disagree with a retraction or become unresponsive to inquiries about issues with their publications, it may be unrealistic to rely on authors to properly propagate such information.

Importantly, even if automated connections via Crossref and Crossmark are established for all preprint servers, several issues will persist. First, journal-published versions whose titles and author lists do not align with their preprints will fail to be linked. Second, only updates on retractions, which are issued their own DOIs, will be facilitated. Unless journals take responsibility for establishing connections to previously published versions, linkage will continue to be suboptimal and preprint readers will continue to be oblivious to events such as expressions of concern, which can take months or even years of investigation before resolving or resulting in a retraction [ 26 ].

Preprints have proven valuable to researchers [ 13 , 18 ] and are likely to become a fixture among authors of biological and medical research, increasingly becoming the earliest version of a study that is shared with the world. But as preprints become more common, so too will the incidence of downstream retractions or other problems that are not properly accounted for on the preprint. As adoption of preprints continues to grow, serious consideration should be given to ensuring that preprints are digitally connected with associated publications and building reliable mechanisms for propagating critical updates. Future studies should include an analysis of preprints and their journal article links across the broader group of preprint servers to provide a more comprehensive picture of the state of information propagation across the publication continuum.

Limitations

Counts of linked articles via Crossref are known to be limited to near exact matches of titles and author lists between preprints and journal publications. For Research Square, counts of links established using internal mechanisms are also underestimates due to ongoing technical deficits that prevent perfect linking of Research Square preprints to Springer Nature articles. Thus, the actual percentages of preprints that are later published is likely higher than represented by the counts presented here.

Acknowledgments

My sincere thanks to Ivan Oransky and Alison Abritis of Retraction Watch for their guidance and access to the Retraction Watch database.

Funding Statement

The author(s) received no specific funding for this work.

Data Availability

  • PLoS One. 2022; 17(5): e0267971.

Decision Letter 0

16 Mar 2022

PONE-D-22-05282Downstream retraction of preprinted research in the life and medical sciencesPLOS ONE

Dear Dr. Avissar-Whiting,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. The manuscript lacks referencing to related work. This topic has been and is discussed in a number of articles that are not sufficiently cited.

Please submit your revised manuscript by April 25, 2022. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at  gro.solp@enosolp . When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols . Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols .

We look forward to receiving your revised manuscript.

Kind regards,

Frederique Lisacek

Academic Editor

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at 

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and 

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. Thank you for stating the following in the Competing Interests section: 

"Michele Avissar-Whiting is the Editor in Chief of the Research Square preprint platform"

Please confirm that this does not alter your adherence to all PLOS ONE policies on sharing data and materials, by including the following statement: "This does not alter our adherence to  PLOS ONE policies on sharing data and materials.” (as detailed online in our guide for authors http://journals.plos.org/plosone/s/competing-interests ).  If there are restrictions on sharing of data and/or materials, please state these. Please note that we cannot proceed with consideration of your article until this information has been declared. 

Please include your updated Competing Interests statement in your cover letter; we will change the online submission form on your behalf.

3. In your Data Availability statement, you have not specified where the minimal data set underlying the results described in your manuscript can be found. PLOS defines a study's minimal data set as the underlying data used to reach the conclusions drawn in the manuscript and any additional data required to replicate the reported study findings in their entirety. All PLOS journals require that the minimal data set be made fully available. For more information about our data policy, please see http://journals.plos.org/plosone/s/data-availability .

"Upon re-submitting your revised manuscript, please upload your study’s minimal underlying data set as either Supporting Information files or to a stable, public repository and include the relevant URLs, DOIs, or accession numbers within your revised cover letter. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories . Any potentially identifying patient information must be fully anonymized.

Important: If there are ethical or legal restrictions to sharing your data publicly, please explain these restrictions in detail. Please see our guidelines for more information on what we consider unacceptable restrictions to publicly sharing data: http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions . Note that it is not acceptable for the authors to be the sole named individuals responsible for ensuring data access.

We will update your Data Availability statement to reflect the information you provide in your cover letter.

4. Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Yes

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #2: N/A

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: In my opinion this paper is relevant because it gives light on the link between preprints, publications focusing on retracted papers.

It is very relevant to know how journals manage retraction but also how preprint servers identify those retracted papers from journals.

This paper addresses this questions and provide some responses.

I have very few comments.

The first one is related to the references. In general there is a very low number of references and a small effort should be done by the author on this. In fact, the introduction only has 6 references with some paragraphs with no reference

I have some doubts on the classification on retracted papers, from table 1 you can see that a number of retractions (most of them) were due to misconduct, but some of them did not show misconduct. This is the case of case 17, where it was a journal mistake. Regarding IRB approval, though it is bad practice, it is not misconduct as such on how we understand publishing-related misconduct. I wonder if you could be more precise on this in your results and how you consider this case 17.

From table 1. It should be included the journal and name of first author. I think that journals publishing retracted papers should be identified along with unethical authors.

Methods. Second paragraph. A reference is needed here, because you are classifying misconduct. It this your own classification? There are other classifications published that could add more comparability to your results.

Results. Line 122-126. The cases do not add 30, and they should. I guess that perhaps some retractions are classified in two categories. An explanation is needed here.

You give importance on the number of retractions by preprint server. I suggest that a new table showing retractions per server and also retraction notice per server, along with time to notice or time to retraction would provide useful information on how these servers work when retractions are detected and their performance on this.

Some more references are needed on the discussion.

Professor Alberto Ruano-Ravina

Reviewer #2: This work provides us with a convincing observation on the status of preprints and links between preprints and subsequent pubications. This is rather an opinion on a socio-political topic of great interest than an authentic research article, but it is worth communicating to the community

6. PLOS authors have the option to publish the peer review history of their article ( what does this mean? ). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy .

Reviewer #1:  Yes:  Alberto Ruano-Ravina

Reviewer #2:  Yes:  Antoine Danchin

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool,  https://pacev2.apexcovantage.com/ . PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at  gro.solp@serugif . Please note that Supporting Information files do not need this step.

Author response to Decision Letter 0

21 Mar 2022

Dear Dr. Lisacek,

I am very grateful to the referees for the time they invested in reviewing the manuscript and for their important feedback, which has led to important revisions that greatly improved the presentation. My point-by-point responses to the referees’ comments follow.

Reviewer #1: In my opinion this paper is relevant because it gives light on the link between preprints, publications focusing on retracted papers.

The first one is related to the references. In general there is a very low number of references and a small effort should be done by the author on this. In fact, the introduction only has 6 references with some paragraphs with no reference.

Response: I agree that the manuscript was lacking sufficient references. I have now added six new relevant references, including in places where new information was added based on other suggestions by the reviewer. I have also added new citations to existing references where needed throughout the text.

Response: While not all cases were defined as being due to misconduct in the original manuscript, the reviewer makes an excellent point that these categories should have been defined in a more standardized and clearer way. To clarify the categorization of these retractions, I have redesignated them based on the definitions of research misconduct provided by the Council of Science Editors. Three discrete designations have been used, and the following text has been added to the Methods section:

“Misconduct in each case of retraction was categorized according to the areas defined by the Council of Science Editors [7] as follows: Mistreatment of research subjects (includes failure to obtain approval from an ethical review board or consent from human subjects before conducting the study and failure to follow an approved protocol); Falsification and fabrication of data; Piracy and plagiarism (including unauthorized use of third-party data or other breach of policy by authors); or No evidence of misconduct. The determination of presence and type of misconduct was based on information contained in the individual retraction notices as well as on the reasons for retraction briefly noted in the Retraction Watch database.”

Note that this definitional change also changed the quantification of retractions in each category, so the numbers in the results have shifted slightly as a result.

Response: While I agree in principle with the referee’s comment, I could not include this information due to our agreement with the Center for Scientific Integrity, which limits the granularity of the data I could share in this publication. Additionally (and fortunately), the specifics of the individual studies are not relevant to the thesis of this study, which is focused more on the integrity of information propagation rather than the individual instances of retraction themselves. Thus, not identifying them here does not seem to undermine the fundamental premise of the work. On a related note, I have removed the Article Type and Country information from the table, as neither is discussed or relevant to the topic at hand.

Response: Thank you for this comment, which has now been addressed above.

Response: Thank you for this comment. The reviewer is correct that retractions could fall into multiple categories. I have now acknowledged this clearly in this section. Note that substantial changes were made to this section due to the re-classification of research misconduct.

Response: I’m grateful to the reviewer for this suggestion. I have added a table (new Table 1) that contains the server-specific information. I have left the retraction notice information in the larger table, however.

Response: I thank the reviewer for this suggestion. Six new references and four new citations to existing references have been added to the paper.

Reviewer #2: This work provides us with a convincing observation on the status of preprints and links between preprints and subsequent pubications. This is rather an opinion on a socio-political topic of great interest than an authentic research article, but it is worth communicating to the community

Response: I thank the reviewer for their comment and agree that there is certainly an element of opinion or “call to action” in this work. My hope is that I’ve provided sufficient empirical evidence to start a discussion and provide a foundation for future, more comprehensive, studies assessing the fidelity of information transfer between associated article types. More importantly, I hope it will convince the myriad stakeholders of the importance of strengthening these connections. This comment compelled me to add a statement regarding future studies to the end of the discussion.

Submitted filename: Response to Reviewers.docx

Decision Letter 1

20 Apr 2022

PONE-D-22-05282R1

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/ , click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at gro.solp@gnillibrohtua .

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact gro.solp@sserpeno .

Additional Editor Comments (optional):

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: All comments have been addressed

2. Is the manuscript technically sound, and do the data support the conclusions?

3. Has the statistical analysis been performed appropriately and rigorously?

4. Have the authors made all data underlying the findings in their manuscript fully available?

5. Is the manuscript presented in an intelligible fashion and written in standard English?

6. Review Comments to the Author

Reviewer #1: The author has answered all my comments satisfactorily and I do not have more concerns on the contents of the manuscript

7. PLOS authors have the option to publish the peer review history of their article ( what does this mean? ). If published, this will include your full peer review and any attached files.

Reviewer #1:  Yes:  Alberto Ruano-Raviña

Acceptance letter

22 Apr 2022

Dear Dr. Avissar-Whiting:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact gro.solp@sserpeno .

If we can help with anything else, please email us at gro.solp@enosolp .

Thank you for submitting your work to PLOS ONE and supporting open access.

PLOS ONE Editorial Office Staff

on behalf of

Dr. Frederique Lisacek

research square preprint update

Authors needs are changing.

Societies and publishers should evolve with them..

Many researchers want more control of how - and when - they share their research. How are your journals evolving to meet these changing needs and expectations? Through In Review, we can help you adapt while also enabling the success of your authors. ‍ Learn more below, then contact us to discuss integrating your journal.

In Review is a simple journal integration for publishers and academic societies. It allows authors to seamlessly post their manuscripts on the multidisciplinary preprint platform, Research Square, while simultaneously submitting those same manuscripts to journals for peer review. In Review includes a series of beneficial features for publishers and authors alike.

Nearly 500 journals are using In Review integration.

Authors are opting to join In Review 40% of the time.

Nearly 100,000 preprints have been posted on Research Square, covering hundreds of disciplines. ‍

For Publishers and Societies

In Review helps you quickly and seamlessly integrate preprinting into your manuscript submission process without the need to build or manage a separate preprint platform.

The value of In Review to Publishers and Societies

  • Zero-cost implementation
  • Seamless integration of In Review into your manuscript submission systems and journals
  • Possible revenue sharing for Research Square's paid author services (research promotion services, editorial badges, etc.)
  • Branding integration
  • White label opportunities
  • A peer-review timeline that provides varying levels of transparency for the publisher
  • Integration with one of the largest, most robust multidisciplinary preprint servers in the world

For Authors

Using In Review, authors can publish their manuscripts as preprints on Research Square while submitting them to participating journals. If the manuscript is not accepted, the author may still submit to other journals, but their work will remain on the Research Square preprint platform.

The value of In Review to Authors

Immediately upon publication of the preprint, the primacy of their work is established. Authors can also track and immediately see status changes as their manuscripts advance through the peer-review process.

  • A peer-review timeline with instant status updates as your manuscript moves through the peer-review process
  • Established primacy of your findings upon publication of your preprint
  • Access to your preliminary paper while readers await your peer-reviewed publication
  • A citable DOI for your preprinted manuscript
  • Access to our tools and services for manuscript improvement, like community comments, language assessment tools, and more
  • A link from the preprint to the latest published version of record
  • Full-text indexing for better search engine discoverability
  • Attractive HTML formatting for all preprints. This allows better readability, image resizing, and figure downloads.

"I opted-in because I knew I should be submitting preprints, but I didn’t know where to start. In Review made it convenient."

In Review author, Johns Hopkins University

"Every publication should have a system like In Review , where everyone can see when and where you submitted."

Pritam Sukul, University Medicine Rostock

research square preprint update

New evidence found for Planet 9

A small team of planetary scientists from the California Institute of Technology, Université Côte d'Azur and Southwest Research Institute reports possible new evidence of Planet 9. They have published their paper on the arXiv preprint server, and it has been accepted for publication in The Astrophysical Journal Letters .

In 2015, a pair of astronomers at Caltech found several objects bunched together beyond Neptune's orbit, near the edge of the solar system. The bunching, they theorized, was due to the pull of gravity from an unknown planet—one that later came to be called Planet 9.

Since that time, researchers have found more evidence of the planet, all of it circumstantial. In this new paper, the research team reports what they describe as additional evidence supporting the existence of the planet.

The work involved tracking the movements of long-period objects that cross Neptune's orbit and exhibit irregular movements during their journey. They used these observations to create multiple computer simulations, each depicting different scenarios.

In addition to factoring in the impact of Neptune's gravitational pull, the team also added data to take into account what has come to be known as the galactic tide, a combination of forces exerted by Milky Way objects beyond the solar system.

The research team found that the most plausible explanation for the behavior of the objects was interference from gravity exerted by a large distant planet. Unfortunately, the simulations were not of the type that would allow the research team to identify the location of the planet.

The team acknowledges that other forces could be at play that might explain the behavior that they simulated but suggest they are less likely. They also note that further evidence will become available as the Vera Rubin Observatory in Chile is set to begin operations sometime next year. It will be equipped, they note, to search in new ways for the planet in a rigorous assessment of its existence.

More information: Konstantin Batygin et al, Generation of Low-Inclination, Neptune-Crossing TNOs by Planet Nine, arXiv (2024). DOI: 10.48550/arxiv.2404.11594

© 2024 Science X Network

A comparison of the orbital distributions from P9-inclusive (left) and P9-free (right) N−body simulations. Both panels depict the perihelion distance against the semi-major axis of orbital footprints of simulated TNOs with i < 40 deg. The overlaying contour lines represent density distributions, with brighter colors indicating higher concentrations of objects. While the panels themselves show raw simulation data, the histograms along the axes show a biased frequency distribution for the perihelion distances (vertical) and semi-major axes (horizontal), assuming a limiting magnitude of Vlim = 24. Credit: arXiv (2024). DOI: 10.48550/arxiv.2404.11594

IMAGES

  1. More than 150,000 Preprints Now Posted on Research Square

    research square preprint update

  2. How to Get the Most Impact from Your Preprint

    research square preprint update

  3. Preprints

    research square preprint update

  4. 2020: The Year of the Preprint

    research square preprint update

  5. How do I submit a revised version of my preprint?

    research square preprint update

  6. What is a Preprint?

    research square preprint update

VIDEO

  1. preLights webinar series: From preprint to publication

  2. PUBLISHING AN OBGYN PAPER IN A JOURNAL

  3. The Importance of Publications for R16 Applications

  4. Make lighting healthier

  5. MPS Ltd

  6. preLights webinar series: From preprint to publication

COMMENTS

  1. Updating/Modifying a Preprint

    Updating/Modifying a Preprint. How do I submit a new version of my preprint? Can I withdraw or remove my preprint from the platform? There are issues with the author list on my preprint. How can I update this? How can I make changes/corrections to my preprint? My manuscript has been published. Can this be reflected on my preprint?

  2. Preprints

    Research Square is a multidisciplinary preprint and author services platform. You can share your work early in the form of a preprint, gain feedback from the community, and use our tools and services to improve your paper. You can also learn about breakthroughs in your field and find potential collaborators before publishing in a scholarly journal.

  3. How can I make changes/corrections to my preprint?

    Research Square allows preprint versioning. Corrections or changes can be made by submitting a new version (revision) of your preprint. The procedure for how to submit a new version is here. 1 year ago.

  4. How do I submit a new version of my preprint?

    Click on "View your private pages" link located above the preprint title. Select "Add new version" from the left hand side menu. Fully update the title, authors, abstract, keywords, and upload any necessary files to reflect all changes to the manuscript by using the edit icons in the upper right hand corner of each section. Click the ...

  5. Preprints

    Updating/Modifying a Preprint How do I submit a new version of my preprint? Can I withdraw or remove my preprint from the platform? There are issues with the author list on my preprint. How can I update this? How can I make changes/corrections to my preprint? My manuscript has been published. Can this be reflected on my preprint?

  6. Home

    Research Square is a preprint platform that makes research communication faster, fairer, and more useful. Browse. Preprints. COVID-19 Preprints. Protocols. Videos. Journals. Tools & Services. Overview. ... Post your manuscript as a preprint directly to Research Square or while under consideration at a participating journal through In Review ...

  7. Submitting a Preprint Directly to Research Square

    How do I submit a preprint? What article types do you accept? What checks will my manuscript undergo before it is posted? How long will it take for my preprint to be posted? Can I post my manuscript on another preprint server as well? Where are preprints posted on Research Square indexed? What are Research Square's editorial policies?

  8. New Draft

    Research Square allows you to get credit and gain visibility for your work as soon as you feel it's ready. Post your results online as a preprint, gain early feedback, and start making changes prior to peer review in a journal. Enter your manuscript's title. Your manuscript will be saved as a private draft until you are ready to submit.

  9. Can I opt into preprinting?

    Yes, you are able to opt in to preprinting directly from Research Square using the following steps. Log into your Research Square account. You will need to select the paper from Research Square. Select either "Post my preprint" from the left hand side menu or the "Learn More" button. Select the button labeled either "Opt in to In ...

  10. In Review

    When authors opt in to In Review, their paper is posted as a preprint and made available for comments. In addition to peer review at the journal, community comments help authors to improve their article. A peer review timeline allows authors and readers to track the status of a manuscript with real time updates. In Review allows authors to:

  11. Quickly Share, Gain Feedback, and Improve Your Papers with Research Square

    The HSLS Update has published numerous articles about preprints over the years. Here we introduce another iteration of the preprint movement — Research Square, a multidisciplinary platform that helps researchers share their work early, gather feedback, and improve their manuscripts prior to (or in parallel with) journal submission.

  12. Can I withdraw or remove my preprint from the platform?

    A preprint may be withdrawn due to issues that cannot be addressed by submitting a revised version of the preprint, such as detection of plagiarism, authorship disputes, and content errors that will not or cannot be corrected. When a preprint is withdrawn, a notification explaining the reason for withdrawal is placed on the preprint as a new ...

  13. More than 150,000 Preprints Now Posted on Research Square

    Durham, NC, USA (May 26, 2022) -- Just nine months after reaching its 100,000 preprint milestone, Research Square now hosts more than 150,000 preprints. The rapid growth of the Research Square preprint server is due in great part to its expansion of In Review: a journal-integrated service allowing researchers to post preprints of their manuscripts during article submission, supports journals ...

  14. Participating Journals & Platforms

    Research Square is a preprint platform that makes research communication faster, fairer, and more useful. Browse. Preprints. ... Opt in when you submit to participating journals and receive real-time updates on your manuscript. 3D Printing in Medicine. ... As a division of Research Square Company, we're committed to making research ...

  15. In Review

    Authors have the option to opt in to In Review directly from the submission system of participating journals.. To opt in, all co-authors agree to have their manuscript posted as a preprint with a CC-BY 4.0 license and a DOI, becoming a permanent part of the scholarly record. Read more about our editorial policies here.. Once the submission has been sent out for review by the journal and has ...

  16. SARS-CoV-2 Preprints

    SARS-CoV-2 and COVID-19 Preprints. Ultrastructural morphology exhibited by coronaviruses. Since November 2019, the novel coronavirus known as SARS-CoV-2 (formerly, 2019-nCoV) has devastated communities and overwhelmed healthcare facilities worldwide. Research on the virus, its epidemiology, modes of infection, and potential treatments has been ...

  17. Research Square Reaches 100,000 Preprint Milestone

    Research Square Reaches 100,000 Preprint Milestone. Durham, NC, USA (August 12, 2021) -- Fewer than three years after the first preprint was posted on Research Square, the world's fastest-growing multidisciplinary preprint platform has surpassed 100,000 preprints. These 100,000-plus preprints combined were produced by 530,415 unique co ...

  18. Research Square Reaches 20,000 Preprints

    DURHAM, NORTH CAROLINA, USA - Research Square, known for their multidisciplinary preprint platform, reached 20,000 preprints just 18 months after its launch in 2018. Having recently celebrated 10,000 preprints in Dec. 2019, the platform doubled its number of preprints in the last 4 months and is effectively the world's fastest-growing preprint platform.

  19. Systematic examination of preprint platforms for use in the medical and

    In that time, over 2.72 million preprints have been posted and in 2020, two platforms (Research Square and bioRxiv) have averaged more than 2500 biomedical postings per month. ... Our plan for maintenance is to enable preprint platforms to update their listings on demand, pending verification of publically accessible information by ASAPbio ...

  20. Research Square Now Offering Badges via Preprint Platform

    Launched in 2018, Research Square now hosts over 20,000 preprints across all fields and is the world's fastest-growing preprint platform and the #4 source of research on COVID-19. Preprint authors can choose to purchase a badge in Methods or Statistics Reporting to certify that fundamental standards of scientific reporting around methodology ...

  21. Research Square Reaches 20,000 Preprints

    DURHAM, NORTH CAROLINA, USA - Research Square, known for its multidisciplinary preprint platform, reached 20,000 preprints just 18 months after its launch in 2018. Having recently celebrated 10,000 preprints in Dec. 2019, the platform doubled its number of preprints in the last 4 months and is effectively the world's fastest-growing preprint platform.

  22. Downstream retraction of preprinted research in the life and medical

    Among the three preprint servers, a total of 30 downstream retractions were identified: 17 at Research Square, 11 at bioRxiv, and 2 at medRxiv (Table 2). This represents 0.05%, 0.01%, and 0.02% of all preprints with journal publication links at Research Square, bioRxiv, and medRxiv, respectively (Table 1). All 30 of the retracted papers in this ...

  23. In Review

    In Review helps you quickly and seamlessly integrate preprinting into your manuscript submission process without the need to build or manage a separate preprint platform.. The value of In Review to Publishers and Societies. Zero-cost implementation; Seamless integration of In Review into your manuscript submission systems and journals; Possible revenue sharing for Research Square's paid author ...

  24. New evidence found for Planet 9

    A small team of planetary scientists from the California Institute of Technology, Université Côte d'Azur and Southwest Research Institute reports possible new evidence of Planet 9. They have ...

  25. Patched Cosmos Bug Could've Put $150M At Risk, Says Firm ...

    May 29-31, 2024 - Austin, TexasThe biggest and most established global hub for everything crypto, blockchain and Web3.Register Now Asymmetric Research, a security firm that contributes to the ...