Skip to main content
Advertisement
  • Loading metrics

The ARRIVE guidelines 2.0: Updated guidelines for reporting animal research

Abstract

Reproducible science requires transparent reporting. The ARRIVE guidelines (Animal Research: Reporting of In Vivo Experiments) were originally developed in 2010 to improve the reporting of animal research. They consist of a checklist of information to include in publications describing in vivo experiments to enable others to scrutinise the work adequately, evaluate its methodological rigour, and reproduce the methods and results. Despite considerable levels of endorsement by funders and journals over the years, adherence to the guidelines has been inconsistent, and the anticipated improvements in the quality of reporting in animal research publications have not been achieved. Here, we introduce ARRIVE 2.0. The guidelines have been updated and information reorganised to facilitate their use in practice. We used a Delphi exercise to prioritise and divide the items of the guidelines into 2 sets, the “ARRIVE Essential 10,” which constitutes the minimum requirement, and the “Recommended Set,” which describes the research context. This division facilitates improved reporting of animal research by supporting a stepwise approach to implementation. This helps journal editors and reviewers verify that the most important items are being reported in manuscripts. We have also developed the accompanying Explanation and Elaboration (E&E) document, which serves (1) to explain the rationale behind each item in the guidelines, (2) to clarify key concepts, and (3) to provide illustrative examples. We aim, through these changes, to help ensure that researchers, reviewers, and journal editors are better equipped to improve the rigour and transparency of the scientific process and thus reproducibility.

See S1 Annotated Byline for individual authorspositions at the time this article was submitted.

Why good reporting is important

In recent years, concerns about the reproducibility of research findings have been raised by scientists, funders, research users, and policy makers [1, 2]. Factors that contribute to poor reproducibility include flawed study design and analysis, variability and inadequate validation of reagents and other biological materials, insufficient reporting of methodology and results, and barriers to accessing data [3]. The bioscience community has introduced a range of initiatives to address the problem, from open access and open practices to enable the scrutiny of all aspects of the research [4, 5] through to study preregistration to shift the focus towards robust methods rather than the novelty of the results [6, 7], as well as resources to improve experimental design and statistical analysis [810].

Transparent reporting of research methods and findings is an essential component of reproducibility. Without this, the methodological rigour of the studies cannot be adequately scrutinised, the reliability of the findings cannot be assessed, and the work cannot be repeated or built upon by others. Despite the development of specific reporting guidelines for preclinical and clinical research, evidence suggests that scientific publications often lack key information and that there continues to be considerable scope for improvement [1118]. Animal research is a good case in point, where poor reporting impacts on the development of therapeutics and irreproducible findings can spawn an entire field of research, or trigger clinical studies, subjecting patients to interventions unlikely to be effective [2, 19, 20].

In an attempt to improve the reporting of animal research, the Animal Research: Reporting of In Vivo Experiments (ARRIVE) guidelines were published in 2010. The guidelines consist of a checklist of the items that should be included in any manuscript that reports in vivo experiments, to ensure a comprehensive and transparent description [2130]. They apply to any area of research using live animal species and are especially pertinent to describe comparative research in the laboratory or other formal test setting. The guidelines are also relevant in a wider context, for example, for observational research, studies conducted in the field, and where animal tissues are used. In the 10 years since publication, the ARRIVE guidelines have been endorsed by more than a thousand journals from across the life sciences. Endorsement typically includes advocating their use in guidance to authors and reviewers. However, despite this level of support, recent studies have shown that important information as set out in the ARRIVE guidelines is still missing from most publications sampled. This includes details on randomisation (reported in only 30%–40% of publications), blinding (reported in only approximately 20% of publications), sample size justification (reported in less than 10% of publications), and animal characteristics (all basic characteristics reported in less than 10% of publications) [11, 31, 32].

Evidence suggests that 2 main factors limit the impact of the ARRIVE guidelines. The first is the extent to which editorial and journal staff are actively involved in enforcing reporting standards. This is illustrated by a randomised controlled trial at PLOS ONE, designed to test the effect of requesting a completed ARRIVE checklist in the manuscript submission process. This single editorial intervention, which did not include further verification from journal staff, failed to improve the disclosure of information in published papers [33]. In contrast, other studies using shorter checklists (primarily focused on experimental design) with more editorial follow-up have shown a marked improvement in the nature and detail of the information included in publications [3436]. It is likely that the level of resource required from journals and editors currently prohibits the implementation of all the items of the ARRIVE guidelines.

The second issue is that researchers and other individuals and organisations responsible for the integrity of the research process are not sufficiently aware of the consequences of incomplete reporting. There is some evidence that awareness of ARRIVE is linked to the use of more rigorous experimental design standards [37]; however, researchers are often unfamiliar with the much larger systemic bias in the publication of research and in the reliability of certain findings and even of entire fields [33, 3840]. This lack of understanding affects how experiments are designed and grant proposals prepared, how animals are used and data recorded in the laboratory, and how manuscripts are written by authors or assessed by journal staff, editors, and reviewers.

Approval for experiments involving animals is generally based on a harm–benefit analysis, weighing the harms to the animals involved against the benefits of the research to society. If the research is not reported in enough detail, even when conducted rigorously, the benefits may not be realised, and the harm–benefit analysis and public trust in the research are undermined [41]. As a community, we must do better to ensure that, where animals are used, the research is both well designed and analysed as well as transparently reported. Here, we introduce the revised ARRIVE guidelines, referred to as ARRIVE 2.0. The information included has been updated, extended, and reorganised to facilitate the use of the guidelines, helping to ensure that researchers, editors, and reviewers—as well as other relevant journal staff—are better equipped to improve the rigour and reproducibility of animal research.

Introducing ARRIVE 2.0

In ARRIVE 2.0, we have improved the clarity of the guidelines, prioritised the items, added new information, and generated the accompanying Explanation and Elaboration (E&E) document to provide context and rationale for each item [42] (also available at https://www.arriveguidelines.org). New additions comprise inclusion and exclusion criteria, which are a key aspect of data handling and prevent the ad hoc exclusion of data [43]; protocol registration, a recently emerged approach that promotes scientific rigour and encourages researchers to carefully consider the experimental design and analysis plan before any data are collected [44]; and data access, in line with the FAIR Data Principles [45] (Findable, Accessible, Interoperable, and Reusable). S1 Table summarises the changes.

The most significant departure from the original guidelines is the classification of items into 2 prioritised groups, as shown in Tables 1 and 2. There is no ranking of the items within each group. The first group is the “ARRIVE Essential 10,” which describes information that is the basic minimum to include in a manuscript, as without this information, reviewers and readers cannot confidently assess the reliability of the findings presented. It includes details on the study design, the sample size, measures to reduce subjective bias, outcome measures, statistical methods, the animals, experimental procedures, and results. The second group, referred to as the “Recommended Set,” adds context to the study described. This includes the ethical statement, declaration of interest, protocol registration, and data access, as well as more detailed information on the methodology such as animal housing, husbandry, care, and monitoring. Items on the abstract, background, objectives, interpretation, and generalisability also describe what to include in the more narrative parts of a manuscript.

Revising the guidelines has been an extensive and collaborative effort, with input from the scientific community carefully built into the process. The revision of the ARRIVE guidelines has been undertaken by an international working group—the authors of this publication—with expertise from across the life sciences community, including funders, journal editors, statisticians, methodologists, and researchers from academia and industry. We used a Delphi exercise [46] with external stakeholders to maximise diversity in fields of expertise and geographical location, with experts from 19 countries providing feedback on each item, suggesting new items, and ranking items according to their relative importance for assessing the reliability of research findings. This ranking resulted in the prioritisation of the items of the guidelines into the 2 sets. Demographics of the Delphi panel and full methods and results are presented in Supporting Information S1 Delphi and S1 Data. Following their publication on BioRxiv, the revised guidelines and the E&E were also road tested with researchers preparing manuscripts describing in vivo studies, to ensure that these documents were well understood and useful to the intended users. This study is presented in Supporting Information S1 Road Testing and S2 Data.

While reporting animal research in adherence to all 21 items of ARRIVE 2.0 represents best practice, the classification of the items into 2 groups is intended to facilitate the improved reporting of animal research by allowing an initial focus on the most critical issues. This better allows journal staff, editors, and reviewers to verify that the items have been adequately reported in manuscripts. The first step should be to ensure compliance with the ARRIVE Essential 10 as a minimum requirement. Items from the Recommended Set can then be added over time and in line with specific editorial policies until all the items are routinely reported in all manuscripts. ARRIVE 2.0 are fully compatible with and complementary to other guidelines that have been published in recent years. By providing a comprehensive set of recommendations that are specifically tailored to the description of in vivo research, they help authors reporting animal experiments adhere to the National Institutes of Health (NIH) standards [43] and the minimum standards framework and checklist (Materials, Design, Analysis and Reporting [MDAR] [47]). The revised guidelines are also in line with many journals’ policies and will assist authors in complying with information requirements on the ethical review of the research [48, 49], data presentation and access [5052], statistical methods [51, 52], and conflicts of interest [53, 54].

Although the guidelines are written with researchers and journal editorial policies in mind, it is important to stress that researchers alone should not have to carry the responsibility for transparent reporting. Funders, institutions, and publishers’ endorsement of ARRIVE has been instrumental in raising awareness to date; they now have a key role to play in building capacity and championing the behavioural changes required to improve reporting practices. This includes embedding ARRIVE 2.0 in appropriate training, workflows, and processes to support researchers in their different roles. While the primary focus of the guidelines has been on the reporting of animal studies, ARRIVE also has other applications earlier in the research process, including in the planning and design of in vivo experiments. For example, requesting a description of the study design in line with the guidelines in funding or ethical review applications ensures that steps to minimise experimental bias are considered at the beginning of the research cycle [55].

Conclusion

Transparent reporting is clearly essential if animal studies are to add to the knowledge base and inform future research, policy, and clinical practice. ARRIVE 2.0 prioritises the reporting of information related to study reliability. This enables research users to assess how much weight to ascribe to the findings and, in parallel, promotes the use of rigorous methodology in the planning and conduct of in vivo experiments [37], thus increasing the likelihood that the findings are reliable and, ultimately, reproducible.

The intention of ARRIVE 2.0 is not to supersede individual journal requirements but to promote a harmonised approach across journals to ensure that all manuscripts contain the essential information needed to appraise the research. Journals usually share a common objective of improving the methodological rigour and reproducibility of the research they publish, but different journals emphasise different pieces of information [5658]. Here, we propose an expert consensus on information to prioritise. This will provide clarity for authors, facilitate transfer of manuscripts between journals, and accelerate an improvement of reporting standards.

Concentrating the efforts of the research and publishing communities on the ARRIVE Essential 10 items provides a manageable approach to evaluate reporting quality efficiently and assess the effect of interventions and policies designed to improve the reporting of animal experiments. It provides a starting point for the development of operationalised checklists to assess reporting, ultimately leading to the build of automated or semi-automated artificial intelligence tools that can detect missing information rapidly [59].

Improving reporting is a collaborative endeavour, and concerted effort from the biomedical research community is required to ensure maximum impact. We welcome collaboration with other groups operating in this area, as well as feedback on ARRIVE 2.0 and our implementation strategy.

Supporting information

S1 Table. Noteworthy changes in ARRIVE 2.0.

This table recapitulates noteworthy changes in the ARRIVE guidelines 2.0, compared to the original ARRIVE guidelines published in 2010.

https://doi.org/10.1371/journal.pbio.3000410.s001

(PDF)

S1 Delphi. Delphi methods and results.

Methodology and results of the Delphi study that was used to prioritise the items of the guidelines into the ARRIVE Essential 10 and Recommended Set.

https://doi.org/10.1371/journal.pbio.3000410.s002

(PDF)

S1 Data. Delphi data.

Tabs 1, 2, and 3: Panel members’ scores for each of the ARRIVE items during rounds 1, 2, and 3, along with descriptive statistics. Tab 4: Qualitative feedback, collected from panel members during round 1, on the importance and the wording of each item. Tab 5: Additional items suggested for consideration in ARRIVE 2.0; similar suggestions were grouped together before processing. Tab 6: Justifications provided by panel members for changing an item’s score between round 1 and round 2.

https://doi.org/10.1371/journal.pbio.3000410.s003

(XLSX)

S2 Data. Road testing data.

Tab 1: Participants’ demographics and general feedback on the guidelines and the E&E preprints. Tab 2: Outcome of each manuscript’s assessment and justifications provided by participants for not including information covered in the ARRIVE guidelines.

https://doi.org/10.1371/journal.pbio.3000410.s004

(XLSX)

S1 Road Testing. Road testing methods and results.

Methodology used to road test the revised ARRIVE guidelines and E&E (as published in preprint) and how this information was used in the development of ARRIVE 2.0.

https://doi.org/10.1371/journal.pbio.3000410.s005

(PDF)

S1 Annotated Byline. Individual authors’ positions at the time this article was submitted.

https://doi.org/10.1371/journal.pbio.3000410.s006

(DOCX)

Acknowledgments

We would like to thank the members of the expert panel for the Delphi exercise and the participants of the road testing for their time and feedback. We are grateful to the DelphiManager team for advice and use of their software. We would like to acknowledge the late Doug Altman’s contribution to this project; Doug was a dedicated member of the working group and his input to the guidelines’ revision has been invaluable.

References

  1. 1. Goodman SN, Fanelli D, Ioannidis JPA. What does research reproducibility mean? Science translational medicine. 2016;8(341):341ps12. pmid:27252173
  2. 2. Begley CG, Ioannidis JP. Reproducibility in science: improving the standard for basic and preclinical research. Circ Res. 2015;116(1):116–26. Epub 2015/01/02. pmid:25552691.
  3. 3. Freedman LP, Venugopalan G, Wisman R. Reproducibility2020: Progress and priorities. F1000Research. 2017;6:604. Epub 2017/06/18. pmid:28620458; PubMed Central PMCID: PMC5461896.
  4. 4. Kidwell MC, Lazarevic LB, Baranski E, Hardwicke TE, Piechowski S, Falkenberg LS, et al. Badges to acknowledge open practices: a simple, low-cost, effective method for increasing transparency. PLoS Biol. 2016;14(5):e1002456. Epub 2016/05/14. pmid:27171007; PubMed Central PMCID: PMC4865119.
  5. 5. Else H. Radical open-access plan could spell end to journal subscriptions. Nature. 2018;561(7721):17–8. Epub 2018/09/06. pmid:30181639.
  6. 6. Nosek BA, Ebersole CR, DeHaven AC, Mellor DT. The preregistration revolution. Proc Natl Acad Sci U S A. 2018;115(11):2600–6. Epub 2018/03/14. pmid:29531091; PubMed Central PMCID: PMC5856500.
  7. 7. Chambers CD, Forstmann B, Pruszynski JA. Registered reports at the European Journal of Neuroscience: consolidating and extending peer-reviewed study pre-registration. Eur J Neurosci. 2017;45(5):627–8. Epub 2016/12/28. pmid:28027598.
  8. 8. Bate ST, Clark RA. The design and statistical analysis of animal experiments. Cambridge, United Kingdom: Cambridge University Press; 2014. 310 p.
  9. 9. Percie du Sert N, Bamsey I, Bate ST, Berdoy M, Clark RA, Cuthill I, et al. The Experimental Design Assistant. PLoS Biol. 2017;15(9):e2003779. Epub 2017/09/29. pmid:28957312; PubMed Central PMCID: PMC5634641.
  10. 10. Lazic SE. Experimental design for laboratory biologists: maximising information and improving reproducibility. Cambridge: Cambridge University Press; 2016.
  11. 11. Macleod MR, Lawson McLean A, Kyriakopoulou A, Serghiou S, de Wilde A, Sherratt N, et al. Risk of bias in reports of in vivo research: a focus for improvement. PLoS Biol. 2015;13(10):e1002273. Epub 2015/10/16. pmid:26460723; PubMed Central PMCID: PMC4603955.
  12. 12. Macleod MR, Fisher M, O'Collins V, Sena ES, Dirnagl U, Bath PM, et al. Good laboratory practice: preventing introduction of bias at the bench. Stroke. 2009;40(3):e50–2. Epub 2008/08/16. STROKEAHA.108.525386 [pii] pmid:18703798.
  13. 13. Rice AS, Cimino-Brown D, Eisenach JC, Kontinen VK, Lacroix-Fralish ML, Machin I, et al. Animal models and the prediction of efficacy in clinical trials of analgesic drugs: a critical appraisal and call for uniform reporting standards. Pain. 2009;139(2):243–7. Epub 2008/09/26. S0304-3959(08)00508-3 [pii] pmid:18814968.
  14. 14. McCance I. Assessment of statistical procedures used in papers in the Australian Veterinary Journal. Aust Vet J. 1995;72(9):322–8. Epub 1995/09/01. pmid:8585846.
  15. 15. Hackam DG, Redelmeier DA. Translation of research evidence from animals to humans. JAMA. 2006;296(14):1731–2. Epub 2006/10/13. pmid:17032985.
  16. 16. Kilkenny C, Parsons N, Kadyszewski E, Festing MF, Cuthill IC, Fry D, et al. Survey of the quality of experimental design, statistical analysis and reporting of research using animals. PLoS ONE. 2009;4(11):e7824. Epub 2009/12/04. pmid:19956596.
  17. 17. van der Worp HB, Howells DW, Sena ES, Porritt MJ, Rewell S, O'Collins V, et al. Can animal models of disease reliably inform human studies? PLoS Med. 2010;7(3):e1000245. Epub 2010/04/03. pmid:20361020.
  18. 18. Glasziou P, Altman DG, Bossuyt P, Boutron I, Clarke M, Julious S, et al. Reducing waste from incomplete or unusable reports of biomedical research. Lancet. 2014;383(9913):267–76. Epub 2014/01/15. pmid:24411647.
  19. 19. Begley CG, Ellis LM. Drug development: raise standards for preclinical cancer research. Nature. 2012;483(7391):531–3. Epub 2012/03/31. pmid:22460880.
  20. 20. Scott S, Kranz JE, Cole J, Lincecum JM, Thompson K, Kelly N, et al. Design, power, and interpretation of studies in the standard murine model of ALS. Amyotroph Lateral Scler. 2008;9(1):4–15. Epub 2008/02/15. 789666722 [pii] pmid:18273714.
  21. 21. Kilkenny C, Altman DG. Improving bioscience research reporting: ARRIVE-ing at a solution. Lab Anim. 2010;44(4):377–8. Epub 2010/07/28. la.2010.0010021 [pii] pmid:20660161.
  22. 22. Kilkenny C, Browne W, Cuthill IC, Emerson M, Altman DG. Animal research: reporting in vivo experiments: the ARRIVE guidelines. J Gene Med. 2010;12(7):561–3. Epub 2010/07/08. pmid:20607692.
  23. 23. Kilkenny C, Browne WJ, Cuthill IC, Emerson M, Altman DG. Improving bioscience research reporting: the ARRIVE guidelines for reporting animal research. PLoS Biol. 2010;8(6):e1000412. Epub 2010/07/09. pmid:20613859; PubMed Central PMCID: PMC2893951.
  24. 24. Kilkenny C, Browne WJ, Cuthill IC, Emerson M, Altman DG. Improving bioscience research reporting: the ARRIVE guidelines for reporting animal research. Journal of Pharmacology & Pharmacotherapeutics. 2010;1(2):94–9. pmid:21350617
  25. 25. Kilkenny C, Browne WJ, Cuthill IC, Emerson M, Altman DG. Animal Research: Reporting In Vivo Experiments: the ARRIVE guidelines. J Physiol. 2010;588(Pt 14):2519–21. Epub 2010/07/17. 588/14/2519 [pii] pmid:20634180; PubMed Central PMCID: PMC2916981.
  26. 26. Kilkenny C, Browne WJ, Cuthill IC, Emerson M, Altman DG. Animal Research: Reporting In Vivo Experiments: the ARRIVE guidelines. Exp Physiol. 2010;95(8):842–4. Epub 2010/07/09. 95/8/842 [pii] pmid:20610776.
  27. 27. McGrath JC, Drummond GB, McLachlan EM, Kilkenny C, Wainwright CL. Guidelines for reporting experiments involving animals: the ARRIVE guidelines. Br J Pharmacol. 2010;160(7):1573–6. Epub 2010/07/24. BPH873 [pii] pmid:20649560; PubMed Central PMCID: PMC2936829.
  28. 28. Kilkenny C, Browne W, Cuthill IC, Emerson M, Altman DG. Animal Research: Reporting In Vivo Experiments-the ARRIVE Guidelines. J Cereb Blood Flow Metab. 2011. Epub 2011/01/06. jcbfm2010220 [pii] pmid:21206507.
  29. 29. Kilkenny C, Browne WJ, Cuthill IC, Emerson M, Altman DG. Improving bioscience research reporting: the ARRIVE guidelines for reporting animal research. Vet Clin Pathol. 2012;41(1):27–31. Epub 2012/03/07. pmid:22390425.
  30. 30. Kilkenny C, Browne WJ, Cuthill IC, Emerson M, Altman DG. Improving bioscience research reporting: the ARRIVE guidelines for reporting animal research. Osteoarthritis Cartilage. 2012;20(4):256–60. pmid:22424462
  31. 31. Avey MT, Moher D, Sullivan KJ, Fergusson D, Griffin G, Grimshaw JM, et al. The Devil Is in the Details: Incomplete Reporting in Preclinical Animal Research. PLoS ONE. 2016;11(11):e0166733. Epub 2016/11/18. pmid:27855228; PubMed Central PMCID: PMC5113978.
  32. 32. Leung V, Rousseau-Blass F, Beauchamp G, Pang DSJ. ARRIVE has not ARRIVEd: Support for the ARRIVE (Animal Research: Reporting of in vivo Experiments) guidelines does not improve the reporting quality of papers in animal welfare, analgesia or anesthesia. PLoS ONE. 2018;13(5):e0197882. Epub 2018/05/26. pmid:29795636; PubMed Central PMCID: PMC5967836.
  33. 33. Hair K, Macleod MR, Sena ES, Sena ES, Hair K, Macleod MR, et al. A randomised controlled trial of an intervention to improve compliance with the ARRIVE guidelines (IICARus). Research Integrity and Peer Review. 2019;4(1):12. pmid:31205756
  34. 34. The NPQIP Collaborative group. Did a change in Nature journals’ editorial policy for life sciences research improve reporting? BMJ Open Science. 2019;3(1):e000035.
  35. 35. Han S, Olonisakin TF, Pribis JP, Zupetic J, Yoon JH, Holleran KM, et al. A checklist is associated with increased quality of reporting preclinical biomedical research: a systematic review. PLoS ONE. 2017;12(9):e0183591. Epub 2017/09/14. pmid:28902887; PubMed Central PMCID: PMC5597130.
  36. 36. Ramirez FD, Motazedian P, Jung RG, Di Santo P, MacDonald ZD, Moreland R, et al. Methodological rigor in preclinical cardiovascular studies: targets to enhance reproducibility and promote research translation. Circ Res. 2017;120(12):1916–26. Epub 2017/04/05. pmid:28373349; PubMed Central PMCID: PMC5466021.
  37. 37. Reichlin TS, Vogt L, Wurbel H. The Researchers' View of Scientific Rigor-Survey on the Conduct and Reporting of In Vivo Research. PLoS ONE. 2016;11(12):e0165999. Epub 2016/12/03. pmid:27911901; PubMed Central PMCID: PMC5135049.
  38. 38. Hurst V, Percie du Sert N. The ARRIVE guidelines survey. Open Science Framework. 2017.
  39. 39. Fraser H, Parker T, Nakagawa S, Barnett A, Fidler F. Questionable research practices in ecology and evolution. PLoS ONE. 2018;13(7):e0200303. Epub 2018/07/17. pmid:30011289; PubMed Central PMCID: PMC6047784.
  40. 40. The Academy of Medical Sciences. Reproducibility and reliability of biomedical research: improving research practice 2015. Available from: https://acmedsci.ac.uk/policy/policy-projects/reproducibility-and-reliability-of-biomedical-research. [cited 2020 June 16].
  41. 41. Sena ES, Currie GL. How our approaches to assessing benefits and harms can be improved. Animal Welfare. 2019;28(1):107–15. WOS:000455904200011.
  42. 42. Percie du Sert N, Ahluwalia A, Alam S, Avey MT, Baker M, Browne WJ, et al. Reporting animal research: Explanation and elaboration for the ARRIVE guidelines 2.0. PLoS Biol. 2020;18(7):e3000411.
  43. 43. Landis SC, Amara SG, Asadullah K, Austin CP, Blumenstein R, Bradley EW, et al. A call for transparent reporting to optimize the predictive value of preclinical research. Nature. 2012;490(7419):187–91. Epub 2012/10/13. pmid:23060188; PubMed Central PMCID: PMC3511845.
  44. 44. Kimmelman J, Anderson JA. Should preclinical studies be registered? Nature biotechnology. 2012;30(6):488–9. Epub 2012/06/09. pmid:22678379.
  45. 45. Wilkinson MD, Dumontier M, Aalbersberg IJ, Appleton G, Axton M, Baak A, et al. The FAIR Guiding Principles for scientific data management and stewardship. Scientific Data. 2016;3:160018. pmid:26978244
  46. 46. Moher D, Schulz KF, Simera I, Altman DG. Guidance for developers of health research reporting guidelines. PLoS Med. 2010;7(2):e1000217. Epub 2010/02/20. pmid:20169112.
  47. 47. Chambers K, Collings A., Graf C., Kiermer V., Mellor D.T., Macleod M., Swaminathan S., Sweet D., Vinson V. Towards minimum reporting standards for life scientists. MetaArXiv. 2019.
  48. 48. Rands SA. Inclusion of policies on ethical standards in animal experiments in biomedical science journals. J Am Assoc Lab Anim Sci. 2011;50(6):901–3. Epub 2012/02/15. pmid:22330784; PubMed Central PMCID: PMC3228928.
  49. 49. Osborne NJ, Payne D, Newman ML. Journal editorial policies, animal welfare, and the 3Rs. Am J Bioeth. 2009;9(12):55–9. Epub 2009/12/17. pmid:20013503.
  50. 50. Vasilevsky NA, Minnier J, Haendel MA, Champieux RE. Reproducible and reusable research: are journal data sharing policies meeting the mark? PeerJ. 2017;5. WOS:000400303800002. pmid:28462024
  51. 51. Giofre D, Cumming G, Fresc L, Boedker I, Tressoldi P. The influence of journal submission guidelines on authors' reporting of statistics and use of open research practices. PLoS ONE. 2017;12(4). WOS:000399874800038. pmid:28414751
  52. 52. Michel MC, Murphy TJ, Motulsky HJ. New author guidelines for displaying data and reporting data analysis and statistical methods in experimental biology. Mol Pharmacol. 2020;97(1):49–60. Epub 2019/12/29. pmid:31882404.
  53. 53. Rowan-Legg A, Weijer C, Gao J, Fernandez C. A comparison of journal instructions regarding institutional review board approval and conflict-of-interest disclosure between 1995 and 2005. J Med Ethics. 2009;35(1):74–8. WOS:000261929300018. pmid:19103950
  54. 54. Ancker JS, Flanagin A. A comparison of conflict of interest policies at peer-reviewed journals in different scientific disciplines. Sci Eng Ethics. 2007;13(2):147–57. Epub 2007/08/25. pmid:17717729.
  55. 55. Updated RCUK guidance for funding applications involving animal research 2015. Available from: https://mrc.ukri.org/news/browse/updated-rcuk-guidance-for-funding-applications-involving-animal-research/. [cited 2020 June 16].
  56. 56. Prager EM, Chambers KE, Plotkin JL, McArthur DL, Bandrowski AE, Bansal N, et al. Improving transparency and scientific rigor in academic publishing. J Neurosci Res. 2018. Epub 2018/12/07. pmid:30506706.
  57. 57. Enhancing reproducibility. Nat Methods. 2013;10(5):367. Epub 2013/06/14. pmid:23762900.
  58. 58. Curtis MJ, Alexander S, Cirino G, Docherty JR, George CH, Giembycz MA, et al. Experimental design and analysis and their reporting II: updated and simplified guidance for authors and peer reviewers. Br J Pharmacol. 2018;175(7):987–93. Epub 2018/03/10. pmid:29520785; PubMed Central PMCID: PMC5843711.
  59. 59. Heaven D. AI peer reviewers unleashed to ease publishing grind. Nature. 2018;563(7733):609–10. Epub 2018/11/30. pmid:30482927.