Original Paper
Abstract
Background: Reproducible research is a foundational component for scientific advancements, yet little is known regarding the extent of reproducible research within the dermatology literature.
Objective: This study aimed to determine the quality and transparency of the literature in dermatology journals by evaluating for the presence of 8 indicators of reproducible and transparent research practices.
Methods: By implementing a cross-sectional study design, we conducted an advanced search of publications in dermatology journals from the National Library of Medicine catalog. Our search included articles published between January 1, 2014, and December 31, 2018. After generating a list of eligible dermatology publications, we then searched for full text PDF versions by using Open Access Button, Google Scholar, and PubMed. Publications were analyzed for 8 indicators of reproducibility and transparency—availability of materials, data, analysis scripts, protocol, preregistration, conflict of interest statement, funding statement, and open access—using a pilot-tested Google Form.
Results: After exclusion, 127 studies with empirical data were included in our analysis. Certain indicators were more poorly reported than others. We found that most publications (113, 88.9%) did not provide unmodified, raw data used to make computations, 124 (97.6%) failed to make the complete protocol available, and 126 (99.2%) did not include step-by-step analysis scripts.
Conclusions: Our sample of studies published in dermatology journals do not appear to include sufficient detail to be accurately and successfully reproduced in their entirety. Solutions to increase the quality, reproducibility, and transparency of dermatology research are warranted. More robust reporting of key methodological details, open data sharing, and stricter standards journals impose on authors regarding disclosure of study materials might help to better the climate of reproducible research in dermatology.
doi:10.2196/16078
Keywords
Introduction
Scientific research is currently facing a reproducibility crisis, with an estimated 50% to 90% of research having been suggested to be irreproducible [
- ]. Supporting the notion of this crisis, the Reproducibility Project: Cancer Biology experienced failure of 32 of 50 replication attempts, in part owing to insufficient reporting of information necessary to reproduce the original study [ ]. One study included in this large-scale project was conducted by Baker and Dolgin [ ]. Aiming to better understand the causes of melanoma, the authors conducted whole-genome sequencing of 25 human telomerase reverse transcriptase–immortalized metastatic melanoma cells and reported that 6 different PREX2 gene mutations are common to melanoma cells. They additionally asserted that PREX2 mutations can increase the rate of tumor incidence compared with controls [ ]. However, attempts to replicate these findings failed. In one such attempt, Berger et al [ ] obtained samples of human skin cells used in the original study and assiduously copied the study’s experimental conditions. They found that the median tumor-free survival was only 1 week, whereas the original study found that 70% of mice remained tumor-free at 9 weeks. These results ultimately made it impossible to determine whether PREX2 mutations influenced the rate of tumor incidence compared with control.Reproducible research is a foundational component for scientific advancement [
]; however, many published works often lack essential reproducibility-related elements, such as openly shared data files, materials, and protocols [ , ] Equally problematic in terms of the lack of information sharing is the rate at which trials are prospectively registered before study commencement. For example, Nankervis et al [ ] found that only 5% of eczema randomized controlled trials (RCTs) were preregistered, registered correctly, and registered with enough accessible information to assess whether the primary outcome aligned with the original registration. Preregistration can protect against selective outcome reporting bias and aid in reducing the prevalence of spurious and misleading results [ - ]. In addition, the dissemination of raw datasets from clinical research through Web-based repositories allows complex issues to be reanalyzed for confirmation or refutation by replication studies [ ]. Furthermore, data sharing allows for further clarification through open discussion and helps to legitimize the quality and integrity of research outcomes [ , ]. Clinical trials are now required to include a data sharing plan in the trial registration as a condition to be considered for publication in journals that are members of the International Committee of Medical Journal Editors [ ]. Journals following this policy in dermatology include JAMA Dermatology, Dermatology, American Journal of Clinical Dermatology, and Journal of Surgical Dermatology, among others. Optimizing good statistical practices—as well as using methods that promote reproducibility and transparency—could ultimately increase reproducibility within the dermatology literature. As questionable findings or false leads impinge scientific advancements, researchers and physicians must advocate for efficient scientific methods that bolster reproducible research [ , ].As little is known about the extent of reproducible literature within dermatology journals, further investigation is warranted. We therefore explored the current state of reproducibility-related research practices in a random sample of publications from the field of dermatology. Our study examined specific indicators of reproducibility and transparency, building upon similar studies, to provide baseline data for subsequent investigations [
, , ].Methods
Overview
This cross-sectional analysis evaluating indicators of reproducibility and transparency was based on the methodology of Hardwicke et al [
], with slight modifications. To promote transparency and clarity of our research, all protocols, data, and appropriate materials are available on Open Science Framework [ ]. This analysis did not include human subjects and was not subject to institutional review board oversight [ ]. This investigation was reported using the guidelines for conducting meta-research as detailed by Murad and Wang [ ] and, when necessary, the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines [ ]. Our primary objective was to evaluate for the presence of specific indicators of reproducibility and transparency in the published dermatology literature.Journal and Publication Selection
On June 6, 2019, one author (DT) searched the National Library of Medicine (NLM) catalog for journals in the field of dermatology using the subject terms tag “Dermatology [ST].” To be included, journals had to be (1) MEDLINE indexed and (2) published in the English language. One investigator (DT) used the electronic ISSN to extract the list of journals. The same journal search string of ISSNs was then used in PubMed on June 7, 2019, to collect all publications published between January 1, 2014, and December 31, 2018. A random sample of 300 publications were selected for our analysis using Excel’s random number function. Our search string and the complete list of publications returned from our search are available for reference [
].Data Extraction
Before data extraction, 2 investigators (MA and AN) completed training (conducted by DT) to ensure reliability between investigators. This training session (which was recorded and is available for reference [
]) involved reviewing study objectives, study design, study protocol, and the data extraction form. After completion of training, MA and AN extracted data from the 300 randomly sampled publications in a blinded and independent manner. Data extraction began on June 10, 2019, and concluded on June 30, 2019. Investigators held a final consensus meeting to resolve any discrepancies. DT was available for adjudication, if necessary. Publications were separated into 2 categories: (1) those that contained empirical data and (2) those that lacked empirical data. Our dataset is available on a Web-based repository [ ].Specific Indicators of Reproducibility and Transparency
A pilot-tested Google Form similar to that created by Hardwicke et al [
] was used for data extraction. This form prompted investigators to identify the presence of prespecified indicators considered necessary to reproduce a study [ ]. Information extracted from each publication varied according to the study design. Studies with empirical data were assessed for the following indicators: materials availability, data availability, analysis scripts, protocol, preregistration, conflict of interest (COI) statement, funding statement, and open access. Nonempirical studies were only assessed for the presence of 3 indicators: COI, funding statement, and open access. Furthermore, despite case reports and case series often providing empirical data, previous studies have demonstrated that key methodological information needed to reproduce these study types is commonly absent or is insufficient [ ]. Thus, we decided to omit these study types from certain assessments. details the 8 queried indicators of reproducibility and transparency, their importance, and a description of study designs included in each analysis.Indicators of reproducibility and transparency | Study types included for analysis of reproducibility indicator | Usefulness for reproducing the medical literature |
Materials available | Empirical studiesa | Having access to all materials (eg, stimuli, survey instruments, and computer code/software used for data collection or running experiments) increases the feasibility by which researchers are able to replicate a study using identical methodology |
Raw data | Empirical studiesb | Sharing of data in their unaltered, digital form facilitates validation of study outcomes and helps prevent forms of bias, such as selective outcome reporting |
Analysis scripts available | Empirical studiesb | Having access to well-documented, step-by-step instructions detailing data preparation and analysis can help to increase the clarity of data interpretation. In addition, thorough analysis scripts can help limit inadvertent computations and misrepresentation of study findings in replication studies |
Protocol available | Empirical studiesb | To completely and accurately reproduce a study, the full protocol must be available in its entirety. Slight alterations to the original study protocol have the potential to influence study outcomes, thereby hindering reproducibility |
Preregistration | Empirical studiesb | Publications restricted behind a paywall contribute to the irreproducible environment of biomedical research. One way to circumvent this obstacle is through study preregistration. Making available study methods, hypotheses, and analysis scripts could potentially help increase the transparency of biomedical research while simultaneously mitigating reporting bias, data dredging, and p-hacking |
Disclosure of conflicts of interest | All eligible studiesc | Disclosure of authors’ financial conflicts of interest might help facilitate the publication of the most robust and unbiased research possible |
Funding source | All eligible studiesc | Funding sources help make costly study designs possible by providing resources to conduct experiments. The transparency of biomedical research is enhanced by disclosure of funding sources |
Open access | All studies included in random sampledd | Open access increases the availability of pertinent information for study reproduction. Failing to make available complete records of the study’s protocol, data, and analyses hinders a comprehensive evaluation of the given study |
aEmpirical studies refers to studies with empirical data including clinical trial, cohort, case control, chart review, and cross-sectional; even though case studies and case series often include empirical data, this category excludes these study types owing to the inherent difficulty surrounding their reproduction, as discussed by Wallach et al [
]. Meta-analyses and commentaries were also excluded from this analysis as materials are not typically included (n=114).bEmpirical studies (clinical trial, cohort, case control, secondary analysis, chart review, commentary [with data analysis], and cross-sectional) excluding case reports and case series. Meta-analyses were included in this analysis (n=127).
cAll empirical and nonempirical studies were included in this analysis (n=280).
dAll publications included in random sample were included in this analysis (n=300).
Assessing Open Access
We employed a systematic process to determine the public’s ability to access full text PDF versions of publications included in our sample. First, a search using the publication’s title, digital object identifier, and/or PubMed ID on Open Access Button [
] was performed. If this search yielded no return, investigators then performed this same search process using Google Scholar and PubMed. Publications were determined to be inaccessible and paywall restricted if a full text version was unobtainable.Attempts of Replication and Citation in Research Synthesis
To evaluate whether a publication with empirical data was cited in a systematic review and/or meta-analysis, we used Web of Science [
], following previous studies [ , , ]. We determined the citing publications to be either a replication study or a meta-analysis or systematic review by individually screening the title, abstract, or the full text when necessary.Statistical Analysis
We presented outcomes as percentages with associated 95% CIs, calculated using the Wilson binomial proportion confidence interval method. Descriptive statistics, medians, and upper and lower quartiles were reported using functions available in Microsoft Excel.
Results
Our search of the NLM catalog returned 100 dermatology journals. In all, 46 of these journals met the inclusion criteria and accounted for 46,615 publications from 2014 to 2018. Data were extracted from a random sample of 300 publications. A total of 280 were deemed eligible and accessible, whereas the remaining 20 were inaccessible (
).Sample Characteristics
Our final analysis of 280 dermatology publications included 127 publications (45.4%) with empirical data from reproducible study designs and 153 publications (54.6%) that lacked empirical data or were inherently difficult to reproduce. The median 5-year journal impact factor was 2.719. Journal impact factors were inaccessible for 21 publications.
and provide additional characteristics for our sample of dermatology publications.Characteristics | Value, n (%) | |
Study designa | ||
Publications with nonempirical data | 69 (24.6) | |
Meta-analysis | 9 (3.2) | |
Commentary with reanalysis | 4 (1.4) | |
Cost effectiveness | 0 (0.0) | |
Clinical trial | 14 (5.0) | |
Case study | 68 (24.3) | |
Case series | 16 (5.7) | |
Cohort | 17 (6.1) | |
Case control | 0 (0.0) | |
Survey | 8 (2.9) | |
Laboratory | 53 (18.9) | |
Multiple | 0 (0.0) | |
Other | 22 (7.9) | |
Funding sourcea | ||
University | 6 (2.1) | |
Hospital | 0 (0.0) | |
Public | 19 (6.8) | |
Private/industry | 22 (7.9) | |
Nonprofit | 6 (2.1) | |
No funding statement listed | 125 (44.6) | |
No external funding received | 77 (27.5) | |
Mixed | 25 (9.0) | |
Test subjectsa | ||
Animals | 11 (3.9) | |
Humans | 178 (63.6) | |
Both | 0 (0.0) | |
Neither | 91 (32.5) | |
Country of journal publicationa | ||
United States | 233 (83.2) | |
Japan | 0 (0.0) | |
United Kingdom | 8 (2.9) | |
France | 11 (3.9) | |
India | 6 (2.1) | |
Canada | 1 (0.4) | |
Otherb | 21 (7.5) | |
Country of corresponding authora | ||
United States | 75 (26.8) | |
China | 9 (3.2) | |
United Kingdom | 9 (3.2) | |
Germany | 16 (5.7) | |
Japan | 26 (9.3) | |
France | 12 (4.3) | |
Canada | 5 (1.8) | |
Italy | 11 (3.9) | |
India | 10 (3.6) | |
Spain | 16 (5.7) | |
Otherc | 91 (32.5) |
aAll empirical and nonempirical studies included in this study (n=280): editorials, commentaries (without reanalysis), simulations, news, and reviews.
bBrazil, Ireland, New Zealand, and Switzerland.
cArgentina, Australia, Austria, Belgium, Brazil, Croatia, Czech Republic, Denmark, Hungary, Iran, Ireland, Israel, Netherlands, Nigeria, Pakistan, Poland, Portugal, Scotland, Singapore, Slovakia, South Korea, Sweden, Switzerland, Taiwan, Turkey, and Ukraine.
Characteristics and Google Form response | Response rate, n (%) | 95% CI | |
Data availability statement (n=127) | |||
Data availability statement provided, the data (or some of the data) are available | 14 (11.0) | 6.7-17.7 | |
Data availability statement provided, the statement declares the data are not available | 0 (0.0) | 0.0-0.0 | |
No data availability statement provided | 113 (89.0) | 82.4-93.3 | |
Means by which additional data are available (n=14) | |||
Personal/institutional website | 1 (7.1) | —a | |
Supplementary information hosted by the journal | 12 (85.8) | — | |
Online third-party repository | 0 (0.0) | — | |
Upon request from the corresponding author(s) | 1 (7.1) | — | |
Accessibility of additional data (n=14) | |||
All data files were successfully accessed and downloaded | 11 (78.6) | — | |
One or more data files could not be accessed or downloaded | 3 (21.4) | — | |
Data files containing all raw numerical data | 3 (21.4) | — | |
Data files without all raw numerical data | 8 (57.1) | — | |
Materials availability statement (n=114) | |||
Materials availability statement provided, some materials are available | 23 (20.2) | 13.8-28.5 | |
Materials availability statement provided, materials are not available | 0 (0.0) | 0.0-0.0 | |
No materials availability statement provided | 91 (79.8) | 71.5-86.2 | |
Means by which supplemental materials are available (n=23) | |||
Personal/institutional website | 0 (0.0) | — | |
Supplementary information hosted by the journal | 23 (100) | — | |
Online third party | 0 (0.0) | — | |
Upon request from the corresponding author(s) | 0 (0.0) | — | |
Accessibility of additional materials (n=23) | |||
Materials availability provided, all supplemental materials were accessible | 21 (91.3) | — | |
Materials availability statement provided, but the materials were not accessible | 2 (8.7) | — | |
Protocol availability statement (n=127) | |||
Protocol availability statement provided | 3 (2.4) | 0.8-6.7 | |
No protocol availability statement provided | 124 (97.6) | 93.3-99.2 | |
Accessibility of additional protocols (n=3) | |||
Full protocol was available using provided link | 3 (100) | — | |
Full protocol was not available using provided link | 0 (0.0 | — | |
Hypotheses were included in the linked protocol | 0 (0.0 | — | |
Methods were included in the linked protocol | 3 (100) | — | |
Analysis plans were included in the linked protocol | 3 (100) | — | |
Analysis script availability statement (n=127) | |||
Analysis script provided, declares that the analysis scripts (or some of the analysis scripts) are available | 1 (0.8) | 0.1-4.3 | |
Analysis script statement provided, declares that the analysis scripts are not available | 0 (0.0) | 0.0-0.0 | |
No analysis script statement provided | 126 (99.2) | 95.7-99.9 | |
Preregistration statement (n=127) | |||
Statement provided, declaring study was preregistered | 3 (2.4) | 3.0-6.7 | |
Statement provided, declaring the study was not preregistered | 0 (0.0) | 0.0-0.0 | |
No preregistration statement provided | 124 (97.6) | 93.3-99.2 | |
Accessibility of publication registration (n=3) | |||
Preregistration was accessible | 3 (100) | — | |
Preregistration was not accessible | 0 (0.0) | — | |
Number of studies preregistered on ClinicalTrials.gov | 2 (66.7) | — | |
Number of studies preregistered on GlaxoSmithKline Clinical Study Register: gsk-clinicalstudyregister.com | 1 (33.3) | — | |
Conflicts of interest statement (n=280) | |||
Disclosure statement provided, author(s) declare one or more conflicts of interest | 30 (10.7) | 7.6-14.9 | |
Disclosure statement provided, author(s) declare that there are no conflicts of interest | 203 (72.5) | 67.0-77.4 | |
No conflicts of interest statement provided | 47 (16.8) | 12.9-21.6 | |
Open access (n=300) | |||
Publication found via Open Access Button (openaccessbutton.org) | 65 (21.7) | 17.4-26.7 | |
Publication found via Google Scholar and/or PubMed | 136 (45.3) | 39.8-51.0 | |
Publication determined to be paywall restricted | 99 (33) | 27.9-38.5 |
aNot applicable.
Eight Indicators of Reproducibility and Transparency
Among the 280 eligible publications, 201 (71.8%) were publicly available, whereas the remaining 79 (28.2%) were only available through a paywall. We classified the 20 publications for which full text PDF versions were unattainable as being paywall restricted. Thus, a total of 99 publications (of 300; 33.0%) were classified as being unavailable to the public. Only 23 publications (out of 114, 20.2%) provided a statement indicating that additional materials were available. Only 3 publications (out of 127, 2.4%) provided a protocol availability statement. All 3 of these statements provided a valid link to a Web-based protocol. Almost all publications lacked data availability statements. A total of 14 publications (out of 127, 11.0%) included data availability statements; however, only 11 of these data statements were linked to supplemental data files. Of the 11 accessible supplemental data files, only 3 provided access to complete and unmodified raw datasets. In addition, only 1 publication (out of 127, 0.8%) provided an analysis script or code. Our analysis revealed only 3 publications (of 127, 2.4%) were prospectively registered. A total of 233 publications (out of 280, 83.2%) provided a COI statement. Of these 280 publications, 30 (10.7%) indicated that 1 or more authors had a COI, and 203 (72.5%) declared the author(s) did not have a COI. The remaining 47 publications (out of 280, 16.8%) failed to provide a COI statement. Furthermore, 155 (out of 280, 55.4%) publications reported a funding source, whereas 125 (44.6%) publications did not receive external funding. Finally, 23 publications (out of 114, 20.2%) included in our analysis were cited in a subsequent data synthesis or review paper (
). No publication included in our analysis was cited in a replication study.Citation frequency | Value, n (%) |
No citation | 91 (79.8) |
A single citation | 15 (13.2) |
1 to 5 citations | 8 (7.0) |
Greater than 5 citations | 0 (0.0) |
Discussion
Principal Findings
Our findings suggest that the current climate of dermatology research does not encourage reproducible and transparent research practices. Few studies provided access to datasets, analysis scripts, or complete study protocols. These findings are congruent with previous reports that found that studies often fail to promote transparent and reproducible research practices [
], and they align with a study published in Nature that found that 90% of more than 1500 researchers agreed that biomedical science is facing a significant reproducibility crisis [ ]. This environment of poor research practice is problematic for clinicians and researchers who might seek to validate or reproduce a study in its entirety. As scientists and clinicians continue to make medical advances, studies must be readily reproducible to ensure proper validation of results and to allow for sustained progression in clinical practice. In the following text, we describe 2 practices in the field of dermatology—study protocols and preregistration—that were commonly omitted by researchers. We follow with actionable recommendations for research funders, journals, and researchers that, if implemented successfully, might help better the climate of reproducible research in published dermatology literature.Most studies included in our sample did not provide additional materials or complete study protocols. Precisely outlining methodology is essential for study reproducibility [
], whether this information is provided within the publication or in supplementary materials [ ]. The Journal of the American Academy of Dermatology’s (JAAD) instructions to authors state, “submissions of research articles should be accompanied by a supplementary document that includes the protocol and statistical analysis plan; this should be labeled ‘For editor/reviewer reference only’ and is not for publication” (emphasis ours) [ ]. The British Journal of Dermatology (BJD) author guidelines state, “The editorial team has found that providing the study protocol facilitates acceptance of the paper if it is available. Therefore, the BJD encourages submission of the protocol at the time of manuscript submission, with the protocol identified as a ‘Supplementary file for review.’ Submission of the trial protocol is also strongly encouraged for industry-sponsored trials.” [ ] JAMA Dermatology guidance states, “authors of manuscripts reporting clinical trials must submit trial protocols (including the complete statistical analysis plan) along with their manuscripts… and that if the manuscript is accepted, the protocol and statistical analysis plan will be published as a supplement [ ].” The widespread variability in guidance provided by these 3 prominent dermatology journals—which ranges from nonpublication of study protocols by JAAD to protocol publication upon article acceptance by JAMA Dermatology—suggests differing views toward implementing reproducible research practices within the field. BJD does not require protocol submission but simply encourages it. As journals are the final arbiters of studies that move on to publication, they have a high degree of influence on the climate of reproducibility and transparency in dermatology research. We highly recommend that dermatology journals adopt stronger requirements for submitting authors to promote greater transparency and reproducibility.According to the Food and Drug Administration Amendments Act, established in 2007, all applicable RCTs must be registered before participant enrollment [
]. Although the number of preregistered RCTs has increased, other study designs have not shown as much improvement. Boccia et al found that only 1109 cancer observational studies were registered on ClinicalTrials.gov across an 11-year period [ ]. In addition, systematic reviews have a preregistration platform, the International Prospective Register of Systematic Reviews (PROSPERO), which has increased in usage exponentially since its inception in 2011 [ ]. These study designs are preregistered solely at the authors’ discretion, with few journals or funders having concrete guidance on the subject. Of the 3 journals discussed above, only BJD mentions registering systematic reviews, stating that authors are required to preregister on PROSPERO [ ]. Transparent research practices such as prospective registration can help mitigate unethical research practices by providing access to date-stamped protocol details and informing the public about current clinical trials being performed [ ]. For example, P-hacking (using different statistical analyses until a nonsignificant finding is found to be significant) [ ] and HARKing (forming study hypothesis after results have been calculated) [ ] might be avoided if investigators disclose the expected statistical analyses that will be used throughout the study before its commencement. It should be noted that HARKing can be beneficial to the scientific process by generating important discoveries during post hoc analyses [ - ] In addition, previous studies have shown that reviewers often encourage authors to add hypotheses post hoc as part of the peer review process [ ]. However, the crossover into research misconduct occurs when authors contend that these posthoc hypotheses were part of the original study design, thereby potentially decreasing the confidence of statistically significant outcomes [ ].Future Recommendations
Changes to the landscape of dermatology research are warranted; however, the optimal framework for doing so is unclear. Here, we offer recommendations for research stakeholders—including funding agencies, journals, and researchers—that may help increase the quality of reproducible research practices in dermatology, if implemented successfully.
With respect to funding, some foundations and governmental agencies have established measures to promote reproducibility and transparency of research for which they provide funding. A nonexhaustive list of these funders include the National Institutes of Health (NIH), the National Science Foundation, the Wellcome Trust, and the Bill and Melinda Gates Foundation. As one example, the Gates Foundation, which funds approximately 2000 to 2500 research articles per year totaling US $5 billion [
], has established an open access policy requiring that all research data and manuscripts resulting from its funds be promptly and broadly disseminated [ ]. To further its goals for widespread dissemination, the foundation has launched its own open access journal, Gates Open Research. Currently, research funded by the foundation is not eligible for publication in some of the world’s most renowned journals, such as Nature, Science, Proceedings of the National Academy of Sciences, and New England Journal of Medicine owing to these funding restrictions [ ]. The NIH has established the Rigor and Reproducibility Initiative, embedding requirements that submitted grant applications outline strategies for more reproducible research [ ]. Strategies such as these are the first steps toward adoption of more transparent and reproducible research practices.For journals, we recommend consideration of adopting stricter standards on the disclosure of study materials, raw datasets, protocols, and analysis scripts. Journals should consider requiring that authors share all study materials on public repositories, such as Open Science Framework. With essential study materials publicly available, outcomes may be reproduced and validated with greater ease. A recent survey found that open access to study data increased the public’s trust and confidence in research outcomes [
]. Depositing all study materials and data before publication may increase the public’s faith and confidence in the literature published in journals with such requirements.Finally, for researchers, we believe a need exists to train and equip principal investigators to adopt more reproducible and transparent research practices. This goal may be best accomplished through continuing education, academic conferences, webinars, and journal clubs. A need also exists to train and equip the next generation of scientists. Given the apprenticeship nature of many biomedical laboratories, principal investigators should take the lead in fostering such cultures within their laboratories and instilling such practices with mentees. Courses on open science are being developed across the country, many posted on the Open Science Framework [
]. The National Institutes of General Medical Sciences has posted several Web-based training modules to increase the overall rigor and reproducibility of medical research [ ]. As these courses continue to expand at universities and with funders, continued development and uptake of such training may help reverse the scant nature of reproducibility and transparency of research in the dermatology literature.Strengths and Limitations
Our study has many strengths, but some limitations are present. Regarding strengths, all materials, protocols, analysis plans, and raw data from our study are publicly available on Open Science Framework. In addition, we implemented numerous measures to ensure the reliability of study outcomes by (1) using a blinded, double data extraction technique—the gold standard for meta-research practices [
] and (2) providing thorough training of each investigator to ensure reliability of results between investigators. Regarding limitations, data extraction was limited to the content of the full-text PDFs and available supplemental materials for each publication. Additional materials may be attainable by contacting the corresponding author. Furthermore, this study focused specifically on publications in dermatology journals. Thus, the results from this study may not be generalizable to other subjects or years of publication. For the aforementioned reasons, interpretation of our findings should be considered a lower bound estimate of reproducibility of publications in dermatology journals.In conclusion, the rate of disclosure of study materials, data, protocols, and analysis scripts of sampled dermatology publications is unacceptably low. Without implementing and adhering to more robust reporting standards and open science practices, reproducibility-related factors of dermatologic research may remain poor.
Acknowledgments
This study was funded through the 2019 Presidential Research Fellowship Mentor—Mentee Program at the Oklahoma State University Center for Health Sciences.
Conflicts of Interest
None declared.
References
- Baker M. 1,500 scientists lift the lid on reproducibility. Nature 2016 May 26;533(7604):452-454. [CrossRef] [Medline]
- Freedman LP, Cockburn IM, Simcoe TS. The economics of reproducibility in preclinical research. PLoS Biol 2015 Jun;13(6):e1002165 [FREE Full text] [CrossRef] [Medline]
- Salman R, Beller E, Kagan J, Hemminki E, Phillips RS, Savulescu J, et al. Increasing value and reducing waste in biomedical research regulation and management. Lancet 2014 Jan 11;383(9912):176-185 [FREE Full text] [CrossRef] [Medline]
- Center for Open Science. Reproducibility Project: Cancer Biology URL: https://cos.io/rpcb/ [accessed 2019-06-28]
- Baker M, Dolgin E. Cancer reproducibility project releases first results. Nature 2017 Jan 18;541(7637):269-270. [CrossRef] [Medline]
- Berger MF, Hodis E, Heffernan TP, Deribe YL, Lawrence MS, Protopopov A, et al. Melanoma genome sequencing reveals frequent PREX2 mutations. Nature 2012 May 9;485(7399):502-506 [FREE Full text] [CrossRef] [Medline]
- Munafò MR, Nosek BA, Bishop DV, Button KS, Chambers CD, Percie du Sert N, et al. A manifesto for reproducible science. Nat Hum Behav 2017 Jan 10;1(1):s41562–016–0021. [CrossRef]
- Hardwicke TE, Wallach JD, Kidwell M, Ioannidis J. An empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014-2017). MetaArXiv 2019 [FREE Full text] [CrossRef]
- Wallach JD, Boyack KW, Ioannidis JP. Reproducible research practices, transparency, and open access data in the biomedical literature, 2015-2017. PLoS Biol 2018 Nov;16(11):e2006930 [FREE Full text] [CrossRef] [Medline]
- Nankervis H, Baibergenova A, Williams HC, Thomas KS. Prospective registration and outcome-reporting bias in randomized controlled trials of eczema treatments: a systematic review. J Invest Dermatol 2012 Dec;132(12):2727-2734 [FREE Full text] [CrossRef] [Medline]
- Chalmers I, Glasziou P. Avoidable waste in the production and reporting of research evidence. Lancet 2009 Jul 4;374(9683):86-89. [CrossRef] [Medline]
- Weber WE, Merino JG, Loder E. Trial registration 10 years on. Br Med J 2015 Jul 6;351:h3572. [CrossRef] [Medline]
- Cook C, Checketts JX, Atakpo P, Nelson N, Vassar M. How well are reporting guidelines and trial registration used by dermatology journals to limit bias? A meta-epidemiological study. Br J Dermatol 2018 Jun;178(6):1433-1434. [CrossRef] [Medline]
- Peng RD, Dominici F, Zeger SL. Reproducible epidemiologic research. Am J Epidemiol 2006 May 1;163(9):783-789. [CrossRef] [Medline]
- Warren E. Strengthening research through data sharing. N Engl J Med 2016 Aug 4;375(5):401-403. [CrossRef] [Medline]
- Groves T, Godlee F. Open science and reproducible research. Br Med J 2012 Jun 26;344:e4383. [CrossRef] [Medline]
- Taichman DB, Sahni P, Pinborg A, Peiperl L, Laine C, James A, et al. Data sharing statements for clinical trials: a requirement of the international committee of medical journal editors. Ethiop J Health Sci 2017 Jul;27(4):315-318 [FREE Full text] [Medline]
- Scherer RW, Langenberg P, von Elm E. Full publication of results initially presented in abstracts. Cochrane Database Syst Rev 2007 Apr 18(2):MR000005. [CrossRef] [Medline]
- Nuzzo R. How scientists fool themselves - and how they can stop. Nature 2015 Oct 8;526(7572):182-185. [CrossRef] [Medline]
- Iqbal SA, Wallach JD, Khoury MJ, Schully SD, Ioannidis JP. Reproducible research practices and transparency across the biomedical literature. PLoS Biol 2016 Jan;14(1):e1002333 [FREE Full text] [CrossRef] [Medline]
- OSF.io. 2019 Jun 7. Dermatology Reproducibility Project protocol URL: https://osf.io/qang3/ [accessed 2019-10-01]
- eCFR — Code of Federal Regulations. 2018 Jul 19. Electronic Code of Federal Regulations URL: https://www.ecfr.gov/cgi-bin/retrieveECFR?gp=&SID=83cd09e1c0f5c6937cd9d7513160fc3f&pitd=20180719&n=pt45.1.46&r=PART&ty=HTML#se45.1.46_1102 [accessed 2019-11-04]
- Murad MH, Wang Z. Guidelines for reporting meta-epidemiological methodology research. Evid Based Med 2017 Aug;22(4):139-142 [FREE Full text] [CrossRef] [Medline]
- PRISMA. URL: http://www.prisma-statement.org/ [accessed 2019-06-29]
- OSF.io. 2019 Jun 7. pubmed_result URL: https://osf.io/c7a4t/ [accessed 2019-10-01]
- OSF.io. 2019 Jun 7. Data Extraction Video URL: https://osf.io/jczx5/ [accessed 2018-10-01]
- OSF.io. 2019 Jun 7. Reproducibility Data - Dermatology URL: https://osf.io/ev38h/ [accessed 2019-10-01]
- OSF.io. 2019 Jun 7. Collection Form - Google Forms URL: https://osf.io/3nfa5/ [accessed 2019-10-01]
- Open Access Button. URL: https://openaccessbutton.org/ [accessed 2019-08-31]
- Web of Science. URL: http://webofknowledge.com [accessed 2019-08-31]
- Niven DJ, McCormick TJ, Straus SE, Hemmelgarn BR, Jeffs L, Barnes TR, et al. Reproducibility of clinical research in critical care: a scoping review. BMC Med 2018 Feb 21;16(1):26 [FREE Full text] [CrossRef] [Medline]
- Goodman SN, Fanelli D, Ioannidis JP. What does research reproducibility mean? Sci Transl Med 2016 Jun 1;8(341):341ps12. [CrossRef] [Medline]
- Elsevier. Guide for Authors URL: https://www.elsevier.com/journals/journal-of-the-american-academy-of-dermatology/0190-9622/guide-for-authors [accessed 2019-08-08]
- Wiley Online Library. British Journal of Dermatology Author Guidelines URL: https://onlinelibrary.wiley.com/page/journal/13652133/homepage/forauthors.html [accessed 2019-08-03]
- JAMA Network. Instructions for Authors URL: https://jamanetwork.com/journals/jamadermatology/pages/instructions-for-authors [accessed 2019-08-08]
- Boccia S, Rothman KJ, Panic N, Flacco ME, Rosso A, Pastorino R, et al. Registration practices for observational studies on ClinicalTrials.gov indicated low adherence. J Clin Epidemiol 2016 Feb;70:176-182. [CrossRef] [Medline]
- Page MJ, Shamseer L, Tricco AC. Registration of systematic reviews in PROSPERO: 30,000 records and counting. Syst Rev 2018 Feb 20;7(1):32 [FREE Full text] [CrossRef] [Medline]
- Zarin DA, Tse T, Williams RJ, Rajakannan T. Update on trial registration 11 years after the ICMJE policy was established. N Engl J Med 2017 Jan 26;376(4):383-391 [FREE Full text] [CrossRef] [Medline]
- Head ML, Holman L, Lanfear R, Kahn AT, Jennions MD. The extent and consequences of p-hacking in science. PLoS Biol 2015 Mar;13(3):e1002106 [FREE Full text] [CrossRef] [Medline]
- Kerr NL. HARKing: hypothesizing after the results are known. Pers Soc Psychol Rev 1998;2(3):196-217. [CrossRef] [Medline]
- Aguinis H, Vandenberg RJ. An ounce of prevention is worth a pound of cure: improving research quality before data collection. Annu Rev Organ Psychol Organ Behav 2014 Mar 21;1(1):569-595 [FREE Full text] [CrossRef]
- Bamberger P, Ang S. The quantitative discovery: what is it and how to get it published. Acad Manag Disc 2016;2(1):1-6. [CrossRef]
- Locke EA. The case for inductive theory building. J Manag 2007;33(6):867-890. [CrossRef]
- Bedeian AG, Taylor SG, Miller AN. Management science on the credibility bubble: cardinal sins and various misdemeanors. Acad Manag Learn Educ 2010 Dec 1;9(4):715-725. [CrossRef]
- Murphy KR, Aguinis H. HARKing: How badly can cherry-picking and question trolling produce bias in published results? J Bus Psychol 2019 Feb;34(1):1-17. [CrossRef]
- Bill & Melinda Gates Foundation. Foundation Fact Sheet URL: https://www.gatesfoundation.org/who-we-are/general-information/foundation-factsheet [accessed 2019-08-12]
- Bill & Melinda Gates Foundation. Open Access Policy URL: https://www.gatesfoundation.org/How-We-Work/General-Information/Open-Access-Policy/Page-2 [accessed 2019-08-12]
- van Noorden R. Gates Foundation research can't be published in top journals. Nature 2017 Jan 13;541(7637):270. [CrossRef] [Medline]
- National Institutes of Health. Rigors and Reproducibility URL: https://www.nih.gov/research-training/rigor-reproducibility [accessed 2019-08-12]
- Pew Research Center. Trust and Mistrust in Americans’ Views of Scientific Experts URL: https://www.pewresearch.org/science/2019/08/02/trust-and-mistrust-in-americans-views-of-scientific-experts/ [accessed 2019-08-08]
- Open Science Framework. URL: https://osf.io/ [accessed 2019-08-31]
- National Institute of General Medical Sciences. 2015. Clearinghouse for Training Modules to Enhance Data Reproducibility URL: https://www.nigms.nih.gov/training/pages/clearinghouse-for-training-modules-to-enhance-data-reproducibility.aspx [accessed 2019-08-12]
- Higgins J, Green S. Cochrane Handbook for Systematic Reviews of Interventions. Chichester, UK: Wiley; 2006.
Abbreviations
BJD: British Journal of Dermatology |
JAAD: Journal of the American Academy of Dermatology |
NIH: National Institutes of Health |
NLM: National Library of Medicine |
PROSPERO: International Prospective Register of Systematic Reviews |
RCT: randomized controlled trial |
Edited by G Eysenbach; submitted 31.08.19; peer-reviewed by T Hardwicke; comments to author 01.10.19; revised version received 07.10.19; accepted 20.10.19; published 07.11.19
Copyright©J Michael Anderson, Andrew Niemann, Austin L Johnson, Courtney Cook, Daniel Tritz, Matt Vassar. Originally published in JMIR Dermatology (http://derma.jmir.org), 07.11.2019.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Dermatology Research, is properly cited. The complete bibliographic information, a link to the original publication on http://derma.jmir.org, as well as this copyright and license information must be included.