This workshop will cover key considerations for preparing and sharing data and software to improve the reproducibility of published results. Participants will have the opportunity to put into practice lessons learned while exploring published data sets. Openly available tools that support reproducibility will also be examined.
Part I: Introduction
Part II: Reproducibility from the perspective of data creators/curators
- Open Activity 1: Organization & assessment of research materials.
- Compile the following dataset materials in OSF:
- Balanced Journalism Amplifies Minority Positions: A Case Study of the Newspaper Coverage of a Fluoridation Plebiscite (Kiss et al., 2018)
- The state of OA: a large-scale analysis of the prevalence and impact of Open Access articles (Piwowar et al., 2018)
- A review of riverine ecosystem service quantification: research gaps and recommendations (Hanna et al., 2017)
- Article: https://besjournals.onlinelibrary.wiley.com/doi/full/10.1111/1365-2664.13045
- Repo (data & code): https://zenodo.org/record/1013254
- Complete Activity 2: Reproducibility Framework.
Part III: Reproducibility from the perspective of data re-users
- Open the following link:
- Is there a relationship between countries’ wealth or spending on schooling and its students’ performance in PISA? (Klajnerok, 2017)
- Open Activity 3: Exploring the repository and the notebook.
- Open Activity 4: Experimenting with the code.
- Bonus activity: Play around with another repository:
- How significant are the public dimensions of faculty work in review, promotion, and tenure documents? (Alperin et al., 2018)
- Benureau, F.C.Y. & Rougier, N.P. (2018). Re-run, repeat, reproduce, reuse, replicate: Transforming code into scientific contributions. Frontiers in Neuroinformatics, 11(69). https://dx.doi.org/10.3389/fninf.2017.00069.
- Broman, K. (n.d.). Initial steps toward reproducible research. Accessed September 05, 2019 from https://kbroman.org/steps2rr/
- Boettiger, C. (2015). An introduction to Docker for reproducible research. SIGOPS Operating Systems Review, 49(1), 71-79. https://doi.org/10.1145/2723872.2723882.
- Clyburn-Sherin, A. (2019). Preparing data and code for reproducible publication using container technology. Research Data Access and Preservation Summit. 2019 Workshop. Coral Gables, FL. Slides accessed September 05, 2019 from http://bit.ly/rdap-workshop.
- Itech Gal. (March 1, 2018). Better Python dependency while packaging your project. Python Pandemonium. https://medium.com/python-pandemonium/better-python-dependency-and-package-management-b5d8ea29dff1.
- Kitzes, J., Turek D., & Deniz, F. (Eds.). (2018). The practice of reproducible research: Case studies and lessons from the data-intensive sciences. Oakland, CA: University of California Press. https://www.practicereproducibleresearch.org
- Sawchuk, S. L., & Khair, S. (2021). Computational reproducibility: A practical framework for data curators. Journal of EScience Librarianship, 10(3), 7. https://doi.org/10/gmgkth.
- Steeves, V., Rampin, R., & Chirigati, F. (2018). Using ReproZip for reproducibility and library services. IASSIST Quarterly, 42(1), 14–14. https://doi.org/10/gf9hw5.
- Reproducibility in science: A guide to enhancing reproducibility in scientific results and writing. (n.d.). Retrieved October 15, 2019, from https://ropensci.github.io/reproducibility-guide.
- Tatman, R., VanderPlas, J., & Dane, S. (2018). A practical taxonomy of reproducibility for machine learning research. International Conference on Machine Learning. Reproducibility in Machine Learning Workshop at ICML 2018, Stockholm, Sweden. http://www.rctatman.com/files/2018-7-14-MLReproducability.pdf
- Wang, H. (Feb 17, 2019). LibGuides: Research reproducibility. Carnegie Mellon University Libraries. Retrieved October 11, 2019, from https://guides.library.cmu.edu/reproducibility/home