Skip to content

Background

Open Science is a term used to encompass making methodologies, datasets, analyses, and results of research publicly accessible for anyone to use freely.12 This term started to frequently occur in the early 2010s when researchers began noticing that they were unable to replicate or reproduce prior work done within a discipline.3 There also tended to be a large amount of ambiguity when trying to understand what process was followed to conduct a study or whether a specific material was used but not clearly defined. Open science, as a result, started to gain more traction to provide greater context, robustness, and reproducibility metrics with each subtopic encompassed under the term receiving their own formal definition and usage. The widespread adoption of open science began to explode exponentially when large scale studies conducted in the mid 2010s found that numerous works were difficult or impossible to reproduce and replicate in psychology4 and other disciplines5.

Some principles commonly referred to as part of open science and its processes: open data, open materials, open methodology, and preregistration. Open Data specifically targets datasets and their documentation for public use without restriction, typically under a permissive license or in the public domain.6 Not all data can be openly released (such as with personally identifiable information); but there are specifications for protected access that allow anonymized datasets to be released or a method to obtain the raw dataset itself. Open Materials is similar in regard except for targeting tools, source code, and their documentation.7 This tends to be synonymous with Open Source in the context of software development, but materials are used to encompass the source in addition to available, free-to-use technologies. Open Methodology defines the full workflow and processes used to conduct the research, including how the participants were gathered, what was told to them, how the collected data was analyzed, and what the final results were.1 The methodologies typically expand upon the original paper, such as technicalities that would not fit in the paper format. Finally, Preregistration acts as an initial methodology before the start of an experiment, defining the process of research without knowledge of the outcomes.89 Preregistrations can additionally be updated or created anew to preserve the initial experiment conducted and the development as more context is generated.

Open science principles and reproducibility metrics are becoming more commonplace within numerous scientific disciplines. Within many subfields of educational technology, such as learning analytics, however, the adoption and review of these principles and metrics are neglected or sparsely considered.10 There are some subfields of education technology that have taken the initiative to introduce open science principles (special education11, gamification12, education research13); however, other subfields have seen little to no adoption. Concerns and inexperience in what can be made publicly available to how to reproduce another’s work are some of the few reasons why researchers may choose to avoid or postpone discussion on open science and reproducibility. On the other hand, lack of discussion can lead to tediousness and repetitive communication for datasets and materials or cause a reproducibility crisis [5] within the field of study. As such, there is a need for accessible resources and understanding on open science, how it can be used, and how to mitigate any potential issues that may arise within one’s work at a later date.


  1. Kraker, P., Leony, D., Reinhardt, W., & Beham, G. (2011). The case for an open science in technology enhanced learning. International Journal of Technology Enhanced Learning 3(6), 643–654. https://doi.org/10.1504/IJTEL.2011.045454 

  2. Vicente-Saez, R. & Martinez-Fuentes, C. (2018). Open Science now: A systematic literature review for an integrated definition. Journal of Business Research 88, 428–436. https://doi.org/10.1016/j.jbusres.2017.12.043 

  3. Spellman, B. (2015). A Short (Personal) Future History of Revolution 2.0. Perspectives on Psychological Science 10(6), 886–899. https://doi.org/10.1177/1745691615609918 

  4. Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science 349(6251), aac4716. https://doi.org/10.1126/science.aac4716 

  5. Baker, M. (2016). 1,500 scientists lift the lid on reproducibility. Nature 533, 7604, 452–454. https://doi.org/10.1038/533452a 

  6. Murray-Rust, P. (2008). Open data in science. Nature Precedings 1(1), 1. https://doi.org/10.1038/npre.2008.1526.1 

  7. Johnson-Eilola, J. (2002). Open Source Basics: Definitions, Models, and Questions. In Proceedings of the 20th Annual International Conference on Computer Documentation (Toronto, Ontario, Canada) (SIGDOC ’02). Association for Computing Machinery, New York, NY, USA, 79–83. https://doi.org/10.1145/584955.584967 

  8. Nosek, B., Beck, E., Campbell, L., Flake, J., Hardwicke, T., Mellor, D., Veer, A., and Vazire, S. (2019). Preregistration Is Hard, And Worthwhile. Trends in Cognitive Sciences 23(10), 815–818. https://doi.org/10.1016/j.tics.2019.07.009 

  9. Nosek, B., Ebersole, C., DeHaven, A., Mellor, D. (2018). The preregistration revolution. Proceedings of the National Academy of Sciences 115(11), 2600–2606. https://doi.org/10.1073/pnas.1708274114 

  10. Nosek, B. (2022). Making the Most of the Unconference. Presented at the Unconference on Open Scholarship Practices in Education Research. Available at https://osf.io/9k6pd/ 

  11. Cook, B. G., Lloyd, J. W., Mellor, D., Nosek, B. A., & Therrien, W. J. (2018). Promoting open science to increase the trustworthiness of evidence in special education. Exceptional Children, 85(1), 104-118. https://doi.org/10.1177/0741932516637198 

  12. García-Holgado, A., García-Peñalvo, F. J., de la Higuera, C., Teixeira, A., Ehlers, U. D., Bruton, J., ... & Burgos, D. (2020, October). Promoting Open Education Through Gamification in Higher Education: the OpenGame project. In Eighth International Conference on Technological Ecosystems for Enhancing Multiculturality. 399-404. https://doi.org/10.1145/3434780.3436688 

  13. Makel, M. C., Smith, K. N., McBee, M. T., Peters, S. J., & Miller, E. M. (2019). A path to greater credibility: Large-scale collaborative education research. AERA Open, 5(4). https://doi.org/10.1177/2332858419891963