Presentations

Author

Steven J. Pierce

Modified

2025-04-25 18:35:21 EDT

This is where I’ll post information about recent or upcoming presentations

1 CSTAT Webinars

Below is the abstract for my upcoming presentation (Pierce, 2025, October 2) called Reproducible research: Principles, practices, and tools for generating reproducible statistical analyses and reports. This is will be the second iteration of a presentation I did earlier this year (Pierce, 2025, March 6). Slides and example files from the first iteration are available in my CSTAT.RR2025 repository and a video recording is also available. The upcoming session will be mostly the same, though I may update some materials, add some time for discussion, and create a new repository.

Abstract

This seminar will introduce the audience to a set of principles, practices, and free, open-source software tools that enable scientists to generate reproducible statistical analyses and reports. We will cover why reproducibility is important, then offer a vision of how to enhance the reproducibility of your work, with concrete steps you can take to achieve that goal. We will discuss tailoring the degree of reproducibility you aim to achieve for a given project, which may vary due to project context or constraints. In terms of software, we will describe how R, RStudio, Quarto, and TinyTex comprise a powerful suite of tools that uses dynamic documents to automate producing fully-formatted reports, manuscripts, or slides complete with narrative text, analysis results, figures, tables, and references. Git and GitHub.com add further value through support for version control and collaboration on the source code for dynamic documents. The session will include conceptual content, examples of dynamic documents, and links to supporting resources the audience can use to accelerate learning how to make their work more reproducible.

2 MSU Program Evaluation Occasional Speaker Series

Below is the abstract for my presentation (Pierce, 2024, December 5) called Fundamentals of missing data in evaluation. Slides are available in my FMDE2024 repository and a video recording is also available.

Abstract

This talk will discuss some fundamental concepts and issues related to missing data in program evaluation contexts. We will cover why missing data matters plus types of missing data and how they affect statistical results. Then we will highlight how to describe the nature, scope, and patterns of missing data and how they relate to observed data. Finally, we will discuss some ways to prevent missing data and options for obtaining valid, unbiased statistical results even when there is missing data.

3 Ann Arbor R Users’ Group

Below is the abstract for my presentation (Pierce, 2024, November 14) called R and Quarto: A foundation for generating reproducible reports. Slides and example files are available in my Pierce.AARUG2024 repository.

Abstract

This presentation will discuss the importance of reproducibility and illustrate how R and Quarto are foundational pieces of an integrated set of tools for generating fully-formatted, reproducible reports in a variety of output formats. We’ll cover some key principles and practices that enhance reproducibility.

4 American Evaluation Association 2024

Below is the abstract for my presentation (Pierce, 2024, October 21-26) called Generating reproducible statistical analyses and evaluation reports: Principles, practices, and free software tools. Slides and example files are available in my Pierce.AEA2024 repository.

Abstract

Fully reproducible statistical analyses are ones for which investigators have shared all the materials required to exactly recreate their findings so others can verify them or conduct alternative analyses. That requires sharing the original (usually de-identified) data, supporting documentation, and the software code used to analyze the data. While reproducibility has been described as an attainable minimum standard for trustworthy, credible scientific work; it is not yet well-embedded in evaluators’ professional training. This session will introduce the audience to a set of principles, practices, and free, open-source software tools that enable evaluators to efficiently generate reproducible statistical analyses and evaluation reports. We will cover why reproducibility is important in an evaluation context, then offer a vision of how to improve the reproducibility of your work and suggest concrete steps you can take to achieve that goal. We will discuss tailoring the degree of reproducibility you aim to achieve for a given project, which may vary due to project context or constraints. In terms of software, we will describe how R, RStudio, Quarto, and TinyTex comprise a powerful suite of tools that can generate dynamic documents containing a mix of narrative text along with R code that can be compiled to automate producing a fully-formatted report, manuscript, or set of slides complete with narrative text, analysis results, figures, tables, and references. Git and GitHub.com add further value through support for version control and collaboration on the source code for dynamic documents. The session will include conceptual content, examples of dynamic documents, and links to supporting resources the audience can use to accelerate learning how to make their work more reproducible.

5 References

Pierce, S. J. (2024, December 5). Fundamentals of missing data in evaluation [Invited oral presentation]. Program Evaluation Occasional Speaker Series hosted by Michigan State University Department of Psychology, East Lansing, MI, United States. https://github.com/sjpierce/FMDE2024
Pierce, S. J. (2024, November 14). R and Quarto: A foundation for generating reproducible reports [Invited oral presentation]. Ann Arbor R Users’ Group, Ann Arbor, MI, United States. https://github.com/sjpierce/Pierce.AARUG2024
Pierce, S. J. (2024, October 21-26). Generating reproducible statistical analyses and evaluation reports: Principles, practices, and free software tools [Demonstration session]. Evaluation 2024: Amplifying; Empowering Voices in Evaluation, the annual conference of the American Evaluation Association, Portland, OR, United States. https://github.com/sjpierce/Pierce.AEA2024
Pierce, S. J. (2025, March 6). Reproducible research: Principles, practices, and tools for generating reproducible statistical analyses and reports [Online seminar]. Center for Statistical Training and Consulting webinar series on Responsible and Ethical Conduct of Research. https://github.com/sjpierce/CSTAT.RR2025
Pierce, S. J. (2025, October 2). Reproducible research: Principles, practices, and tools for generating reproducible statistical analyses and reports [Online seminar]. Center for Statistical Training and Consulting webinar series on Responsible and Ethical Conduct of Research. https://github.com/sjpierce/CSTAT.RR2025