Post provided by Coralie Williams
Have you ever wondered if your simulation study could be replicated? The replication crisis has been a hot topic in empirical research for years, but it’s only recently that we’ve started discussing it in statistical method research (Boulesteix et al., 2020; Luijken et al., 2024). Methodological research often relies on simulations – computer experiments that assess how well statistical methods perform under predefined conditions. Simulations are a valuable tool for evaluation and comparison, but just like any scientific experiment, they face challenges with reproducibility.
These challenges could be due to two key issues: (1) selective reporting of only positive findings, and (2) poor reporting that omits important details about the study’s design, execution, and results. When these details are missing, it can be difficult to understand the potential limitations or biases of the methods being evaluated.
When I started my PhD, I found several helpful resources on how to design simulation studies (Burton et al., 2006; Lotterhos et al., 2022; Morris et al., 2019), but I was surprised there were no clear guidelines for reporting. It led me to think how one would ideally report simulation studies to enhance their reproducibility.
Current practices in ecology and evolution
A recent tutorial paper by Morris et al. (2019) provided key steps and decisions to plan and conduct a simulation study. Building on their work, we propose in our paper 11 key reporting items, covering the three stages of a simulation study: planning, coding, and analysis. These reporting items aim to improve transparency and ensure that all essential aspects of a study are clearly communicated.
To highlight our proposed reporting items with current practices, my co-authors and I decided to carry out a survey on recent simulation studies in ecology and evolution. We were particularly interested in seeing which components of simulation studies were being reported, whether implementation details and code were shared, and how thoroughly the studies methods and results were described.
We surveyed 100 published simulation studies in ecology and evolution and found room for improvement. We found that there was a lack of code sharing and limited reporting on Monte Carlo uncertainty—an essential measure of variability in simulation results. Our findings align with similar surveys in medicine (Morris et al., 2019) and psychology (Siepe et al., 2023), suggesting that the need for better reporting practices spans across various fields of methodological research.

Towards a reporting guideline and community effort
Simulation studies are an important part of the statistical research and frequently feature in method papers across many disciplines, including ecology and evolution. However, just like any research, we need to ask ourselves whether we might unknowingly be engaging in questionable practices when conducting and reporting these studies.
To address the issue of poor reporting, we propose to turn the ideas from our paper into a set of guidelines. These guidelines would help researchers report their simulation studies more transparently. To make sure they meet the needs of the ecology and evolution community, we want your input. We’ll be launching a survey to gather feedback and build a consensus. If you’d like to be part of this process, please reach out!
Of course, improving transparency in reporting is only the beginning. Even with perfect reporting, a simulation study can still be poorly designed and fail to achieve its intended objective. However, transparent reporting is an important stepping stone and first step. It opens up dialogue, enables proper peer review and assessments, and ultimately leads to higher-quality simulation studies. We’re excited to see future work in this direction to improve simulation research beyond transparent reporting.
Get involved
If you’re interested in contributing or sharing your thoughts, please reach out at coralie.williams@unsw.edu.au.

©Alberto Ghizzi Panizza
Read the full article here.
Posted edited by Lydia Morley