Learning from Authoritative Security Experiment Results

The 2014 LASER Workshop

Invited Speaker: Alex Reinhart

Alex Reinhart of Carnegie Mellon University spoke on the subject of "Statistics Done Wrong: Pitfalls in Experimentation".

Most research relies on statistical hypothesis testing to report its conclusions, but the seeming precision of statistical significance actually hides many possible biases and errors. The prevalence of these errors suggests that most published results are exaggerated or false. I will explain some of these errors, such as inadequate sample sizes, multiple comparisons, and the keep-looking bias, and their impact on published results. Finally, I will suggest solutions to these problems, including statistical improvements and changes to scientific funding and publication practices.

Alex Reinhart is a PhD student in Statistics at Carnegie Mellon University and is the author of Statistics Done Wrong, a guide to common statistical errors. He earned his BSc in physics at the University of Texas at Austin while doing research on statistical methods to detect unexpected radioactive sources using mobile detectors.


The 2014 LASER proceedings are published by USENIX, which provides free, perpetual online access to technical papers. USENIX has been committed to the "Open Access to Research" movement since 2008.

Further Information

If you have questions or comments about LASER, or if you would like additional information about the workshop, contact us at: info@laser-workshop.org.

Join the LASER mailing list to stay informed of LASER news.