Learning from Authoritative Security Experiment Results
Invited Speaker: Alex Reinhart
Alex Reinhart of Carnegie Mellon University spoke on the subject of "Statistics Done Wrong: Pitfalls in Experimentation".
Most research relies on statistical hypothesis testing to report its conclusions, but the seeming precision of statistical significance actually hides many possible biases and errors. The prevalence of these errors suggests that most published results are exaggerated or false. I will explain some of these errors, such as inadequate sample sizes, multiple comparisons, and the keep-looking bias, and their impact on published results. Finally, I will suggest solutions to these problems, including statistical improvements and changes to scientific funding and publication practices.
Alex Reinhart is a PhD student in Statistics at Carnegie Mellon University and is the author of Statistics Done Wrong, a guide to common statistical errors. He earned his BSc in physics at the University of Texas at Austin while doing research on statistical methods to detect unexpected radioactive sources using mobile detectors.