Kevin Quinlan Statistician

About Me


I am currently a Ph.D. student in the Penn State Department of Statistics and I am working under Dennis Lin. I graduated from Duquesne University with a B.S. in Mathematics with minors in Biochemistry and Theology.

I was previously the president of the Statistics Graduate Student Association (SGSA) I previously held the TA position of Stat 200 Shared Office Hours coordinator. This included holding TA training sessions on effectively communicating with students without giving away the answers, organizing office hours/finals schedules for the TAs, and sitting in on meetings with the stat 200 faculty about the future direction of the course.

Previous courses I have taught: Stat 100 Statistical Concepts and Reasoning, an introductory statistical literacy course. Stat 301 Statistical Analysis 1 an introductory calculus based statistics course for non-majors. I have also taught STAT 200 online for Penn State World Campus.

I was also an intern at Los Alamos National Lab where my mentors were Christine Anderson-Cook and Kary Myers.


Research and Publications


My research interests include topics in Design of Experiments such as Systematic Run Order and T-covering arrays. I am also interested in Forensic Science applications.

Publications:

Quinlan, K. R., & Anderson-Cook, C. M. & Myers, K. L. (2017) The Weighted Priors Approach for Combining Expert Opinion in Logistic Regression Experiments. Quality Engineering , 1-15

Quinlan, K. R., & Lin, D. K. (2015). Run order considerations for Plackett and Burman designs. Journal of Statistical Planning and Inference , 165, 56-62.


Talks


The Weighted Priors Approach for Combining Expert Opinion in Logistic Regression Experiments

Los Alamos National Lab 7/27/2016

Abstract: When modeling the reliability of a system or component, it is not uncommon for more than one expert to provide very different prior estimates of the expected reliability as a function of an explanatory variable such as age or an accelerating factor. Our goal is to incorporate all information from the experts when choosing a design about which units to test. Bayesian design of experiments has been shown to be very successful for generalized linear models, including logistic regression models. We use this approach to develop methodology for the case where there are multiple potentially non-overlapping priors under consideration. The Weighted Priors method performs well for a broad range of true underlying model parameter choices and is more robust when compared to other reasonable design choices. The method is illustrated through multiple scenarios and a case study. Additional figures for this article are available as Supplementary Materials online.

Run Order Considerations in Plackett and Burman Designs

Contributed Talk JSM 8/2/2016

Abstract: Run order considerations for two-level full and fractional factorial designs have been studied in depth, but are lacking for Plackett and Burman designs. We look at the level change problems in Plackett and Burman designs. When a systematic run order is appropriate (as opposed to the conventional random run order), minimizing level changes implies the minimization of experiment costs. We thus aim to find optimal run orders with respect to minimizing level changes. It is shown that level changes are a constant for saturated Plackett and Burman designs. Methods for obtaining the minimum/maximum level changes are given. Tables with example run orders for the cases where N = 12 and N = 20 are tabulated for practical uses. By finding minimum level change designs, we also produce maximum level change designs and such results can be directly extended to Trend Robust designs.

Bayesian Design of Experiments for Logistic Regression to Accommodate Multiple Forensic Algorithms

Invited Talk Quality and Productivity Research Conference 6/15/2017

Abstract: When evaluating the performance of several Forensic classification algorithms, it is desirable to create a design that considers a variety of performance levels across the algorithms. We describe a strategy to use Bayesian design of experiments with multiple prior estimates to capture anticipated performance. Our goal is to characterize results from the different algorithms as a function of different explanatory variables and use this to help choose a design about which units to test. Bayesian design of experiments has been shown to be very successful for generalized linear models, including logistic regression models. We develop methodology for the case where there are several potentially non-overlapping priors under consideration. The Weighted Priors method performs well for a broad range of true underlying model parameter choices and is more robust when compared to other reasonable design choices. Additionally we show how this can be applied in the multivariate input case and provide some useful summary measures. We illustrate the method with several examples.