Advances in Fisherian Alternatives to Conventional Inference


Advances in Fisherian Alternatives to Conventional Inference
Organizer and Chair: David Bickel (University of Ottawa)
[PDF]

AO YUAN, Howard University
Bayesian Frequentist Hybrid Model with Applications to CGH Data Analysis  [PDF]

Comparative genomic hybridization (CGH) is a common technique to screen the GCN changes in mutant cells genome-wide. Existing statistical methods for analyzing such data are either frequentist, or full Bayesian. The former has advantage of objectivity, while the latter can incorporate useful prior information. In an attempt to take full advantages of both approaches, we develop a Bayesian-frequentist hybrid procedure, in which part of the parameters are inferred in a Bayesian fashion, while the remaining parameters are inferred in frequentist way. This is especially useful when sound prior information is available only on part of the parameters, and the sample size is relatively small.

ALAN POLANSKY, Northern Illinois University
Observed Confidence Levels: An Alternative to Multiple Testing Techniques  [PDF]

Many applications in statistics do not fall within the framework of traditional hypothesis testing. These problems typically require a sequence of hypothesis tests. The sequence and choice of hypotheses in such an application is important, as different choices can often lead to different overall conclusions. Observed confidence levels provide a new approach to these problems by computing a simultaneous measure of the relative truth of each hypothesis. This approach does not require the specification of a sequence of hypothesis tests. A general theory for the application of this method is developed and the method is demonstrated in several examples.

JEFFREY BLUME, Vanderbilt University School of Medicine
Fisher's Likelihood Inference: Coming of Age  [PDF]

Likelihood methods for measuring statistical evidence have evolved slowly since R.A. Fisher first introduced them in the early 1920’s. Nearly a century later the likelihood paradigm has matured enough to warrant careful consideration. Likelihood methods are the natural compromise between Bayesian and frequentist approaches; they retain the desirable properties of both paradigms (irrelevance of sample space, good performance probabilities) while shedding undesirable ones (dependence on prior distributions, ad-hoc adjustments to control error probabilities). In this talk, I will introduce the modern likelihood paradigm, show how this evidential framework resolves multiplicity paradoxes, and discuss recent advances.