Robust Inference via Lq-Likelihood

Embargo until
Date
2013-10-23
Journal Title
Journal ISSN
Volume Title
Publisher
Johns Hopkins University
Abstract
Robust inference is of great importance in modern statistics. In this dissertation, we introduce a series of robust statistical procedures based on the concept of Lq-likelihood. The Lq-likelihood function partially preserves the desired properties of the log-likelihood function. Moreover, it provides remarkable robustness, on which we can develop robust statistical procedures. The tuning parameter q of the Lq-likelihood makes our robust statistical procedures more flexible; because when q tends to 1, the Lq-likelihood reduces to the traditional log-likelihood. Therefore, we can use q to adjust the efficiency-robustness trade off as well as the bias-variance trade off. In this dissertation, we first introduce a new robust estimator called maximum Lq-likelihood estimate (MLqE) and derive its properties from a robust statistics point of view. We also develop a robust testing procedure --- the Lq-likelihood ratio test (LqLR) --- and demonstrate its effectiveness on contaminated data. We further move to the problem of robust estimation of mixture models and propose an expectation maximization algorithm for Lq-likelihood (EM-Lq). Finally, we develop a robust clustering technique and provide an application of our technique to brain graph data.
Description
Keywords
robustness, relative efficiency, gross error model, hypothesis testing, mixture model, clustering
Citation