Heidelberger Kolloquium

Schedule of event:

06/11/2017, 16:15

More event dates:

For this event there are no more dates.


Dominic Edelmann, C060


Dipl.-Ing. Dr. Florian Frommlet, Dept. of Statistics and Operations Research, Universität Wien




High dimensional model selection with FDR controlling L0 penalties

Penalized selection criteria like AIC or BIC are among the most popular methods for variable selection. Their theoretical properties have been studied intensively and are well understood in case of a moderate number of variables. However, these criteria do not work well in a high-dimensional setting under the assumption of sparsity. We will introduce different modifications of AIC and BIC which will allow to control the family wise error (mAIC, mBIC) or the false discovery rate (mAIC2, mBIC2), respectively, in terms of including false positive regressors in the model. We will discuss for which purpose one might prefer mAIC2 or mBIC2 and give the conditions under which they provide selection procedures which are asymptotically Bayes optimal under sparsity (ABOS).

The second part of the talk will discuss the difficult task of model search and introduce four different strategies. The simplest and most straight forward approach is stepwise selection as implemented in MOSGWA, a software package designed for GWAS analysis. Alternatively adaptive ridge regression can be used, where iteratively weighted ridge problems are solved whose weights are updated in such a way that the procedure converges towards selection with L0 penalties. Finally we will introduce two different algorithms which allow for Bayesian model selection based on our selection criteria and compare them in the context of GWAS analysis.


ATV - Seminarraum A0.106

 back to overview
to top