2007/039 - Bias-Variance tradeoff in Hybrid Generative-Discriminative models
The Sixth International conference on Machine Learning and Applications (ICMLA 07), Cincinnati, Ohio, USA, 13-15 Dec.2007.
Given any generative classifier based on an inexact density model, we can define a discriminative counterpart that reduces its asymptotic error rate, while increasing the estimation variance. An optimal bias-variance balance might be found using Hybrid Generative-Discriminative (HGD) approaches. In these paper, these methods are defined in a unified framework. This allow us to find sufficient conditions under which an improvement in generalization performances is guaranteed. Numerical experiments illustrate the well foundness of our statements.