arXiv Analytics

Sign in

arXiv:1805.06144 [math.ST]AbstractReferencesReviewsResources

On Difference Between Two Types of $γ$-divergence for Regression

Takayuki Kawashima, Hironori Fujisawa

Published 2018-05-16Version 1

The $\gamma$-divergence is well-known for having strong robustness against heavy contamination. By virtue of this property, many applications via the $\gamma$-divergence have been proposed. There are two types of \gd\ for regression problem, in which the treatments of base measure are different. In this paper, we compare them and pointed out a distinct difference between these two divergences under heterogeneous contamination where the outlier ratio depends on the explanatory variable. One divergence has the strong robustness under heterogeneous contamination. The other does not have in general, but has when the parametric model of the response variable belongs to a location-scale family in which the scale does not depend on the explanatory variables or under homogeneous contamination where the outlier ratio does not depend on the explanatory variable. \citet{hung.etal.2017} discussed the strong robustness in a logistic regression model with an additional assumption that the tuning parameter $\gamma$ is sufficiently large. The results obtained in this paper hold for any parametric model without such an additional assumption.

Related articles: Most relevant | Search more
arXiv:1308.2927 [math.ST] (Published 2013-08-13, updated 2016-03-30)
Robust estimation on a parametric model via testing
arXiv:1005.2781 [math.ST] (Published 2010-05-16)
Divergence of sample quantiles
arXiv:1510.08226 [math.ST] (Published 2015-10-28)
Asymptotic expansion of the risk of maximum likelihood estimator with respect to $α$-divergence as a measure of the difficulty of specifying a parametric model --with detailed proof