2009 IEEE International Symposium on Parallel & Distributed Processing (IPDPS)
Download PDF

Abstract

In this paper, we propose a kernel hat matrix based learning stage for outlier removal. In particular, we show that the gaussian kernel hat matrix have very interesting discriminative properties under the condition of choosing appropriate values for kernel parameters. Thus, we develop a practical model selection criteria in order to well separate the “outlier” distribution from the “dominant” distribution. This learning stage, beforehand applied to the training data set, offers a new answer for down-weighting outliers corrupting both the response and predictor variables in regression tasks. The application to simulated and real data shows the robustness of the proposed approach.
Like what you’re reading?
Already a member?
Get this article FREE with a new membership!

Similar Articles