报告人: Xingye Qiao博士于2010年毕业于美国北卡大学(University of North Carolina)统计与运筹系,现为纽约州立大学宾汉顿分校(Binghamton University, State University of New York)数学科学系的助理教授。他的研究方向主要是高维数据分析,机器学习以及大数据的应用。
报告时间:2015.1.19(周一)上午9:00-10:30
报告地点:数学楼二楼学术报告厅
abstract: Stability has been of a great concern in statistics: similar statistical conclusions should be drawn based on different data sampled from the same population. In this article, we introduce a general measure of classification instability (CIS) to capture the sampling variability of the predictions made by a classification procedure. The minimax rate of CIS is established for general plug-in classifiers. As a concrete example, we consider the stability of the nearest neighbor classifier. In particular, we derive an asymptotically equivalent form for the CIS of a weighted nearest neighbor classifier. This allows us to develop a novel stabilized nearest neighbor classifier which well balances the trade-off between classification accuracy and stability. The resulting classification procedure is shown to possess the minimax optimal rates in both excess risk and CIS. Extensive experiments demonstrate a significant improvement of CIS over existing nearest neighbor classifiers at an ignorable cost of classification accuracy.