Optimal sparse kernel learning for hyperspectral anomaly detection

Prudhvi Gurram, Heesung Kwon, Zhimin Peng, and Wotao Yin

IEEE WHISPERS Conference, Gainesville, FL, 2013

Overview

In this paper, a novel framework of sparse kernel learning for Support Vector Data Description (SVDD) based anomaly de- tection is presented. In this work, optimal sparse feature selection for anomaly detection is first modeled as a mixed integer programming (MIP) problem. Due to the prohibitively high computational complexity of the MIP, it is relaxed into a quadratically constrained linear program (QCLP). The QCLP can then be practically solved by using an iterative optimization method, in which multiple subsets of features are iteratively found as opposed to a single subset. The QCLP-based iterative optimization problem is solved in a finite space called the empirical kernel feature space (EKFS) instead of in the input space or reproducing kernel Hilbert space (RKHS). This is possible because of the fact that the geometrical properties of the EKFS and the corresponding RKHS remain the same. Now, an explicit nonlinear exploitation of the data in a finite EKFS is achievable, which results in optimal feature ranking. Experimental results based on a hyperspectral image show that the proposed method can provide improved performance over the current state-of-the-art techniques.

Citation

P. Gurram, H. Kwon, Z. Peng, and W. Yin, Optimal sparse kernel learning for hyperspectral anomaly detection, IEEE WHISPERS’13, Gainesville, FL, 2013.


« Back