Abstract
In this paper, we compare several detection algorithms that are based on spectral matched (subspace) filters. Nonlin- ear (kernel) versions of these spectral matched (subspace) detectors are also discussed and their performance is com- pared with the linear versions. These kernel-based detec- tors exploit the nonlinear correlations between the spec- tral bands that are ignored by the conventional detectors. Several well-known matched detectors, such as matched subspace detector, orthogonal subspace detector, spectral matched filter and adaptive subspace detector (adaptive co- sine estimator) are extended to their corresponding kernel versions by using the idea of kernel-based learning theory. In kernel-based detection algorithms the data is implicitly mapped into a high dimensional kernel feature space by a nonlinear mapping which is associated with a kernel func- tion. The detection algorithm is then derived in the feature space which is kernelized in terms of the kernel functions in order to avoid explicit computation in the high dimensional feature space. Experimental results based on simulated toy- examples and real hyperspectral imagery show that the ker- nel versions of these detectors outperform the conventional linear detectors.
1 Introduction
Detecting signals of interest, particularly with wide signal variability, in noisy environments has long been a challeng- ing issue in various fields of signal processing. Among a number of previously developed detectors, the well-known matched subspace detector (MSD) [1], orthogonal subspace detector (OSD) [1, 2], spectral matched filter (SMF) [3, 4], and adaptive subspace detectors (ASD) also known as adap- tive cosine estimator (ACE) [5, 6] have been widely used to detect a desired signal (target).
Matched signal detectors, such as spectral matched fil- ter and matched subspace detectors (whether adaptive or non-adaptive), only exploit second order correlations, thus completely ignoring nonlinear (higher order) spectral inter- band correlations that could be crucial to discriminate be- tween target and background. In this paper, our aim is to introduce nonlinear versions of MSD, OSD, SMF and ASD
detectors which effectively exploits the higher order spec- tral inter-band correlations in a high (possibly infinite) di- mensional feature space associated with a certain nonlinear mapping via kernel-based learning methods [7]. A nonlin- ear mapping of the input data into a high dimensional fea- ture space is often expected to increase the data separability and reduce the complexity of the corresponding data struc- ture. The nonlinear versions of a number of signal process- ing techniques such as principal component analysis (PCA) [8], Fisher discriminant analysis [9], linear classifiers [10], and kernel-based anomaly detection [11] have already been defined in a kernel space.
This paper is organized as follows. Section 2 provides the background to the kernel-based learning methods and kernel trick. Section 3 introduces a linear matched subspace and its kernel version. The orthogonal subspace detector is defined in Section 4 as well as its kernel version. In Section 5 we describe the conventional spectral matched filter ad its kernel version in the feature space and reformulate the the expression in terms of the kernel function using the kernel trick. Finally, in Section 6 the adaptive subspace detector and its kernel version are introduced. Performance com- parison between the conventional and the kernel versions of these algorithms is provided in Section 7 and conclusions are given in Section 8.
2 Kernel-based Learning and Kernel Trick
Suppose that the input hyperspectral data is represented by the data space ( ) and is a feature space associated with by a nonlinear mapping function
(1)
where is an input vector in which is mapped into a potentially much higher – (could be infinite) – dimensional feature space. Due to the high dimensionality of the feature space , it is computationally not feasible to implement any algorithm directly in feature space. However, kernel-based learning algorithms use an effective kernel trick given by Eq. (2) to implement dot products in feature space by em- ploying kernel functions [7]. The idea in kernel-based techniques is to obtain a nonlinear version of an algorithm de- fined in the input space by implicitly redefining it in the feature space and then converting it in terms of dot prod- ucts. The kernel trick is then used to implicitly compute the dot products in without mapping the input vectors into
; therefore, in the kernel methods, the mapping does not need to be identified.
The kernel representation for the dot products in is expressed as
(2)
where is a kernel function in terms of the original data. There are a large number of Mercer kernels that have the kernel trick property, see [7] for detailed information about the properties of different kernels and kernel-based learn- ing. Our choice of kernel in this paper is the Gaussian RBF kernel and the associated nonlinear function with this ker- nel generates a feature space of infinite dimensionality
CS 340 Milestone One Guidelines and Rubric Overview: For this assignment, you will implement the fundamental operations of create, read, update,
Retail Transaction Programming Project Project Requirements: Develop a program to emulate a purchase transaction at a retail store. This
7COM1028 Secure Systems Programming Referral Coursework: Secure
Create a GUI program that:Accepts the following from a user:Item NameItem QuantityItem PriceAllows the user to create a file to store the sales receip
CS 340 Final Project Guidelines and Rubric Overview The final project will encompass developing a web service using a software stack and impleme