Central Attention Network for Hyperspectral Imagery Classification

Date
2022-03-10
Journal Title
Journal ISSN
Volume Title
Publisher
IEEE Transactions on Neural Networks and Learning Systems
Abstract
In this article, the intrinsic properties of hyperspectral imagery (HSI) are analyzed, and two principles for spectral-spatial feature extraction of HSI are built, including the foundation of pixel-level HSI classification and the definition of spatial information. Based on the two principles, scaled dot-product central attention (SDPCA) tailored for HSI is designed to extract spectral-spatial information from a central pixel (i.e., a query pixel to be classified) and pixels that are similar to the central pixel on an HSI patch. Then, employed with the HSI-tailored SDPCA module, a central attention network (CAN) is proposed by combining HSI-tailored dense connections of the features of the hidden layers and the spectral information of the query pixel. MiniCAN as a simplified version of CAN is also investigated. Superior classification performance of CAN and miniCAN on three datasets of different scenarios demonstrates their effectiveness and benefits compared with state-of-the-art methods.
Description
Copyright 2022 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. This article was originally published in IEEE Transactions on Neural Networks and Learning Systems. The version of record is available at: https://doi.org/10.1109/TNNLS.2022.3155114
Keywords
Central attention, hyperspectral imagery (HSI), spectral-spatial feature extraction, transformer
Citation
H. Liu, W. Li, X. -G. Xia, M. Zhang, C. -Z. Gao and R. Tao, "Central Attention Network for Hyperspectral Imagery Classification," in IEEE Transactions on Neural Networks and Learning Systems, doi: 10.1109/TNNLS.2022.3155114.