Hyperspectral Image Denoising With Group Sparse and Low-Rank Tensor Decomposition

Zhihong Huang, Shutao Li, Leyuan Fang, Huali Li, Jon Atli Benediktsson

Research output: Contribution to journalArticlepeer-review

21 Citations (Scopus)

Abstract

Hyperspectral image (HSI) is usually corrupted by various types of noise, including Gaussian noise, impulse noise, stripes, deadlines, and so on. Recently, sparse and low-rank matrix decomposition (SLRMD) has demonstrated to be an effective tool in HSI denoising. However, the matrix-based SLRMD technique cannot fully take the advantage of spatial and spectral information in a 3-D HSI data. In this paper, a novel group sparse and low-rank tensor decomposition (GSLRTD) method is proposed to remove different kinds of noise in HSI, while still well preserving spectral and spatial characteristics. Since a clean 3-D HSI data can be regarded as a 3-D tensor, the proposed GSLRTD method formulates a HSI recovery problem into a sparse and low-rank tensor decomposition framework. Specifically, the HSI is first divided into a set of overlapping 3-D tensor cubes, which are then clustered into groups by K-means algorithm. Then, each group contains similar tensor cubes, which can be constructed as a new tensor by unfolding these similar tensors into a set of matrices and stacking them. Finally, the SLRTD model is introduced to generate noisefree estimation for each group tensor. By aggregating all reconstructed group tensors, we can reconstruct a denoised HSI. Experiments on both simulated and real HSI data sets demonstrate the effectiveness of the proposed method.
Original languageEnglish
Pages (from-to)1380-1390
JournalIEEE Access
Volume6
DOIs
Publication statusPublished - 2018

Other keywords

  • Hyperspectral image
  • Denoising
  • Sparse and low-rank tensor decomposition
  • Nonlocal similarity
  • Myndvinnsla

Fingerprint

Dive into the research topics of 'Hyperspectral Image Denoising With Group Sparse and Low-Rank Tensor Decomposition'. Together they form a unique fingerprint.

Cite this