Latent Variable and Classification Performance Analysis of Bird-Drone Spectrograms with Elementary Autoencoder

Research output: Contribution to journalArticlepeer-review

96 Downloads (Pure)

Abstract

Deep learning with convolutional neural networks has been widely utilised in radar research concerning automatic target recognition. Maximising numerical metrics to gauge the performance of such algorithms does not necessarily correspond to model robustness against untested targets, nor does it lead to improved model interpretability. Approaches designed to explain the mechanisms behind the operation of a classifier on radar data are proliferating, but bring with them a significant computational and analysis overhead. This work uses an elementary unsupervised convolutional autoencoder (CAE) to learn a compressed representation of a challenging dataset of urban bird and drone targets and subsequently if apparent quality of the representation via preservation of class labels leads to better classification performance after a separate supervised training stage. It is shown that a CAE that reduces the features output after each layer of the encoder gives rise to the best drone vs bird classifier. A clear connection between unsupervised evaluation via label preservation in the latent space and subsequent classification accuracy after supervised fine-tuning is shown, supporting further efforts to optimise radar data latent representations to enable optimal performance and model interpretability.
Original languageEnglish
Article number10804883
Pages (from-to)115-123
Number of pages9
JournalIEEE Transactions on Radar Systems
Volume3
Early online date17 Dec 2024
DOIs
Publication statusPublished - 10 Jan 2025

Keywords

  • Autoencoder
  • classification
  • convolutional autoencoder (CAE)
  • latent variables
  • spectrograms

Fingerprint

Dive into the research topics of 'Latent Variable and Classification Performance Analysis of Bird-Drone Spectrograms with Elementary Autoencoder'. Together they form a unique fingerprint.

Cite this