Yuewen Xu

AMC-Transformer: Automatic Modulation Classification based on Enhanced Attention Model

High-accuracy automatic modulation classification (AMC) is essential for spectrum monitoring and interferenceaware access in future 6G systems [1]. We propose AMCTransformer, which tokenizes raw I/Q sequences into fixedlength patches, augments them with learnable positional embeddings, and applies multi-layer, multi-head self-attention to capture global temporal–spatial correlations without handcrafted features or convolutions. On RadioML2018.01A, our model achieves 98.8% accuracy in the high-SNR regime (SNR at least 10 dB), showing higher accuracy than a CNN and a ResNet reimplementation by 4.44% and 1.96% in relative terms; averaged across all SNRs, it also improves upon MCformer, CNN, and ResNet baselines. Consistent gains are observed on the RadioML2016.10A dataset, further validating robustness across benchmarks. Ablations on depth, patch size, and head count provide practical guidance under different SNR regimes and compute budgets. These results demonstrate the promise of transformer-based AMC for robust recognition in complex wireless environments.

Reference:

DOI: 10.36244/ICJ.2025.4.5

Download 

Please cite this paper the following way:

Yuewen Xu "AMC-Transformer: Automatic Modulation Classification based on Enhanced Attention Model", Infocommunications Journal, Vol. XVII, No 4, December 2025, pp. 32-40., https://doi.org/10.36244/ICJ.2025.4.5