Logo
Nazad
Xue Fu, Guan Gui, Yu Wang, T. Ohtsuki, B. Adebisi, H. Gačanin, F. Adachi
41 1. 3. 2022.

Lightweight Automatic Modulation Classification Based on Decentralized Learning

Due to the implementation and performance limitations of centralized learning automatic modulation classification (CentAMC) method, this paper proposes a decentralized learning AMC (DecentAMC) method using model consolidation and lightweight design. Specifically, the model consolidation is realized by a central device (CD) for edge device (ED) model averaging (MA) and multiple EDs for ED model training. The lightweight is designed by separable convolutional neural network (S-CNN), in which the separable convolutional layer is utilized to replace the standard convolution layer and most of fully connected layers are cut off. Simulation results show that the proposed method substantially reduces the storage and computational capacity requirements of the EDs and communication overhead. The training efficiency also shows remarkable improvement. Compared with convolutional neural network (CNN), the space complexity (i.e., model parameters and output feature map) is decreased by about 94% and the time complexity (i.e., floating point operations) of S-CNN is decreased by about 96% while degrading the average correct classification probability by less than 1%. Compared with S-CNN-based CentAMC, without considering model weights uploading and downloading, the training efficiency of our proposed method is about <inline-formula> <tex-math notation="LaTeX">${N}$ </tex-math></inline-formula> times of it, where <inline-formula> <tex-math notation="LaTeX">${N}$ </tex-math></inline-formula> is the number of EDs. Considering the model weights uploading and downloading, the training efficiency of our proposed method can still be maintained at a high level (e.g., when the number of EDs is 12, the training efficency of the proposed AMC method is about 4 times that of S-CNN-based CentAMC in dataset <inline-formula> <tex-math notation="LaTeX">$D_{1} = \{2{\mathrm {FSK, 4FSK, 8FSK, BPSK, QPSK, 8PSK, 16QAM}}\}$ </tex-math></inline-formula> and about 5 times that of S-CNN-based CentAMC in dataset <inline-formula> <tex-math notation="LaTeX">$D_{2} = \{2 {\mathrm {FSK, 4FSK, 8FSK, BPSK, QPSK, 8PSK, PAM2, PAM4, PAM8, 16QAM}}\}$ </tex-math></inline-formula>), while the communication overhead is reduced more than 35%.


Pretplatite se na novosti o BH Akademskom Imeniku

Ova stranica koristi kolačiće da bi vam pružila najbolje iskustvo

Saznaj više