BOOSTING LIGHTWEIGHT CNNS THROUGH NETWORK PRUNING AND KNOWLEDGE DISTILLATION FOR SAR TARGET RECOGNITION

Boosting Lightweight CNNs Through Network Pruning and Knowledge Distillation for SAR Target Recognition

Boosting Lightweight CNNs Through Network Pruning and Knowledge Distillation for SAR Target Recognition

Blog Article

Deep convolutional neural networks (CNNs) have yielded unusually brilliant results in synthetic aperture radar (SAR) target recognition.However, overparameterization is a widely-recognized property of deep CNNs, and most previous works excessively pursued high accuracy but neglected the requirement of model deployment in radar systems, i.e.

, small computations and low memory cost.Therefore, further research on lightweight CNNs 1994 toyota camry green for SAR target recognition is necessary.In this article, we devise an effective CNN with channel-wise attention mechanism for SAR target recognition and then compress the network structure and recover lightweight network performance through network pruning and knowledge distillation, respectively.

The attention values produced by the network are utilized to evaluate the importance of convolution kernels, and unimportant kernels are pruned.In addition, a novel bridge connection based knowledge distillation method is proposed.Instead of directly mimicking the hidden layer output or artificially designing a function to extract the knowledge in humboldt cherry cider hidden layers, bridge connections are introduced to distill internal knowledge via teacher network.

Experiments are conducted on the moving and stationary target acquisition and recognition benchmark dataset.The proposed network has excellent generalization performance and reaches an accuracy of 99.46% on the classification of ten-class targets without any data augmentation.

Furthermore, through the network pruning and knowledge distillation algorithm, we cut down 90% parameters of the proposed CNN while maintaining model performance.

Report this page