Efficient Neural Network Reduction for AI-on-the-edge Applications through Structural Compression
DOI:
https://doi.org/10.64552/wipiec.v11i1.89Keywords:
IoT, Edge AI, Deep Model Optimization, Neural Network CompressionAbstract
Modern neural networks often rely on overparameterized architectures to ensure stability and accuracy, but in many real-world scenarios, large models are unnecessarily expensive to train and deploy. This is especially true in Internet of Things (IoT) and edge computing scenarios, where computational resources and available memory are severely limited. Reducing the size of neural networks without compromising their ability to solve the target task remains a practical challenge, especially when the goal is to simplify the architecture itself, not just the weight space. To address this problem, we introduce ImproveNet, a simple and general method that reduces the size of a neural network, without compromising its ability to solve the original task. The approach does not require any pre-trained model, specific architecture knowledge, or manual tuning. Starting with a standard-sized network and the standard training configuration, ImproveNet verifies the model's performance during training. Once the performance requirements are met, it reduces the network by resizing feature maps or removing internal layers, thus making it ready for AI-on-the-edge deployment and execution.
References
H. N. W. R. C. W. W. H. Z. Z. &. V. A. V. Dai, "Big data analytics for large-scale wireless networks: Challenges and opportunities," ACM Computing Surveys (CSUR), pp. 1-36, 2019. DOI: https://doi.org/10.1145/3337065
B. K. S. C. B. Z. M. T. M. H. A. .. &. K. D. Jacob, "Quantization and training of neural networks for efficient integer-arithmetic-only inference," Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 2704-2713, 2018. DOI: https://doi.org/10.1109/CVPR.2018.00286
H. K. A. D. I. S. H. &. G. H. P. Li, "Pruning filters for efficient convnets," arXiv preprint arXiv:1608.08710, 2016.
T. G. I. &. S. J. Chen, "Net2net: Accelerating learning via knowledge transfer," arXiv preprint arXiv:1511.05641, 2015.
R. C. a. A. N.-M. C. Buciluˇa, "Model compression," Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining, p. 535–541, 2006. DOI: https://doi.org/10.1145/1150402.1150464
K. H. C. A. J. R. B. N. J. L. D. .. &. D. C. G. Kotowski, "European space agency benchmark for anomaly detection in satellite telemetry," arXiv preprint arXiv:2406.17826, 2024.
J. H. Z. H. Z. H. Y. X. C. W. W. J. &. L. W. Luo, "ThiNet: Pruning CNN filters for a thinner net," IEEE transactions on pattern analysis and machine intelligence, vol. 41, no. 10, pp. 2525-2538, 2018. DOI: https://doi.org/10.1109/TPAMI.2018.2858232
Y. K. G. D. X. F. Y. &. Y. Y. He, "Soft filter pruning for accelerating deep convolutional neural networks," arXiv preprint arXiv:1808.06866, 2018.
S. &. B. R. V. Srinivas, "Data-free parameter pruning for deep neural networks," arXiv preprint arXiv:1507.06149, 2015. DOI: https://doi.org/10.5244/C.29.31
S. P. J. T. J. &. D. W. Han, "Learning both weights and connections for efficient neural network," Advances in neural information processing systems, vol. 28, 2015.
G. V. O. &. D. J. Hinton, "Distilling the knowledge in a neural network," arXiv preprint arXiv:1503.02531, 2015.

Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Adriano Puglisi, Flavia Monti, Christian Napoli, Massimo Mecella

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
License Terms:
Except where otherwise noted, content on this website is lincesed under a Creative Commons Attribution Non-Commercial License (CC BY NC)
Use, distribution and reproduction in any medium, provided the original work is properly cited and is not used for commercial purposes, is permitted.
Copyright to any article published by WiPiEC retained by the author(s). Authors grant WiPiEC Journal a license to publish the article and identify itself as the original publisher. Authors also grant any third party the right to use the article freely as long as it is not used for commercial purposes and its original authors, citation details, and publisher are identified, in accordance with CC BY NC license. Fore more information on license terms, click here.