Reliability-Aware Hyperparameter Optimization for ANN-to-SNN Conversion
DOI:
https://doi.org/10.64552/wipiec.v11i1.85Keywords:
deep neural networks, spiking neural networks, reliability, edge applications, safety-critical applicationsAbstract
Spiking Neural Networks (SNNs) have emerged as an energy-efficient alternative to Artificial Neural Networks (ANNs), particularly for edge-computing and safety-critical applications. Unlike conventional ANNs, SNNs leverage sparse event-driven processing to reduce energy consumption while significantly maintaining high computational efficiency. This paper presents a framework designed to optimize the conversion of ANNs into equivalent SNNs while balancing accuracy, reliability, and energy efficiency. The proposed framework systematically explores SNN hyperparameters to identify configurations that achieve superior performance compared to their ANN counterparts. Experimental evaluations on MNIST and Fashion-MNIST datasets with different network topologies demonstrate that the optimized SNNs achieve comparable accuracy while offering in some cases 27.81× and 15.17× lower energy consumption and 1.92× and 1.84× less accuracy drop in the presence of faults, respectively, over the ANN baseline. The results highlight the applicability of SNNs in reliability-critical power-constrained environments.
References
W. Maass, “Networks of spiking neurons: the third generation of neural network models,” Neural networks, vol. 10, no. 9, pp. 1659–1671, 1997. DOI: https://doi.org/10.1016/S0893-6080(97)00011-7
K. Roy, A. Jaiswal, and P. Panda, “Towards spike-based machine intelligence with neuromorphic computing,” Nature, vol. 575, no. 7784, pp. 607–617, 2019. DOI: https://doi.org/10.1038/s41586-019-1677-2
J. Ding, Z. Pan, Y. Liu, Z. Yu, and T. Huang, “Robust stable spiking neural networks,” arXiv preprint arXiv:2405.20694, 2024.
C. D. Schuman, T. E. Potok, R. M. Patton, J. D. Birdwell, M. E. Dean, G. S. Rose, and J. S. Plank, “A survey of neuromorphic computing and neural networks in hardware,” arXiv preprint arXiv:1705.06963, 2017.
M. Pfeiffer and T. Pfeil, “Deep learning with spiking neurons: Opportunities and challenges,” Frontiers in neuroscience, vol. 12, p. 409662, 2018. DOI: https://doi.org/10.3389/fnins.2018.00774
J. H. Lee, T. Delbruck, and M. Pfeiffer, “Training deep spiking neural networks using backpropagation,” Frontiers in neuroscience, vol. 10, p.508, 2016. DOI: https://doi.org/10.3389/fnins.2016.00508
Y. Cao, Y. Chen, and D. Khosla, “Spiking deep convolutional neural networks for energy-efficient object recognition,” International Journal of Computer Vision, vol. 113, pp. 54–66, 2015. DOI: https://doi.org/10.1007/s11263-014-0788-3
C. Stöckl and W. Maass, “Optimized spiking neurons can classify images with high accuracy through temporal coding and synaptic weights trained with backpropagation,” Neural Networks, vol. 143, pp. 100–111, 2021.
T. Spyrou, S. A. El-Sayed, E. Afacan, L. A. Camuñas-Mesa, B. Linares-Barranco, and H.-G. Stratigopoulos, “Reliability analysis of a spiking neural network hardware accelerator,” in 2022 Design, Automation &Test in Europe Conference & Exhibition (DATE), 2022, pp. 370–375. DOI: https://doi.org/10.23919/DATE54114.2022.9774711
R. V. W. Putra, M. A. Hanif, and M. Shafique, “Respawn: Energy- efficient fault-tolerance for spiking neural networks considering unreliable memories,” in 2021 IEEE/ACM International Conference OnComputer Aided Design (ICCAD). IEEE, 2021, pp. 1–9. DOI: https://doi.org/10.1109/ICCAD51958.2021.9643524
R. V. Putra, M. A. Hanif, and M. Shafique, “Rescuesnn: enabling reliable executions on spiking neural network accelerators under permanent faults,” Frontiers in Neuroscience, vol. 17, p. 1159440, 2023. DOI: https://doi.org/10.3389/fnins.2023.1159440
A. B. Göğebakan, E. Magliano, A. Carpegna, A. Ruospo, A. Savino, and S. Di Carlo, “Spikingjet: Enhancing fault injection for fully and convolutional spiking neural networks,” in 2024 IEEE 30th International Symposium on On-Line Testing and Robust System Design (IOLTS). IEEE, 2024, pp. 1–7. DOI: https://doi.org/10.1109/IOLTS60994.2024.10616060
T. Spyrou, S. Hamdioui, and H.-G. Stratigopoulos, “Spikefi: A fault injection framework for spiking neural networks,” arXiv preprint arXiv:2412.06795, 2024.
B. Na, J. Mok, S. Park, D. Lee, H. Choe, and S. Yoon, “Autosnn:Towards energy-efficient spiking neural networks,” in International Conference on Machine Learning. PMLR, 2022, pp. 16 253–16 269.
J. K. Eshraghian, M. Ward, E. Neftci, X. Wang, G. Lenz, G. Dwivedi, M. Bennamoun, D. S. Jeong, and W. D. Lu, “Training spiking neural networks using lessons from deep learning,” Proceedings of the IEEE, vol. 111, no. 9, pp. 1016–1054, 2023. DOI: https://doi.org/10.1109/JPROC.2023.3308088
S. Venkatesh, R. Marinescu, and J. K. Eshraghian, “Squat: Stateful quantization-aware training in recurrent spiking neural networks,” in 2024 Neuro Inspired Computational Elements Conference (NICE). IEEE, 2024, pp. 1–10. DOI: https://doi.org/10.1109/NICE61972.2024.10549198
S. Barchid, J. Mennesson, J. Eshraghian, C. Djéraba, and M. Bennamoun, “Spiking neural networks for frame-based and event-based single object localization,” Neurocomputing, vol. 559, p. 126805, 2023. DOI: https://doi.org/10.1016/j.neucom.2023.126805
T. Zhang, S. Xiang, W. Liu, Y. Han, X. Guo, and Y. Hao, “Hybrid spiking fully convolutional neural network for semantic segmentation,” Electronics, vol. 12, no. 17, p. 3565, 2023. DOI: https://doi.org/10.3390/electronics12173565
M. Horowitz, “1.1 computing’s energy problem (and what we can do about it),” in 2014 IEEE international solid-state circuits conference digest of technical papers (ISSCC). IEEE, 2014, pp. 10–14. DOI: https://doi.org/10.1109/ISSCC.2014.6757323
M. H. Ahmadilivani, M. Taheri, J. Raik, M. Daneshtalab, and M. Jenihhin, “A systematic literature review on hardware reliability assessment methods for deep neural networks,” ACM Computing Surveys, vol. 56, no. 6, pp. 1–39, 2024. DOI: https://doi.org/10.1145/3638242
L. Pang, J. Liu, J. Harkin, G. Martin, M. McElholm, A. Javed, and L. McDaid, “Case study—spiking neural network hardware system for structural health monitoring,” Sensors, vol. 20, no. 18, p. 5126, 2020. DOI: https://doi.org/10.3390/s20185126

Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Saeed Sharifian, Mahdi Taheri, Vahid Rashtchi, Ali Azarpeyvand, Christian Herglotz, Maksim Jenihhin

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
License Terms:
Except where otherwise noted, content on this website is lincesed under a Creative Commons Attribution Non-Commercial License (CC BY NC)
Use, distribution and reproduction in any medium, provided the original work is properly cited and is not used for commercial purposes, is permitted.
Copyright to any article published by WiPiEC retained by the author(s). Authors grant WiPiEC Journal a license to publish the article and identify itself as the original publisher. Authors also grant any third party the right to use the article freely as long as it is not used for commercial purposes and its original authors, citation details, and publisher are identified, in accordance with CC BY NC license. Fore more information on license terms, click here.