Home > Archive > 2025 > Volume 15 Number 1 (2025) >
IJML 2025 Vol.15(1): 23-28
DOI: 10.18178/ijml.2025.15.1.1174

A Binarized Feature Mapping Technique for Enhancing Squeeze-and-Excitation (SE) Channel Attention Mechanism

Wu Shaoqing1,* and Hiroyuki Yamauchi2
1 Fukuoka Institute of Technology/Graduate School, Fukuoka, Japan
2 Fukuoka Institute of Technology/Computer Science and Engineering, Fukuoka, Japan
Email: mfm22202@bene.fit.ac.jp (S.Q.W.)); yamauchi@fit.ac.jp(H.Y.)
*Corresponding author

Manuscript received May 31, 2024; revised June 22, 2024; accepted July 17, 2024; published February 25, 2025

Abstract—Representing the weight in the network with only 1bit contributes to saving of the required memory footprint. Channel attention with the squeeze-and-excitation (SE) technique can eliminate redundant channels, resulting in saving of the number of the weights. Nevertheless, this causes an unstable and slow learning curve. To address this issue, this paper proposes the first attempt to accelerate the learning curve, even with a 1-bit weight representation across the whole SEResNet14 network, which significantly reduced the number of model parameters with only a minimal loss in accuracy. We also experimented with more aggressive activation functions such as HardTanh. We demonstrated that the FMB (Feature Map Binarization) method can reduce the number of active channels across different layers, thereby decreasing the quantity of weights in the channel direction. We also introduced the first attempt to utilize the EigenCAM for evaluating the channel attention effects. Experimental results demonstrate the efficacy of the proposed technique in the SE module in terms of speed-up of the learning curve and positional accuracy of the heat map based on the EigenCAM. We found the difference in the heat map position between the two cases with and without the proposed technique.

Keywords—EigenCAM, ResNet14, CIFAR-10, SVHN, SE attention mechanism, 1-bit quantization, model compression, activation functions, channel feature maps binarization, ultra-compact ai deployment

[PDF]

Cite: Wu Shaoqing and Hiroyuki Yamauchi, "A Binarized Feature Mapping Technique for Enhancing Squeeze-and-Excitation (SE) Channel Attention Mechanism," International Journal of Machine Learning vol. 15, no. 1, pp. 23-28, 2025.

Copyright © 2025 by the authors. This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).

General Information

  • E-ISSN: 2972-368X
  • Abbreviated Title: Int. J. Mach. Learn.
  • Frequency: Quarterly
  • DOI: 10.18178/IJML
  • Editor-in-Chief: Dr. Lin Huang
  • Executive Editor:  Ms. Cherry L. Chen
  • Abstracing/Indexing: Inspec (IET), Google Scholar, Crossref, ProQuest, Electronic Journals LibraryCNKI.
  • E-mail: editor@ijml.org
  • APC: 500USD


Article Metrics in Dimensions