TY - JOUR
T1 - An Automated Approach for Improving The Inference Latency and Energy Efficiency of Pretrained CNNs by Removing Irrelevant Pixels with Focused Convolutions
AU - Tung, Caleb
AU - Eliopoulos, Nick
AU - Jajal, Purvish
AU - Ramshankar, Gowri
AU - Yang, Chen-Yun
AU - Synovic, Nicholas
AU - Zhang, Xuecen
AU - Chaudhary, Vipin
AU - Thiruvathukal, George K.
AU - Lu, Yung-Hsiang
N1 - Tung, Caleb; Eliopoulos, Nicholas; Jajal, Purvish; Ramshankar, Gowri; Yang, Chen-Yun; Synovic, Nicholas; Zhang, Xuecen; Chaudhary, Vipin; Thiruvathukal, George K.; ; Lu, Yung-Hsiang. "An automated approach for improving the inference latency and energy efficiency of pretrained CNNs by removing irrelevant pixels with focused convolutions." In Proceedings of 29th Asia and South Pacific Design Automation Conference (ASP-DAC 2024). https://doi.org/10.6084/m9.figshare.25058516
PY - 2024/3/25
Y1 - 2024/3/25
N2 - Computer vision often uses highly accurate Convolutional Neural Networks (CNNs), but these deep learning models are associated with ever-increasing energy and computation requirements. Producing more energy-efficient CNNs often requires model training which can be cost-prohibitive. We propose a novel, automated method to make a pretrained CNN more energy-efficient without re-training. Given a pretrained CNN, we insert a threshold layer that filters activations from the preceding layers to identify regions of the image that are irrelevant, i.e. can be ignored by the following layers while maintaining accuracy. Our modified focused convolution operation saves inference latency (by up to 25%) and energy costs (by up to 22%) on various popular pretrained CNNs, with little to no loss in accuracy.
AB - Computer vision often uses highly accurate Convolutional Neural Networks (CNNs), but these deep learning models are associated with ever-increasing energy and computation requirements. Producing more energy-efficient CNNs often requires model training which can be cost-prohibitive. We propose a novel, automated method to make a pretrained CNN more energy-efficient without re-training. Given a pretrained CNN, we insert a threshold layer that filters activations from the preceding layers to identify regions of the image that are irrelevant, i.e. can be ignored by the following layers while maintaining accuracy. Our modified focused convolution operation saves inference latency (by up to 25%) and energy costs (by up to 22%) on various popular pretrained CNNs, with little to no loss in accuracy.
KW - artificial intelligence
KW - energy-efficient computing
KW - computer vision
UR - https://ecommons.luc.edu/cs_facpubs/361
U2 - 10.1109/ASP-DAC58780.2024.10473884
DO - 10.1109/ASP-DAC58780.2024.10473884
M3 - Article
JO - Computer Science: Faculty Publications and Other Works
JF - Computer Science: Faculty Publications and Other Works
ER -