典型文献
Probability-Based Channel Pruning for Depthwise Separable Convolutional Networks
文献摘要:
Channel pruning can reduce memory consumption and running time with least performance damage,and is one of the most important techniques in network compression.However,existing channel pruning methods mainly focus on the pruning of standard convolutional networks,and they rely intensively on time-consuming fine-tuning to achieve the performance improvement.To this end,we present a novel efficient probability-based channel pruning method for depth-wise separable convolutional networks.Our method leverages a new simple yet effective probability-based channel pruning criterion by taking the scaling and shifting factors of batch normalization layers into consideration.A novel shifting factor fusion technique is further developed to improve the performance of the pruned networks without requiring extra time-consuming fine-tuning.We apply the proposed method to five representative deep learning networks,namely MobileNetV1,MobileNetV2,ShuffleNetV1,ShuffleNetV2,and GhostNet,to demonstrate the efficiency of our pruning method.Exten-sive experimental results and comparisons on publicly available CIFAR10,CIFAR100,and ImageNet datasets validate the feasibility of the proposed method.
文献关键词:
中图分类号:
作者姓名:
Han-Li Zhao;Kai-Jie Shi;Xiao-Gang Jin;Ming-Liang Xu;Hui Huang;Wang-Long Lu;Ying Liu
作者机构:
College of Computer Science and Artificial Intelligence,Wenzhou University,Wenzhou 325035,China;State Key Laboratory of CAD&CG,Zhejiang University,Hangzhou 310058,China;School of Information Engineering,Zhengzhou University,Zhengzhou 450000,China;Department of Computer Science,Memorial University of Newfoundland,St.John's,NL A1B 3X5,Canada
文献出处:
引用格式:
[1]Han-Li Zhao;Kai-Jie Shi;Xiao-Gang Jin;Ming-Liang Xu;Hui Huang;Wang-Long Lu;Ying Liu-.Probability-Based Channel Pruning for Depthwise Separable Convolutional Networks)[J].计算机科学技术学报(英文版),2022(03):584-600
A类:
pruned,ShuffleNetV1
B类:
Probability,Based,Channel,Pruning,Depthwise,Separable,Convolutional,Networks,pruning,can,reduce,memory,consumption,running,least,performance,damage,one,most,important,techniques,compression,However,existing,channel,methods,mainly,focus,standard,convolutional,networks,they,rely,intensively,consuming,fine,tuning,achieve,improvement,To,this,end,novel,efficient,probability,depth,separable,Our,leverages,new,simple,yet,effective,criterion,by,taking,scaling,shifting,factors,batch,normalization,layers,into,consideration,fusion,further,developed,without,requiring,extra,We,apply,proposed,five,representative,deep,learning,namely,MobileNetV1,MobileNetV2,ShuffleNetV2,GhostNet,demonstrate,efficiency,our,Exten,experimental,results,comparisons,publicly,available,CIFAR100,ImageNet,datasets,validate,feasibility
AB值:
0.627331
相似文献
机标中图分类号,由域田数据科技根据网络公开资料自动分析生成,仅供学习研究参考。