首站-论文投稿智能助手
典型文献
SIGNGD with Error Feedback Meets Lazily Aggregated Technique:Communication-Efficient Algorithms for Distributed Learning
文献摘要:
The proliferation of massive datasets has led to significant interests in distributed algorithms for solving large-scale machine learning problems.However,the communication overhead is a major bottleneck that hampers the scalability of distributed machine learning systems.In this paper,we design two communication-efficient algorithms for distributed learning tasks.The first one is named EF-SIGNGD,in which we use the 1-bit (sign-based) gradient quantization method to save the communication bits.Moreover,the error feedback technique,i.e.,incorporating the error made by the compression operator into the next step,is employed for the convergence guarantee.The second algorithm is called LE-SIGNGD,in which we introduce a well-designed lazy gradient aggregation rule to EF-SIGNGD that can detect the gradients with small changes and reuse the outdated information.LE-SlGNGD saves communication costs both in transmitted bits and communication rounds.Furthermore,we show that LE-SIGNGD is convergent under some mild assumptions.The effectiveness of the two proposed algorithms is demonstrated through experiments on both real and synthetic data.
文献关键词:
作者姓名:
Xiaoge Deng;Tao Sun;Feng Liu;Dongsheng Li
作者机构:
National Laboratory for Parallel and Distributed Processing (PDL),College of Computer,National University of Defense Technology,Changsha 410073,China
引用格式:
[1]Xiaoge Deng;Tao Sun;Feng Liu;Dongsheng Li-.SIGNGD with Error Feedback Meets Lazily Aggregated Technique:Communication-Efficient Algorithms for Distributed Learning)[J].清华大学学报自然科学版(英文版),2022(01):174-185
A类:
SIGNGD,Lazily,SlGNGD
B类:
Error,Feedback,Meets,Aggregated,Technique,Communication,Efficient,Algorithms,Distributed,Learning,proliferation,massive,datasets,has,significant,interests,distributed,algorithms,solving,large,scale,machine,learning,problems,However,communication,overhead,major,bottleneck,that,hampers,scalability,systems,In,this,paper,two,efficient,tasks,first,one,named,EF,which,quantization,method,bits,Moreover,error,feedback,technique,incorporating,made,by,compression,operator,into,next,step,employed,convergence,guarantee,second,called,LE,introduce,well,designed,lazy,aggregation,rule,detect,gradients,small,changes,reuse,outdated,information,saves,costs,both,transmitted,rounds,Furthermore,show,convergent,under,some,mild,assumptions,effectiveness,proposed,demonstrated,through,experiments,real,synthetic
AB值:
0.588256
相似文献
Toward High-Performance Delta-Based Iterative Processing with a Group-Based Approach
Hui Yu;Xin-Yu Jiang;Jin Zhao;Hao Qi;Yu Zhang;Xiao-Fei Lia;Hai-Kun Liu;Fu-Bing Mao;Hai Jin-National Engineering Research Center for Big Data Technology and System,Huazhong University of Science and Technology,Wuhan 430074,China;Service Computing Technology and System Laboratory,Huazhong University of Science and Technology Wuhan 430074,China;Cluster and Grid Computing Laboratory,Huazhong University of Science and Technology,Wuhan 430074,China;School of Computer Science and Technology,Huazhong University of Science and Technology,Wuhan 430074,China;School of Computer Science and Technology,HUST,Wuhan;School of Computer Science and Technology at HUST,Wuhan;School of Computer Science and Technology,Huazhong University of Science and Technology,Wuhan;Huazhong University of Science and Technology(HUST),Wuhan
机标中图分类号,由域田数据科技根据网络公开资料自动分析生成,仅供学习研究参考。