首站-论文投稿智能助手
典型文献
ADC-DL:Communication-Efficient Distributed Learning with Hierarchical Clustering and Adaptive Dataset Condensation
文献摘要:
The rapid growth of modern mobile de-vices leads to a large number of distributed data,which is extremely valuable for learning models.Unfortu-nately,model training by collecting all these origi-nal data to a centralized cloud server is not applica-ble due to data privacy and communication costs con-cerns,hindering artificial intelligence from empow-ering mobile devices.Moreover,these data are not identically and independently distributed(Non-ⅡD)caused by their different context,which will deterio-rate the performance of the model.To address these issues,we propose a novel Distributed Learning al-gorithm based on hierarchical clustering and Adaptive Dataset Condensation,named ADC-DL,which learns a shared model by collecting the synthetic samples generated on each device.To tackle the heterogene-ity of data distribution,we propose an entropy topsis comprehensive tiering model for hierarchical cluster-ing,which distinguishes clients in terms of their data characteristics.Subsequently,synthetic dummy sam-ples are generated based on the hierarchical structure utilizing adaptive dataset condensation.The procedure of dataset condensation can be adjusted adaptively ac-cording to the tier of the client.Extensive experiments demonstrate that the performance of our ADC-DL is more outstanding in prediction accuracy and commu-nication costs compared with existing algorithms.
文献关键词:
作者姓名:
Zhipeng Gao;Yan Yang;Chen Zhao;Zijia Mo
作者机构:
State Key Laboratory of Networking and Switching Technology,Beijing University of Posts and Telecommunications,Beijing 100876,China
引用格式:
[1]Zhipeng Gao;Yan Yang;Chen Zhao;Zijia Mo-.ADC-DL:Communication-Efficient Distributed Learning with Hierarchical Clustering and Adaptive Dataset Condensation)[J].中国通信(英文版),2022(12):73-85
A类:
empow,tiering
B类:
ADC,DL,Communication,Efficient,Distributed,Learning,Hierarchical,Clustering,Adaptive,Dataset,Condensation,rapid,growth,modern,mobile,leads,large,number,distributed,which,extremely,valuable,learning,models,Unfortu,nately,training,by,collecting,these,origi,nal,centralized,cloud,server,not,applica,due,privacy,communication,costs,cerns,hindering,artificial,intelligence,from,devices,Moreover,identically,independently,Non,caused,their,different,context,will,deterio,performance,To,address,issues,we,propose,novel,hierarchical,clustering,named,learns,shared,synthetic,samples,generated,each,tackle,heterogene,ity,distribution,entropy,topsis,comprehensive,distinguishes,clients,terms,characteristics,Subsequently,dummy,structure,utilizing,dataset,condensation,procedure,can,adjusted,adaptively,cording,Extensive,experiments,demonstrate,that,our,more,outstanding,prediction,accuracy,compared,existing,algorithms
AB值:
0.566249
相似文献
A Distributed Framework for Large-scale Protein-protein Interaction Data Analysis and Prediction Using MapReduce
Lun Hu-School of Computer Science and Technology, Dongguan University of Technology, Dongguan 523808,China;Xinjiang Technical Institute of Physics and Chemistry,Chinese Academy of Sciences, Urumqi 830000, China;School of Computer Science and Technology,Wuhan University of Technology, Wuhan 430070, China;Chongqing Engineering Research Center of Big Data Application for Smart Cities, and Chongqing Key Laboratory of Big Data and Intelligent Computing, Chongqing Institute of Green and Intelligent Technology, Chinese Academy of Sciences, Chongqing 400714, China;Center of Research Excellence in Renewable Energy and Power Systems, and the Department of Electrical and Computer Engineering, Faculty of Engineering, King Abdulaziz University, Jeddah 21589, Saudi Arabia;Department of Electrical and Computer Engineering, New Jersey Institute of Technology, Newark, NJ 07102 USA
机标中图分类号,由域田数据科技根据网络公开资料自动分析生成,仅供学习研究参考。