特黄三级爱爱视频|国产1区2区强奸|舌L子伦熟妇aV|日韩美腿激情一区|6月丁香综合久久|一级毛片免费试看|在线黄色电影免费|国产主播自拍一区|99精品热爱视频|亚洲黄色先锋一区

基于樣本重要性的分布式深度學(xué)習(xí)通信優(yōu)化策略

  • 打印
  • 收藏
收藏成功


打開文本圖片集

Distributed deep learning communication optimization strategy basedonsampleimportance

MENG Yugong (GuangxiCodemakerInformation TechnologyCo.,Ltd.,Nanning53ooo3,China)

Abstract:Thecomputing nodes indistributeddeep learning(DDL)need to frequentlyexchangegradientdatawith the server,whichresultsinlargecommunicationoverhead.Inviewof this,aDDLcommunicationoptimizationstrategybasedon sampleimportance isproposed.Itmainlyincludesthreecontents.Theimportancedistributionofdatasamples isexploredby confirmatoryexperiments.Theimportanceofdatasamplesisevaluatedbycross-entropylossIncombinationwiththenetwork statusawarenessmechanismandbytakingtheend-to-end network delayasthenetwork status feedback indicator,thecomputing nodesareusedtoadjustthecompressionratiosofthetransmissongradientdynamicall,whichreducesnetworktraficwhile ensuringmodelconvergence,therebyimproving thetraining eficiencyofDDL.Experimentalresultsshowthattheproposed methodanimprovecommunicationeffciencyefectivelyindistributed trainingsenariosofdiferentscalesIncomparisonwith the existing gradient compression strategies,the proposed method can reduce distributed training time by up to 40% :

Keywords:DDL;stochasticgradient descent;sampleimportance;cross -entropy;network statusawareness;dynamic compression

0 引言

深度神經(jīng)網(wǎng)絡(luò)(DNN)被廣泛用于支持越來越多的人工智能應(yīng)用,如計算機視覺、自然語言處理2和網(wǎng)絡(luò)優(yōu)化等。(剩余7952字)

monitor