特黄三级爱爱视频|国产1区2区强奸|舌L子伦熟妇aV|日韩美腿激情一区|6月丁香综合久久|一级毛片免费试看|在线黄色电影免费|国产主播自拍一区|99精品热爱视频|亚洲黄色先锋一区

CN2Conv:面向物聯(lián)網(wǎng)設(shè)備的強(qiáng)魯棒CNN設(shè)計(jì)方法

  • 打印
  • 收藏
收藏成功


打開(kāi)文本圖片集

關(guān)鍵詞:卷積神經(jīng)網(wǎng)絡(luò);云輔助訓(xùn)練;少參數(shù)模型;魯棒性

中圖分類號(hào):TP183 文獻(xiàn)標(biāo)志碼:A 文章編號(hào):1001-3695(2025)07-031-2154-07

doi: 10.19734/j . issn.1001-3695.2024.12.0500

Abstract:Theuseofcloud-assisted training forCNN with fewparameterscanenable theirdeploymentonresource-constrained IoTdevices.However,existing models with fewparameters suferfrom insuficientabilitytoextractcomplexdatafeatures and poor robustnessThis article proposed a CNN design method that was adaptable to complexdata and hadstrong robustness, calledthecombinednon-linearityconvolution kernelgeneration(CN2Conv).Firstly,itrandomlyselectedsomeconvolution kernels fromtheconvolutionallayersoftheCNNmodelasseedconvolution kernels,andusedmultiplegeneration functions to performonlinear transformationsontheseedconvolution kernels toobtain diversegenerationconvolution kernels.Secondly, thediferentgenerationfunctionsuseddiferenthyperparameterstocontroltheregularizationeffectofthemodelandiproved itsrobustness.Finall,itusedthefeatureapsgeneratedbytheconvoutionalkernls toperformchannelsuflingadconvlutional dimensionalityreductionoperations,whileusinggroupnormalization techniques toimprovethedistributionconsistency offeaturesandenhancetheabilitytocapturecomplex data features.InordertoverifytheefectivenessofCN2Conv,this paper carried outseveral experimentson CIFAR-10,CIFAR-1O0,CIFAR-10-Cand Icons-50datasets.Onthe CIFAR-10-C dataset, the accuracy of ResNet34 using CN2Conv is 8. 22% higher than the standard ResNet34,and 11. 86% higher than MonoCNN. Theresultsshowthatthe accuracyof the CNN modelbasedon CN2Convis beterthanthecomparison method on multiple datasets,and the robustness is significantly improved.

Key words:convolutional neural network (CNN);cloud-asisted training;less parameter model; robustness

0 引言

根據(jù)IDC最新發(fā)布的報(bào)告《WorldwideGlobalDataSphereIoTDeviceandDataForecast,2023—2027》顯示[1,隨著物聯(lián)網(wǎng)技術(shù)的持續(xù)進(jìn)步和廣泛應(yīng)用,全球物聯(lián)網(wǎng)設(shè)備的數(shù)量預(yù)計(jì)將從2023年的167億臺(tái)激增至2027年的290億臺(tái),其中2024年將達(dá)到190億臺(tái)。(剩余17424字)

目錄
monitor