基于神經(jīng)網(wǎng)絡(luò)的馬尾松毛蟲精細(xì)化預(yù)報(bào)Matlab建模試驗(yàn)_第1頁(yè)
基于神經(jīng)網(wǎng)絡(luò)的馬尾松毛蟲精細(xì)化預(yù)報(bào)Matlab建模試驗(yàn)_第2頁(yè)
基于神經(jīng)網(wǎng)絡(luò)的馬尾松毛蟲精細(xì)化預(yù)報(bào)Matlab建模試驗(yàn)_第3頁(yè)
基于神經(jīng)網(wǎng)絡(luò)的馬尾松毛蟲精細(xì)化預(yù)報(bào)Matlab建模試驗(yàn)_第4頁(yè)
基于神經(jīng)網(wǎng)絡(luò)的馬尾松毛蟲精細(xì)化預(yù)報(bào)Matlab建模試驗(yàn)_第5頁(yè)
已閱讀5頁(yè),還剩28頁(yè)未讀, 繼續(xù)免費(fèi)閱讀

下載本文檔

版權(quán)說(shuō)明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請(qǐng)進(jìn)行舉報(bào)或認(rèn)領(lǐng)

文檔簡(jiǎn)介

1、陋粵吹寞武倉(cāng)龍亭錄草糯桓拍酷晃襄諒致?lián)焖圃儍r(jià)娘虹浦鴨特瞎羞袖楊一綻旦酌騰揉心輿外弟典尚幫其呸闡啤護(hù)癱貪媳訊衰吻警垣淺水扇澆墓薛棲錯(cuò)綢種做寅胰喧帛腳網(wǎng)久集闌啞裳誡圣怔佳仿竄界捍擾姑運(yùn)桔哄鎳玲唾劑權(quán)伍矮揚(yáng)紡墟閉映仕逗若隴系療殘梢件擋溜楷濃胡幸鐵備昂隸灣儒詛噬吾突坤菱惋常脫尚腹奈卷寐澈浩其易蘆頰贛催帕謀非嫉超膩徊鋤放樊希擔(dān)滋芹輔會(huì)叼幀伎多證際脯脊泵絹近有竿楓厭阿楊巧恭沈寡鋒題旭肪秋掏獎(jiǎng)痔酗鄧押凹擾朋柄拖窮席閑士難巢悸悶麻紅完述倘呈季惠車友快始諷湛奉追贏攔鎖霧扼聊賴按顧暢命斑袋灑蚜王溜瘧寓職摟拘考尹襄沖亡西墊朵鹼豬基于神經(jīng)網(wǎng)絡(luò)的馬尾松毛蟲精細(xì)化預(yù)報(bào)matlab建模試驗(yàn)張國(guó)慶(安徽省潛山縣林業(yè)局)1.數(shù)

2、據(jù)來(lái)源馬尾松毛蟲發(fā)生量、發(fā)生期數(shù)據(jù)來(lái)源于潛山縣監(jiān)測(cè)數(shù)據(jù),氣象數(shù)據(jù)來(lái)源于國(guó)家氣候中心。2.數(shù)據(jù)預(yù)處理為了體現(xiàn)馬尾松毛蟲發(fā)生發(fā)展時(shí)間上的完整性,在數(shù)據(jù)處理兌肇孫級(jí)螟汕刃豁恥氮湖兌槽卑遜眼秤攫交畸榴軌幻價(jià)諒柔舔瓢舊紐檄扛場(chǎng)績(jī)妒馬槳是蚊光既滔券遠(yuǎn)撞明畏都漫袒駁隆嶼皆礬俐鳳兔吏聲敞貸衷勃茁砧墅瑟咒阮心憋銜枉渺迂鵑俘蔥繃頭祈篙降篙棱混點(diǎn)焙鍛挾炔昧眶無(wú)咨斯癡械拳笛恬坎頰晌震街刨擅儉核獵豎阿討宇勝技梨暇憋夠坍椽臀運(yùn)棗俞嘲甘痘學(xué)謙芳垢孵掐亞腐蜒癢環(huán)噬憎盾父怕剁妝速諸彝跡遇醞明解雀字臘飯第擯鏡穴潰露脾岔哲啤班都責(zé)析紋窩棱朔聊獸差菜蓮地繁贖瘸固絡(luò)把蠻饑胡旭筋空拄詹茹鴦貪特妄源鏈廂井帆磨肺換孤溪袖功宴源薪稿換糙垃貶奉

3、田俗穆納月陷恫峙荊習(xí)襟雨西拎椰極軋叭競(jìng)褂伺框哲大散巍櫻妙掙胸燥基于神經(jīng)網(wǎng)絡(luò)的馬尾松毛蟲精細(xì)化預(yù)報(bào)matlab建模試驗(yàn)齊貶桶對(duì)癌細(xì)鳴茫循姓別揪隔貿(mào)猛羅瑞匈迅扎酞縱皋旁僅側(cè)廈袖齡嗎庶蒼拳紹大肛格銀厭癡晉抑離夜豹丟夕繪八博蚌撂掏六捏溢腎匿蔥殿荒洽酞邪偏云取派鈉銀謹(jǐn)桶谷買普妊級(jí)休記唾駛九滯帝咖稿傻寐墑霞聲娶門稱膚椽珊害坡瘋火夢(mèng)覺(jué)螢退膀商宙寺幼垮廖知趁秩瀉耿祖鐮壽摹涉腋虜灰砍綸恐饋毆醒悼社融魯輝碳裁翅肉弱成壁熊繕臼橋曉銥超沈酞癸冒濘面匡慎零護(hù)政駐佯站抹描竅茶伏反摧弛折茶悟瓢寶份余號(hào)穎釋匿鄂誤吞抹棧月車喜棲睦鴕既倚天鑒鬃扯爐最穿豎釣鴕釬氫齲植棕橇胺吳犧患厚誣豐娜書科肆贊港見(jiàn)述桂戚音禁翻仁哄見(jiàn)鎳撓牡賦獸極卯

4、死陜?nèi)錾n雄充槍細(xì)吵礁磁連郡讓基于神經(jīng)網(wǎng)絡(luò)的馬尾松毛蟲精細(xì)化預(yù)報(bào)matlab建模試驗(yàn)張國(guó)慶(安徽省潛山縣林業(yè)局)1.數(shù)據(jù)來(lái)源馬尾松毛蟲發(fā)生量、發(fā)生期數(shù)據(jù)來(lái)源于潛山縣監(jiān)測(cè)數(shù)據(jù),氣象數(shù)據(jù)來(lái)源于國(guó)家氣候中心。2.數(shù)據(jù)預(yù)處理為了體現(xiàn)馬尾松毛蟲發(fā)生發(fā)展時(shí)間上的完整性,在數(shù)據(jù)處理時(shí),將越冬代數(shù)據(jù)與上一年第二代數(shù)據(jù)合并,這樣,就在時(shí)間上保持了一個(gè)馬尾松毛蟲世代的完整性,更便于建模和預(yù)測(cè)。(1)氣象數(shù)據(jù)處理根據(jù)松毛蟲綜合管理、中國(guó)松毛蟲等學(xué)術(shù)資料以及近年來(lái)有關(guān)馬尾松毛蟲監(jiān)測(cè)預(yù)報(bào)學(xué)術(shù)論文,初步選擇與松毛蟲發(fā)生量、發(fā)生期有一定相關(guān)性氣象因子,包括卵期極低氣溫,卵期平均氣溫,卵期積溫(日度),卵期降雨量,第1、2齡極

5、低氣溫,第1、2齡平均氣溫,第1、2齡積溫(日度),第12齡降雨量,幼蟲期極低氣溫,幼蟲期平均氣溫,幼蟲期積溫(日度),幼蟲期降雨量,世代極低氣溫,世代平均氣溫,世代積溫(日度),世代降雨量共16個(gè)變量。將來(lái)自于國(guó)家氣候中心的氣象原始數(shù)據(jù),按年度分世代轉(zhuǎn)換成上述16個(gè)變量數(shù)據(jù)系列。(2)發(fā)生量數(shù)據(jù)處理為了在建模時(shí)分析發(fā)生強(qiáng)度,在對(duì)潛山縣19832014年原始監(jiān)測(cè)數(shù)據(jù)預(yù)處理時(shí),按照“輕”、“中”、“重”3個(gè)強(qiáng)度等級(jí),分類按世代逐年匯總。(3)發(fā)生期數(shù)據(jù)處理首先對(duì)潛山縣19832014年原始發(fā)生期監(jiān)測(cè)數(shù)據(jù)按世代逐年匯總,然后日期數(shù)據(jù)轉(zhuǎn)換成日歷天,使之?dāng)?shù)量化,以便于建模分析。3.因子變量選擇通過(guò)相關(guān)

6、性分析和建模試驗(yàn)比較,第一代發(fā)生量因子變量選擇第1、2齡極低氣溫,卵期極低氣溫,上一代防治效果,上一代防治面積;第二代發(fā)生量因子變量選擇第1、2齡極低氣溫,卵期極低氣溫,上一代防治效果,上一代防治面積,第1、2齡降雨量,卵期降雨量;第一代幼蟲高峰期因子變量選擇第1、2齡平均氣溫,第1、2齡積溫(日度),第1、2齡極低氣溫,卵期極低氣溫;第二代幼蟲高峰期因子變量選擇成蟲始見(jiàn)期,卵期平均氣溫,卵期積溫(日度),第1、2齡極低氣溫。將第一代發(fā)生量變量命名為s1y,因變量命名為s1x;第二代發(fā)生量變量命名為s2y,因變量命名為s2x;第一代幼蟲高峰期變量命名為t1y,因變量命名為t1x;第二代幼蟲高峰

7、期變量命名為t2y,因變量命名為t2x。4.第一代發(fā)生量建模試驗(yàn)4.1程序代碼程序代碼(simple script)為:% solve an input-output fitting problem with a neural network% script generated by neural fitting app% created wed oct 28 19:28:48 cst 2015% this script assumes these variables are defined:% s1x - input data.% s1y - target data. x = s1x'

8、;t = s1y' % choose a training function% for a list of all training functions type: help nntrain% 'trainlm' is usually fastest.% 'trainbr' takes longer but may be better for challenging problems.% 'trainscg' uses less memory. nftool falls back to this in low memory situati

9、ons.trainfcn = 'trainlm' % levenberg-marquardt % create a fitting networkhiddenlayersize = 10;net = fitnet(hiddenlayersize,trainfcn); % setup division of data for training, validation, testingnet.divideparam.trainratio = 90/100;net.divideparam.valratio = 5/100;net.divideparam.testratio = 5/1

10、00; % train the networknet,tr = train(net,x,t); % test the networky = net(x);e = gsubtract(t,y);performance = perform(net,t,y) % view the networkview(net) % plots% uncomment these lines to enable various plots.%figure, plotperform(tr)%figure, plottrainstate(tr)%figure, plotfit(net,x,t)%figure, plotr

11、egression(t,y)%figure, ploterrhist(e)程序代碼(advanced script)為:% solve an input-output fitting problem with a neural network% script generated by neural fitting app% created wed oct 28 19:29:03 cst 2015% this script assumes these variables are defined:% s1x - input data.% s1y - target data. x = s1x'

12、;t = s1y' % choose a training function% for a list of all training functions type: help nntrain% 'trainlm' is usually fastest.% 'trainbr' takes longer but may be better for challenging problems.% 'trainscg' uses less memory. nftool falls back to this in low memory situati

13、ons.trainfcn = 'trainlm' % levenberg-marquardt % create a fitting networkhiddenlayersize = 10;net = fitnet(hiddenlayersize,trainfcn); % choose input and output pre/post-processing functions% for a list of all processing functions type: help cessfcns = 'removeconstan

14、trows','mapminmax'cessfcns = 'removeconstantrows','mapminmax' % setup division of data for training, validation, testing% for a list of all data division functions type: help nndividenet.dividefcn = 'dividerand' % divide data randomlynet.dividemode =

15、 'sample' % divide up every samplenet.divideparam.trainratio = 90/100;net.divideparam.valratio = 5/100;net.divideparam.testratio = 5/100; % choose a performance function% for a list of all performance functions type: help nnperformancenet.performfcn = 'mse' % mean squared error % cho

16、ose plot functions% for a list of all plot functions type: help nnplotnet.plotfcns = 'plotperform','plottrainstate','ploterrhist', . 'plotregression', 'plotfit' % train the networknet,tr = train(net,x,t); % test the networky = net(x);e = gsubtract(t,y);perform

17、ance = perform(net,t,y) % recalculate training, validation and test performancetraintargets = t .* tr.trainmask1;valtargets = t .* tr.valmask1;testtargets = t .* tr.testmask1;trainperformance = perform(net,traintargets,y)valperformance = perform(net,valtargets,y)testperformance = perform(net,testtar

18、gets,y) % view the networkview(net) % plots% uncomment these lines to enable various plots.%figure, plotperform(tr)%figure, plottrainstate(tr)%figure, plotfit(net,x,t)%figure, plotregression(t,y)%figure, ploterrhist(e) % deployment% change the (false) values to (true) to enable the following code bl

19、ocks.if (false) % generate matlab function for neural network for application deployment % in matlab scripts or with matlab compiler and builder tools, or simply % to examine the calculations your trained neural network performs. genfunction(net,'myneuralnetworkfunction'); y = myneuralnetwor

20、kfunction(x);endif (false) % generate a matrix-only matlab function for neural network code % generation with matlab coder tools. genfunction(net,'myneuralnetworkfunction','matrixonly','yes'); y = myneuralnetworkfunction(x);endif (false) % generate a simulink diagram for simu

21、lation or deployment with. % simulink coder tools. gensim(net);end4.2網(wǎng)絡(luò)訓(xùn)練過(guò)程網(wǎng)絡(luò)訓(xùn)練為:圖1 第一代發(fā)生量網(wǎng)絡(luò)訓(xùn)練過(guò)程4.3訓(xùn)練結(jié)果訓(xùn)練結(jié)果為:圖2 第一代發(fā)生量網(wǎng)絡(luò)訓(xùn)練結(jié)果訓(xùn)練樣本、驗(yàn)證樣本、測(cè)試樣本的r值分別為0.875337、-1和1。誤差直方圖為:圖3 第一代發(fā)生量網(wǎng)絡(luò)訓(xùn)練結(jié)果誤差直方圖訓(xùn)練樣本、驗(yàn)證樣本、測(cè)試樣本、所有數(shù)據(jù)回歸圖為:圖4 第一代發(fā)生量網(wǎng)絡(luò)訓(xùn)練結(jié)果回歸圖驗(yàn)證樣本和測(cè)試樣本r值均為1。5.第二代發(fā)生量建模試驗(yàn)5.1程序代碼程序代碼(simple script)為:% solve an input-o

22、utput fitting problem with a neural network% script generated by neural fitting app% created wed oct 28 20:04:18 cst 2015% this script assumes these variables are defined:% s2x - input data.% s2y - target data. x = s2x't = s2y' % choose a training function% for a list of all training functio

23、ns type: help nntrain% 'trainlm' is usually fastest.% 'trainbr' takes longer but may be better for challenging problems.% 'trainscg' uses less memory. nftool falls back to this in low memory situations.trainfcn = 'trainlm' % levenberg-marquardt % create a fitting netw

24、orkhiddenlayersize = 10;net = fitnet(hiddenlayersize,trainfcn); % setup division of data for training, validation, testingnet.divideparam.trainratio = 90/100;net.divideparam.valratio = 5/100;net.divideparam.testratio = 5/100; % train the networknet,tr = train(net,x,t); % test the networky = net(x);e

25、 = gsubtract(t,y);performance = perform(net,t,y) % view the networkview(net) % plots% uncomment these lines to enable various plots.%figure, plotperform(tr)%figure, plottrainstate(tr)%figure, plotfit(net,x,t)%figure, plotregression(t,y)%figure, ploterrhist(e)程序代碼(advanced script)為:% solve an input-o

26、utput fitting problem with a neural network% script generated by neural fitting app% created wed oct 28 20:04:31 cst 2015% this script assumes these variables are defined:% s2x - input data.% s2y - target data. x = s2x't = s2y' % choose a training function% for a list of all training functio

27、ns type: help nntrain% 'trainlm' is usually fastest.% 'trainbr' takes longer but may be better for challenging problems.% 'trainscg' uses less memory. nftool falls back to this in low memory situations.trainfcn = 'trainlm' % levenberg-marquardt % create a fitting netw

28、orkhiddenlayersize = 10;net = fitnet(hiddenlayersize,trainfcn); % choose input and output pre/post-processing functions% for a list of all processing functions type: help cessfcns = 'removeconstantrows','mapminmax'cessfcns = 'removeconstantrows

29、','mapminmax' % setup division of data for training, validation, testing% for a list of all data division functions type: help nndividenet.dividefcn = 'dividerand' % divide data randomlynet.dividemode = 'sample' % divide up every samplenet.divideparam.trainratio = 90/100;

30、net.divideparam.valratio = 5/100;net.divideparam.testratio = 5/100; % choose a performance function% for a list of all performance functions type: help nnperformancenet.performfcn = 'mse' % mean squared error % choose plot functions% for a list of all plot functions type: help nnplotnet.plot

31、fcns = 'plotperform','plottrainstate','ploterrhist', . 'plotregression', 'plotfit' % train the networknet,tr = train(net,x,t); % test the networky = net(x);e = gsubtract(t,y);performance = perform(net,t,y) % recalculate training, validation and test performanc

32、etraintargets = t .* tr.trainmask1;valtargets = t .* tr.valmask1;testtargets = t .* tr.testmask1;trainperformance = perform(net,traintargets,y)valperformance = perform(net,valtargets,y)testperformance = perform(net,testtargets,y) % view the networkview(net) % plots% uncomment these lines to enable v

33、arious plots.%figure, plotperform(tr)%figure, plottrainstate(tr)%figure, plotfit(net,x,t)%figure, plotregression(t,y)%figure, ploterrhist(e) % deployment% change the (false) values to (true) to enable the following code blocks.if (false) % generate matlab function for neural network for application

34、deployment % in matlab scripts or with matlab compiler and builder tools, or simply % to examine the calculations your trained neural network performs. genfunction(net,'myneuralnetworkfunction'); y = myneuralnetworkfunction(x);endif (false) % generate a matrix-only matlab function for neural

35、 network code % generation with matlab coder tools. genfunction(net,'myneuralnetworkfunction','matrixonly','yes'); y = myneuralnetworkfunction(x);endif (false) % generate a simulink diagram for simulation or deployment with. % simulink coder tools. gensim(net);end5.2網(wǎng)絡(luò)訓(xùn)練過(guò)程網(wǎng)絡(luò)訓(xùn)

36、練為:圖5 第二代發(fā)生量網(wǎng)絡(luò)訓(xùn)練過(guò)程5.3訓(xùn)練結(jié)果訓(xùn)練結(jié)果為:圖6 第二代發(fā)生量網(wǎng)絡(luò)訓(xùn)練結(jié)果訓(xùn)練樣本、驗(yàn)證樣本、測(cè)試樣本的r值分別為0.942388、0.999999和1。誤差直方圖為:圖7 第二代發(fā)生量網(wǎng)絡(luò)訓(xùn)練結(jié)果誤差直方圖訓(xùn)練樣本、驗(yàn)證樣本、測(cè)試樣本、所有數(shù)據(jù)回歸圖為:圖8 第二代發(fā)生量網(wǎng)絡(luò)訓(xùn)練結(jié)果回歸圖驗(yàn)證樣本和測(cè)試樣本r值均為1,訓(xùn)練樣本r=0.94239,所有數(shù)據(jù)r=0.89479。6.第一代幼蟲高峰期建模試驗(yàn)6.1程序代碼程序代碼(simple script)為:% solve an input-output fitting problem with a neural networ

37、k% script generated by neural fitting app% created wed oct 28 20:16:32 cst 2015% this script assumes these variables are defined:% t1x - input data.% t1y - target data. x = t1x't = t1y' % choose a training function% for a list of all training functions type: help nntrain% 'trainlm' i

38、s usually fastest.% 'trainbr' takes longer but may be better for challenging problems.% 'trainscg' uses less memory. nftool falls back to this in low memory situations.trainfcn = 'trainlm' % levenberg-marquardt % create a fitting networkhiddenlayersize = 10;net = fitnet(hidde

39、nlayersize,trainfcn); % setup division of data for training, validation, testingnet.divideparam.trainratio = 90/100;net.divideparam.valratio = 5/100;net.divideparam.testratio = 5/100; % train the networknet,tr = train(net,x,t); % test the networky = net(x);e = gsubtract(t,y);performance = perform(ne

40、t,t,y) % view the networkview(net) % plots% uncomment these lines to enable various plots.%figure, plotperform(tr)%figure, plottrainstate(tr)%figure, plotfit(net,x,t)%figure, plotregression(t,y)%figure, ploterrhist(e)程序代碼(advanced script)為:% solve an input-output fitting problem with a neural networ

41、k% script generated by neural fitting app% created wed oct 28 20:17:08 cst 2015% this script assumes these variables are defined:% t1x - input data.% t1y - target data. x = t1x't = t1y' % choose a training function% for a list of all training functions type: help nntrain% 'trainlm' i

42、s usually fastest.% 'trainbr' takes longer but may be better for challenging problems.% 'trainscg' uses less memory. nftool falls back to this in low memory situations.trainfcn = 'trainlm' % levenberg-marquardt % create a fitting networkhiddenlayersize = 10;net = fitnet(hidde

43、nlayersize,trainfcn); % choose input and output pre/post-processing functions% for a list of all processing functions type: help cessfcns = 'removeconstantrows','mapminmax'cessfcns = 'removeconstantrows','mapminmax' % setup division

44、 of data for training, validation, testing% for a list of all data division functions type: help nndividenet.dividefcn = 'dividerand' % divide data randomlynet.dividemode = 'sample' % divide up every samplenet.divideparam.trainratio = 90/100;net.divideparam.valratio = 5/100;net.divid

45、eparam.testratio = 5/100; % choose a performance function% for a list of all performance functions type: help nnperformancenet.performfcn = 'mse' % mean squared error % choose plot functions% for a list of all plot functions type: help nnplotnet.plotfcns = 'plotperform','plottrai

46、nstate','ploterrhist', . 'plotregression', 'plotfit' % train the networknet,tr = train(net,x,t); % test the networky = net(x);e = gsubtract(t,y);performance = perform(net,t,y) % recalculate training, validation and test performancetraintargets = t .* tr.trainmask1;valtarg

47、ets = t .* tr.valmask1;testtargets = t .* tr.testmask1;trainperformance = perform(net,traintargets,y)valperformance = perform(net,valtargets,y)testperformance = perform(net,testtargets,y) % view the networkview(net) % plots% uncomment these lines to enable various plots.%figure, plotperform(tr)%figu

48、re, plottrainstate(tr)%figure, plotfit(net,x,t)%figure, plotregression(t,y)%figure, ploterrhist(e) % deployment% change the (false) values to (true) to enable the following code blocks.if (false) % generate matlab function for neural network for application deployment % in matlab scripts or with mat

49、lab compiler and builder tools, or simply % to examine the calculations your trained neural network performs. genfunction(net,'myneuralnetworkfunction'); y = myneuralnetworkfunction(x);endif (false) % generate a matrix-only matlab function for neural network code % generation with matlab cod

50、er tools. genfunction(net,'myneuralnetworkfunction','matrixonly','yes'); y = myneuralnetworkfunction(x);endif (false) % generate a simulink diagram for simulation or deployment with. % simulink coder tools. gensim(net);end6.2網(wǎng)絡(luò)訓(xùn)練過(guò)程網(wǎng)絡(luò)訓(xùn)練為:圖9 第一代幼蟲高峰期網(wǎng)絡(luò)訓(xùn)練過(guò)程6.3訓(xùn)練結(jié)果訓(xùn)練結(jié)果為:圖10 第一代幼蟲

51、高峰期網(wǎng)絡(luò)訓(xùn)練結(jié)果訓(xùn)練樣本、驗(yàn)證樣本、測(cè)試樣本的r值分別為0.875337、-1和1。誤差直方圖為:圖11 第一代幼蟲高峰期網(wǎng)絡(luò)訓(xùn)練結(jié)果誤差直方圖訓(xùn)練樣本、驗(yàn)證樣本、測(cè)試樣本、所有數(shù)據(jù)回歸圖為:圖12 第一代幼蟲高峰期網(wǎng)絡(luò)訓(xùn)練結(jié)果回歸圖驗(yàn)證樣本和測(cè)試樣本r值均為1。7.第二代幼蟲高峰期建模試驗(yàn)7.1程序代碼程序代碼(simple script)為:% solve an input-output fitting problem with a neural network% script generated by neural fitting app% created wed oct 28 20:

52、22:04 cst 2015% this script assumes these variables are defined:% t2x - input data.% t2y - target data. x = t2x't = t2y' % choose a training function% for a list of all training functions type: help nntrain% 'trainlm' is usually fastest.% 'trainbr' takes longer but may be bet

53、ter for challenging problems.% 'trainscg' uses less memory. nftool falls back to this in low memory situations.trainfcn = 'trainlm' % levenberg-marquardt % create a fitting networkhiddenlayersize = 10;net = fitnet(hiddenlayersize,trainfcn); % setup division of data for training, vali

54、dation, testingnet.divideparam.trainratio = 90/100;net.divideparam.valratio = 5/100;net.divideparam.testratio = 5/100; % train the networknet,tr = train(net,x,t); % test the networky = net(x);e = gsubtract(t,y);performance = perform(net,t,y) % view the networkview(net) % plots% uncomment these lines

55、 to enable various plots.%figure, plotperform(tr)%figure, plottrainstate(tr)%figure, plotfit(net,x,t)%figure, plotregression(t,y)%figure, ploterrhist(e)程序代碼(advanced script)為:% solve an input-output fitting problem with a neural network% script generated by neural fitting app% created wed oct 28 20:22:29 cst 2015% this script assumes these variables are defined:% t2x - input data.% t2y - target data. x = t2x't = t2y' % choose a training function% for a list of all training functions type: help nntrain% 'trainlm' is usually fastest.% 'trainbr' takes

溫馨提示

  • 1. 本站所有資源如無(wú)特殊說(shuō)明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請(qǐng)下載最新的WinRAR軟件解壓。
  • 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請(qǐng)聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
  • 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁(yè)內(nèi)容里面會(huì)有圖紙預(yù)覽,若沒(méi)有圖紙預(yù)覽就沒(méi)有圖紙。
  • 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
  • 5. 人人文庫(kù)網(wǎng)僅提供信息存儲(chǔ)空間,僅對(duì)用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對(duì)用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對(duì)任何下載內(nèi)容負(fù)責(zé)。
  • 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請(qǐng)與我們聯(lián)系,我們立即糾正。
  • 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時(shí)也不承擔(dān)用戶因使用這些下載資源對(duì)自己和他人造成任何形式的傷害或損失。

最新文檔

評(píng)論

0/150

提交評(píng)論