基于數(shù)值計(jì)算方法的BP神經(jīng)網(wǎng)絡(luò)及遺傳算法的優(yōu)化研究
本文關(guān)鍵詞:基于數(shù)值計(jì)算方法的BP神經(jīng)網(wǎng)絡(luò)及遺傳算法的優(yōu)化研究,由筆耕文化傳播整理發(fā)布。
分類號(hào)密級(jí)——
UDf編號(hào)一一一
YUNNANNoRMALUNIVERSITY
碩士研究生學(xué)位論文
題目:基于數(shù)值計(jì)算方法的BP神經(jīng)網(wǎng)絡(luò)及遺傳算法
的優(yōu)化研究
學(xué)院鹽篡扭盤堂曼焦皇鹽盔堂瞳
專業(yè)名稱基趟麴堂
研究方向組金熬堂
研究生姓名丟佳要.學(xué)號(hào)
導(dǎo)師姓名陶送堡!夏勉塑職稱熬援
2006年06月日
摘要
人工神經(jīng)網(wǎng)絡(luò)和遺傳算法都是將生物學(xué)原理應(yīng)用于計(jì)算機(jī)科學(xué)的仿生學(xué)理論成果。由于它們具
有極強(qiáng)的解決問(wèn)題的能力,近年來(lái)引起了眾多學(xué)者的興趣與參與,已成為學(xué)術(shù)界跨學(xué)科的熱門專題之一。
在人工神經(jīng)網(wǎng)絡(luò)的實(shí)際應(yīng)用中,約90%的人工神經(jīng)網(wǎng)絡(luò)模型都是采用BP網(wǎng)絡(luò)或者是它的變化
形式,它也是前饋網(wǎng)絡(luò)的核心部分,BP網(wǎng)絡(luò)廣泛應(yīng)用于函數(shù)逼近、模式識(shí)別,分類、數(shù)據(jù)壓縮等,F(xiàn)已成為人工智能研究的重要領(lǐng)域之一。然而,由于BP算法是一種梯度下降搜索方法,因而不可避免地存在固有的不足,如收斂速度慢、易陷入誤差函數(shù)的局部極小點(diǎn),對(duì)于較大的搜索空間,多峰值和不可微函數(shù)不能有效搜索到全局極小點(diǎn)。
遺傳算法作為一種智能化的全局搜索算法,自80年代問(wèn)世以來(lái)便在數(shù)值優(yōu)化、系統(tǒng)控制、結(jié)構(gòu)
優(yōu)化設(shè)計(jì)等諸多領(lǐng)域的應(yīng)用中展現(xiàn)出其特有的魅力,同時(shí)也暴露出許多不足和缺陷。如完全依賴概率隨機(jī)地進(jìn)行操作,雖然可以避免陷入局部極小,但受尋優(yōu)條件的限制,一般只能得到全局范圍內(nèi)的近似最優(yōu)解,很難得到最優(yōu)解;對(duì)參數(shù)采用二進(jìn)制編碼,人為地將連續(xù)空間離散化,導(dǎo)致了計(jì)算精度與字符串長(zhǎng)度、運(yùn)算量之間的矛盾;采用隨機(jī)優(yōu)化技術(shù),所以要花費(fèi)大量的時(shí)間;算法在交叉、變異的進(jìn)化過(guò)程中隨機(jī)性較強(qiáng),致使搜索效率低下,具體表現(xiàn)為進(jìn)化迭代過(guò)程中會(huì)出現(xiàn)子代最優(yōu)個(gè)體劣于父代最優(yōu)個(gè)體的“退化”現(xiàn)象;遺傳算法雖然具有很強(qiáng)的全局搜索能力,但其局部搜索能力較弱(易出現(xiàn)早熟收斂現(xiàn)象)。
本文主要工作:
(1)對(duì)BP神經(jīng)網(wǎng)絡(luò)的缺陷進(jìn)行分析研究,針對(duì)BP神經(jīng)網(wǎng)絡(luò)收斂度慢的不足,對(duì)經(jīng)典BP網(wǎng)絡(luò)
的單極性Sigmoid傳輸函數(shù)和雙極性Sigmoid函數(shù)進(jìn)行數(shù)學(xué)分析,給出二者不同的數(shù)學(xué)性質(zhì)和它們的優(yōu)先選擇方法。
(2)利用數(shù)值計(jì)算優(yōu)化方法對(duì)BP神經(jīng)網(wǎng)絡(luò)進(jìn)行改進(jìn),提高其收斂速度,本文分別用擬牛頓法、
最優(yōu)步長(zhǎng)法和共軛梯度法對(duì)BP神經(jīng)網(wǎng)絡(luò)學(xué)習(xí)法進(jìn)行改進(jìn),,對(duì)各種改進(jìn)方法進(jìn)行分析比較,給出各自適用的網(wǎng)絡(luò)規(guī)模,并對(duì)其收斂性進(jìn)行分析證明。
(3)利用數(shù)值計(jì)算優(yōu)化方法對(duì)遺傳算法的交叉算子、變異算子、編碼方式及適應(yīng)度函數(shù)等進(jìn)行
分析研究,給出了基于一維極小化問(wèn)題的最優(yōu)策略(Fibonacci法)和近似最優(yōu)策略(黃金分割法)的交叉和變異算子。
(4)將擅長(zhǎng)全局搜索的遺傳算法和局部尋優(yōu)能力較強(qiáng)的BP算法結(jié)合起來(lái),根據(jù)GA的交叉、變
異和選擇算子在全變量空間以較大概率搜索全局解和在解的點(diǎn)附近利用BP神經(jīng)網(wǎng)絡(luò)能快速、精確地收斂的特點(diǎn),融合二者的優(yōu)點(diǎn),將二者有機(jī)結(jié)合,利用遺傳算法同時(shí)訓(xùn)練神經(jīng)網(wǎng)絡(luò)權(quán)值和拓?fù)浣Y(jié)構(gòu),可以辟免陷入局部極小值,提高算法收斂速度,很快得到問(wèn)題的全局最優(yōu)解。
(5)通過(guò)試驗(yàn)對(duì)改進(jìn)后的BP神經(jīng)網(wǎng)絡(luò)算法、遺傳算法和他們?nèi)诤戏椒ㄟM(jìn)行了試驗(yàn)驗(yàn)證。
。關(guān)鍵字:
神經(jīng)網(wǎng)絡(luò)遺傳算法收斂性共軛梯度法黃金分割法Fibonacci法泛化能力
Abstract
Theartificialneuralnetworksandgeneticalgorithmappliedthebiologicalprincipletothebionicstheoryachievementofcomputerscience.Becausetheyhaveextremelystrongabilitytosolveproblem,ithavedrawnnumerous
onescholars’interestandparticipationinrecentyearsandhasalreadybecome
hotspecialtopics.0facademia’Sinterdisciplinary
Inthepracticalapplicationoftheartificialneuralnetwork,about90%oftheartificialneuralnetworkmodelsadoptBPnetworkoritschangefOrm.itisakeypartofthefeedforwardnetworktoo.BPnetworkappliestotheapproximationoffunctionextensively,pattern.recognition/classification,thedatacompressedand
becomeoneSOon..Ithasalreadynow.
areoftheimportantfieldswhichtheartificiaIintelligencehasstudiedaHowever,becauseBPalgorithmisgradientdroppingmethodofsearchingfor,there
asinherentdeficienciesunavoidably,such
Somesnackofthe
andlittlefunctionerrorcanconvergingslowly,apttofaffintoextremelyfunction,astothespaceoflargersearching,manypeakvaluessearchforreachingtheoveralIsituationsnackverymucheffectively.
Thegeneticalgorithm
optimizedinasakindofintelligenttheoverallsearchingforalgorithmshasnumbervaluesincetheeighties,systemcontrol,structuraloptimization
adesignandapplicationsin
agreatdealoffieldsshowtheircharacterizedglamour,exposeonlotofinsufficientanddefectsatthesametime.Forexample,totallyrely
canprobabilitytooperateatrandom,thoughextremelysmall
theexcellentcondition.Theonesbeavoided,ItissoughttherestrictionofonlycangenerallybereceivedintheoveralIrange
binaryapproximatlyandoptimumly,itisverydifficulttobesolvedoptimumly;Adopting
scalecodetoparameter,dispersingtotakecontinuousspaceartificiallyresultinthecontradictionbetweencalculatingprecisionandstringlengthandoperationamount;Soadoptingandoptimizingtechnologyatrandomshouldspend
branchingandthemutantevolutioncoursealargeamountoftime;Inrandomnessisrelativelystrong,causingtheefficiencyofsearchingfortobelow,embodiedinevolving,changingandtakingtheplaceofcoursewillappearsubgenerationoptimumindividuallowerthanparentoptimumindividual.Thoughgeneticalgorithmhavestronglyoverallsituationsearchingforability,itspartsearchingforabilitytobeweaker
phenomenonofand(easiertoappeartheearly-maturingdisappearing).
ofthispaper:
andresearchthedefectofBPneuralnetwork.WiththeshortcomingtothestowGroundwork(1)Analyze
convergentspeedofBPneuralnetwork,wecarryonmathematicsanalysistosinglepolaritySigmoid第T00頁(yè)
transmitfunctionanddoublepolaritySigmoidfunctionofclassicalBPnetwork。Fromthisweshowthedifferentmathematicsnatureofthemandtheirchoicemethods.
(2)We
neuralmethod.WeimproveBPimproveBPneuralnetworkbynumericalcalculationoptimizationmethodbytheimitateNewtonmethod,optimumsteplengthmethodand
analyzeandcompareaboutvariouskindsofnetworkconjugationgradientmethodrespectively.Thenimprovement
methodsandshoweachsuitablenetworksize,analyzingandprovingitsconvergenceproperty.
(3)Weanalyzeandresearchthecrossoperator,variationoperatorandcodewayofgenetic
crossalgorithmbynumericalcalculationoptimizationmethodandshowthe
optimumtactics(Fibonacci
onoperator.variationofmethod)andapproximateoptimumtactics(goldensectionmethod)basedone-dimensionminimizingquestion.
(4)WecombinethegeneticalgorithmwhichisgoodatoverallsearchwithBPalgorithmwhichhas
crossmuchstronglocaloptimizingability.Accordingtothecharacteristicoftheoperator,variation
operatorandchoiceoperatorofGAthattheseoperatorssearchtheoverallsolvewithgreatprobabilityinthewholevariablespaceandconvergefastandaccuratenearthesolve,intergradingadvantagesofthesetowcharacteristics,combiningthesetowcharacteristicsorganically,trainingtheweightvalueandtopologicalstructureofneuralnetworkatthesametimebygeneticalgorithm,wecanavoidfalling
canintolocalextrememinimumvalueandimproveconvergentspeedofthealgorithm,thenwe
theoveralIoptimumsolvetothequestionquickly.obtain
(5)WeverifytheimprovedBPnetworkalgorithm,improvedgeneticalgorithmandtheirmix
““throughtesting.
Keywords:
Neuralnetwork,Geneticalgorithm,Convergenceproperty,Fibonaccilaw,
Gradientlawofconjugation,goldsplitmethod,Generalizationability第101頁(yè)
本文關(guān)鍵詞:基于數(shù)值計(jì)算方法的BP神經(jīng)網(wǎng)絡(luò)及遺傳算法的優(yōu)化研究,由筆耕文化傳播整理發(fā)布。
本文編號(hào):138131
本文鏈接:http://www.lk138.cn/kejilunwen/rengongzhinen/138131.html