matlab 神经网路,matlab神经网络的工程实例(超级详细)
介紹神經網絡算法在機械結構優化中的應用的例子
(大家要學習的時候只需要把輸入輸出變量更改為你自己的數據既可以了,如果看完了還有問題的話可以加我微博“極南師兄”給我留言,與大家共同進步)。
把一個結構的8個尺寸參數設計為變量,如上圖所示,
對應的質量,溫差,面積作為輸出。用神經網絡擬合變量與輸出的數學模型,首相必須要有數據來源,這里我用復合中心設計法則構造設計點,根據規則,八個變量將構造出81個設計點。然后在ansys
workbench中進行81次仿真(先在proe建模并設置變量,將模型導入wokbench中進行相應的設置,那么就會自動的完成81次仿真,將結果導出來exceel文件)
Matlab程序如下
P=
[20?2.5?6?14.9
16.5?6?14.9 16.5
15?2.5?6?14.9
16.5?6?14.9 16.5
25?2.5?6?14.9
16.5?6?14.9 16.5
20?1?6?14.9
16.5?6?14.9?16.5
20?4?6?14.9
16.5?6?14.9 16.5
20?2.5?2?14.9?16.5?6?14.9 16.5
20?2.5?10?14.9
16.5?6?14.9 16.5
20?2.5?6?10?16.5
6?14.9 16.5
20?2.5?6?19.8 16.5
6?14.9 16.5
20?2.5?6?14.9
10?6?14.9 16.5
20?2.5?6?14.9
23?6?14.9 16.5
20?2.5?6?14.9 16.5
2?14.9 16.5
20?2.5?6?14.9 16.5
10?14.9 16.5
20?2.5?6?14.9 16.5
6?10?16.5
20?2.5?6?14.9 16.5
6?19.8 16.5
20?2.5?6?14.9 16.5
6?14.9 10
20?2.5?6?14.9 16.5
6?14.9 23
17.51238947?1.75371684?4.009911573?12.46214168?13.26610631?4.009911573?12.46214168?19.73389369
22.48761053?1.75371684?4.009911573?12.46214168?13.26610631?4.009911573?12.46214168?13.26610631
17.51238947?3.24628316?4.009911573?12.46214168?13.26610631?4.009911573?17.33785832?19.73389369
22.48761053?3.24628316?4.009911573?12.46214168?13.26610631?4.009911573?17.33785832?13.26610631
17.51238947?1.75371684?7.990088427?12.46214168?13.26610631?4.009911573?17.33785832?19.73389369
22.48761053?1.75371684?7.990088427?12.46214168?13.26610631?4.009911573?17.33785832?13.26610631
17.51238947?3.24628316?7.990088427?12.46214168?13.26610631?4.009911573?12.46214168?19.73389369
22.48761053?3.24628316?7.990088427?12.46214168?13.26610631?4.009911573?12.46214168?13.26610631
17.51238947?1.75371684?4.009911573?17.33785832?13.26610631?4.009911573?17.33785832?13.26610631
22.48761053?1.75371684?4.009911573?17.33785832?13.26610631?4.009911573?17.33785832?19.73389369
17.51238947?3.24628316?4.009911573?17.33785832?13.26610631?4.009911573?12.46214168?13.26610631
22.48761053?3.24628316?4.009911573?17.33785832?13.26610631?4.009911573?12.46214168?19.73389369
17.51238947?1.75371684?7.990088427?17.33785832?13.26610631?4.009911573?12.46214168?13.26610631
22.48761053?1.75371684?7.990088427?17.33785832?13.26610631?4.009911573?12.46214168?19.73389369
17.51238947?3.24628316?7.990088427?17.33785832?13.26610631?4.009911573?17.33785832?13.26610631
22.48761053?3.24628316?7.990088427?17.33785832?13.26610631?4.009911573?17.33785832?19.73389369
17.51238947?1.75371684?4.009911573?12.46214168?19.73389369?4.009911573?17.33785832?13.26610631
22.48761053?1.75371684?4.009911573?12.46214168?19.73389369?4.009911573?17.33785832?19.73389369
17.51238947?3.24628316?4.009911573?12.46214168?19.73389369?4.009911573?12.46214168?13.26610631
22.48761053?3.24628316?4.009911573?12.46214168?19.73389369?4.009911573?12.46214168?19.73389369
17.51238947?1.75371684?7.990088427?12.46214168?19.73389369?4.009911573?12.46214168?13.26610631
22.48761053?1.75371684?7.990088427?12.46214168?19.73389369?4.009911573?12.46214168?19.73389369
17.51238947?3.24628316?7.990088427?12.46214168?19.73389369?4.009911573?17.33785832?13.26610631
22.48761053?3.24628316?7.990088427?12.46214168?19.73389369?4.009911573?17.33785832?19.73389369
17.51238947?1.75371684?4.009911573?17.33785832?19.73389369?4.009911573?12.46214168?19.73389369
22.48761053?1.75371684?4.009911573?17.33785832?19.73389369?4.009911573?12.46214168?13.26610631
17.51238947?3.24628316?4.009911573?17.33785832?19.73389369?4.009911573?17.33785832?19.73389369
22.48761053?3.24628316?4.009911573?17.33785832?19.73389369?4.009911573?17.33785832?13.26610631
17.51238947?1.75371684?7.990088427?17.33785832?19.73389369?4.009911573?17.33785832?19.73389369
22.48761053?1.75371684?7.990088427?17.33785832?19.73389369?4.009911573?17.33785832?13.26610631
17.51238947?3.24628316?7.990088427?17.33785832?19.73389369?4.009911573?12.46214168?19.73389369
22.48761053?3.24628316?7.990088427?17.33785832?19.73389369?4.009911573?12.46214168?13.26610631
17.51238947?1.75371684?4.009911573?12.46214168?13.26610631?7.990088427?17.33785832?13.26610631
22.48761053?1.75371684?4.009911573?12.46214168?13.26610631?7.990088427?17.33785832?19.73389369
17.51238947?3.24628316?4.009911573?12.46214168?13.26610631?7.990088427?12.46214168?13.26610631
22.48761053?3.24628316?4.009911573?12.46214168?13.26610631?7.990088427?12.46214168?19.73389369
17.51238947?1.75371684?7.990088427?12.46214168?13.26610631?7.990088427?12.46214168?13.26610631
22.48761053?1.75371684?7.990088427?12.46214168?13.26610631?7.990088427?12.46214168?19.73389369
17.51238947?3.24628316?7.990088427?12.46214168?13.26610631?7.990088427?17.33785832?13.26610631
22.48761053?3.24628316?7.990088427?12.46214168?13.26610631?7.990088427?17.33785832?19.73389369
17.51238947?1.75371684?4.009911573?17.33785832?13.26610631?7.990088427?12.46214168?19.73389369
22.48761053?1.75371684?4.009911573?17.33785832?13.26610631?7.990088427?12.46214168?13.26610631
17.51238947?3.24628316?4.009911573?17.33785832?13.26610631?7.990088427?17.33785832?19.73389369
22.48761053?3.24628316?4.009911573?17.33785832?13.26610631?7.990088427?17.33785832?13.26610631
17.51238947?1.75371684?7.990088427?17.33785832?13.26610631?7.990088427?17.33785832?19.73389369
22.48761053?1.75371684?7.990088427?17.33785832?13.26610631?7.990088427?17.33785832?13.26610631
17.51238947?3.24628316?7.990088427?17.33785832?13.26610631?7.990088427?12.46214168?19.73389369
22.48761053?3.24628316?7.990088427?17.33785832?13.26610631?7.990088427?12.46214168?13.26610631
17.51238947?1.75371684?4.009911573?12.46214168?19.73389369?7.990088427?12.46214168?19.73389369
22.48761053?1.75371684?4.009911573?12.46214168?19.73389369?7.990088427?12.46214168?13.26610631
17.51238947?3.24628316?4.009911573?12.46214168?19.73389369?7.990088427?17.33785832?19.73389369
22.48761053?3.24628316?4.009911573?12.46214168?19.73389369?7.990088427?17.33785832?13.26610631
17.51238947?1.75371684?7.990088427?12.46214168?19.73389369?7.990088427?17.33785832?19.73389369
22.48761053?1.75371684?7.990088427?12.46214168?19.73389369?7.990088427?17.33785832?13.26610631
17.51238947?3.24628316?7.990088427?12.46214168?19.73389369?7.990088427?12.46214168?19.73389369
22.48761053?3.24628316?7.990088427?12.46214168?19.73389369?7.990088427?12.46214168?13.26610631
17.51238947?1.75371684?4.009911573?17.33785832?19.73389369?7.990088427?17.33785832?13.26610631
22.48761053?1.75371684?4.009911573?17.33785832?19.73389369?7.990088427?17.33785832?19.73389369
17.51238947?3.24628316?4.009911573?17.33785832?19.73389369?7.990088427?12.46214168?13.26610631
22.48761053?3.24628316?4.009911573?17.33785832?19.73389369?7.990088427?12.46214168?19.73389369
17.51238947?1.75371684?7.990088427?17.33785832?19.73389369?7.990088427?12.46214168?13.26610631
22.48761053?1.75371684?7.990088427?17.33785832?19.73389369?7.990088427?12.46214168?19.73389369
17.51238947?3.24628316?7.990088427?17.33785832?19.73389369?7.990088427?17.33785832?13.26610631
22.48761053?3.24628316?7.990088427?17.33785832?19.73389369?7.990088427?17.33785832?19.73389369
]';%注意因為本人做了81組仿真試驗,這里的矩陣后面有轉置符號,在神經網絡模型中,輸入P的是8X81的矩陣(把程序復制過來之后格式沒對齊,大家自己調整一下啦),對應的下面的輸出T的是3x81的矩陣。
T=[150.749?2.28499?13.466
165.148?2.64021?9.6525
138.061?1.92976?17.2795
149.446?2.25704?13.766
151.642?2.31293?13.166
147.146?2.22947?14.062
154.131?2.3405?12.87
144.164?2.2576?13.76
155.889?2.31237?13.172
150.646?2.28499?13.466
150.621?2.28499?13.466
147.091?2.22947?14.062
154.166?2.3405?12.87
144.289?2.2576?13.76
155.553?2.31237?13.172
150.653?2.28499?13.466
150.704?2.28499?13.466
148.424?2.37609?12.4879
134.952?2.01917?16.3197
154.264?2.41865?12.0311
141.207?2.06864?15.7885
156.492?2.44051?11.7964
142.671?2.08358?15.6282
152.473?2.44664?11.7306
138.329?2.09663?15.488
159.696?2.41252?12.0969
145.947?2.05559?15.9287
155.401?2.41865?12.0311
141.73?2.06864?15.7885
157.408?2.45858?11.6024
144.1?2.10166?15.4341
163.483?2.50114?11.1455
150.483?2.15114?14.9029
154.111?2.3943?12.2924
140.418?2.03738?16.1242
149.253?2.40044?12.2266
135.997?2.05043?15.984
151.518?2.4223?11.9919
137.257?2.06537?15.8237
158.05?2.46485?11.535
143.739?2.11485?15.2925
153.641?2.3943?12.2924
140.723?2.03738?16.1242
158.956?2.43686?11.8355
146.933?2.08685?15.593
160.731?2.4768?11.4068
149.315?2.11987?15.2386
156.842?2.48293?11.341
145.17?2.13292?15.0984
156.942?2.45858?11.6024
143.948?2.10166?15.4341
152.503?2.44664?11.7306
138.486?2.09663?15.488
154.84?2.4685?11.4959
139.795?2.11157?15.3276
161.574?2.52914?10.845
147.502?2.17913?14.6024
156.975?2.44051?11.7964
143.06?2.08358?15.6282
162.688?2.50114?11.1455
150.483?2.15114?14.9029
164.588?2.54108?10.7168
153.024?2.18415?14.5485
160.908?2.52914?10.845
147.794?2.17913?14.6024
151.437?2.4223?11.9919
137.386?2.06537?15.8237
156.979?2.48293?11.341
144.915?2.13292?15.0984
159.167?2.50479?11.1063
146.229?2.14786?14.9381
155.699?2.49285?11.2345
140.767?2.14284?14.992
161.782?2.4768?11.4068
149.124?2.11987?15.2386
157.819?2.46485?11.535
143.8?2.11485?15.2925
159.553?2.50479?11.1063
146.186?2.14786?14.9381
166.512?2.56542?10.4554
153.896?2.21542?14.2129
]'; % T 為目標矢量
[PP,ps]=mapminmax(P,-1,1); %把P歸一化處理變為pp,在范圍(-1,1)內
%把T歸一化處理變TT,在范圍(-1,1)內,歸一化主要是為了消除不通量崗對結果的影響
[TT,ps]=mapminmax(T,-1,1);
% 創建三層前向神經網絡,隱層神經元為15輸出層神經元為3
net=newff(minmax(PP),[15,3],{'tansig','purelin'},'traingdm')
%
---------------------------------------------------------------
% 訓練函數:traingdm,功能:以動量BP算法修正神經網絡的權值和閾值。
% 它的相關特性包括:
% epochs:訓練的次數,默認:100
% goal:誤差性能目標值,默認:0
% lr:學習率,默認:0.01
% max_fail:確認樣本進行仿真時,最大的失敗次數,默認:5
% mc:動量因子,默認:0.9
% min_grad:最小梯度值,默認:1e-10
% show:顯示的間隔次數,默認:25
% time:訓練的最長時間,默認:inf
%
---------------------------------------------------------------
inputWeights=net.IW{1,1}?%當前輸入層權值和閾值
inputbias=net.b{1}
% 當前網絡層權值和閾值
layerWeights=net.LW{2,1}
layerbias=net.b{2}
% 設置網絡的訓練參數
net.trainParam.show = 2;
net.trainParam.lr = 0.05;
net.trainParam.mc = 0.9;
net.trainParam.epochs =10000;
net.trainParam.goal =
1e-3;
% 調用 TRAINGDM 算法訓練 BP 網絡(在構建net中有說明)
[net,tr]=train(net,PP,TT);
A = sim(net,PP) ; % 對 BP 網絡進行仿真,
A=mapminmax('reverse',A,ps) ; %
對A矩陣進行反歸一化處理()
% 計算仿真誤差
E = T - A
MSE=mse(E)
echo off
按上面的運行之后結果如圖所示。
如果輸出值與目標值完全相等則R=1,這里已經非常接近了,說明效果擬合效果還是可以的,右圖是訓練過程的平方和誤差變化,達到我們指定的誤差0.001時候,訓練停止。
總結
以上是生活随笔為你收集整理的matlab 神经网路,matlab神经网络的工程实例(超级详细)的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 博物馆施工组织设计方案
- 下一篇: cs add 命令