神经网络用python还是matlab_Matlab与神经网络学习笔记
這是學習《Neural Network and Deep Learning》的筆記。在剛開始學習時,電腦安裝的是python3,而作者用的是2.我想著反正要熟悉代碼,大學期間MATLAB用的比較多,也懶得去折騰python2或是重寫成python3.于是用MATLAB完成了里面的實例和例題。這本書很適合新手,特別是像我這種非科班的人。通過啟發式的思想引導讀者由淺入深,自己動手操作代碼,成就感很強。這個筆記主要是書本里例子的操作,還有習題的解答,難免會有錯誤,歡迎指正。
下面就是前兩章的主要代碼,后面的內容都是在這基礎上修改的。
function NW(a)
%NW([784,30,10])
global NetWork;
NetWork.length = length(a);
for i=2:1:NetWork.length
NetWork.bias{i-1} = randn(a(i),1);
NetWork.weight{i-1} = randn(a(i),a(i-1));
end
end
function update_mini_batch(mini_batch,eta,mini_batch_size)
global NetWork;
for i=1:1:NetWork.length-1;
nabla_bi{i} = zeros(size(NetWork.bias{i}));
end
for i=1:1:NetWork.length-1;
nabla_wi{i} = zeros(size(NetWork.weight{i}));
end
for i=1:1:mini_batch_size
[delta_nabla_b,delta_nabla_w] = backprop(mini_batch{1,1}(:,i),mini_batch{1,2}(i));
for j=1:NetWork.length-1
nabla_bi{j} = nabla_bi{j} + delta_nabla_b{j};
nabla_wi{j} = nabla_wi{j} + delta_nabla_w{j};
end
end
for k = 1:NetWork.length-1
NetWork.weight{k}=NetWork.weight{k}-(eta/mini_batch_size)*nabla_wi{k};
NetWork.bias{k}=NetWork.bias{k}-(eta/mini_batch_size)*nabla_bi{k};
end
end
function [nabla_b,nabla_w] = backprop(x,y)
global NetWork;
for i=1:1:NetWork.length-1;
nabla_b{i} = zeros(size(NetWork.bias{i}));
end
for i=1:1:NetWork.length-1;
nabla_w{i} = zeros(size(NetWork.weight{i}));
end
% 向前傳播
activation = x./256;
activations{1} = activation;
for i=1:NetWork.length-1
z = NetWork.weight{i}*activation+NetWork.bias{i};
zs{i} = z;
activation = sigmoid(z);
activations{i+1} = activation;
end
% 向后傳播
% 輸出層誤差:
a = cost_derivative(activations{NetWork.length},y);
b = sigmoid_prime(zs{NetWork.length-1});
delta = cost_derivative(activations{NetWork.length},y).*sigmoid_prime(zs{NetWork.length-1});
nabla_b{NetWork.length-1} = delta;
nabla_w{NetWork.length-1} = delta*activations{NetWork.length-1}';
for i=NetWork.length-2:-1:1
z = zs{i};
sp = sigmoid_prime(z);
delta = (NetWork.weight{i+1}'*delta).*sp;
nabla_b{i}=delta;
nabla_w{i}=delta*activations{i}';
end
end
function c = cost_derivative(output_activations,y)
y1 = zeros(10,1);
y1(y+1) = 1;
c = output_activations - y1;
end
function s = sigmoid(z)
s = (1./(1+exp(-z)));
end
function sp = sigmoid_prime(z)
sp = (sigmoid(z).*(1-sigmoid(z)));
end
```
總結
以上是生活随笔為你收集整理的神经网络用python还是matlab_Matlab与神经网络学习笔记的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 九、Pandas高级处理
- 下一篇: python网络爬虫系列(八)——常见的