生活随笔
收集整理的這篇文章主要介紹了
matlab 卷积神经网络 图像去噪 对抗样本修复
小編覺(jué)得挺不錯(cuò)的,現(xiàn)在分享給大家,幫大家做個(gè)參考.
總流程類似于幫助文檔幫助文檔圖像回歸 神經(jīng)網(wǎng)絡(luò)結(jié)構(gòu)類似于Unet,由卷積、激活、池化、轉(zhuǎn)置卷積、深度連接等layers組成 先準(zhǔn)備好數(shù)據(jù)集,我上傳在資源里,可下載https://download.csdn.net/download/qq_51750957/14115738 1.matlab自帶的0-9手寫(xiě)數(shù)字,共10000張,為[28,28]單通道圖像
2.對(duì)應(yīng)的對(duì)抗樣本(加了干擾的圖片)10000張(如何生成對(duì)抗樣本可參照另一篇)
導(dǎo)入數(shù)據(jù)
imds
= imageDatastore ( 'DigitDataset' , 'IncludeSubfolders' , true , 'labelsource' , 'foldernames' ) ; % 讀取原始圖片
[ imdsTrain
, imdsValidation
] = splitEachLabel ( imds
, 0.9 ) ; % 拆分出驗(yàn)證集
advs
= imageDatastore ( 'digit_adv0.15' , 'IncludeSubfolders' , true , 'labelsource' , 'foldernames' ) ; % 讀取對(duì)抗樣本
[ advsTrain
, advsValidation
] = splitEachLabel ( advs
, 0.9 ) ;
檢查前十張圖片是否對(duì)應(yīng)
for i
= 1 : 10 figure ( i
) ; subplot ( 1 , 2 , 1 ) ; imshow ( imread ( imds
. Files
{ i
} ) ) ; subplot ( 1 , 2 , 2 ) ; imshow ( imread ( advs
. Files
{ i
} ) ) ; list1
{ i
} = imds
. Files
{ i
} ; list2
{ i
} = advs
. Files
{ i
} ;
end
組合數(shù)據(jù),形成輸入輸出
dsTrain
= combine ( advsTrain
, imdsTrain
) ; % 合并為兩列數(shù)據(jù),第一列為對(duì)抗樣本(輸入),第二列為原始圖片(輸出)
dsVal
= combine ( advsValidation
, imdsValidation
) ; % 同上
dsTrain
= transform ( dsTrain
, @commonPreprocessing
) ; % 歸一化到【
0 , 1 】
, 轉(zhuǎn)為灰度圖,縮放為
[ 32 , 32 ] 大小,commonPreprocessing定義在最后
dsVal
= transform ( dsVal
, @commonPreprocessing
) ; % 同上dsTrain
= shuffle ( dsTrain
) ; % 打亂dataOut
= readall ( dsTrain
) ; % 顯示前
8 對(duì)數(shù)據(jù)
inputs
= dataOut ( : , 1 ) ;
responses
= dataOut ( : , 2 ) ;
minibatch
= cat ( 2 , inputs
, responses
) ;
montage ( minibatch
',' Size'
, [ 8 2 ] )
title ( 'Inputs (Left) and Responses (Right)' )
設(shè)置訓(xùn)練參數(shù)
options
= trainingOptions ( 'adam' , . . . 'MaxEpochs' , 30 , . . . 'MiniBatchSize' , 32 , . . . 'ValidationData' , dsVal
, . . . 'ValidationFrequency' , 300 , . . . 'ValidationPatience' , 100 , . . . 'Shuffle' , 'never' , . . . 'Plots' , 'training-progress' , . . . 'LearnRateSchedule' , 'piecewise' , . . . 'LearnRateDropPeriod' , 5 , . . . 'LearnRateDropFactor' , 0.5 , . . . 'Verbose' , false ) ;
打開(kāi)深度學(xué)習(xí)app,開(kāi)始搭建網(wǎng)絡(luò),輸入大小[32,32,1],輸出也是[32,32,1],輸入和輸出每個(gè)像素都在【0,1】范圍內(nèi)。網(wǎng)絡(luò)包含轉(zhuǎn)置卷積層、深度連接層,比較復(fù)雜 搭建好后可導(dǎo)出為lgraph 網(wǎng)絡(luò)很長(zhǎng),可以自己修改
創(chuàng)建深度學(xué)習(xí)網(wǎng)絡(luò)架構(gòu)
該腳本可創(chuàng)建具有以下屬性的深度學(xué)習(xí)網(wǎng)絡(luò)
:
層數(shù)
: 42
連接數(shù)
: 44
運(yùn)行腳本以在工作區(qū)變量 lgraph 中創(chuàng)建層。
要了解詳細(xì)信息,請(qǐng)參閱 從 Deep Network Designer 生成 MATLAB 代碼。
由 MATLAB 于
2021 - 01 - 11 20 : 52 : 38 自動(dòng)生成
創(chuàng)建層次圖
創(chuàng)建層次圖變量以包含網(wǎng)絡(luò)層。
lgraph
= layerGraph ( ) ; 添加層分支
將網(wǎng)絡(luò)分支添加到層次圖中。每個(gè)分支均為一個(gè)線性層組。
tempLayers
= [ imageInputLayer ( [ 32 32 1 ] , "Name" , "imageinput" , "Normalization" , "rescale-zero-one" ) convolution2dLayer ( [ 3 3 ] , 32 , "Name" , "conv_1_1" , "Padding" , "same" ) batchNormalizationLayer ( "Name" , "batchnorm_1" ) reluLayer ( "Name" , "relu_1_1" ) convolution2dLayer ( [ 3 3 ] , 32 , "Name" , "conv_1_2" , "Padding" , "same" ) batchNormalizationLayer ( "Name" , "batchnorm_2" ) reluLayer ( "Name" , "relu_1_2" ) convolution2dLayer ( [ 3 3 ] , 32 , "Name" , "conv_1_3" , "Padding" , "same" ) batchNormalizationLayer ( "Name" , "batchnorm_3" ) reluLayer ( "Name" , "relu_1_3" ) ] ;
lgraph
= addLayers ( lgraph
, tempLayers
) ; tempLayers
= [ maxPooling2dLayer ( [ 2 2 ] , "Name" , "maxpool_1" , "Padding" , "same" , "Stride" , [ 2 2 ] ) convolution2dLayer ( [ 3 3 ] , 64 , "Name" , "conv_2" , "Padding" , "same" ) reluLayer ( "Name" , "relu_2" ) ] ;
lgraph
= addLayers ( lgraph
, tempLayers
) ; tempLayers
= [ dropoutLayer ( 0.15 , "Name" , "dropout" ) maxPooling2dLayer ( [ 2 2 ] , "Name" , "maxpool_2" , "Padding" , "same" , "Stride" , [ 2 2 ] ) convolution2dLayer ( [ 3 3 ] , 128 , "Name" , "conv_3" , "Padding" , "same" ) reluLayer ( "Name" , "relu_3" ) ] ;
lgraph
= addLayers ( lgraph
, tempLayers
) ; tempLayers
= [ maxPooling2dLayer ( [ 2 2 ] , "Name" , "maxpool_3" , "Padding" , "same" , "Stride" , [ 2 2 ] ) convolution2dLayer ( [ 3 3 ] , 256 , "Name" , "conv_5" , "Padding" , "same" ) batchNormalizationLayer ( "Name" , "batchnorm_6" ) reluLayer ( "Name" , "relu_7" ) transposedConv2dLayer ( [ 4 4 ] , 128 , "Name" , "transposed-conv_1" , "Cropping" , [ 1 1 1 1 ] , "Stride" , [ 2 2 ] ) reluLayer ( "Name" , "relu_4" ) ] ;
lgraph
= addLayers ( lgraph
, tempLayers
) ; tempLayers
= [ depthConcatenationLayer ( 2 , "Name" , "depthcat_3" ) convolution2dLayer ( [ 3 3 ] , 128 , "Name" , "conv_1_4" , "Padding" , "same" ) batchNormalizationLayer ( "Name" , "batchnorm_4" ) reluLayer ( "Name" , "relu_1_4" ) transposedConv2dLayer ( [ 4 4 ] , 64 , "Name" , "transposed-conv_2" , "Cropping" , [ 1 1 1 1 ] , "Stride" , [ 2 2 ] ) reluLayer ( "Name" , "relu_5" ) ] ;
lgraph
= addLayers ( lgraph
, tempLayers
) ; tempLayers
= [ depthConcatenationLayer ( 2 , "Name" , "depthcat_1" ) convolution2dLayer ( [ 3 3 ] , 64 , "Name" , "conv_1_5" , "Padding" , "same" ) batchNormalizationLayer ( "Name" , "batchnorm_5" ) reluLayer ( "Name" , "relu_1_5" ) transposedConv2dLayer ( [ 4 4 ] , 32 , "Name" , "transposed-conv_3" , "Cropping" , [ 1 1 1 1 ] , "Stride" , [ 2 2 ] ) reluLayer ( "Name" , "relu_6" ) ] ;
lgraph
= addLayers ( lgraph
, tempLayers
) ; tempLayers
= [ depthConcatenationLayer ( 2 , "Name" , "depthcat_2" ) convolution2dLayer ( [ 3 3 ] , 32 , "Name" , "conv_6" , "Padding" , "same" ) batchNormalizationLayer ( "Name" , "batchnorm_7" ) reluLayer ( "Name" , "relu_8" ) convolution2dLayer ( [ 1 1 ] , 1 , "Name" , "conv_4" , "Padding" , "same" ) clippedReluLayer ( 1 , "Name" , "clippedrelu" ) regressionLayer ( "Name" , "regressionoutput" ) ] ;
lgraph
= addLayers ( lgraph
, tempLayers
) ; % 清理輔助變量
clear tempLayers
; 連接層分支
連接網(wǎng)絡(luò)的所有分支以創(chuàng)建網(wǎng)絡(luò)圖。
lgraph
= connectLayers ( lgraph
, "relu_1_3" , "maxpool_1" ) ;
lgraph
= connectLayers ( lgraph
, "relu_1_3" , "depthcat_2/in2" ) ;
lgraph
= connectLayers ( lgraph
, "relu_2" , "dropout" ) ;
lgraph
= connectLayers ( lgraph
, "relu_2" , "depthcat_1/in1" ) ;
lgraph
= connectLayers ( lgraph
, "relu_3" , "maxpool_3" ) ;
lgraph
= connectLayers ( lgraph
, "relu_3" , "depthcat_3/in1" ) ;
lgraph
= connectLayers ( lgraph
, "relu_4" , "depthcat_3/in2" ) ;
lgraph
= connectLayers ( lgraph
, "relu_5" , "depthcat_1/in2" ) ;
lgraph
= connectLayers ( lgraph
, "relu_6" , "depthcat_2/in1" ) ; 繪制層
plot ( lgraph
) ;
開(kāi)始訓(xùn)練,訓(xùn)練耗時(shí)很長(zhǎng),可以適當(dāng)更改maxEpoch
denoise_net
= trainNetwork ( dsTrain
, lgraph
, options
) ;
訓(xùn)練完成,嘗試預(yù)測(cè)5張圖片,左邊是輸入,右邊是輸出
for ii
= 0 : 5 figure ( ii
+ 1 ) I
= imread ( [ 'digit_adv0.15/' , num2str ( ii
) , '/0000.png' ] ) ; % 讀入圖片I
= imresize ( I
, [ 32 , 32 ] ) ; % I
= rgb2gray ( I
) ; % 如果是彩圖,要轉(zhuǎn)為灰度I
= single ( I
) ; I
= rescale ( I
) ; subplot ( 1 , 2 , 1 ) ; imshow ( I
) ; ypred
= predict ( denoise_net
, I
) ; % 預(yù)測(cè)
subplot ( 1 , 2 , 2 ) ; imshow ( rescale ( ypred
) ) ;
end
成功修復(fù)了簡(jiǎn)單的對(duì)抗樣本
函數(shù)
function dataOut
= commonPreprocessing ( data
) dataOut
= cell ( size ( data
) ) ;
for col
= 1 : size ( data
, 2 ) for idx
= 1 : size ( data
, 1 ) temp
= ( data
{ idx
, col
} ) ; temp
= imresize ( temp
, [ 32 , 32 ] ) ; s
= size ( size ( temp
) ) ; if s ( 2 ) == 3 temp
= rgb2gray ( temp
) ; end
% temp
= rescale ( temp
) ; dataOut
{ idx
, col
} = temp
; end
end
end
總結(jié)
以上是生活随笔 為你收集整理的matlab 卷积神经网络 图像去噪 对抗样本修复 的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問(wèn)題。
如果覺(jué)得生活随笔 網(wǎng)站內(nèi)容還不錯(cuò),歡迎將生活随笔 推薦給好友。