深度学习之基于卷积神经网络实现花朵识别
生活随笔
收集整理的這篇文章主要介紹了
深度学习之基于卷积神经网络实现花朵识别
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
類比于貓狗大戰,利用自己搭建的CNN網絡和已經搭建好的VGG16實現花朵識別。
1.導入庫
注:導入的庫可能有的用不到,之前打acm時留下的毛病,別管用不用得到,先寫上再說
import tensorflow as tf from tensorflow import keras from tensorflow.keras.models import Sequential from tensorflow.keras.layers import Dense,Conv2D,Flatten,Dropout,MaxPooling2D from tensorflow.keras.preprocessing.image import ImageDataGenerator import os,PIL import numpy as np import matplotlib.pyplot as plt import pathlib from PIL import ImageFile ImageFile.LOAD_TRUNCATED_IMAGES = True2.數據下載
#數據下載 dataset_url = "https://storage.googleapis.com/download.tensorflow.org/example_images/flower_photos.tgz" dataset_dir = tf.keras.utils.get_file(fname = 'flower_photos',origin=dataset_url,untar=True,cache_dir= 'E:/Deep-Learning/flowers') dataset_dir = pathlib.Path(dataset_dir)下載之后的文件是這個樣子的。
為了方便處理,博主將數據集按照8:2的比例手動(就是不會用代碼)劃分成了訓練集和測試集。
處理之后的文件夾如圖所示:
3.計算數據總數
#將train和test下面的數據的文件路徑加載到變量中 train_daisy = os.path.join(dataset_dir,"train","daisy") train_dandelion = os.path.join(dataset_dir,"train","dandelion") train_roses = os.path.join(dataset_dir,"train","roses") train_sunflowers = os.path.join(dataset_dir,"train","sunflowers") train_tulips = os.path.join(dataset_dir,"train","tulips")test_daisy = os.path.join(dataset_dir,"test","daisy") test_dandelion = os.path.join(dataset_dir,"test","dandelion") test_roses = os.path.join(dataset_dir,"test","roses") test_sunflowers = os.path.join(dataset_dir,"test","sunflowers") test_tulips = os.path.join(dataset_dir,"test","tulips") #將訓練集和測試集加載到變量中 train_dir = os.path.join(dataset_dir,"train") test_dir = os.path.join(dataset_dir,"test") #統計訓練集和測試集的數據數目 train_daisy_num = len(os.listdir(train_daisy)) train_dandelion_num = len(os.listdir(train_dandelion)) train_roses_num = len(os.listdir(train_roses)) train_sunflowers_num = len(os.listdir(train_sunflowers)) train_tulips_num = len(os.listdir(train_tulips)) train_all = train_tulips_num+train_daisy_num+train_dandelion_num+train_roses_num+train_sunflowers_numtest_daisy_num = len(os.listdir(test_daisy)) test_dandelion_num = len(os.listdir(test_dandelion)) test_roses_num = len(os.listdir(test_roses)) test_sunflowers_num = len(os.listdir(test_sunflowers)) test_tulips_num = len(os.listdir(test_tulips)) test_all = test_tulips_num+test_daisy_num+test_dandelion_num+test_roses_num+test_sunflowers_num4.超參數的設置
batch_size = 32 epochs = 10 height = 180 width = 1805.數據預處理
#歸一化處理 train_generator = tf.keras.preprocessing.image.ImageDataGenerator(rescale=1.0/255) test_generator = tf.keras.preprocessing.image.ImageDataGenerator(rescale=1.0/255) #規定batch_size的大小,文件路徑,打亂圖片順序,規定圖片的大小 train_data_gen = train_generator.flow_from_directory(batch_size=batch_size,directory=train_dir,shuffle=True,target_size=(height,width),class_mode="categorical") test_data_gen = test_generator.flow_from_directory(batch_size=batch_size,directory=test_dir,shuffle=True,target_size=(height,width),class_mode="categorical")6.模型搭建&&模型訓練
模型采用的是三層卷積池化層+Dropout+Flatten+兩層全連接層
model = tf.keras.Sequential([tf.keras.layers.Conv2D(16,3,padding="same",activation="relu",input_shape=(height,width,3)),tf.keras.layers.MaxPooling2D(),tf.keras.layers.Conv2D(32,3,padding="same",activation="relu"),tf.keras.layers.MaxPooling2D(),tf.keras.layers.Conv2D(64,3,padding="same",activation="relu"),tf.keras.layers.MaxPooling2D(),tf.keras.layers.Dropout(0.5),tf.keras.layers.Flatten(),tf.keras.layers.Dense(128,activation="relu"),tf.keras.layers.Dense(5,activation='softmax') ]) model.compile(optimizer="adam",loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),metrics=["acc"]) history = model.fit_generator(train_data_gen,steps_per_epoch=train_all//batch_size,epochs=epochs,validation_data=test_data_gen,validation_steps=test_all//batch_size)實驗效果如下所示:
出現了過擬合的情況,epochs為50的情況下,運行結果如下所示:
測試集的準確率相比于epochs為10的情況高了10%左右,但是準確率仍然很低,而且過擬合的情況還是很嚴重,運行時間特別長~
7.數據增強
train_generator = tf.keras.preprocessing.image.ImageDataGenerator(rescale=1.0/255,rotation_range=45,#傾斜45°width_shift_range=.15,height_shift_range=.15,horizontal_flip=True,#水平翻轉zoom_range=0.5)#隨機放大實驗結果如下所示:
epochs為20,過擬合情況得到了改善。
8.遷移學習
利用別人訓練好的VGG16網絡對同樣的數據進行訓練。
#引用VGG16模型 conv_base = tf.keras.applications.VGG16(weights='imagenet',include_top = False) #設置為不可訓練 conv_base.trainable = False #搭建模型 model = tf.keras.Sequential() model.add(conv_base) model.add(tf.keras.layers.GlobalAveragePooling2D()) model.add(tf.keras.layers.Dense(128,activation='relu')) model.add(tf.keras.layers.Dense(5,activation='sigmoid'))訓練結果如下圖所示:
在epochs只有10的情況下,準確率就達到了80,有了明顯的提升。
努力加油a啊
總結
以上是生活随笔為你收集整理的深度学习之基于卷积神经网络实现花朵识别的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: Win10更新驱动导致设备异常怎么处理
- 下一篇: 深度学习之基于CNN实现天气识别