CV:基于Keras利用CNN主流架构之mini_XCEPTION训练情感分类模型hdf5并保存到指定文件夹下
生活随笔
收集整理的這篇文章主要介紹了
CV:基于Keras利用CNN主流架构之mini_XCEPTION训练情感分类模型hdf5并保存到指定文件夹下
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
CV:基于Keras利用CNN主流架構之mini_XCEPTION訓練情感分類模型hdf5并保存到指定文件夾下
?
?
目錄
圖示過程
核心代碼
?
?
?
圖示過程
?
核心代碼
def mini_XCEPTION(input_shape, num_classes, l2_regularization=0.01):regularization = l2(l2_regularization)# baseimg_input = Input(input_shape)x = Conv2D(8, (3, 3), strides=(1, 1), kernel_regularizer=regularization,use_bias=False)(img_input)x = BatchNormalization()(x)x = Activation('relu')(x)x = Conv2D(8, (3, 3), strides=(1, 1), kernel_regularizer=regularization,use_bias=False)(x)x = BatchNormalization()(x)x = Activation('relu')(x)# module 1residual = Conv2D(16, (1, 1), strides=(2, 2),padding='same', use_bias=False)(x)residual = BatchNormalization()(residual)x = SeparableConv2D(16, (3, 3), padding='same',kernel_regularizer=regularization,use_bias=False)(x)x = BatchNormalization()(x)x = Activation('relu')(x)x = SeparableConv2D(16, (3, 3), padding='same',kernel_regularizer=regularization,use_bias=False)(x)x = BatchNormalization()(x)x = MaxPooling2D((3, 3), strides=(2, 2), padding='same')(x)x = layers.add([x, residual])# module 2residual = Conv2D(32, (1, 1), strides=(2, 2),padding='same', use_bias=False)(x)residual = BatchNormalization()(residual)x = SeparableConv2D(32, (3, 3), padding='same',kernel_regularizer=regularization,use_bias=False)(x)x = BatchNormalization()(x)x = Activation('relu')(x)x = SeparableConv2D(32, (3, 3), padding='same',kernel_regularizer=regularization,use_bias=False)(x)x = BatchNormalization()(x)x = MaxPooling2D((3, 3), strides=(2, 2), padding='same')(x)x = layers.add([x, residual])# module 3residual = Conv2D(64, (1, 1), strides=(2, 2),padding='same', use_bias=False)(x)residual = BatchNormalization()(residual)x = SeparableConv2D(64, (3, 3), padding='same',kernel_regularizer=regularization,use_bias=False)(x)x = BatchNormalization()(x)x = Activation('relu')(x)x = SeparableConv2D(64, (3, 3), padding='same',kernel_regularizer=regularization,use_bias=False)(x)x = BatchNormalization()(x)x = MaxPooling2D((3, 3), strides=(2, 2), padding='same')(x)x = layers.add([x, residual])# module 4residual = Conv2D(128, (1, 1), strides=(2, 2),padding='same', use_bias=False)(x)residual = BatchNormalization()(residual)x = SeparableConv2D(128, (3, 3), padding='same',kernel_regularizer=regularization,use_bias=False)(x)x = BatchNormalization()(x)x = Activation('relu')(x)x = SeparableConv2D(128, (3, 3), padding='same',kernel_regularizer=regularization,use_bias=False)(x)x = BatchNormalization()(x)x = MaxPooling2D((3, 3), strides=(2, 2), padding='same')(x)x = layers.add([x, residual])x = Conv2D(num_classes, (3, 3),#kernel_regularizer=regularization,padding='same')(x)x = GlobalAveragePooling2D()(x)output = Activation('softmax',name='predictions')(x)model = Model(img_input, output)return model?
#CV:利用CNN主流架構之一的XCEPTION訓練情感分類模型.hdf5并保存到指定文件夾下邊 from keras.callbacks import CSVLogger, ModelCheckpoint, EarlyStopping from keras.callbacks import ReduceLROnPlateau from keras.preprocessing.image import ImageDataGeneratorfrom models.cnn import mini_XCEPTION# parameters 1、定義參數:每個batch的采樣本數、訓練輪數、輸入shape、部分比例分離用于驗證、冗長參數、分類個數、patience、loghdf5保存路徑 batch_size = 32 #整數,指定進行梯度下降時每個batch包含的樣本數。訓練時一個batch的樣本會被計算一次梯度下降,使目標函數優化一步。 num_epochs = 10000 #整數,訓練終止時的epoch值,訓練將在達到該epoch值時停止,當沒有設置initial_epoch時,它就是訓練的總輪數,否則訓練的總輪數為epochs - inital_epoch input_shape = (64, 64, 1) validation_split = .2 #0~1之間的浮點數,用來指定訓練集的一定比例數據作為驗證集。驗證集將不參與訓練,并在每個epoch結束后測試的模型的指標,如損失函數、精確度等。 verbose = 1 #日志顯示,0為不在標準輸出流輸出日志信息,1為輸出進度條記錄,2為每個epoch輸出一行記錄 num_classes = 7 patience = 50 #當monitor不再有改善的時候就會停止訓練,這個可以通過patience看出來 base_path = '../trained_models/emotion_models/'# data generator調用ImageDataGenerator函數實現實時數據增強生成小批量的圖像數據。 data_generator = ImageDataGenerator(featurewise_center=False,featurewise_std_normalization=False,rotation_range=10,width_shift_range=0.1,height_shift_range=0.1,zoom_range=.1,horizontal_flip=True)# model parameters/compilation2、建立XCEPTION模型并compile編譯配置參數,最后輸出網絡摘要 model = mini_XCEPTION(input_shape, num_classes) #mini_XCEPTION函數(XCEPTION是屬于CNN下目前最新的一種模型)實現輸入形狀、分類個數兩個參數建立模型 model.compile(optimizer='adam', loss='categorical_crossentropy', #model.compile函數(屬于keras庫)用來配置訓練模型參數,可以指定你設想的隨機梯度下降中的網絡的損失函數、優化方式等參數metrics=['accuracy']) model.summary() #Prints a string summary of the network.#3、指定要訓練的數據集(emotion→fer2013即喜怒哀樂數據集) datasets = ['fer2013'] #4、for循環實現callbacks、loading dataset for dataset_name in datasets: print('Training dataset:', dataset_name)# callbacks回調:通過調用CSVLogger、EarlyStopping、ReduceLROnPlateau、ModelCheckpoint等函數得到訓練參數存到一個list內log_file_path = base_path + dataset_name + '_emotion_training.log'csv_logger = CSVLogger(log_file_path, append=False) #Callback that streams epoch results to a csv file.early_stop = EarlyStopping('val_loss', patience=patience) #Stop training when a monitored quantity has stopped improving.reduce_lr = ReduceLROnPlateau('val_loss', factor=0.1, #Reduce learning rate when a metric has stopped improving.patience=int(patience/4), verbose=1)trained_models_path = base_path + dataset_name + '_mini_XCEPTION'model_names = trained_models_path + '.{epoch:02d}-{val_acc:.2f}.hdf5'model_checkpoint = ModelCheckpoint(model_names, 'val_loss', verbose=1, #Save the model after every epochsave_best_only=True)callbacks = [model_checkpoint, csv_logger, early_stop, reduce_lr] ## loading dataset加載數據集:通過調用DataManager、data_loader = DataManager(dataset_name, image_size=input_shape[:2]) #自定義DataManager函數實現根據數據集name進行加載faces, emotions = data_loader.get_data() #自定義get_data函數根據不同數據集name得到各自的ground truth data,faces = preprocess_input(faces) #自定義preprocess_input函數:處理輸入的數據,先轉為float32類型然后/ 255.0num_samples, num_classes = emotions.shape #shape函數讀取矩陣的長度train_data, val_data = split_data(faces, emotions, validation_split) #自定義split_data對數據整理各取所得train_data、 val_data train_faces, train_emotions = train_data#training model調用fit_generator函數訓練模型model.fit_generator(data_generator.flow(train_faces, train_emotions, #flow函數返回Numpy Array Iterator迭代batch_size),steps_per_epoch=len(train_faces) / batch_size,epochs=num_epochs, verbose=1, callbacks=callbacks,validation_data=val_data) #fit_generator函數Fits the model on data generated batch-by-batch by a Python generator?
總結
以上是生活随笔為你收集整理的CV:基于Keras利用CNN主流架构之mini_XCEPTION训练情感分类模型hdf5并保存到指定文件夹下的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: CV:利用cv2(加载人脸
- 下一篇: CV:基于Keras利用CNN主流架构之