python 库整理:Timm(1)
生活随笔
收集整理的這篇文章主要介紹了
python 库整理:Timm(1)
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
1 Timm介紹
????????`timm` 是一個深度學習庫,是 SOTA 計算機視覺模型、層、實用程序、優化器、調度器、數據加載器、增強以及具有再現 ImageNet 訓練結果能力的訓練/驗證腳本的集合。
2 基本用法
2.1 加載模型
2.1.1?加載預訓練模型
以mobilenetv3為例:
import timm m = timm.create_model('mobilenetv3_large_100', pretrained=True) print(m) ''' MobileNetV3((conv_stem): Conv2d(3, 16, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)... 篇幅考慮省去之后的模型 '''2.1.2?加載模型(不加載預訓練數據)?
import timm import torch from torchsummary import summary model = timm.create_model('resnet18') summary(model,( 3, 224, 224)) ''' ----------------------------------------------------------------Layer (type) Output Shape Param # ================================================================Conv2d-1 [-1, 64, 112, 112] 9,408BatchNorm2d-2 [-1, 64, 112, 112] 128ReLU-3 [-1, 64, 112, 112] 0MaxPool2d-4 [-1, 64, 56, 56] 0Conv2d-5 [-1, 64, 56, 56] 36,864BatchNorm2d-6 [-1, 64, 56, 56] 128ReLU-7 [-1, 64, 56, 56] 0Conv2d-8 [-1, 64, 56, 56] 36,864BatchNorm2d-9 [-1, 64, 56, 56] 128ReLU-10 [-1, 64, 56, 56] 0BasicBlock-11 [-1, 64, 56, 56] 0Conv2d-12 [-1, 64, 56, 56] 36,864BatchNorm2d-13 [-1, 64, 56, 56] 128ReLU-14 [-1, 64, 56, 56] 0Conv2d-15 [-1, 64, 56, 56] 36,864BatchNorm2d-16 [-1, 64, 56, 56] 128ReLU-17 [-1, 64, 56, 56] 0BasicBlock-18 [-1, 64, 56, 56] 0Conv2d-19 [-1, 128, 28, 28] 73,728BatchNorm2d-20 [-1, 128, 28, 28] 256ReLU-21 [-1, 128, 28, 28] 0Conv2d-22 [-1, 128, 28, 28] 147,456BatchNorm2d-23 [-1, 128, 28, 28] 256Conv2d-24 [-1, 128, 28, 28] 8,192BatchNorm2d-25 [-1, 128, 28, 28] 256ReLU-26 [-1, 128, 28, 28] 0BasicBlock-27 [-1, 128, 28, 28] 0Conv2d-28 [-1, 128, 28, 28] 147,456BatchNorm2d-29 [-1, 128, 28, 28] 256ReLU-30 [-1, 128, 28, 28] 0Conv2d-31 [-1, 128, 28, 28] 147,456BatchNorm2d-32 [-1, 128, 28, 28] 256ReLU-33 [-1, 128, 28, 28] 0BasicBlock-34 [-1, 128, 28, 28] 0Conv2d-35 [-1, 256, 14, 14] 294,912BatchNorm2d-36 [-1, 256, 14, 14] 512ReLU-37 [-1, 256, 14, 14] 0Conv2d-38 [-1, 256, 14, 14] 589,824BatchNorm2d-39 [-1, 256, 14, 14] 512Conv2d-40 [-1, 256, 14, 14] 32,768BatchNorm2d-41 [-1, 256, 14, 14] 512ReLU-42 [-1, 256, 14, 14] 0BasicBlock-43 [-1, 256, 14, 14] 0Conv2d-44 [-1, 256, 14, 14] 589,824BatchNorm2d-45 [-1, 256, 14, 14] 512ReLU-46 [-1, 256, 14, 14] 0Conv2d-47 [-1, 256, 14, 14] 589,824BatchNorm2d-48 [-1, 256, 14, 14] 512ReLU-49 [-1, 256, 14, 14] 0BasicBlock-50 [-1, 256, 14, 14] 0Conv2d-51 [-1, 512, 7, 7] 1,179,648BatchNorm2d-52 [-1, 512, 7, 7] 1,024ReLU-53 [-1, 512, 7, 7] 0Conv2d-54 [-1, 512, 7, 7] 2,359,296BatchNorm2d-55 [-1, 512, 7, 7] 1,024Conv2d-56 [-1, 512, 7, 7] 131,072BatchNorm2d-57 [-1, 512, 7, 7] 1,024ReLU-58 [-1, 512, 7, 7] 0BasicBlock-59 [-1, 512, 7, 7] 0Conv2d-60 [-1, 512, 7, 7] 2,359,296BatchNorm2d-61 [-1, 512, 7, 7] 1,024ReLU-62 [-1, 512, 7, 7] 0Conv2d-63 [-1, 512, 7, 7] 2,359,296BatchNorm2d-64 [-1, 512, 7, 7] 1,024ReLU-65 [-1, 512, 7, 7] 0BasicBlock-66 [-1, 512, 7, 7] 0 AdaptiveAvgPool2d-67 [-1, 512, 1, 1] 0Flatten-68 [-1, 512] 0 SelectAdaptivePool2d-69 [-1, 512] 0Linear-70 [-1, 1000] 513,000 ================================================================ Total params: 11,689,512 Trainable params: 11,689,512 Non-trainable params: 0 ---------------------------------------------------------------- Input size (MB): 0.57 Forward/backward pass size (MB): 62.80 Params size (MB): 44.59 Estimated Total Size (MB): 107.97 ---------------------------------------------------------------- '''2.1.3? 創建有隨機數量類別的模型
import timm import torch from torchsummary import summary model = timm.create_model('resnet18',num_classes=10) summary(model,( 3, 224, 224)) ''' 最后幾行是: SelectAdaptivePool2d-69 [-1, 512] 0Linear-70 [-1, 10] 5,130 ================================================================ '''2.2 列出模型
2.2.1 列出所有預訓練模型
大多數模型都有預先訓練過的重量。權重是:
- 從它們的原始來源獲得的
- 由作者自己從不同框架中移植(例如Tensorflow模型)
- 使用包含的訓練腳本從頭開始訓練
2.2.2?列出所有resnet分支的模型
import timm from pprint import pprint model_names = timm.list_models('*resne*t*') pprint(len(model_names)) #1803 預訓練模型的準確率
Model Summaries - Pytorch Image Models
下表包括作者自己訓練的部分模型權重的ImageNet-1k驗證結果。
4 使用timm 進行訓練
訓練不同模型的超參數:
Training Scripts | timmdocs
總結
以上是生活随笔為你收集整理的python 库整理:Timm(1)的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: NTU 课程笔记13:线性规划(对偶性)
- 下一篇: TImm 笔记: 训练模型