tensorflow-yolov4实施方法
tensorflow-yolov4實施方法
tensorflow-yolov4-tflite
YOLOv4: Optimal Speed and Accuracy of Object Detection
文獻鏈接:https://arxiv.org/abs/2004.10934
代碼鏈接:https://github.com/AlexeyAB/darknet
摘要
有大量的特征被認為可以提高卷積神經網絡(CNN)的精度。需要在大型數據集上對這些特征的組合進行實際測試,并對結果進行理論證明。某些功能只在某些模型上操作,某些問題只在某些模型上操作,或只在小規模數據集上操作;而某些功能(如批處理規范化和剩余連接)適用于大多數模型、任務和數據集。我們假設這些通用特征包括加權剩余連接(WRC)、跨階段部分連接(CSP)、跨小批量規范化(CmBN)、自對抗訓練(SAT)和Mish激活。使用了新功能:WRC、CSP、CmBN、SAT、誤激活、馬賽克數據增強、CmBN、DropBlock正則化和CIoU丟失,并將其中一些功能結合起來,以達到最新的結果:43.5%AP(65.7%AP50)的MS
COCO數據集,在Tesla V100上以約65 FPS的實時速度。
YOLOv4 Implemented in Tensorflow
2.0. Convert YOLO v4, YOLOv3, YOLO tiny .weights to .pb, .tflite and trt format
for tensorflow, tensorflow lite, tensorRT.
Download yolov4.weights file: https://drive.google.com/open?id=1cewMfusmPjYWbrnuJRuKhPMwRe_b9PaT
環境需要Prerequisites
Tensorflow
2.1.0
tensorflow_addons
0.9.1 (required for mish activation)
Demo
yolov4
python detect.py
–weights ./data/yolov4.weights --framework tf --size 608 --image
./data/kite.jpg
yolov4 tflite
python detect.py
–weights ./data/yolov4-int8.tflite --framework tflite --size 416 --image
./data/kite.jpg
Convert to tflite
yolov4python convert_tflite.py --weights ./data/yolov4.weights --output ./data/yolov4.tflite # yolov4 quantize float16python convert_tflite.py --weights ./data/yolov4.weights --output ./data/yolov4-fp16.tflite --quantize_mode float16 # yolov4 quantize int8python convert_tflite.py --weights ./data/yolov4.weights --output ./data/yolov4-fp16.tflite --quantize_mode full_int8 --dataset ./coco_dataset/coco/val207.txt
Convert to TensorRT
yolov3python save_model.py --weights ./data/yolov3.weights --output ./checkpoints/yolov3.tf --input_size 416 --model yolov3python convert_trt.py --weights ./checkpoints/yolov3.tf --quantize_mode float16 --output ./checkpoints/yolov3-trt-fp16-416 # yolov3-tinypython save_model.py --weights ./data/yolov3-tiny.weights --output ./checkpoints/yolov3-tiny.tf --input_size 416 --tinypython convert_trt.py --weights ./checkpoints/yolov3-tiny.tf --quantize_mode float16 --output ./checkpoints/yolov3-tiny-trt-fp16-416 # yolov4python save_model.py --weights ./data/yolov4.weights --output ./checkpoints/yolov4.tf --input_size 416 --model yolov4python convert_trt.py --weights ./checkpoints/yolov4.tf --quantize_mode float16 --output ./checkpoints/yolov4-trt-fp16-416
Evaluate on COCO 2017 Dataset
run script in /script/get_coco_dataset_2017.sh to download COCO 2017 Dataset# preprocess coco datasetcd datamkdir datasetcd …cd scriptspython coco_convert.py --input ./coco/annotations/instances_val2017.json --output val2017.pklpython coco_annotation.py --coco_path ./coco cd … # evaluate yolov4 modelpython evaluate.py --weights ./data/yolov4.weightscd mAP/extrapython remove_space.pycd …python main.py --output results_yolov4_tf
mAP50 on COCO 2017 Dataset
Benchmark
python benchmarks.py --size 416 --model yolov4 --weights ./data/yolov4.weights
TensorRT performance
訓練模型
Prepare your dataset# If you want to train from scratch:In config.py set FISRT_STAGE_EPOCHS=0 # Run script:python train.py# Transfer learning: python train.py --weights ./data/yolov4.weights
訓練性能還沒有完全重現,建議使用Alex的Darknet訓練自己的數據,然后將.weights轉換為tensorflow或tflite。
總結
以上是生活随笔為你收集整理的tensorflow-yolov4实施方法的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 3D惯导Lidar仿真
- 下一篇: YOLOv4:目标检测(windows和