learning rate四种改变方式
生活随笔
收集整理的這篇文章主要介紹了
learning rate四种改变方式
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
Fixed
learning rate固定不變
base_lr = 0.01 lr_policy = "fixed"Step
learning rate在每迭代stepsize次后減少gamma倍。lr=lr×gamma
base_lr = 0.01 lr_policy = "step" gamma = 0.1 stepsize= 10000Polynomial
learning rate呈多項式曲線下降。LR(t)=base_lr×(tT)power
base_lr = 0.01 lr_policy = "poly" power = 0.5Inv
learning rate隨迭代次數增加而下降。LR(t)=base_lr×(1+gamma×iter)power
base_lr = 0.01 lr_policy = "Inv" gamma = 0.0001 power = 0.75總結
以上是生活随笔為你收集整理的learning rate四种改变方式的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: mysql mongodb b树_为何M
- 下一篇: python必备神器_Python 必备