mxnet基础到提高(15)--梯度与反向传播
生活随笔
收集整理的這篇文章主要介紹了
mxnet基础到提高(15)--梯度与反向传播
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
#!/usr/bin/env python2
# -*- coding: utf-8 -*-
"""
Created on Fri Aug 10 16:13:29 2018@author: myhaspl
"""
from mxnet import nd
from mxnet import autograd
x = nd.array([[1, 2], [3, 4]])
x.attach_grad()#在ndarray里準備存儲梯度
with autograd.record():#定義f(x)y=2*x*x
#反向傳播backward()
y.backward()
#f'(x)=4*x
z=x.grad
print x
print z
[[1. 2.]
[3. 4.]]
<NDArray 2x2 @cpu(0)>
[[ 4. 8.]
[12. 16.]]
<NDArray 2x2 @cpu(0)>
######################
#!/usr/bin/env python2 # -*- coding: utf-8 -*- """ Created on Fri Aug 10 16:13:29 2018@author: myhaspl """ from mxnet import nd from mxnet import autograddef f(x): b=xwhile b.norm().asscalar() < 100: #計算歐氏距離(norm)b=b*2#y=ax ,a=2*2*.....*2print bif b.sum().asscalar() >= 0: y = b[0]else:y = b[1]return yx = nd.array([1,4]) x.attach_grad()#在ndarray里準備存儲梯度 with autograd.record():#定義f(x)y=f(x) #反向傳播backward() y.backward() #f'(x)=a,y=ax z=x.grad print "=======" print [z,x,y,y/x]#a=y/x[2. 8.]
<NDArray 2 @cpu(0)>
[ 4. 16.]
<NDArray 2 @cpu(0)>
[ 8. 32.]
<NDArray 2 @cpu(0)>
[16. 64.]
<NDArray 2 @cpu(0)>
[ 32. 128.]
<NDArray 2 @cpu(0)>
=======
[
[32. 0.]
<NDArray 2 @cpu(0)>,
[1. 4.]
<NDArray 2 @cpu(0)>,
[32.]
<NDArray 1 @cpu(0)>,
[32. 8.]
<NDArray 2 @cpu(0)>]
總結
以上是生活随笔為你收集整理的mxnet基础到提高(15)--梯度与反向传播的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 请求头Content-Type:appl
- 下一篇: redis session 超时时间_S