TFboy养成记 多层感知器 MLP
生活随笔
收集整理的這篇文章主要介紹了
TFboy养成记 多层感知器 MLP
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
內容總結與莫煩的視頻。
這里多層感知器代碼寫的是一個簡單的三層神經網絡,輸入層,隱藏層,輸出層。代碼的目的是你和一個二次曲線。同時,為了保證數據的自然,添加了mean為0,steddv為0.05的噪聲。
添加層代碼:
def addLayer(inputs,inSize,outSize,activ_func = None):#insize outsize表示輸如輸出層的大小,inputs是輸入。activ_func是激活函數,輸出層沒有激活函數。默認激活函數為空with tf.name_scope(name = "layer"):with tf.name_scope("weigths"):Weights = tf.Variable(tf.random_normal([inSize,outSize]),name = "W")bias = tf.Variable(tf.zeros([1,outSize]),name = "bias")W_plus_b = tf.matmul(inputs,Weights)+biasif activ_func == None:return W_plus_belse:return activ_func(W_plus_b)輸入:
1 with tf.name_scope(name = "inputs"):#with這個主要是用來在tensorboard上顯示用。 2 xs = tf.placeholder(tf.float32,[None,1],name = "x_input")#不是-1哦 3 ys = tf.placeholder(tf.float32,[None,1],name = "y_input") 4 l1 = addLayer(xs,1,10,activ_func= tf.nn.relu) 5 y_pre = addLayer(l1,10,1,activ_func=None)其他部分:
需要注意的是
1 with tf.name_scope("loss"): 2 loss = tf.reduce_mean(tf.reduce_sum(tf.square(ys-y_pre), 3 reduction_indices=[1]))#這里reduction_indices=[1]類似于numpy中的那種用法,是指橫向還是豎向,reduce_sum函數貌似主要是用于矩陣的,向量可以不使用 4 with tf.name_scope("train"): 5 train_step = tf.train.GradientDescentOptimizer(0.1).minimize(loss) 6 #在以后的版本中,這里的initialize_all_variable()可能被逐步拋棄使用global_variable_init(大概是這么寫的)那個函數。歡迎指正。 7 init = tf.initialize_all_variables()#init這一步很重要,在訓練前一定要是使用sess.run(init)操作(只要是你用到了Variable) 8 writer = tf.summary.FileWriter("logs/",sess.graph) 9 with tf.Session() as sess: 10 11 sess.run(init) 12 13 for i in range(1000): 14 sess.run(train_step,feed_dict = {xs:x_data,ys:y_data}) 15 if i % 50 == 0: 16 print(sess.run(loss,feed_dict = {xs:x_data,ys:y_data}))#只要是你的操作中有涉及到placeholder一定要記得使用feed_dict?所有代碼:
1 # -*- coding: utf-8 -*- 2 """ 3 Created on Tue Jun 13 15:41:23 2017 4 5 @author: Jarvis 6 """ 7 8 import tensorflow as tf 9 import numpy as np 10 11 def addLayer(inputs,inSize,outSize,activ_func = None): 12 with tf.name_scope(name = "layer"): 13 with tf.name_scope("weigths"): 14 Weights = tf.Variable(tf.random_normal([inSize,outSize]),name = "W") 15 bias = tf.Variable(tf.zeros([1,outSize]),name = "bias") 16 W_plus_b = tf.matmul(inputs,Weights)+bias 17 if activ_func == None: 18 return W_plus_b 19 else: 20 return activ_func(W_plus_b) 21 x_data = np.linspace(-1,1,300)[:,np.newaxis] 22 noise = np.random.normal(0,0.05,x_data.shape) 23 y_data = np.square(x_data)-0.5+noise 24 25 with tf.name_scope(name = "inputs"): 26 xs = tf.placeholder(tf.float32,[None,1],name = "x_input")#不是-1哦 27 ys = tf.placeholder(tf.float32,[None,1],name = "y_input") 28 l1 = addLayer(xs,1,10,activ_func= tf.nn.relu) 29 y_pre = addLayer(l1,10,1,activ_func=None) 30 with tf.name_scope("loss"): 31 loss = tf.reduce_mean(tf.reduce_sum(tf.square(ys-y_pre), 32 reduction_indices=[1])) 33 with tf.name_scope("train"): 34 train_step = tf.train.GradientDescentOptimizer(0.1).minimize(loss) 35 36 init = tf.initialize_all_variables() 37 writer = tf.summary.FileWriter("logs/",sess.graph) 38 with tf.Session() as sess: 39 40 sess.run(init) 41 42 for i in range(1000): 43 sess.run(train_step,feed_dict = {xs:x_data,ys:y_data}) 44 if i % 50 == 0: 45 print(sess.run(loss,feed_dict = {xs:x_data,ys:y_data})) View Code?
轉載于:https://www.cnblogs.com/silence-tommy/p/7039702.html
總結
以上是生活随笔為你收集整理的TFboy养成记 多层感知器 MLP的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 5-5 表格输出 (5分)
- 下一篇: 让小乌龟可以唱歌——对Python tu