吴恩达深度学习tensorflow版本问题
生活随笔
收集整理的這篇文章主要介紹了
吴恩达深度学习tensorflow版本问题
小編覺得挺不錯的,現(xiàn)在分享給大家,幫大家做個參考.
1.module 'tensorflow' has no attribute 'global_variables_initializer'?問題解決:在import后面加代碼
tf.compat.v1.disable_eager_execution()然后將源代碼改為:
y_hat = tf.constant(36, name='y_hat') # Define y_hat constant. Set to 36. y = tf.constant(39, name='y') # Define y. Set to 39loss = tf.Variable((y - y_hat)**2, name='loss') # Create a variable for the lossinit = tf.compat.v1.global_variables_initializer() # When init is run later (session.run(init)),# the loss variable will be initialized and ready to be computed with tf.compat.v1.Session() as session: # Create a session and print the outputsession.run(init) # Initializes the variablesprint(session.run(loss)) # Prints the loss2.RuntimeError: The Session graph is empty. Add operations to the graph before calling run()解決方法:
with tf.compat.v1.Session() as session:3.placeholder占位符問題,同上,加入compat.v1即可解決:
# Change the value of x in the feed_dictx = tf.compat.v1.placeholder(tf.int64, name = 'x') print(sess.run(2 * x, feed_dict = {x: 3})) sess.close()4.tf.contrib.layers.xavier_initializer初始化問題:在網上找到兩種解決方法:
一是替換代碼為
initializer=tf.keras.initializers.GlorotUniform(seed = 0)二是代碼替換為:
initializer=tf.truncated_normal_initializer(stddev=1.0, seed = 0))5.tf.contrib.layers.flatten()問題:
tf.compat.v1.layers.flatten()6.tensorflow2.0中全連接層問題:
tf.compat.v1.layers.dense(P2, 6)總結
以上是生活随笔為你收集整理的吴恩达深度学习tensorflow版本问题的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 计算机毕业设计论文题目分享
- 下一篇: 项目新增需求确认单(模板)