variable与get_variable
Variable
tensorflow中有兩個(gè)關(guān)于variable的op,tf.Variable()與tf.get_variable()下面介紹這兩個(gè)的區(qū)別
tf.Variable與tf.get_variable()
tf.Variable(initial_value=None, trainable=True, collections=None, validate_shape=True, caching_device=None, name=None, variable_def=None, dtype=None, expected_shape=None, import_scope=None)- 1
- 1
區(qū)別
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 1
- 2
- 3
- 4
- 5
- 6
- 7
get_variable()與Variable的實(shí)質(zhì)區(qū)別
來(lái)看下面一段代碼:
import tensorflow as tfwith tf.variable_scope("scope1"):w1 = tf.get_variable("w1", shape=[])w2 = tf.Variable(0.0, name="w2") with tf.variable_scope("scope1", reuse=True):w1_p = tf.get_variable("w1", shape=[])w2_p = tf.Variable(1.0, name="w2")print(w1 is w1_p, w2 is w2_p) #輸出 #True False- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
看到這,就可以明白官網(wǎng)上說(shuō)的參數(shù)復(fù)用的真面目了。由于tf.Variable()?每次都在創(chuàng)建新對(duì)象,所有reuse=True?和它并沒有什么關(guān)系。對(duì)于get_variable(),來(lái)說(shuō),如果已經(jīng)創(chuàng)建的變量對(duì)象,就把那個(gè)對(duì)象返回,如果沒有創(chuàng)建變量對(duì)象的話,就創(chuàng)建一個(gè)新的。
random Tensor
可用于賦值給tf.Variable()的第一個(gè)參數(shù)
tf.random_normal(shape, mean=0.0, stddev=1.0, dtype=tf.float32, seed=None, name=None)tf.truncated_normal(shape, mean=0.0, stddev=1.0, dtype=tf.float32, seed=None, name=None)tf.random_uniform(shape, minval=0, maxval=None, dtype=tf.float32, seed=None, name=None)tf.random_shuffle(value, seed=None, name=None)tf.random_crop(value, size, seed=None, name=None)tf.multinomial(logits, num_samples, seed=None, name=None)tf.random_gamma(shape, alpha, beta=None, dtype=tf.float32, seed=None, name=None)tf.set_random_seed(seed)- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
constant value tensor
tf.zeros(shape, dtype=tf.float32, name=None)tf.zeros_like(tensor, dtype=None, name=None)tf.ones(shape, dtype=tf.float32, name=None)tf.ones_like(tensor, dtype=None, name=None)tf.fill(dims, value, name=None)tf.constant(value, dtype=None, shape=None, name='Const')- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
initializer
tf.constant_initializer(value=0, dtype=tf.float32) tf.random_normal_initializer(mean=0.0, stddev=1.0, seed=None, dtype=tf.float32) tf.truncated_normal_initializer(mean=0.0, stddev=1.0, seed=None, dtype=tf.float32) tf.random_uniform_initializer(minval=0, maxval=None, seed=None, dtype=tf.float32) tf.uniform_unit_scaling_initializer(factor=1.0, seed=None, dtype=tf.float32) tf.zeros_initializer(shape, dtype=tf.float32, partition_info=None) tf.ones_initializer(dtype=tf.float32, partition_info=None) tf.orthogonal_initializer(gain=1.0, dtype=tf.float32, seed=None)- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
參考資料?
https://www.tensorflow.org/api_docs/python/state_ops/variables#Variable?
https://www.tensorflow.org/api_docs/python/state_ops/sharing_variables#get_variable?
https://www.tensorflow.org/versions/r0.10/api_docs/python/constant_op/?
https://www.tensorflow.org/api_docs/python/state_ops/
轉(zhuǎn)自:http://blog.csdn.net/u012436149/article/details/53696970
總結(jié)
以上是生活随笔為你收集整理的variable与get_variable的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問(wèn)題。
- 上一篇: 勒索病毒再次对能源行业数据安全保护敲响警
- 下一篇: ncl如何添加线shp文件_NCL画图个