Tensorflow学习率的learning rate decay

来源:互联网 发布:上海炫踪网络 怎么样 编辑:程序博客网 时间:2024/05/21 07:57
x = tf.Variable(1.0)y = x.assign_add(1)with tf.Session() as sess:    sess.run(tf.global_variables_initializer())    print sess.run(x)    print sess.run(y)    print sess.run(x)

输出 1,2,2注意其x会变的

import tensorflow as tfglobal_step = tf.Variable(0, trainable=False)initial_learning_rate = 0.1 #初始学习率learning_rate = tf.train.exponential_decay(initial_learning_rate,                                           global_step=global_step,                                           decay_steps=10,decay_rate=0.9)opt = tf.train.GradientDescentOptimizer(learning_rate)add_global = global_step.assign_add(1)with tf.Session() as sess:    tf.global_variables_initializer().run()    print(sess.run(learning_rate))    for i in range(1):        _, rate = sess.run([add_global, learning_rate])        print(rate)

参考:
http://blog.csdn.net/u012436149/article/details/62058318

阅读全文
0 0
原创粉丝点击