TensorFlow-方法train.exponential_decay()

来源:互联网 发布:网络大电影网购机器人 编辑:程序博客网 时间:2024/05/18 20:04

这个方法提供指数级衰减,用来让模型在训练后期,变化的精度高一些

When training a model, it is often recommended to lower the learning rate asthe training progresses.

This function applies an exponential decay functionto a provided initial learning rate. It requires a

global_step value tocompute the decayed learning rate. You can just pass a TensorFlow variablethat

you increment at each training step.

exponential_decay(    learning_rate,    global_step,    decay_steps,    decay_rate,    staircase=False,    name=None)

公式:

decayed_learning_rate = learning_rate *                        decay_rate ^ (global_step / decay_steps)
If the argument staircase is True, then global_step / decay_steps is aninteger division and the decayed learning rate follows a staircase function.

如果staircase是True,那么 global_step/decay_steps会是整数。

例子:

learning_rate = 0.1

decay_rate = 0.96

decay_steps = 1000

global_step = 1

rate = 0.0999959179

///////////////

learning_rate = 0.1

decay_rate = 0.96

decay_steps = 1000

global_step = 900

rate = 0.0963926921



1 0
原创粉丝点击