Mini-Batch Gradient Descent

来源:互联网 发布:如何破解MD5的网络包 编辑:程序博客网 时间:2024/05/21 17:31

Mini-Batch Gradient Descent

1. What is Mini-Batch Gradient Descent?

Mini-Batch Gradient Descent is an algorithm between the Batch Gradient Descent and Stochastic Gradient Descent. Concretly, this use some(not one or all) examples(M) for each iteration.

2. Compute Effort

The compute time of this algorithm depends on the examples. It not stable, but the worst case is like Batch Gradient Descent: O(N2)

The table below shows the different among these there Gradient Descent

Batch Gradient Descent Mini-Batch Gradient Descent Stochastic Gradient Descent use 1 example in each iteration use some examples use all example in each iteration relative compute loose somewhat in between relative compute intensive

3. Gradient Descent Formula

For all θi

Jθθi=1mi=1M[hθ(xi)yi](xi)

E.g.,
two parameters θ0,θ1 –> hθ(x)=θ0+θ1x1

For i = 0 :

Jθθ0=1mi=1M[hθ(xi)yi](x0)

For i = 1:

Jθθ1=1mi=1M[hθ(xi)yi](x1)

Note that the datasets need to be shuffled before iteration.

阅读全文
0 0
原创粉丝点击