Brief overview of backward and forward

来源:互联网 发布:ubuntu的apache服务器 编辑:程序博客网 时间:2024/05/17 02:20

Let’s say we only feed in one data point.

  • out = model:forward(xi) computes fw(xi) where fw is our model with its current parameters w, and stores the result in out.
  • loss = criterion:forward(out, yi) computes the loss (fw(xi),yi) with respect to the true value yi.
  • dl_dout = criterion:backward(out, yi) computes (...)fw(xi).
  • model:backward(xi, dl_dout) computes (...)w and stores this gradient in a place we have a reference to, usually called gradParameters in our code.
0 0
原创粉丝点击