循环神经网络重要的论文博客汇总

来源:互联网 发布:类中静态变量和常量php 编辑:程序博客网 时间:2024/06/01 08:13

Recurrent Neural Network 学习之路

来源于http://blog.csdn.net/yangyangyang20092010/article/details/50374289

Followed by distinctive figures in corresponding articles.

1. Read paper: “A critical review of Recurrent Neural Networks for Sequence Learning”.


2. Read blog: LSTM简介以及数学推导(FULL BPTT)


3. Read blog: Recurrent Neural Networks Tutorial, Part 1 – Introduction to RNNs


Another way to build deep Recurrent Neural Networks:Generating Sequences with Recurrent Neural Networks



4. Read blog:The unreasonable effectiveness of Recurrent Neural Network - Andrej Karpathy blog





5. Read blog:Understanding LSTM Networks (Nice post series)



6. Read the important paper: “Supervised Sequence Labelling with Recurrent Neural Networks” - Alex Graves

 

7. Read the special tutorial from Schmidhuber (IDSIA):

http://people.idsia.ch/~juergen/lstm/sld001.htm


8. Take a note for this paper: A clockwork RNN


9. read a paper about early stopping for training: Early stopping-but when?

10. keep in touch with the work from DeepMind: https://deepmind.com/publications.html


Code learning:

1. blog: http://www.wildml.com/2015/09/recurrent-neural-networks-tutorial-part-1-introduction-to-rnns/

2. Andrej Karpathy:

Minimal character-level language model with a Vanilla Recurrent Neural Network

3. code from uyaseen. It’s proved to be runable.

https://github.com/uyaseen/theano-recurrence




0 0
原创粉丝点击