将在numpy中实现一个循环神经网络
Recurrent Neural Networks (RNN) are very effective for Natural Language Processing and other sequence tasks because they have "memory". 他们可以读取一个输入 \(x^{\langle t \rangle}\) (such as words) one at a time, 并且通过隐藏层激活 从一个 time-step 传递到下一个 time-step 来记住一些信息(information/context). 这允许单向RNN(uni-directional RNN)从过去获取信息来处理后面的输入,双向RNN(A bidirection RNN) 可以从过去和未来中获取上下文。
Notation:
Superscript \([l]\) denotes an object associated with the \(l^{th}\) layer.
Superscript \((i)\) denotes an object associated with the \(i^{th}\) example.
Superscript \(\langle t \rangle\) denotes an object at the \(t^{th}\) time-step.
Lowerscript \(i\) denotes the \(i^{th}\) entry of a vector.
Sequence Model-week1编程题1(RNN step by step)
原文:https://www.cnblogs.com/douzujun/p/13179646.html