首页 > 其他 > 详细

Sequence Model-week1编程题1(RNN step by step)

时间:2020-06-22 23:43:36      阅读:89      评论:0      收藏:0      [点我收藏+]

一步步搭建循环神经网络

将在numpy中实现一个循环神经网络

Recurrent Neural Networks (RNN) are very effective for Natural Language Processing and other sequence tasks because they have "memory". 他们可以读取一个输入 \(x^{\langle t \rangle}\) (such as words) one at a time, 并且通过隐藏层激活 从一个 time-step 传递到下一个 time-step 来记住一些信息(information/context). 这允许单向RNN(uni-directional RNN)从过去获取信息来处理后面的输入,双向RNN(A bidirection RNN) 可以从过去和未来中获取上下文。

Notation:

  • Superscript \([l]\) denotes an object associated with the \(l^{th}\) layer.

    • Example: \(a^{[4]}\) is the \(4^{th}\) layer activation. \(W^{[5]}\) and \(b^{[5]}\) are the \(5^{th}\) layer parameters.
  • Superscript \((i)\) denotes an object associated with the \(i^{th}\) example.

    • Example: \(x^{(i)}\) is the \(i^{th}\) training example input.
  • Superscript \(\langle t \rangle\) denotes an object at the \(t^{th}\) time-step.

    • Example: \(x^{\langle t \rangle}\) is the input x at the \(t^{th}\) time-step. \(x^{(i)\langle t \rangle}\) is the input at the \(t^{th}\) timestep of example \(i\).
  • Lowerscript \(i\) denotes the \(i^{th}\) entry of a vector.

    • Example: \(a^{[l]}_i\) denotes the \(i^{th}\) entry of the activations in layer \(l\).

Sequence Model-week1编程题1(RNN step by step)

原文:https://www.cnblogs.com/douzujun/p/13179646.html

(0)
(0)
   
举报
评论 一句话评论(0
关于我们 - 联系我们 - 留言反馈 - 联系我们:wmxa8@hotmail.com
© 2014 bubuko.com 版权所有
打开技术之扣,分享程序人生!