首页 > Web开发 > 详细

吴恩达《深度学习》-课后测验-第一门课 (Neural Networks and Deep Learning)-Week 3 - Shallow Neural Networks(第三周测验 - 浅层神 经网络)

时间:2019-12-12 11:57:04      阅读:117      评论:0      收藏:0      [点我收藏+]

Week 3 Quiz - Shallow Neural Networks(第三周测验 - 浅层神 经网络)

\1. Which of the following are true? (Check all that apply.) Notice that I only list correct options(以下哪一项是正确的?只列出了正确的答案)

【 】?? is a matrix in which each column is one training example.(??是一个矩阵,其中每个列都 是一个训练样本。)

【 】\(??_4^{[2]}\) is the activation output by the 4th neuron of the 2nd layer(??4 [2]是第二层第四层神经 元的激活的输出。)

【 】\(??^{[2](12)}\) denotes the activation vector of the 2nd layer for the 12th training example.(表\(??^{[2](12)}\)示第二层和第十二层的激活向量。)

【 】\(??^{[2]}\) denotes the activation vector of the 2nd layer.(\(??^{[2]}\) 表示第二层的激活向量。)

答案

全对

\2. The tanh activation usually works better than sigmoid activation function for hidden units because the mean of its output is closer to zero, and so it centers the data better for the next layer. True/False?(tanh 激活函数通常比隐藏层单元的 sigmoid 激活函数效果更好,因为其 输出的平均值更接近于零,因此它将数据集中在下一层是更好的选择,请问正确吗?)

【 】True(正确) 【 】 False(错误)

答案

True

Note: You can check this post and(this paper)(请注意,你可以看一下这篇文章 和这篇文档.)

As seen in lecture the output of the tanh is between -1 and 1, it thus centers the data which makes the learning simpler for the next layer.(tanh 的输出在-1 和 1 之间,因此它将数据集中在一起,使得下 一层的学习变得更加简单。)

\3. Which of these is a correct vectorized implementation of forward propagation for layer ??, where ?? ≤ ?? ≤ ??? Notice that I only list correct options(其中哪一个是第 l 层向前传播的正确 向量化实现,其中?? ≤ ?? ≤ ??)(以下哪一项是正确的?只列出了正确的答案)

【 】$??^{[??]} = ??^{[??]}??^{[???1]} + ??^{[??]} $

【 】\(??^{[??]} = ??^{[??]} (??^{[??]} )\)

答案

全对

\4. You are building a binary classifier for recognizing cucumbers (y=1) vs. watermelons (y=0). Which one of these activation functions would you recommend using for the output layer?(您 正在构建一个识别黄瓜(y = 1)与西瓜(y = 0)的二元分类器。 你会推荐哪一种激活函数 用于输出层?)

【 】 ReLU 【 】 Leaky ReLU 11 【 】 sigmoid 【 】 tanh

答案

sigmoid

Note: The output value from a sigmoid function can be easily understood as a probability.(注意:来自 sigmoid 函数的输出值可以很容易地理解为概率。)

Sigmoid outputs a value between 0 and 1 which makes it a very good choice for binary classification.You can classify as 0 if the output is less than 0.5 and classify as 1 if the output is more than 0.5. It can be done with tanh as well but it is less convenient as the output is between -1 and 1. Sigmoid 输出的值介于 0 和 1 之间,这使其成为二元分类的一个非常好的选择。 如果输出小于 0.5,则可以将其归类为 0,如果输出大于 0.5,则归类为 1。 它也可以用 tanh 来完成,但是它不太方便,因为输出在-1 和 1 之间。)

吴恩达《深度学习》-课后测验-第一门课 (Neural Networks and Deep Learning)-Week 3 - Shallow Neural Networks(第三周测验 - 浅层神 经网络)

原文:https://www.cnblogs.com/phoenixash/p/12027801.html

(0)
(0)
   
举报
评论 一句话评论(0
关于我们 - 联系我们 - 留言反馈 - 联系我们:wmxa8@hotmail.com
© 2014 bubuko.com 版权所有
打开技术之扣,分享程序人生!