首页 > Web开发 > 详细

Neural Network and DeepLearning (1.1)

时间:2017-03-02 12:52:12      阅读:400      评论:0      收藏:0      [点我收藏+]

1.perceptrons

A perceptron takes several binary inputs ,x1,x2,...,and produce a single binary output:

  技术分享

weights:real numbers expressing the importance of the respective inputs to the output.

The neuron‘s output, 0or1, is determined by whether the weighted sum 技术分享is less than or greater than some threshold value. Just like the weights, the threshold is a real number which is a parameter of the neuron.

alebraic term:    

  技术分享         

rewritten:

技术分享

 

a NAND gate

  example:

  技术分享

0,0-->positive

0,1-->positive

1,0-->positive    

1,1-->negative

we can use perceptrons to compute simple logical functions.

In fact, we can use networks of perceptrons to compute any logical function at all.

  技术分享

2.sigmoid neurons

HOW:small changes in any weight (or bias) causes only a small corresponding change in the output  .

    技术分享

Changing the weights and biases over and over to produce better and better output. The network would be learning.

sigmoid function:

  技术分享

The output of a sigmoid neuron with input x1,x2,..., weights w1,w2,..., and bias b ,is:

  技术分享

 

z=w.x+b     output(即σ(z))

positive     1

negative    0

In fact, the exact form of σ isn‘t so important - what really matters is the shape of the function when plotted.

  技术分享

This shape is a smoothed out version of a step function:  

    技术分享

技术分享 is well approximated by    

  技术分享


3.The architecture of neural networks

input layer, output layer, hidden layer(means nothing more than "not an input or an output")  

example:  

  技术分享

 

feedforward networks:(前馈神经网络)  

  There are no loops in the network - information is always fed forward, never fed back.

recurrent neural networks:(循环神经网络)

  The idea in these models is to have neurons which fire for some limited duration of time, before becoming quiescent. That firing can stimulate other neurons, which may fire a little while later, also for a limited duration. That causes still more neurons to fire, and so over time we get a cascade of neurons firing. Loops don‘t cause problems in such a model, since a neuron‘s output only affects its input at some later time, not instantaneously.

4.A simple network to classify handwritten digits

Two sub-problems

(1)breaking an image containing many digits into a sequence of separate images, each containing a single digit.

for example break the image    

技术分享

into six separate images,

技术分享

(2)classify each individual digit.

recognize that the digit

技术分享

is a 5.

 

A three-layer neural network:

技术分享

 

Neural Network and DeepLearning (1.1)

原文:http://www.cnblogs.com/zhoulixue/p/6489724.html

(0)
(0)
   
举报
评论 一句话评论(0
关于我们 - 联系我们 - 留言反馈 - 联系我们:wmxa8@hotmail.com
© 2014 bubuko.com 版权所有
打开技术之扣,分享程序人生!