首页 > 其他 > 详细

机器学习笔记- from Andrew Ng的教学视频

时间:2015-01-09 17:11:16      阅读:495      评论:0      收藏:0      [点我收藏+]

最近算是一段空闲期,不想荒废,记得之前有收藏一个机器学习的链接Andrew Ng的网易公开课,其中的overfiting部分做组会报告时涉及到了,这几天有时间决定把这部课程学完,好歹算是有个粗浅的认识。

本来想去网上查一查机器学习的书籍,发现李航的《统计学习方法》和PRML(Pattern Recognition And Machine Learning)很受人推崇,有空再看吧。

然后在图书馆碰到了天佑,给我推荐了coursera这个网站,上面有Andrew Ng针对网络版的机器学习教程,挺好的。以下笔记基于此课程。

https://www.coursera.org/course/ml

week one:

a:machine learning

Supervised learning:Regression Classification

Unsupervised learning:cluster

and Reinforcement learning, recommender systems

b: Linear regression with one variable

Linear regression:

Hypothesis,Cost function(为何最小二乘估计中分母有个系数2),Contour plots(轮廓图中一条线上的值相等)

Gradient descent:

技术分享

alpha:learning rate

If α is too large, gradient descent can overshoot the minimum. It may fail to converge, or even diverge.

Gradient descent can converge to a local minimum, even with the learning rate α fixed.

Gradient descent for linear regression:

convex Function for it.

“Batch” Gradient Descent:

Batch: Each step of gradient descent uses all the training examples.

c: Linear Algebra Review

If A is an m x m matrix, and if it has an inverse

(如何判断一个矩阵存不存在逆矩阵)

Matrices that don’t have an inverse are “singular” or “degenerate”

机器学习笔记- from Andrew Ng的教学视频

原文:http://www.cnblogs.com/holyprince/p/4213623.html

(0)
(0)
   
举报
评论 一句话评论(0
关于我们 - 联系我们 - 留言反馈 - 联系我们:wmxa8@hotmail.com
© 2014 bubuko.com 版权所有
打开技术之扣,分享程序人生!