首页 > 其他 > 详细

Normalizing flows

时间:2020-02-12 09:04:51      阅读:70      评论:0      收藏:0      [点我收藏+]

probability VS likelihood: 

https://zhuanlan.zhihu.com/p/25768606

http://sdsy888.me/%E9%9A%8F%E7%AC%94-Writing/2018/%E4%BC%BC%E7%84%B6%EF%BC%88likelihood%EF%BC%89%E5%92%8C%E6%A6%82%E7%8E%87%EF%BC%88probability%EF%BC%89%E7%9A%84%E5%8C%BA%E5%88%AB%E4%B8%8E%E8%81%94%E7%B3%BB/

https://www.psychologicalscience.org/observer/bayes-for-beginners-probability-and-likelihood

To approach MLE today, let’s come from the Bayesian angle, and use Bayes Theorem to frame our question as such:

P(β∣y) = P(y∣β) x P(β) / P(y)

Or, in English:

posterior = likelihood x prior / evidence

Normalizing flows

原文:https://www.cnblogs.com/dulun/p/12297635.html

(0)
(0)
   
举报
评论 一句话评论(0
关于我们 - 联系我们 - 留言反馈 - 联系我们:wmxa8@hotmail.com
© 2014 bubuko.com 版权所有
打开技术之扣,分享程序人生!