In mathematical statistics, the Fisher information(sometimes simply called information) is a way of measuring the amount of information that an observable random variable X carries about an unknown parameter θ of a distribution that models X. Formally, it is the variance of the score, or the expected value of the observed information(参数分布分数的方差,或者是观察信息的期望值). In Bayesian statistics, the asymptotic distribution of the posterior mode(后验模型的渐进分布取决于费舍尔信息值) depends on the Fisher information and not on the prior(而不是先验分布) (according to the Bernstein–von Mises theorem, which was anticipated by Laplace for exponential families(拉普拉斯指数家族簇)). The role of the Fisher information in the asymptotic theory of maximum-likelihood estimation was emphasized by the statistician Ronald Fisher (following some initial results by Francis Ysidro Edgeworth). The Fisher information is also used in the calculation of the Jeffreys prior, which is used in Bayesian statistics(fisher理论也不仅仅只用在最大似然参数的渐进估计中,同时还在贝叶斯估计中计算先验值)。