线性归一化
一般可以这样写
function N = linear_normalize(D) % Linear Normalization max_value = max(max(D)); min_value = min(min(D)); N = (D-min_value) / (max_value-min_value); end
%matlab code %normalization function normalized = normalize(A,minOut,maxOut) %input: % A: the matrix to be normalize % minOut: the minimum value after normalized % maxOut: the maximum value after normalized %output: %normalized:the normalized matrix minA=min(min(A));maxA=max(max(A)); normalized=(A-minA)/(maxA-minA)*(maxOut-minOut)+minOut; end %end function
%normalize test load A; %记住A的最大值和最小值,以便反归一化时用 minA=min(min(A));maxA=max(max(B)); %归一化A到B B=normalize(A,0,255); %反归一化B到A inverse_B=normalize(B,minA,maxA); %此时inverse_B应等于A。
function [B]= gaussNormalization(M) % input: % M: the matrix to be normalized % output: % B: the normalized matrix % Gauss Normalization Formula: % Z=((X - μ) / (3 * σ) + 1) / 2 % μis the average value of X, % σis the square deviation of X % coder:flyskymlf % time:2009.10.31 [x,y]=size(M); ave=sum(sum(M))/(x*y);% average value total=sum(sum(M.^2-M.*(2*ave)+ave^2));%square deviation if total~=0 len=1/(3*sqrt(total/(x*y))); else len=1.0; end B=(M.*len-ave*len+1)/2;
注:代码中对公式做了一定的变形,例如求方差的时候,
把方差公示(x_{i,j}-ave)^2 因式分解改写成 x_{i,j}^2-2*x_{i,j}*ave+ave^2 ,即(a-b)^2=a^2-2ab+b^2 。
(抱歉,由于baidu敲不上公式,所以直接用latex中定义的公式书写方式来书写的)
附:高斯归一化公式:
Z=((X - μ) / (3 * σ) + 1) / 2. 其中μ为均值,σ为标准差(σ的平方是方差)
Energy normalization:
y = y / sum(y.^2);
Max value 100:
y = y / max(y) * 100;
Max absolute value 8:
y = y / max(abs(y)) * 8;
reference:
http://hi.baidu.com/mhyuycwnspbqswe/item/08bcd456e19729968c12eddc
https://answers.yahoo.com/question/index?qid=20100114224156AABcN2j
http://blog.csdn.net/mpbchina/article/details/7384487
原文:http://blog.csdn.net/lansatiankongxxc/article/details/24020515