The paper:
Hui Zou, Trevor Hastie, and Robert Tibshirani,
Sparse Principal Component Analysis,
Journal of computational and Graphical Statistics, 15(2): 265-286, 2006.
Reproduction of the Synthetic Example in Section 5.2 using R programming:
1 library(elasticnet) 2 3 ## sample version of SPCA 4 n = 1000 5 v1 = rnorm(n,0,sqrt(290)) 6 v2 = rnorm(n,0,sqrt(300)) 7 v3 = -.3*v1 + 0.925*v2 + rnorm(n) 8 x1 = v1 + rnorm(n) 9 x2 = v1 + rnorm(n) 10 x3 = v1 + rnorm(n) 11 x4 = v1 + rnorm(n) 12 13 x5 = v2 + rnorm(n) 14 x6 = v2 + rnorm(n) 15 x7 = v2 + rnorm(n) 16 x8 = v2 + rnorm(n) 17 18 x9 = v3 + rnorm(n) 19 x10 = v3 + rnorm(n) 20 21 x = cbind(x1,x2,x3,x4,x5,x6,x7,x8,x9,x10) 22 x.cov = t(x) %*% x/n; head(x.cov) 23 a = spca(x, 2, type=‘predictor‘, sparse=‘varnum‘, para=c(4,4), lambda=0) 24 a
25 ## population version of SPCA 26 g1 = matrix(290, 4, 4) 27 diag(g1) = 291 28 29 g2 = matrix(300, 4, 4) 30 diag(g2) = 301 31 32 g3 = matrix(283.7875, 2, 2) 33 diag(g3) = diag(g3)+1 34 35 36 g1g3 = matrix(-87, 4, 2) 37 g2g3 = matrix(277.5, 4, 2) 38 39 # construct the exact covariance matrix 40 x.cov = matrix(0, 10, 10) 41 x.cov[1:4,1:4] = g1 42 x.cov[5:8,5:8] = g2 43 x.cov[9:10,9:10] = g3 44 x.cov[1:4,9:10] = g1g3 45 x.cov[9:10,1:4] = t(g1g3) 46 x.cov[5:8,9:10] = g2g3 47 x.cov[9:10,5:8] = t(g2g3) 48 49 50 b = spca(x.cov, 2, type=‘Gram‘, sparse=‘varnum‘, para=c(4,4), lambda=0) 51 b
The results of the population version using exact covariance matrix are exactly as in the paper:
> b Call: spca(x = x.cov, K = 2, para = c(4, 4), type = "Gram", sparse = "varnum", lambda = 0) 2 sparse PCs Pct. of exp. var. : 40.9 39.5 Num. of non-zero loadings : 4 4 Sparse loadings PC1 PC2 [1,] 0.0 0.5 [2,] 0.0 0.5 [3,] 0.0 0.5 [4,] 0.0 0.5 [5,] 0.5 0.0 [6,] 0.5 0.0 [7,] 0.5 0.0 [8,] 0.5 0.0 [9,] 0.0 0.0 [10,] 0.0 0.0
But the sample version may randomly vary a little.
> a Call: spca(x = x, K = 2, para = c(4, 4), type = "predictor", sparse = "varnum", lambda = 0) 2 sparse PCs Pct. of exp. var. : 37.9 37.6 Num. of non-zero loadings : 4 4 Sparse loadings PC1 PC2 x1 0.000 -0.303 x2 0.000 -0.533 x3 0.000 -0.576 x4 0.000 -0.540 x5 -0.492 0.000 x6 -0.287 0.000 x7 -0.481 0.000 x8 -0.666 0.000 x9 0.000 0.000 x10 0.000 0.000
Having fun learning sparse PCA!
Sparse PCA: reproduction of the synthetic example
原文:http://www.cnblogs.com/shalijiang/p/4077566.html