求KPCA高手指导
我是KPCA新手,在网上下载了短KPCA程序。应该是对的但是我不太明白假设我有一个矩阵,我对他加个随机矩阵相乘测量矩阵X,在对他进行KPCA分析,求主元
s=[rand(1,100);square((1:100)/100*20);exprnd(1,1,100)]
Wg=rand_orth(m);%m个正交列
x=Wg*s;
kernel={'gaussian',0.1};
Wcca=kpca_calc(x,kernel);
而函数kpca_calc为如下
function basis = kpca_calc(xs,kernel,d,kmataxis);
% KPCA_CALC calculates a kernel核函数 PCA basis基底,底线.
%
% usage
% basis = kpca_calc(xs,kernel,d);
%
% input
% xs matrix of column vectors列向量
% kernel a chosen kernel, default = {'gaussian',1}核函数默认为高斯函数
% d number of eigenvectors特征向量 (give for efficiency),
% default = size(xs,2)
% kmataxis is a figure handle where the kernel matrix will be
% plotted (default = 0 no plot)
%
% output
% basis struct containing the following entries
% basis.V eigenvectors
% basis.Lambda eigenvalues
% basis.xs used vectors
% basis.kernel used kernel
%
% see also
% kpca_plot, kpca_map
%
% STH * 12MAR2002
if ~exist('kernel')|isempty(kernel), kernel = {'gaussian',1}; end
if ~exist('d')|isempty(d), d = size(xs,2); end
if ~exist('kmataxis')|isempty(kmataxis), kmataxis = 0; end
% d can't be larger than the number of samples
if d>size(xs,2)
warning('d is larger than the number of samples, resetting d')
d = size(xs,2);
end
xsc = size(xs,2); % column of xs
% calculate the kernel matrix
K = kpca_matrix(xs,xs,kernel); %调用kpca_matrix.m
if kmataxis>0
cf = gcf;
figure(kmataxis)
imagesc(K)
figure(cf)
end
% center the kernel matrix
sk = size(K,1); % note, K is square matrix
rowK = sum(K)/sk; % the sums of the columns
allK = sum(K(:))/(sk*sk); % the sum of all entries
K = K - repmat(rowK,[sk 1]) - repmat(rowK',[1 sk]) + repmat(allK,[sk sk]);
% find the eigenvectors and eigenvalues
switch 2
case 1
[V,Lambda] = jdqr(K/sk,d);
case 2
opts.disp = 0;
[V,Lambda,flag] = eigs(K/sk,d,'LM',opts);
if flag
warning([mfilename ': not all eigenvalues converged']) %converged 收敛的
end
end
% we can not assume that the eigenvalues are sorted分类
[dummy, ind] = sort(-diag(Lambda));
Lambda = Lambda(ind,ind);
V = V(:,ind);
% due to numerical instabilities不稳定性 some eigenvalues might be negative
% or smaller than eps, we want to ignore those
valid = find(diag(Lambda)<2*eps);
if length(valid)<1
% all eigenvalues are valid, keep d unchanged
else
% some are not valid
d = valid(1)-1;
warning([mfilename ': some eigenvalues of kernel matrix are less than eps'])
end
clear valid
% cut off those eigenvalues and eigenvectors
V = V(:,1:d);
Lambda = Lambda(1:d,1:d);
% normalize标准化 the eigenvectors特征向量 in feature space在特征空间
V = V*inv(sqrtm(sk*Lambda));
% assign struct结构
basis.V = V;
basis.Lambda = Lambda;
basis.xs = xs;
basis.kernel = kernel;
程序运行完了之后我要得到主元只要将x*basis.V 就可以了么?
求高手指导 教育