deep learning之PCA in 2D matlab 实现
来源:互联网 发布:电脑文档加密软件 编辑:程序博客网 时间:2024/09/21 06:21
PCA data
-6.7644914e-01 -6.3089308e-01 -4.8915202e-01 -4.8005424e-01 -3.7842021e-01 -3.3788391e-01 -3.2023528e-01 -3.1108837e-01 -2.3145555e-01 -1.9623727e-01 -1.5678926e-01 -1.4900779e-01 -1.0861557e-01 -1.0506308e-01 -8.0899829e-02 -7.1157518e-02 -6.3251073e-02 -2.6007219e-02 -2.2553443e-02 -5.8489047e-03 -4.3935323e-03 -1.7309716e-03 7.8223728e-03 7.5386969e-02 8.6608396e-02 9.6406046e-02 1.0331683e-01 1.0531131e-01 1.1493296e-01 1.3052813e-01 1.6626253e-01 1.7901863e-01 1.9267343e-01 1.9414427e-01 1.9770003e-01 2.3043613e-01 3.2715844e-01 3.2737163e-01 3.2922364e-01 3.4869293e-01 3.7500704e-01 4.2830153e-01 4.5432503e-01 5.4422436e-01 6.6539963e-01
-4.4722050e-01 -7.4778067e-01 -3.9074344e-01 -5.6036362e-01 -3.4291940e-01 -1.3832158e-01 1.2360939e-01 -3.3934986e-01 -8.2868433e-02 -2.4759514e-01 -1.0914760e-01 4.2243921e-01 -5.2329327e-02 -2.0126541e-01 1.3016657e-01 1.2293321e-01 -3.4787750e-01 -1.4584897e-01 -1.0559656e-01 -5.4200847e-02 1.6915422e-02 -1.1069762e-01 9.0859816e-02 1.5269096e-01 -9.4416463e-02 1.5116385e-01 -1.3540126e-01 2.4592698e-01 5.1087447e-02 2.4583340e-01 -5.9535372e-02 2.9704742e-01 1.0168115e-01 1.4258649e-01 1.0662592e-01 3.1698532e-01 6.1577841e-01 4.3911172e-01 2.7156501e-01 1.3572389e-01 3.1918066e-01 1.5122962e-01 3.4979047e-01 6.2316971e-01 5.2018811e-01
也可以在深度学习的官网上下载。
matlab实现代码如下:
clc;clear all;close all%%================================================================%% Step 0: Load data% We have provided the code to load data from pcaData.txt into x.% x is a 2 * 45 matrix, where the kth column x(:,k) corresponds to% the kth data point.Here we provide the code to load natural image data into x.% You do not need to change the code below.% MK=importdata('CNS.txt');x = load('pcaData.txt','-ascii');figure(1);scatter(x(1, :), x(2, :));title('Raw data');%%================================================================%% Step 1a: Implement PCA to obtain U % Implement PCA to obtain the rotation matrix U, which is the eigenbasis% sigma. % -------------------- YOUR CODE HERE -------------------- u = zeros(size(x, 1)); % You need to compute this 每一列就代表一个特征,2*45 就代表有2个特征avg=mean(x,2);x=x-repmat(avg,1,size(x,2));%均值化sigma=x*x'/size(x,2);[U,S,V]=svd(sigma);u=U;% -------------------------------------------------------- hold onplot([0 u(1,1)], [0 u(2,1)]);plot([0 u(1,2)], [0 u(2,2)]);scatter(x(1, :), x(2, :));%plot(x(1, :), x(2, :),'o');hold off%%================================================================%% Step 1b: Compute xRot, the projection on to the eigenbasis% Now, compute xRot by projecting the data on to the basis defined% by U. Visualize the points by performing a scatter plot.% -------------------- YOUR CODE HERE -------------------- xRot = zeros(size(x)); % You need to compute thisxRot=U'*x;% -------------------------------------------------------- % Visualise the covariance matrix. You should see a line across the% diagonal against a blue background.figure(2);scatter(xRot(1, :), xRot(2, :));title('xRot');%%================================================================%% Step 2: Reduce the number of dimensions from 2 to 1. % Compute xRot again (this time projecting to 1 dimension).% Then, compute xHat by projecting the xRot back onto the original axes % to see the effect of dimension reduction% -------------------- YOUR CODE HERE -------------------- k = 1; % Use k = 1 and project the data onto the first eigenbasisxHat = zeros(size(x)); % You need to compute thisxHat=U*[U(:,1:k),zeros(size(U,1),size(U,2)-k)]'*x;% xHat = u*([u(:,1),zeros(size(x,1),1)]'*x);% -------------------------------------------------------- figure(3);scatter(xHat(1, :), xHat(2, :));title('xHat');%%================================================================%% Step 3: PCA Whitening% Complute xPCAWhite and plot the results.epsilon = 1e-5;% -------------------- YOUR CODE HERE -------------------- xPCAWhite = zeros(size(x)); % You need to compute thisxPCAWhite=diag(1./sqrt(diag(S)+epsilon))*U'*x;% -------------------------------------------------------- figure(4);scatter(xPCAWhite(1, :), xPCAWhite(2, :));title('xPCAWhite');%%================================================================%% Step 3: ZCA Whitening% Complute xZCAWhite and plot the results.% -------------------- YOUR CODE HERE -------------------- xZCAWhite = zeros(size(x)); % You need to compute thisxZCAWhite = U*xPCAWhite;%xZCAWhite = U*diag(1./sqrt(diag(S)+epsilon))*U'*x;% -------------------------------------------------------- figure(5);scatter(xZCAWhite(1, :), xZCAWhite(2, :));title('xZCAWhite');%% Congratulations! When you have reached this point, you are done!% You can now move onto the next PCA exercise. :)
- deep learning之PCA in 2D matlab 实现
- 【Deep Learning】2、Preprocessing: PCA and Whitening
- deep learning 之 PCA and Whitening 代码篇
- ufldl.PCA-2D实现
- PCA, PCA whitening and ZCA whitening in 2D
- UFLDL教程:Exercise:PCA in 2D & PCA and Whitening
- In on deep learning
- deep learning in NLP
- Autoencoders in Deep Learning
- IR in deep learning
- Deep Learning in OpenCV
- UFLDL Exercise:PCA in 2D
- UFLDL Exercise:PCA in 2D
- Exercise:PCA in 2D 代码示例
- UFLDL Exercise:PCA in 2D
- Deep learning------------PCA (principle component analysis)
- Deep learning:十(PCA和whitening)
- Deep learning:十(PCA和whitening)
- android studio快捷键大全
- bzoj3530【SDOI2014】数数
- iOS隐藏状态栏
- CRC8(8位循环左移)算法的实现
- 如何写出无法维护的代码
- deep learning之PCA in 2D matlab 实现
- QlikView通过svgReader扩展件制作地理热图
- 封装网络解析方法
- epoll的事件类型
- Python选择语句
- 查询数据可以用的函数
- 我的第一个博客
- javascript apply/call解读
- QT学习资源