Machine Learning week 8 quiz: programming assignment-K-Means Clustering and PCA

来源:互联网 发布:攻击app软件 编辑:程序博客网 时间:2024/06/06 12:56

一、ex7.m

%% Machine Learning Online Class%  Exercise 7 | Principle Component Analysis and K-Means Clustering%%  Instructions%  ------------%%  This file contains code that helps you get started on the%  exercise. You will need to complete the following functions:%%     pca.m%     projectData.m%     recoverData.m%     computeCentroids.m%     findClosestCentroids.m%     kMeansInitCentroids.m%%  For this exercise, you will not need to change any code in this file,%  or any other files other than those mentioned above.%%% Initializationclear ; close all; clc%% ================= Part 1: Find Closest Centroids ====================%  To help you implement K-Means, we have divided the learning algorithm %  into two functions -- findClosestCentroids and computeCentroids. In this%  part, you shoudl complete the code in the findClosestCentroids function. %fprintf('Finding closest centroids.\n\n');% Load an example dataset that we will be usingload('ex7data2.mat');% Select an initial set of centroidsK = 3; % 3 Centroidsinitial_centroids = [3 3; 6 2; 8 5];% Find the closest centroids for the examples using the% initial_centroidsidx = findClosestCentroids(X, initial_centroids);fprintf('Closest centroids for the first 3 examples: \n')fprintf(' %d', idx(1:3));fprintf('\n(the closest centroids should be 1, 3, 2 respectively)\n');fprintf('Program paused. Press enter to continue.\n');pause;%% ===================== Part 2: Compute Means =========================%  After implementing the closest centroids function, you should now%  complete the computeCentroids function.%fprintf('\nComputing centroids means.\n\n');%  Compute means based on the closest centroids found in the previous part.centroids = computeCentroids(X, idx, K);fprintf('Centroids computed after initial finding of closest centroids: \n')fprintf(' %f %f \n' , centroids');fprintf('\n(the centroids should be\n');fprintf('   [ 2.428301 3.157924 ]\n');fprintf('   [ 5.813503 2.633656 ]\n');fprintf('   [ 7.119387 3.616684 ]\n\n');fprintf('Program paused. Press enter to continue.\n');pause;%% =================== Part 3: K-Means Clustering ======================%  After you have completed the two functions computeCentroids and%  findClosestCentroids, you have all the necessary pieces to run the%  kMeans algorithm. In this part, you will run the K-Means algorithm on%  the example dataset we have provided. %fprintf('\nRunning K-Means clustering on example dataset.\n\n');% Load an example datasetload('ex7data2.mat');% Settings for running K-MeansK = 3;max_iters = 10;% For consistency, here we set centroids to specific values% but in practice you want to generate them automatically, such as by% settings them to be random examples (as can be seen in% kMeansInitCentroids).initial_centroids = [3 3; 6 2; 8 5];% Run K-Means algorithm. The 'true' at the end tells our function to plot% the progress of K-Means[centroids, idx] = runkMeans(X, initial_centroids, max_iters, true);fprintf('\nK-Means Done.\n\n');fprintf('Program paused. Press enter to continue.\n');pause;%% ============= Part 4: K-Means Clustering on Pixels ===============%  In this exercise, you will use K-Means to compress an image. To do this,%  you will first run K-Means on the colors of the pixels in the image and%  then you will map each pixel on to it's closest centroid.%  %  You should now complete the code in kMeansInitCentroids.m%fprintf('\nRunning K-Means clustering on pixels from an image.\n\n');%  Load an image of a birdA = double(imread('bird_small.png'));% If imread does not work for you, you can try instead%   load ('bird_small.mat');A = A / 255; % Divide by 255 so that all values are in the range 0 - 1% Size of the imageimg_size = size(A);% Reshape the image into an Nx3 matrix where N = number of pixels.% Each row will contain the Red, Green and Blue pixel values% This gives us our dataset matrix X that we will use K-Means on.X = reshape(A, img_size(1) * img_size(2), 3);% Run your K-Means algorithm on this data% You should try different values of K and max_iters hereK = 16; max_iters = 10;% When using K-Means, it is important the initialize the centroids% randomly. % You should complete the code in kMeansInitCentroids.m before proceedinginitial_centroids = kMeansInitCentroids(X, K);% Run K-Means[centroids, idx] = runkMeans(X, initial_centroids, max_iters);fprintf('Program paused. Press enter to continue.\n');pause;%% ================= Part 5: Image Compression ======================%  In this part of the exercise, you will use the clusters of K-Means to%  compress an image. To do this, we first find the closest clusters for%  each example. After that, we fprintf('\nApplying K-Means to compress an image.\n\n');% Find closest cluster membersidx = findClosestCentroids(X, centroids);% Essentially, now we have represented the image X as in terms of the% indices in idx. % We can now recover the image from the indices (idx) by mapping each pixel% (specified by it's index in idx) to the centroid valueX_recovered = centroids(idx,:);% Reshape the recovered image into proper dimensionsX_recovered = reshape(X_recovered, img_size(1), img_size(2), 3);% Display the original image subplot(1, 2, 1);imagesc(A); title('Original');% Display compressed image side by sidesubplot(1, 2, 2);imagesc(X_recovered)title(sprintf('Compressed, with %d colors.', K));fprintf('Program paused. Press enter to continue.\n');pause;

二、findClosestCentroids.m

function idx = findClosestCentroids(X, centroids)%FINDCLOSESTCENTROIDS computes the centroid memberships for every example%   idx = FINDCLOSESTCENTROIDS (X, centroids) returns the closest centroids%   in idx for a dataset X where each row is a single example. idx = m x 1 %   vector of centroid assignments (i.e. each entry in range [1..K])%% Set KK = size(centroids, 1); % centroids*1 i.e. K*1 % K% You need to return the following variables correctly.idx = zeros(size(X,1), 1); % m*1% ====================== YOUR CODE HERE ======================% Instructions: Go over every example, find its closest centroid, and store%               the index inside idx at the appropriate location.%               Concretely, idx(i) should contain the index of the centroid%               closest to example i. Hence, it should be a value in the %               range 1..K%% Note: You can use a for-loop over the examples to compute this.%m = size(X, 1); % mfor i = 1:mdist = [];for j = 1:Kdist(j) =  sum((X(i, :)-centroids(j, :)) .^ 2);end[min_dist, min_idx] = min(dist);idx(i) = min_idx;end% =============================================================end

三、computeCentroids.m

function centroids = computeCentroids(X, idx, K)%COMPUTECENTROIDS returs the new centroids by computing the means of the %data points assigned to each centroid.%   centroids = COMPUTECENTROIDS(X, idx, K) returns the new centroids by %   computing the means of the data points assigned to each centroid. It is%   given a dataset X where each row is a single data point, a vector%   idx of centroid assignments (i.e. each entry in range [1..K]) for each%   example, and K, the number of centroids. You should return a matrix%   centroids, where each row of centroids is the mean of the data points%   assigned to it.%% Useful variables[m n] = size(X); % m*n% You need to return the following variables correctly.centroids = zeros(K, n); % k*n% ====================== YOUR CODE HERE ======================% Instructions: Go over every centroid and compute mean of all points that%               belong to it. Concretely, the row vector centroids(i, :)%               should contain the mean of the data points assigned to%               centroid i.%% Note: You can use a for-loop over the centroids to compute this.%for i = 1:Kidx_set = find(i == idx);ck = numel(idx_set);if(0 ~= ck)cen_sum = sum(X(idx_set, :));centroids(i, :) = cen_sum / ck;end end% =============================================================end


四、pca.m

function [U, S] = pca(X)%PCA Run principal component analysis on the dataset X%   [U, S, X] = pca(X) computes eigenvectors of the covariance matrix of X%   Returns the eigenvectors U, the eigenvalues (on diagonal) in S%% Useful values[m, n] = size(X); % m*n% You need to return the following variables correctly.U = zeros(n); % n*nS = zeros(n); % n*n% ====================== YOUR CODE HERE ======================% Instructions: You should first compute the covariance matrix. Then, you%               should use the "svd" function to compute the eigenvectors%               and eigenvalues of the covariance matrix. %% Note: When computing the covariance matrix, remember to divide by m (the%       number of examples).%Omega = X' * X / m;[U S V] = svd(Omega);% =========================================================================end


五、projectData.m

function Z = projectData(X, U, K)%PROJECTDATA Computes the reduced data representation when projecting only %on to the top k eigenvectors%   Z = projectData(X, U, K) computes the projection of %   the normalized inputs X into the reduced dimensional space spanned by%   the first K columns of U. It returns the projected examples in Z.%% You need to return the following variables correctly.Z = zeros(size(X, 1), K); % m*K% ====================== YOUR CODE HERE ======================% Instructions: Compute the projection of the data using only the top K %               eigenvectors in U (first K columns). %               For the i-th example X(i,:), the projection on to the k-th %               eigenvector is given as follows:%                    x = X(i, :)';%                    projection_k = x' * U(:, k);%Ureduce = U(:, 1:K);x = X';Z = x' * Ureduce; % i.e. X * Ureduce % =============================================================end

六、recoverData.m

function X_rec = recoverData(Z, U, K)%RECOVERDATA Recovers an approximation of the original data when using the %projected data%   X_rec = RECOVERDATA(Z, U, K) recovers an approximation the %   original data that has been reduced to K dimensions. It returns the%   approximate reconstruction in X_rec.%% You need to return the following variables correctly.X_rec = zeros(size(Z, 1), size(U, 1)); % size(X)% ====================== YOUR CODE HERE ======================% Instructions: Compute the approximation of the data by projecting back%               onto the original space using the top K eigenvectors in U.%%               For the i-th example Z(i,:), the (approximate)%               recovered data for dimension j is given as follows:%                    v = Z(i, :)';%                    recovered_j = v' * U(j, 1:K)';%%               Notice that U(j, 1:K) is a row vector.%               Ureduce = U(:, 1:K);X_rec = Z * Ureduce';% =============================================================end

七、submit results


1 0