Multivariance Linear Regression

来源:互联网 发布:mac xlplayer 编辑:程序博客网 时间:2024/04/28 07:02

转载:http://www.cnblogs.com/tornadomeet/archive/2013/03/15/2962116.html,加一点内容

linear_grad_ascent_test3.m代码

linear_grad_ascent函数参考Linear Regression篇

function linear_grad_ascent_test3%多变量线下回归测试%% 方法一:梯度下降法x = load('ex3x.dat');y = load('ex3y.dat');x = [ones(size(x,1),1) x];meanx = mean(x);%求均值sigmax = std(x);%求标准偏差x(:,2) = (x(:,2)-meanx(2))./sigmax(2);x(:,3) = (x(:,3)-meanx(3))./sigmax(3);figureitera_num = 5000; %尝试的迭代次数sample_num = size(x,1); %训练样本的次数alpha = [0.01, 0.03, 0.1, 0.3, 1, 1.3];%因为差不多是选取每个3倍的学习率来测试,所以直接枚举出来plotstyle = {'b', 'r', 'g', 'k', 'b--', 'r--'};theta_grad_descent = zeros(size(x(1,:)));for alpha_i = 1:length(alpha) %尝试看哪个学习速率最好    theta = zeros(size(x,2),1); %theta的初始值赋值为0%     Jtheta = zeros(itera_num, 1);%     for i = 1:itera_num %计算出某个学习速率alpha下迭代itera_num次数后的参数       %         Jtheta(i) = (1/(2*sample_num)).*(x*theta-y)'*(x*theta-y);%Jtheta是个行向量%         grad = (1/sample_num).*x'*(x*theta-y);%         theta = theta - alpha(alpha_i).*grad;%     end        [theta,Jtheta] = linear_grad_ascent(x, y, theta, alpha(alpha_i), itera_num);        plot(0:49, Jtheta(1:50),char(plotstyle(alpha_i)),'LineWidth', 2)%此处一定要通过char函数来转换    hold on        if(1 == alpha(alpha_i)) %通过实验发现alpha为1时效果最好,则此时的迭代后的theta值为所求的值        theta_grad_descent = theta    endendlegend('0.01','0.03','0.1','0.3','1','1.3');xlabel('Number of iterations')ylabel('Cost function')p = theta_grad_descent'*[1 (1600-meanx(2))/sigmax(2) (3-meanx(3)/sigmax(3))]'end
alpha 的取值不能太大,会造成Jtheta的值越界,而且迭代过程中不收敛??
<span style="font-family: Arial, Helvetica, sans-serif;">测试数据:</span><a target=_blank target="_blank" href="http://openclassroom.stanford.edu/MainFolder/DocumentPage.php?course=DeepLearning&doc=exercises/ex3/ex3.html" style="font-family: Arial, Helvetica, sans-serif;">http://openclassroom.stanford.edu/MainFolder/DocumentPage.php?course=DeepLearning&doc=exercises/ex3/ex3.html</a>

0 0
原创粉丝点击