梯度下降法解多元线性回归

来源:互联网 发布:网络人脸搜索网站 编辑:程序博客网 时间:2024/05/19 17:06

今天试验了梯度下降法求多元线性回归,一般给出拟合数据,根据误差平方最小可以将系数表示为输入数据的函数。这里先选择了一组初始系数,设定步长,迭代最大500次,结果比较稳定。

原始数据X:

9.20000000000000 2.732000000000001.47100000000000 0.3320000000000001.13800000000000
9.10000000000000 3.732000000000001.82000000000000 0.1120000000000000.828000000000000
8.60000000000000 4.882000000000001.87200000000000 0.3830000000000002.13100000000000
10.2330000000000 3.968000000000001.58700000000000 0.1810000000000001.34900000000000
5.60000000000000 3.732000000000001.84100000000000 0.2970000000000001.81500000000000
5.36700000000000 4.236000000000001.87300000000000 0.06300000000000001.35200000000000
6.13300000000000 3.146000000000001.98700000000000 0.2800000000000001.64700000000000
8.20000000000000 4.646000000000001.61500000000000 0.3790000000000004.56500000000000
8.80000000000000 4.378000000000001.54300000000000 0.7440000000000002.07300000000000
7.60000000000000 3.864000000000001.59900000000000 0.3420000000000002.43200000000000

y:

1.15500000000000
1.14600000000000
1.84100000000000
1.36500000000000
0.863000000000000
0.903000000000000
0.114000000000000
0.898000000000000
1.93000000000000
1.10400000000000

%%% 梯度法democlear;close all;%%  Read data,initial.load regressdemo.mat;alpha=0.0021;MaxIter=500;etc=1e-3;[M,N]=size(X);X=[ones(M,1),X];thetaO=ones(N+1,1);Jth=zeros(N+1,1);for i=1:N+1    csum=0;    for j=1:M        csum=csum+(X(j,:)*thetaO-y(j))*X(j,i);    end    Jth(i)=csum;end%%  Iterationiter=0;thetaN=thetaO-alpha*Jth;while iter<=MaxIter && norm(thetaN-thetaO,2)>etc    thetaO=thetaN;    for i=1:N+1        csum=0;        for j=1:M            csum=csum+(X(j,:)*thetaO-y(j))*X(j,i);        end        Jth(i)=csum;    end    thetaN=thetaO-alpha*Jth;    iter=iter+1;endif iter>MaxIter    disp('Reach Max Iteration!');endif norm(thetaN-thetaO,2)<etc    disp('Precision Got!');enderr=abs(X*thetaN-y);disp('Accuracy:');disp(norm(err,2));

得到最终回归系数:

0.406038824192068
0.0644305972109697
0.206808862959160
-0.300906571023362
0.999414853186913
-0.189645999954232

误差绝对值向量为:

0.0821571494721867
0.0254255517077873
0.455874008914591
0.0315024000516371
0.0743105693626545
0.0321560807957608
0.707400200100757
0.0242838259943150
0.165433064539306
0.00974798969549395

这是直接用MATLAB里regress求得的回归系数:

0.388820826048249
0.0610652390388895
0.662061220787943
-1.15507263289582
0.961473551930387
-0.337782747357585

残差绝对值为

0.159927109843097
0.00490429158351668
0.208701871341208
0.161017859528120
0.115408118045256
0.0584888131579902
0.149933625751205
0.0244701503154238
0.127534569005781
0.0325036269454444

结论:迭代步长选择对结果影响很大,步长过小迭代慢,过大会不收敛。

0 0
原创粉丝点击