[Machine Learning][Octave]Logistic Regression Practice

来源:互联网 发布:淘宝首页 ios源代码 编辑:程序博客网 时间:2024/06/11 01:29

After learning the Andrew ng’s lessons, I use his exercises to test the logistic regression.
First, I wrote a sigmond function.

function g = sigmoid(z)g = 1 ./ (1 .+ exp(-z));end

And then I wrote the costFunction.
It needs to be noticed that X is a m*n matrix, y is a m*1 vector.(At first, there are some bugs in my programme,and after displaying many variable size, I find the reason which is that the expression that I deduce is not fit for the offering X,y size.)
It’s the same that the codes are optimized by vectorization instead of using for loop.

function [J, grad] = costFunction(theta, X, y)m = length(y); % number of training examplesJ = (1/m) * (sum((-y') .* log(sigmoid(theta'*X')))-sum((1 .- y') .*log(1 .- sigmoid(theta'*X'))));grad = (1/m) .* ((sigmoid(theta'*X')-y')*X)';end

At last, the ex2.m works. And it needs to be wrote down that the code to call the fminunc.

initial_theta = zeros(n + 1, 1);options = optimset('GradObj', 'on', 'MaxIter', 400);%  Run fminunc to obtain the optimal theta%  This function will return theta and the cost [theta, cost] = ...    fminunc(@(t)(costFunction(t, X, y)), initial_theta, options);

And the explanation in his instruction.

To specify the actual function we are minimizing, we use a “short-hand” for specifying functions with the @(t) ( costFunction(t, X, y) ) . This creates a function, with argument t, which calls your costFunction. This allows us to wrap the costFunction for use with fminunc.

I’m still confused to the statement ‘@(t)(costFunction(t,X,y))’, I don’t know what the ‘@’ and ‘t’ means. So, here is a problem needs to be solved.
Mark here!

阅读全文
0 0
原创粉丝点击