[Machine Learning][Octave]Logistic Regression Practice
来源:互联网 发布:淘宝首页 ios源代码 编辑:程序博客网 时间:2024/06/11 01:29
After learning the Andrew ng’s lessons, I use his exercises to test the logistic regression.
First, I wrote a sigmond function.
function g = sigmoid(z)g = 1 ./ (1 .+ exp(-z));end
And then I wrote the costFunction.
It needs to be noticed that X is a m*n matrix, y is a m*1 vector.(At first, there are some bugs in my programme,and after displaying many variable size, I find the reason which is that the expression that I deduce is not fit for the offering X,y size.)
It’s the same that the codes are optimized by vectorization instead of using for loop.
function [J, grad] = costFunction(theta, X, y)m = length(y); % number of training examplesJ = (1/m) * (sum((-y') .* log(sigmoid(theta'*X')))-sum((1 .- y') .*log(1 .- sigmoid(theta'*X'))));grad = (1/m) .* ((sigmoid(theta'*X')-y')*X)';end
At last, the ex2.m works. And it needs to be wrote down that the code to call the fminunc.
initial_theta = zeros(n + 1, 1);options = optimset('GradObj', 'on', 'MaxIter', 400);% Run fminunc to obtain the optimal theta% This function will return theta and the cost [theta, cost] = ... fminunc(@(t)(costFunction(t, X, y)), initial_theta, options);
And the explanation in his instruction.
To specify the actual function we are minimizing, we use a “short-hand” for specifying functions with the @(t) ( costFunction(t, X, y) ) . This creates a function, with argument t, which calls your costFunction. This allows us to wrap the costFunction for use with fminunc.
I’m still confused to the statement ‘@(t)(costFunction(t,X,y))’, I don’t know what the ‘@’ and ‘t’ means. So, here is a problem needs to be solved.
Mark here!
- [Machine Learning][Octave]Logistic Regression Practice
- Machine Learning:Logistic Regression
- [Machine Learning]--Logistic Regression
- Machine Learning--logistic regression
- Machine Learning:Logistic Regression
- Machine Learning Notes - Logistic Regression
- Machine Learning - Regularized Logistic Regression
- machine learning之logistic regression
- [Machine Learning] [Octave]Gradient Descent Practice
- 转 machine learning 之logistic regression
- 【Machine Learning】逻辑回归 Logistic Regression
- Coursera Machine Learning Week 3.1: Logistic Regression
- Machine Learning—Classification and logistic regression
- Machine Learning week 3 quiz : Logistic Regression
- Machine Learning - Logistic Regression - Two-class Classification
- Machine Learning - Logistic Regression - Multi-class Classification
- Machine Learning week 3 quiz : Logistic Regression
- Programming Exercise 2: Logistic Regression Machine Learning
- redis随笔(一)----redis数据淘汰策略
- uva 10934 Dropping water balloons(dp)
- 对于将数组分类并输出
- 8月英语总结
- html
- [Machine Learning][Octave]Logistic Regression Practice
- 史蒂夫·乔布斯的成功,因为他掌握了“深度工作”
- 05:计算分数的浮点数值
- HibernateSessionFactory建立-使用ThreadLocal
- Android开发图片三级缓存
- Mac OS 安装 Git 环境
- What should be in my .gitignore for an Android Studio project?提交到git上的文件
- c语言关闭正在运行的应用程序
- 用Python和Pygame写游戏-从入门到精通(7) 混杂的例子