[Coursera][Stanford] Machine Learning Week 1 2

来源:互联网 发布:c语言平均值数组 编辑:程序博客网 时间:2024/06/05 11:17

有很多东西不回忆就会遗忘,在此也算给自己的学习做一个笔记及记录。

开始时间:2014年8月

1. Introduce
        对Machine Learning 的定义: Field of study that give computers the ability to learn without being explicity programed.
        课程涉及讨论的算法:   ---Supervised learning      监督学习
                                              ---Unsupervised learning  非监督学习
                                              Others: Reinforcement learning, recommender  systems
                                               Also talks about: Practral advice for applying learning algorithm.

2. Linear Regression with one Variable
        介绍一个变量的线性回归算法,介绍了梯度下降算法来计算theta。

3. Multiple features (variables)
        多个变量的情况,介绍Gradient descent 中的 Feature Scaling, Learning rate

4. Normal equation
        Method to solve for theta analytically.

5. 比较 Gradient Descent 和 Normal Equation ,

6. 介绍Octave基本功能

7. vectorized implementation
       
                                                           

Programming Exercises 1

2 Linear regression with one variable
2.2 Gradient Descent
2.2.3 Computing the cost J(θ)
J = (1 / (2 * m)) * sum(((X * theta - y).^2));
2.2.4 Gradient descent
    % 1st    theta = theta - alpha * ( 1 / m) * (X' * (X * theta - y));    % 2nd    %temp1 = theta(1) - alpha * ( 1 / m) * sum(X * theta - y);    %temp2 = theta(2) - alpha * ( 1 / m) * sum((X(:,2) .* (X * theta - y)));    %theta = [temp1;temp2];
3 Linear regression with multiple variables
3.1 Feature Normalization
mu = mean(X);sigma = std(X);% problem: need for many other case[m n] = size(X);for i = 1:n    X_norm(:,i) = (X(:,i) - mu(i)) / sigma(i);end
3.2 Gradient Descent
J = (1 / (2 * m)) * sum(((X * theta - y).^2));

theta = theta - alpha * ( 1 / m) * (X' * (X * theta - y));
3.3 Normal Equations
theta = pinv(X' * X) * X' * y;


0 0