[#1]Least square and Nearest neighbors
来源:互联网 发布:电脑软件专科学校 编辑:程序博客网 时间:2024/06/06 00:04
- Least square and nearest neighbors
- 1 Least square in learn regression
- 2 Nearest neighbors
- Rationality and difference of least square and nearest neighbors
- rationality least square and nearest neighbors
- extension of these simple procedures
1. Least square and nearest neighbors
1.1 Least square in learn regression
Assume we have a data set
Y=(y(1),y(2),⋯,y(N)) X=(X(1)T,X(2)T,⋯,X(N)T)T
Firstly, we need to choose a loss function. Here we choose the Least squares, which aims to minimize the quadratic function of parameter
which leads to the solution
1.2 Nearest neighbors
In regression, nearest neighbors method averages the outputs of the k-nearest points of
In classification, nearest neighbors method uses the maximum votes of labels of the k-nearest points of
2. Rationality and difference of least square and nearest neighbors
- least square makes huge assumptions about structure but nearest neighbors not
- least square yields stable but possible inaccurate predictions, while predictions of nearest neighbors are often accurate but can be unstable
Note that:
- stable means low variance
- accurate means low bias
rationality least square and nearest neighbors
Suppose that we have random variables
It suffices to minimize
The solution is
And least square and nearest neighbors both aim to approximate the expectation by averaging.
Least square assumes the linear structure and approximate the expectation in square loss function by averaging all training datas.
Nearest neighbors approximates the conditional expectation in solution by averaging the outputs near target
So, two things are happening in approximating of both least square and nearest neighbor.
Least square
- model structure assumption
- averaging over all training data in
EPE Nearest neighbors
1.condition on a small region of target point
x instead of conditioning onit
2.averaging the outputs which are near tox
extension of these simple procedures
There are many complex algorithms are from these two,
- Kernel Methods use weights that decrease smoothly to zero with distance from the target point, rather that the effective 0/1 weights used by k-nearest neighbors.
- In high dimensional spaces the distance kernels are modified to emphasize some variable more than others.
- Local regression fits linear models by locally weighted least squares, rather than fitting constants locally.
- Linear models fit to a basis expansion of the original inputs allow arbitrary complex models.
- Projection pursuit and neural network models consist of sums of nonlinearly transformed linear models.
- [#1]Least square and Nearest neighbors
- 计算Approximate Nearest Neighbors
- k-Nearest Neighbors算法
- 计算Approximate Nearest Neighbors
- Nearest Neighbors matching
- k Nearest Neighbors 简介
- K-Nearest Neighbors algorithm
- K Nearest Neighbors - Classification
- K-Nearest Neighbors KNN
- k-Nearest Neighbors
- 3.1 Maximum likelihood and least square
- Kernel KNN ( K-Nearest Neighbors )
- Classifying with K-Nearest Neighbors
- 【Data Algorithms_Recipes for Scaling up with Hadoop and Spark】Chapter 13 k-Nearest Neighbors
- 基于Tensorflow的机器学习(3) -- KMeans and NN(Nearest Neighbors)
- ML:Scikit-Learn 学习笔记(1) --- Nearest Neighbors 最近邻 综述
- 监督学习之k Nearest Neighbors算法
- Approximate Nearest Neighbors.接近最近邻搜索
- 【jQuery】调用animate()方法制作移动位置的动画
- 【jQuery】调用stop()方法停止当前所有动画效果
- [leetcode] 30. Substring with Concatenation of All Words 解题报告
- 【jQuery】调用delay()方法延时执行动画效果
- 【jQuery】使用load()方法异步请求数据
- [#1]Least square and Nearest neighbors
- 【jQuery】使用getJSON()方法异步加载JSON格式数据
- [leetcode] 5. Longest Palindromic Substring 解题报告
- photoshop学习入门:选择和处理
- Bayes classifier
- CPP-static members
- Android M (API 23) 记录
- Jmeter+maven+Jenkins构建云性能测试平台(一)
- 在团队协作中,该如何提交一份干净、可靠的代码给队友?