机器学习之感知机python实现
来源:互联网 发布:java ftp下载文件 编辑:程序博客网 时间:2024/05/16 06:47
机器学习之感知机python实现
- 机器学习之感知机python实现
- 一 理论基础
- 损失函数
- 更新参数
- 二 python实现
- 代码
- 结果
- 一 理论基础
一. 理论基础
1. 损失函数
2. 更新参数
二. python实现
1. 代码
Perceptron.py
#encoding=utf-8'''implements the perceptron'''import numpy as npimport matplotlib.pyplot as pltimport pandas as pdclass Perceptron: def __init__(self, alpha=0.1, iterator_num=100): self.alpha = alpha self.iterator_num = iterator_num def train(self, x_train, y_train): x_train = np.mat(x_train) y_train = np.mat(y_train) [m, n] = x_train.shape self.theta = np.mat(np.zeros((n, 1))) self.b = 0 self.__stochastic_gradient_decent__(x_train, y_train) def __gradient_decent__(self, x_train, y_train): x_train = np.mat(x_train) y_train = np.mat(y_train) for i in xrange(self.iterator_num): self.theta = self.theta + self.alpha * x_train.T * y_train self.b = self.b + self.alpha * np.sum(y_train) def __stochastic_gradient_decent__(self, x_train, y_train): x_train = np.mat(x_train) y_train = np.mat(y_train) [m, n] = x_train.shape for i in xrange(self.iterator_num): for j in xrange(m): self.theta = self.theta + self.alpha * x_train[j].T * y_train[j] self.b = self.b + self.alpha * y_train[j]def main(): ''' test unit ''' print "step 1: load data..." data = pd.read_csv('/home/LiuYao/Documents/MarchineLearning/data.csv') # data = data.ix[0:30, :] x = np.mat(data[['x', 'y']].values) y = np.mat(data['label'].values).T y[y == 0] = -1 print y[y == 1] print "positive samples : ", y[y == 1].shape print "nagetive samples : ", y[y == -1].shape ## step 2: training... print "step 2: training..." perceptron = Perceptron(alpha=0.1,iterator_num=100) perceptron.train(x, y) ## step 3: show the decision boundary print "step 3: show the decision boundary..." print perceptron.theta x_min = np.min(x[:, 0]) x_max = np.max(x[:, 0]) y_min = (-perceptron.b - perceptron.theta[0] * x_min) / perceptron.theta[1] y_max = (-perceptron.b - perceptron.theta[0] * x_max) / perceptron.theta[1] plt.plot([x_min, x_max], [y_min[0,0], y_max[0,0]]) plt.scatter(x[:, 0].getA(), x[:, 1].getA(), c=y.getA()) plt.show()if __name__ == '__main__': main()
2. 结果
最终结果会有一些问题,不知道为什么,请知道的大神解答一下,梯度下降和随机梯度下降都试了,结果一样。
* 用前三十个数据的时候,分界面正确;
* 用前60个数据的时候,分界面就离谱了。
* 所有数据的时候分界面明显有一些偏差;
前30个数据
前60个数据
所有数据
数据如下:
x,y,label10.6,13.5,012.55,12.1,012.05,13.95,010.85,15.05,07.5,12.75,09.45,11.25,08.95,13.3,08.45,15.5,012.15,12.2,05.15,8.25,017.45,6.0,018.55,5.8,116.1,4.45,113.95,6.75,115.4,7.85,117.7,9.25,119.3,9.8,120.5,8.1,18.15,2.05,111.7,4.9,121.1,4.6,121.1,9.75,117.65,11.4,16.95,9.9,15.8,12.05,17.35,10.0,08.15,11.05,07.4,11.65,04.55,11.35,04.4,15.2,04.2,16.6,07.85,17.1,013.45,18.95,015.35,18.9,018.35,17.1,016.85,15.75,015.75,10.8,013.95,9.25,010.25,10.7,09.85,12.05,014.25,17.45,010.15,17.55,07.0,14.1,04.85,11.8,04.75,8.6,03.25,6.65,01.9,9.55,02.1,14.75,01.5,10.9,05.75,9.65,07.65,8.1,09.6,9.1,010.1,2.0,112.2,2.75,18.0,6.3,16.8,5.1,17.35,3.65,19.5,4.65,113.05,7.7,117.85,5.15,124.35,7.4,120.4,13.1,114.55,15.4,124.95,11.05,122.15,11.15,122.85,5.85,122.5,4.15,119.3,1.6,115.6,0.25,114.5,1.55,114.5,3.95,110.35,7.1,113.65,6.75,114.0,5.55,112.15,4.8,110.5,4.15,122.95,8.75,121.25,7.05,117.05,7.9,117.05,7.9,1
阅读全文
0 0
- 机器学习之感知机python实现
- 机器学习-感知机python实现
- 机器学习之感知机&&python实践
- Python机器学习-感知机原理及代码实现
- 机器学习之感知机
- 机器学习之感知机
- 机器学习之感知机模型及其实现
- 机器学习系列之感知机
- 机器学习入门之感知机
- 机器学习基础之----感知机----
- 机器学习——感知器算法及python实现
- 机器学习 - 感知机
- 机器学习 ---感知机
- 机器学习之感知器
- 机器学习之感知器
- [python]感知机学习算法实现
- 机器学习算法——感知机(Python源码)
- 感知机Python实现
- shell脚本编程(二)------面试题
- android 图片的三级缓存(内存,sd卡,网络)
- 你了解System.out.println()的真正含义吗?
- PHP 核心技术与最佳实践
- Android 疯狂讲义学习
- 机器学习之感知机python实现
- Android页面跳转以及数据传递实现
- 收集Java面试题知识点(Java基础部分三)
- HTTP1.0 HTTP 1.1 HTTP 2.0主要区别
- Dash第一
- leetcode 239. Sliding Window Maximum 双端队列 滑动窗口最大值
- 编辑框CCEditBox 设置字体不起作用
- NB-IOT的OTA测试(功率、灵敏度测试)
- 百度 echarts 饼图 获取动态数据 显示百分比