keras深度学习入门
来源:互联网 发布:姚明nba生涯数据统计 编辑:程序博客网 时间:2024/04/28 09:31
目标:完成对Iris数据集分类
Iris是4维数据,所以第一层的neuron的数目时4;定义两个hidden-layer,每层有10个neuron;因为要对Iris数据分成3类,所以最后一层的neuron数目为3,且activation函数为softmax,将输出结果normalized到0-1之间。
重要函数:
keras.utils.to_categorical(),将数据转换成one-hot类型,比如data_label=[0,1,4],函数会根据label中最大的数字来定义one-hot data的维度,那么上面的label对应的one-hot数据为:
[[ 1. 0. 0. 0. 0.]
[ 0. 1. 0. 0. 0.]
[ 0. 0. 0. 0. 1.]]
fit函数中的train是神经网络的输入数据,label是神经网络的输出数据,都是对应到每个neuron的,所以这个问题的train是个(none,4)类型的数据,输出是(none,3)类型的数据,输入输出每条一一对应。fit()中的batch_size,是随机抽样训练的规模,在gpu加速之后可以并行计算,且提高了accuracy。
代码:
# -*- coding: utf-8 -*-"""Created on Fri Sep 15 20:00:39 2017@author: wjw"""import kerasimport numpy as npfrom keras.layers.core import Dense, Activation def readText(filePath): lines = open(filePath,'r').readlines() data = [] dataClass = [] for line in lines: dataList = line.split(',') data.append([float(dataList[0]),float(dataList[1]),float(dataList[2]),float(dataList[3])]) dataClass.append(dataList[4].split("\n")[0]) new_class = [] for name in dataClass: if name=="Iris-setosa": new_class.append(0) elif name=="Iris-versicolor": new_class.append(1) else: new_class.append(2) return np.array(data),np.array(new_class)model = keras.models.Sequential()#初始化一个神经网络model.add(Dense(input_dim=4,output_dim=10))#Dense表示fully connected 的神经网络model.add(Activation("sigmoid"))model.add(Dense(output_dim=10))model.add(Activation("sigmoid"))model.add(Dense(output_dim=3))model.add(Activation("softmax"))model.compile(loss="categorical_crossentropy",optimizer="adam",metrics=["accuracy"])#"categorical crossentropy"按照crossenrropy的方法定义损失函数#优化方法是"adam"filePath = r"E:\data\iris.txt"traindata, dataClass = readText(filePath)dataClass = keras.utils.to_categorical(dataClass)print(dataClass)model.fit(traindata,dataClass,batch_size=20,epochs=2000)score=model.evaluate(traindata,dataClass,batch_size=20)print(score)
部分训练结果:
Epoch 1/200150/150 [==============================] - 0s - loss: 1.2763 - acc: 0.3333 Epoch 2/200150/150 [==============================] - 0s - loss: 1.2426 - acc: 0.3333 Epoch 3/200150/150 [==============================] - 0s - loss: 1.2117 - acc: 0.3333 Epoch 4/200150/150 [==============================] - 0s - loss: 1.1872 - acc: 0.3333 Epoch 5/200150/150 [==============================] - 0s - loss: 1.1667 - acc: 0.3333 Epoch 6/200150/150 [==============================] - 0s - loss: 1.1497 - acc: 0.3333 Epoch 7/200150/150 [==============================] - 0s - loss: 1.1342 - acc: 0.3333 Epoch 8/200150/150 [==============================] - 0s - loss: 1.1214 - acc: 0.3333 Epoch 9/200150/150 [==============================] - 0s - loss: 1.1107 - acc: 0.3333 Epoch 10/200150/150 [==============================] - 0s - loss: 1.1015 - acc: 0.3333 Epoch 11/200150/150 [==============================] - 0s - loss: 1.0942 - acc: 0.3333 Epoch 12/200150/150 [==============================] - 0s - loss: 1.0873 - acc: 0.2467 Epoch 13/200150/150 [==============================] - 0s - loss: 1.0816 - acc: 0.1867 Epoch 14/200150/150 [==============================] - 0s - loss: 1.0747 - acc: 0.3267 Epoch 15/200150/150 [==============================] - 0s - loss: 1.0693 - acc: 0.3200 Epoch 16/200150/150 [==============================] - 0s - loss: 1.0637 - acc: 0.3933 Epoch 17/200150/150 [==============================] - 0s - loss: 1.0583 - acc: 0.3933 Epoch 18/200150/150 [==============================] - 0s - loss: 1.0524 - acc: 0.4733 Epoch 19/200150/150 [==============================] - 0s - loss: 1.0461 - acc: 0.6533 Epoch 20/200150/150 [==============================] - 0s - loss: 1.0393 - acc: 0.8000 Epoch 21/200150/150 [==============================] - 0s - loss: 1.0318 - acc: 0.7333 Epoch 22/200150/150 [==============================] - 0s - loss: 1.0248 - acc: 0.7200 Epoch 23/200150/150 [==============================] - 0s - loss: 1.0165 - acc: 0.7133 Epoch 24/200150/150 [==============================] - 0s - loss: 1.0087 - acc: 0.7067 Epoch 25/200150/150 [==============================] - 0s - loss: 1.0005 - acc: 0.6933 Epoch 26/200150/150 [==============================] - 0s - loss: 0.9926 - acc: 0.7067 Epoch 27/200150/150 [==============================] - 0s - loss: 0.9839 - acc: 0.7467 Epoch 28/200150/150 [==============================] - 0s - loss: 0.9752 - acc: 0.7467 Epoch 29/200150/150 [==============================] - 0s - loss: 0.9666 - acc: 0.7333 Epoch 30/200150/150 [==============================] - 0s - loss: 0.9581 - acc: 0.7400 Epoch 31/200150/150 [==============================] - 0s - loss: 0.9488 - acc: 0.7267 Epoch 32/200150/150 [==============================] - 0s - loss: 0.9398 - acc: 0.7200 Epoch 33/200150/150 [==============================] - 0s - loss: 0.9305 - acc: 0.7467
阅读全文
0 0
- keras深度学习入门
- tensorflow keras入门,深度学习跑起来
- [3]深度学习和Keras----Keras深度学习框架入门例子
- 深度学习框架Keras
- 深度学习 之 keras
- 深度学习框架Keras
- 深度学习keras连接
- 深度学习资源-Keras
- 深度学习:Keras入门(一)之基础篇
- [7]深度学习和Keras---- 快速入门心得
- 深度学习:keras 学习笔记
- 深度学习:keras 学习笔记
- 基于keras的深度学习基本概念讲解——深度学习之从小白到入门
- 基于keras的深度学习基本概念讲解——深度学习之从小白到入门
- 【深度学习】深度学习 之 Keras
- 深度学习框架Keras安装
- Keras深度学习框架配置
- 深度学习框架Keras安装
- 面试参考别人总结
- Unity3D的Resource类使用
- Intellij IDEA Debug调试技巧
- css布局知识
- winpcap 快速入门 实现抓包&&发包
- keras深度学习入门
- java的构造方法能不能与普通对象方法同名
- 单链表
- PowerDesign提示未安装打印机
- oracle之 RAC本地数据文件迁移至ASM
- 索引和视图
- C语言实现静态顺序表
- 【学习笔记】C++primer plus 10. 11. 12. 对象
- 解决错误error LNK2019: 无法解析的外部符号 __imp__函数名,该符号在函数..