keras深度学习入门

来源:互联网 发布:姚明nba生涯数据统计 编辑:程序博客网 时间:2024/04/28 09:31

目标:完成对Iris数据集分类

Iris是4维数据,所以第一层的neuron的数目时4;定义两个hidden-layer,每层有10个neuron;因为要对Iris数据分成3类,所以最后一层的neuron数目为3,且activation函数为softmax,将输出结果normalized到0-1之间。

重要函数:

keras.utils.to_categorical(),将数据转换成one-hot类型,比如data_label=[0,1,4],函数会根据label中最大的数字来定义one-hot data的维度,那么上面的label对应的one-hot数据为:

[[ 1.  0.  0.  0.  0.] 

[ 0.  1.  0.  0.  0.]

[ 0.  0.  0.  0.  1.]]

fit函数中的train是神经网络的输入数据,label是神经网络的输出数据,都是对应到每个neuron的,所以这个问题的train是个(none,4)类型的数据,输出是(none,3)类型的数据,输入输出每条一一对应。fit()中的batch_size,是随机抽样训练的规模,在gpu加速之后可以并行计算,且提高了accuracy。

代码:

# -*- coding: utf-8 -*-"""Created on Fri Sep 15 20:00:39 2017@author: wjw"""import kerasimport numpy as npfrom keras.layers.core import Dense, Activation def readText(filePath):    lines = open(filePath,'r').readlines()    data = []    dataClass = []    for line in lines:        dataList = line.split(',')        data.append([float(dataList[0]),float(dataList[1]),float(dataList[2]),float(dataList[3])])        dataClass.append(dataList[4].split("\n")[0])       new_class = []        for name in dataClass:        if name=="Iris-setosa":            new_class.append(0)        elif name=="Iris-versicolor":            new_class.append(1)        else:            new_class.append(2)                return np.array(data),np.array(new_class)model = keras.models.Sequential()#初始化一个神经网络model.add(Dense(input_dim=4,output_dim=10))#Dense表示fully connected 的神经网络model.add(Activation("sigmoid"))model.add(Dense(output_dim=10))model.add(Activation("sigmoid"))model.add(Dense(output_dim=3))model.add(Activation("softmax"))model.compile(loss="categorical_crossentropy",optimizer="adam",metrics=["accuracy"])#"categorical crossentropy"按照crossenrropy的方法定义损失函数#优化方法是"adam"filePath = r"E:\data\iris.txt"traindata, dataClass = readText(filePath)dataClass = keras.utils.to_categorical(dataClass)print(dataClass)model.fit(traindata,dataClass,batch_size=20,epochs=2000)score=model.evaluate(traindata,dataClass,batch_size=20)print(score)

部分训练结果:

Epoch 1/200150/150 [==============================] - 0s - loss: 1.2763 - acc: 0.3333     Epoch 2/200150/150 [==============================] - 0s - loss: 1.2426 - acc: 0.3333     Epoch 3/200150/150 [==============================] - 0s - loss: 1.2117 - acc: 0.3333     Epoch 4/200150/150 [==============================] - 0s - loss: 1.1872 - acc: 0.3333     Epoch 5/200150/150 [==============================] - 0s - loss: 1.1667 - acc: 0.3333     Epoch 6/200150/150 [==============================] - 0s - loss: 1.1497 - acc: 0.3333     Epoch 7/200150/150 [==============================] - 0s - loss: 1.1342 - acc: 0.3333     Epoch 8/200150/150 [==============================] - 0s - loss: 1.1214 - acc: 0.3333     Epoch 9/200150/150 [==============================] - 0s - loss: 1.1107 - acc: 0.3333     Epoch 10/200150/150 [==============================] - 0s - loss: 1.1015 - acc: 0.3333     Epoch 11/200150/150 [==============================] - 0s - loss: 1.0942 - acc: 0.3333     Epoch 12/200150/150 [==============================] - 0s - loss: 1.0873 - acc: 0.2467     Epoch 13/200150/150 [==============================] - 0s - loss: 1.0816 - acc: 0.1867     Epoch 14/200150/150 [==============================] - 0s - loss: 1.0747 - acc: 0.3267     Epoch 15/200150/150 [==============================] - 0s - loss: 1.0693 - acc: 0.3200     Epoch 16/200150/150 [==============================] - 0s - loss: 1.0637 - acc: 0.3933     Epoch 17/200150/150 [==============================] - 0s - loss: 1.0583 - acc: 0.3933     Epoch 18/200150/150 [==============================] - 0s - loss: 1.0524 - acc: 0.4733     Epoch 19/200150/150 [==============================] - 0s - loss: 1.0461 - acc: 0.6533     Epoch 20/200150/150 [==============================] - 0s - loss: 1.0393 - acc: 0.8000     Epoch 21/200150/150 [==============================] - 0s - loss: 1.0318 - acc: 0.7333     Epoch 22/200150/150 [==============================] - 0s - loss: 1.0248 - acc: 0.7200     Epoch 23/200150/150 [==============================] - 0s - loss: 1.0165 - acc: 0.7133     Epoch 24/200150/150 [==============================] - 0s - loss: 1.0087 - acc: 0.7067     Epoch 25/200150/150 [==============================] - 0s - loss: 1.0005 - acc: 0.6933     Epoch 26/200150/150 [==============================] - 0s - loss: 0.9926 - acc: 0.7067     Epoch 27/200150/150 [==============================] - 0s - loss: 0.9839 - acc: 0.7467     Epoch 28/200150/150 [==============================] - 0s - loss: 0.9752 - acc: 0.7467     Epoch 29/200150/150 [==============================] - 0s - loss: 0.9666 - acc: 0.7333     Epoch 30/200150/150 [==============================] - 0s - loss: 0.9581 - acc: 0.7400     Epoch 31/200150/150 [==============================] - 0s - loss: 0.9488 - acc: 0.7267     Epoch 32/200150/150 [==============================] - 0s - loss: 0.9398 - acc: 0.7200     Epoch 33/200150/150 [==============================] - 0s - loss: 0.9305 - acc: 0.7467     




原创粉丝点击