用 Keras 编写你的第一个人工神经网络

来源:互联网 发布:centos怎么看版本 编辑:程序博客网 时间:2024/06/06 03:56

简单地训练一个四层全连接网络。

Ref: http://machinelearningmastery.com/tutorial-first-neural-network-python-keras/

 

1. Load Data


from keras.models import Sequentialfrom keras.layers import Denseimport numpy
# fix random seed for reproducibilityseed = 7numpy.random.seed(seed)# 1.播了个随机种子# load pima indians datasetdataset = ("pima-indians-diabetes.csv", delimiter=",")# split into input (X) and output (Y) variablesX = dataset[:,0:8]Y = dataset[:,8]
# 2.读取了数据放在了二维数组

 

2. Define Model


    • 参数1:一层网络的结点个数【第一层(input) 8个 --> 第二层 12个 --> 第三层 8个 --> 第四层(output) 1个】
    • 参数2:初始化方法【0-0.05均匀分布】
    • 参数3:激活函数【第二、三层:rectifier (‘relu‘);: sigmoid】

 

对应的代码,如下:

# create modelmodel = Sequential()
model.add(Dense(
12, input_dim=8, init='uniform', activation='relu'))model.add(Dense(8, init='uniform', activation='relu'))model.add(Dense(1, init='uniform', activation='sigmoid'))

 

3. Compile Model


    • 损失函数:logarithmic loss
    • 逼近模型:efficient gradient descent algorithm

 

对应的代码,如下:

# Compile modelmodel.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])

Finally, because it is a classification problem, we will collect and report the classification accuracy as the metric.

 

Otherwise,见<6. load model>,也可导入已有模型,继续训练。

 

4. Fit Model 


    • 循环次数:epoch
    • 批统计量:batch

 

对应的代码,如下:

# Fit the modelhistory_callback = model.fit(X, Y, nb_epoch=150, batch_size=10)

 

5. Evaluate Model

训练后,使用model.evaluate(...)预测成功率统计:  

# evaluate the modelscores = model.evaluate(X, Y)print("%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))

 

运行结果:acc: 78.91%

Epoch 1/150768/768 [==============================] - 0s - loss: 0.6826 - acc: 0.6328     Epoch 2/150768/768 [==============================] - 0s - loss: 0.6590 - acc: 0.6510     Epoch 3/150768/768 [==============================] - 0s - loss: 0.6475 - acc: 0.6549     Epoch 4/150768/768 [==============================] - 0s - loss: 0.6416 - acc: 0.6615     Epoch 5/150768/768 [==============================] - 0s - loss: 0.6216 - acc: 0.6745     Epoch 6/150768/768 [==============================] - 0s - loss: 0.6128 - acc: 0.6680     Epoch 7/150768/768 [==============================] - 0s - loss: 0.6018 - acc: 0.6927     Epoch 8/150768/768 [==============================] - 0s - loss: 0.5962 - acc: 0.6927     Epoch 9/150768/768 [==============================] - 0s - loss: 0.5991 - acc: 0.6953     Epoch 10/150768/768 [==============================] - 0s - loss: 0.5920 - acc: 0.6927     Epoch 11/150768/768 [==============================] - 0s - loss: 0.5905 - acc: 0.6979     Epoch 12/150768/768 [==============================] - 0s - loss: 0.5883 - acc: 0.6901     Epoch 13/150768/768 [==============================] - 0s - loss: 0.5870 - acc: 0.6953     Epoch 14/150768/768 [==============================] - 0s - loss: 0.5869 - acc: 0.6836     Epoch 15/150768/768 [==============================] - 0s - loss: 0.5815 - acc: 0.6953     Epoch 16/150768/768 [==============================] - 0s - loss: 0.5779 - acc: 0.6966     Epoch 17/150768/768 [==============================] - 0s - loss: 0.5809 - acc: 0.6849     Epoch 18/150768/768 [==============================] - 0s - loss: 0.5818 - acc: 0.6953     Epoch 19/150768/768 [==============================] - 0s - loss: 0.5814 - acc: 0.6901     Epoch 20/150768/768 [==============================] - 0s - loss: 0.5748 - acc: 0.7096     Epoch 21/150768/768 [==============================] - 0s - loss: 0.5758 - acc: 0.7005     Epoch 22/150768/768 [==============================] - 0s - loss: 0.5739 - acc: 0.7135     Epoch 23/150768/768 [==============================] - 0s - loss: 0.5736 - acc: 0.6927     Epoch 24/150768/768 [==============================] - 0s - loss: 0.5750 - acc: 0.6940     Epoch 25/150768/768 [==============================] - 0s - loss: 0.5734 - acc: 0.7031     Epoch 26/150768/768 [==============================] - 0s - loss: 0.5683 - acc: 0.7083     Epoch 27/150768/768 [==============================] - 0s - loss: 0.5688 - acc: 0.7018     Epoch 28/150768/768 [==============================] - 0s - loss: 0.5714 - acc: 0.7070     Epoch 29/150768/768 [==============================] - 0s - loss: 0.5621 - acc: 0.7188     Epoch 30/150768/768 [==============================] - 0s - loss: 0.5647 - acc: 0.7122     Epoch 31/150768/768 [==============================] - 0s - loss: 0.5630 - acc: 0.7135     Epoch 32/150768/768 [==============================] - 0s - loss: 0.5613 - acc: 0.7214     Epoch 33/150768/768 [==============================] - 0s - loss: 0.5594 - acc: 0.7188     Epoch 34/150768/768 [==============================] - 0s - loss: 0.5598 - acc: 0.7187     Epoch 35/150768/768 [==============================] - 0s - loss: 0.5624 - acc: 0.7187     Epoch 36/150768/768 [==============================] - 0s - loss: 0.5615 - acc: 0.7201     Epoch 37/150768/768 [==============================] - 0s - loss: 0.5544 - acc: 0.7214     Epoch 38/150768/768 [==============================] - 0s - loss: 0.5529 - acc: 0.7135     Epoch 39/150768/768 [==============================] - 0s - loss: 0.5550 - acc: 0.7227     Epoch 40/150768/768 [==============================] - 0s - loss: 0.5574 - acc: 0.7331     Epoch 41/150768/768 [==============================] - 0s - loss: 0.5561 - acc: 0.7357     Epoch 42/150768/768 [==============================] - 0s - loss: 0.5459 - acc: 0.7370     Epoch 43/150768/768 [==============================] - 0s - loss: 0.5481 - acc: 0.7240     Epoch 44/150768/768 [==============================] - 0s - loss: 0.5409 - acc: 0.7331     Epoch 45/150768/768 [==============================] - 0s - loss: 0.5438 - acc: 0.7422     Epoch 46/150768/768 [==============================] - 0s - loss: 0.5360 - acc: 0.7344     Epoch 47/150768/768 [==============================] - 0s - loss: 0.5393 - acc: 0.7357     Epoch 48/150768/768 [==============================] - 0s - loss: 0.5360 - acc: 0.7435     Epoch 49/150768/768 [==============================] - 0s - loss: 0.5407 - acc: 0.7370     Epoch 50/150768/768 [==============================] - 0s - loss: 0.5473 - acc: 0.7344     Epoch 51/150768/768 [==============================] - 0s - loss: 0.5287 - acc: 0.7448     Epoch 52/150768/768 [==============================] - 0s - loss: 0.5283 - acc: 0.7539     Epoch 53/150768/768 [==============================] - 0s - loss: 0.5308 - acc: 0.7396     Epoch 54/150768/768 [==============================] - 0s - loss: 0.5274 - acc: 0.7448     Epoch 55/150768/768 [==============================] - 0s - loss: 0.5241 - acc: 0.7539     Epoch 56/150768/768 [==============================] - 0s - loss: 0.5262 - acc: 0.7526     Epoch 57/150768/768 [==============================] - 0s - loss: 0.5272 - acc: 0.7422     Epoch 58/150768/768 [==============================] - 0s - loss: 0.5262 - acc: 0.7539     Epoch 59/150768/768 [==============================] - 0s - loss: 0.5224 - acc: 0.7604     Epoch 60/150768/768 [==============================] - 0s - loss: 0.5200 - acc: 0.7513     Epoch 61/150768/768 [==============================] - 0s - loss: 0.5158 - acc: 0.7578     Epoch 62/150768/768 [==============================] - 0s - loss: 0.5162 - acc: 0.7513     Epoch 63/150768/768 [==============================] - 0s - loss: 0.5097 - acc: 0.7552     Epoch 64/150768/768 [==============================] - 0s - loss: 0.5134 - acc: 0.7487     Epoch 65/150768/768 [==============================] - 0s - loss: 0.5112 - acc: 0.7435     Epoch 66/150768/768 [==============================] - 0s - loss: 0.5141 - acc: 0.7656     Epoch 67/150768/768 [==============================] - 0s - loss: 0.5082 - acc: 0.7539     Epoch 68/150768/768 [==============================] - 0s - loss: 0.5101 - acc: 0.7643     Epoch 69/150768/768 [==============================] - 0s - loss: 0.5136 - acc: 0.7409     Epoch 70/150768/768 [==============================] - 0s - loss: 0.5182 - acc: 0.7474     Epoch 71/150768/768 [==============================] - 0s - loss: 0.5185 - acc: 0.7370     Epoch 72/150768/768 [==============================] - 0s - loss: 0.5073 - acc: 0.7539     Epoch 73/150768/768 [==============================] - 0s - loss: 0.4982 - acc: 0.7682     Epoch 74/150768/768 [==============================] - 0s - loss: 0.4967 - acc: 0.7591     Epoch 75/150768/768 [==============================] - 0s - loss: 0.5070 - acc: 0.7617     Epoch 76/150768/768 [==============================] - 0s - loss: 0.5025 - acc: 0.7526     Epoch 77/150768/768 [==============================] - 0s - loss: 0.4991 - acc: 0.7604     Epoch 78/150768/768 [==============================] - 0s - loss: 0.4923 - acc: 0.7656     Epoch 79/150768/768 [==============================] - 0s - loss: 0.4998 - acc: 0.7695     Epoch 80/150768/768 [==============================] - 0s - loss: 0.5004 - acc: 0.7526     Epoch 81/150768/768 [==============================] - 0s - loss: 0.5043 - acc: 0.7552     Epoch 82/150768/768 [==============================] - 0s - loss: 0.5002 - acc: 0.7656     Epoch 83/150768/768 [==============================] - 0s - loss: 0.4932 - acc: 0.7617     Epoch 84/150768/768 [==============================] - 0s - loss: 0.4971 - acc: 0.7604     Epoch 85/150768/768 [==============================] - 0s - loss: 0.5007 - acc: 0.7513     Epoch 86/150768/768 [==============================] - 0s - loss: 0.4889 - acc: 0.7656     Epoch 87/150768/768 [==============================] - 0s - loss: 0.4953 - acc: 0.7591     Epoch 88/150768/768 [==============================] - 0s - loss: 0.4910 - acc: 0.7669     Epoch 89/150768/768 [==============================] - 0s - loss: 0.4897 - acc: 0.7604     Epoch 90/150768/768 [==============================] - 0s - loss: 0.4867 - acc: 0.7643     Epoch 91/150768/768 [==============================] - 0s - loss: 0.4915 - acc: 0.7669     Epoch 92/150768/768 [==============================] - 0s - loss: 0.4907 - acc: 0.7630     Epoch 93/150768/768 [==============================] - 0s - loss: 0.4912 - acc: 0.7604     Epoch 94/150768/768 [==============================] - 0s - loss: 0.4851 - acc: 0.7630     Epoch 95/150768/768 [==============================] - 0s - loss: 0.4821 - acc: 0.7682     Epoch 96/150768/768 [==============================] - 0s - loss: 0.4835 - acc: 0.7669     Epoch 97/150768/768 [==============================] - 0s - loss: 0.4738 - acc: 0.7773     Epoch 98/150768/768 [==============================] - 0s - loss: 0.5008 - acc: 0.7474     Epoch 99/150768/768 [==============================] - 0s - loss: 0.4841 - acc: 0.7682     Epoch 100/150768/768 [==============================] - 0s - loss: 0.4816 - acc: 0.7669     Epoch 101/150768/768 [==============================] - 0s - loss: 0.4843 - acc: 0.7695     Epoch 102/150768/768 [==============================] - 0s - loss: 0.4753 - acc: 0.7891     Epoch 103/150768/768 [==============================] - 0s - loss: 0.4841 - acc: 0.7630     Epoch 104/150768/768 [==============================] - 0s - loss: 0.4836 - acc: 0.7786     Epoch 105/150768/768 [==============================] - 0s - loss: 0.4809 - acc: 0.7708     Epoch 106/150768/768 [==============================] - 0s - loss: 0.4792 - acc: 0.7786     Epoch 107/150768/768 [==============================] - 0s - loss: 0.4831 - acc: 0.7734     Epoch 108/150768/768 [==============================] - 0s - loss: 0.4783 - acc: 0.7852     Epoch 109/150768/768 [==============================] - 0s - loss: 0.4784 - acc: 0.7708     Epoch 110/150768/768 [==============================] - 0s - loss: 0.4803 - acc: 0.7682     Epoch 111/150768/768 [==============================] - 0s - loss: 0.4704 - acc: 0.7734     Epoch 112/150768/768 [==============================] - 0s - loss: 0.4752 - acc: 0.7878     Epoch 113/150768/768 [==============================] - 0s - loss: 0.4776 - acc: 0.7760     Epoch 114/150768/768 [==============================] - 0s - loss: 0.4849 - acc: 0.7604     Epoch 115/150768/768 [==============================] - 0s - loss: 0.4773 - acc: 0.7682     Epoch 116/150768/768 [==============================] - 0s - loss: 0.4712 - acc: 0.7773     Epoch 117/150768/768 [==============================] - 0s - loss: 0.4675 - acc: 0.7786     Epoch 118/150768/768 [==============================] - 0s - loss: 0.4660 - acc: 0.7839     Epoch 119/150768/768 [==============================] - 0s - loss: 0.4702 - acc: 0.7891     Epoch 120/150768/768 [==============================] - 0s - loss: 0.4699 - acc: 0.7852     Epoch 121/150768/768 [==============================] - 0s - loss: 0.4786 - acc: 0.7852     Epoch 122/150768/768 [==============================] - 0s - loss: 0.4745 - acc: 0.7786     Epoch 123/150768/768 [==============================] - 0s - loss: 0.4684 - acc: 0.7839     Epoch 124/150768/768 [==============================] - 0s - loss: 0.4709 - acc: 0.7760     Epoch 125/150768/768 [==============================] - 0s - loss: 0.4699 - acc: 0.7747     Epoch 126/150768/768 [==============================] - 0s - loss: 0.4649 - acc: 0.7747     Epoch 127/150768/768 [==============================] - 0s - loss: 0.4709 - acc: 0.7708     Epoch 128/150768/768 [==============================] - 0s - loss: 0.4573 - acc: 0.7982     Epoch 129/150768/768 [==============================] - 0s - loss: 0.4646 - acc: 0.7943     Epoch 130/150768/768 [==============================] - 0s - loss: 0.4775 - acc: 0.7773     Epoch 131/150768/768 [==============================] - 0s - loss: 0.4613 - acc: 0.7799     Epoch 132/150768/768 [==============================] - 0s - loss: 0.4608 - acc: 0.7799     Epoch 133/150768/768 [==============================] - 0s - loss: 0.4737 - acc: 0.7826     Epoch 134/150768/768 [==============================] - 0s - loss: 0.4711 - acc: 0.7773     Epoch 135/150768/768 [==============================] - 0s - loss: 0.4665 - acc: 0.7839     Epoch 136/150768/768 [==============================] - 0s - loss: 0.4579 - acc: 0.7969     Epoch 137/150768/768 [==============================] - 0s - loss: 0.4621 - acc: 0.7917     Epoch 138/150768/768 [==============================] - 0s - loss: 0.4684 - acc: 0.7760     Epoch 139/150768/768 [==============================] - 0s - loss: 0.4597 - acc: 0.7839     Epoch 140/150768/768 [==============================] - 0s - loss: 0.4593 - acc: 0.7799     Epoch 141/150768/768 [==============================] - 0s - loss: 0.4624 - acc: 0.7799     Epoch 142/150768/768 [==============================] - 0s - loss: 0.4609 - acc: 0.7786     Epoch 143/150768/768 [==============================] - 0s - loss: 0.4648 - acc: 0.7826     Epoch 144/150768/768 [==============================] - 0s - loss: 0.4541 - acc: 0.8060     Epoch 145/150768/768 [==============================] - 0s - loss: 0.4597 - acc: 0.7852     Epoch 146/150768/768 [==============================] - 0s - loss: 0.4639 - acc: 0.7891     Epoch 147/150768/768 [==============================] - 0s - loss: 0.4548 - acc: 0.7865     Epoch 148/150768/768 [==============================] - 0s - loss: 0.4659 - acc: 0.7786     Epoch 149/150768/768 [==============================] - 0s - loss: 0.4596 - acc: 0.7799     Epoch 150/150768/768 [==============================] - 0s - loss: 0.4615 - acc: 0.7773      32/768 [>.............................] - ETA: 0sacc: 78.91%
log

 

6. Save & load model

分析log:How to log Keras loss output to a file

loss_history = history_callback.history["loss"]acc_history  = history_callback.history["acc"]

Save and Load Your Keras Deep Learning Models

    • 模型:model.json
    • 权重:model.h5

 

对应的代码,如下:

# serialize model to JSONmodel_json = model.to_json()with open("model.json", "w+") as json_file:    json_file.write(model_json)
# serialize weights to HDF5model.save_weights("model.h5")print("Saved model to disk") # later... # load json and create modeljson_file = open('model.json', 'r')loaded_model_json = json_file.read()json_file.close()loaded_model = model_from_json(loaded_model_json)
# load weights into new modelloaded_model.load_weights("model.h5")print("Loaded model from disk")

 

# serialize model to YAMLmodel_yaml = model.to_yaml()with open("model.yaml", "w") as yaml_file:    yaml_file.write(model_yaml)
# serialize weights to HDF5model.save_weights("model.h5")print("Saved model to disk") # later... # load YAML and create modelyaml_file = open('model.yaml', 'r')loaded_model_yaml = yaml_file.read()yaml_file.close()loaded_model = model_from_yaml(loaded_model_yaml)
# load weights into new modelloaded_model.load_weights("model.h5")print("Loaded model from disk")

  

7. Make Predictions

通过numpy.loadtxt(...) 获取新的数据,放入X中。

# calculate predictionspredictions = model.predict(X)# round predictionsrounded = [int(round(x[0])) for x in predictions]print(rounded)
0 0
原创粉丝点击