keras实现LeNet-5模型

来源:互联网 发布:python arima 编辑:程序博客网 时间:2024/05/22 06:30

先放一张LeNet-5模型的结构图

这里写图片描述

关于LeNet-5模型每层具体的参数个数和如何连接本文不讲述,想了解的读者点击这个链接。

下面的代码是使用keras实现,以theano为后端。

#encoding:utf-8from keras.datasets import mnistfrom keras.models import Sequentialfrom keras.layers import Dense,Flatten,Conv2D,MaxPool2Dfrom keras.optimizers import SGDfrom keras.utils import to_categorical(x_train,y_train),(x_test,y_test)=mnist.load_data()print(x_train.shape)print(y_train.shape)print(x_test.shape)print(y_test.shape)#theano为后端的图片矩阵通道数在前,相反tensorflow是通道数在后x_train=x_train.reshape(60000,1,28,28)x_test=x_test.reshape(10000,1,28,28)#把标签变成one-hot编码的形式y_train=to_categorical(y_train,num_classes=10)y_test=to_categorical(y_test,num_classes=10)# create modelmodel=Sequential()model.add(Conv2D(filters=6, kernel_size=(5,5), padding='valid', input_shape=(1,28,28), activation='tanh'))model.add(MaxPool2D(pool_size=(2,2)))model.add(Conv2D(filters=16, kernel_size=(5,5), padding='valid', activation='tanh'))model.add(MaxPool2D(pool_size=(2,2)))#池化后变成16个4x4的矩阵,然后把矩阵压平变成一维的,一共256个单元。model.add(Flatten())#下面就是全连接层了model.add(Dense(120, activation='tanh'))model.add(Dense(84, activation='tanh'))model.add(Dense(10, activation='softmax'))#compile model#事实证明,对于分类问题,使用交叉熵(cross entropy)作为损失函数更好些model.compile(    loss='categorical_crossentropy',    optimizer=SGD(lr=0.1),    metrics=['accuracy'])#train modelmodel.fit(x_train,y_train,batch_size=128,epochs=2)#evaluate modelscore=model.evaluate(x_test,y_test)print("Total loss on Testing Set:", score[0])print("Accuracy of Testing Set:", score[1])