deeplearning.ai 第四课第二周,keras导航
来源:互联网 发布:互联网软件下载 编辑:程序博客网 时间:2024/06/01 10:27
1、函数库导入:(案例是一个happyhouse案例)
import numpy as npfrom keras import layersfrom keras.layers import Input, Dense, Activation, ZeroPadding2D, BatchNormalization, Flatten, Conv2Dfrom keras.layers import AveragePooling2D, MaxPooling2D, Dropout, GlobalMaxPooling2D, GlobalAveragePooling2Dfrom keras.models import Modelfrom keras.preprocessing import imagefrom keras.utils import layer_utilsfrom keras.utils.data_utils import get_filefrom keras.applications.imagenet_utils import preprocess_inputimport pydotfrom IPython.display import SVGfrom keras.utils.vis_utils import model_to_dotfrom keras.utils import plot_modelfrom keras import lossesfrom kt_utils import *import keras.backend as KK.set_image_data_format('channels_last')import matplotlib.pyplot as pltfrom matplotlib.pyplot import imshow%matplotlib inline#help(Model.compile)
2、训练数据导入:
X_train_orig, Y_train_orig, X_test_orig, Y_test_orig, classes = load_dataset()# Normalize image vectorsX_train = X_train_orig/255.X_test = X_test_orig/255.# ReshapeY_train = Y_train_orig.TY_test = Y_test_orig.Tprint ("number of training examples = " + str(X_train.shape[0]))print ("number of test examples = " + str(X_test.shape[0]))print ("X_train shape: " + str(X_train.shape))print ("Y_train shape: " + str(Y_train.shape))print ("X_test shape: " + str(X_test.shape))print ("Y_test shape: " + str(Y_test.shape))
3、happymodel定义(本模型采用类似letnet-5卷积网络的设置,前面卷积和池化层和其相同,后面的全连接层采用神经元数为 100,20,及最后一层用的是logistic_regression的二分层)
# GRADED FUNCTION: HappyModeldef HappyModel(input_shape): """ Implementation of the HappyModel. Arguments: input_shape -- shape of the images of the dataset Returns: model -- a Model() instance in Keras """ ### START CODE HERE ### # Feel free to use the suggested outline in the text above to get started, and run through the whole # exercise (including the later portions of this notebook) once. The come back also try out other # network architectures as well. X_input = Input(input_shape) # shape(None,68,68,3) X = ZeroPadding2D((2,2))(X_input) #first layer of model # shape (None,64,64,6) X = Conv2D(6,(5,5),strides=(1,1),name='conv1')(X) X = BatchNormalization(axis=3,name='bn1')(X) X = Activation('relu')(X) # first layer pooling # shape = (None,32,32,6) X = MaxPooling2D((2,2),strides=(2,2),name='max_pool_1')(X) # second layers of models # shape = (None,28,28,16) X = Conv2D(16,(5,5),strides=(1,1),name='conv2')(X) X = BatchNormalization(axis=3,name='nb2')(X) X = Activation('relu')(X) # second pooling # shape = (None,14,14,16) X = MaxPooling2D((2,2),strides=(2,2),name='max_pool_2')(X) # flatten the X values # shape = (None, 14*14*16) X = Flatten()(X) # third layers models # shape = (None,100) X = Dense(100,activation='relu',name='fc1')(X) # fourth layer models # shape = (None,20) X = Dense(20,activation='relu',name='fc2')(X) # fifth layer models # shape = (None,1) X = Dense(1,activation='sigmoid',name='fc3')(X) #create this your keras model model = Model(inputs = X_input, outputs = X, name='HappyModel') ### END CODE HERE ### return model
4、模型计算
4.1 实体模型定义:
### START CODE HERE ### (1 line)happyModel = HappyModel((64,64,3))### END CODE HERE ###
4.2 模型 compile 定一其学习过程及相应超参
### START CODE HERE ### (1 line)happyModel.compile(optimizer='Adam',loss='binary_crossentropy',metrics=["accuracy"])### END CODE HERE ###
4.3 模型训练过程:
### START CODE HERE ### (1 line)happyModel.fit(x = X_train,y=Y_train,epochs = 10,batch_size=32)### END CODE HERE ###
4.4 模型评估:
### START CODE HERE ### (1 line)preds = happyModel.evaluate(x=X_test,y=Y_test)### END CODE HERE ###print()print ("Loss = " + str(preds[0]))print ("Test Accuracy = " + str(preds[1]))
阅读全文
0 0
- deeplearning.ai 第四课第二周,keras导航
- deeplearning.ai 第四课第二周 resnet 50层神经网络实现
- 【deeplearning.ai笔记第二课】2.4 batch normalization
- Coursera 吴恩达 DeepLearning.ai 第2课 Improving Deep Neural Networks 第二周 测验题 Optimization algorithms
- Deeplearning-吴恩达-卷积神经网络-第二周作业02-Residual Networks(keras)
- Deeplearning-吴恩达-卷积神经网络-第二周作业01-Convolution Networks(keras)
- Coursera Deep Learning 第四课 卷积神经网络 第二周 编程练习 Keras
- 吴恩达deeplearning.ai第四课学习心得:卷积神经网络与计算机视觉
- Andrew Ng deeplearning.ai专项课程第四课Convolutional Neural Networks第一周笔记
- deeplearning.ai 第四课第一周,step by step 卷积神经网络的python实现
- deeplearning.ai 第四课第一周, 卷积神经网络的tensorflow实现
- 【deeplearning.ai笔记第二课】1.1 训练集,验证集和测试集
- 【deeplearning.ai笔记第二课】1.2 欠拟合和过拟合(bias variance)
- 【deeplearning.ai笔记第二课】1.3 机器学习基本方法(Basic recipe for machine learning)
- 【deeplearning.ai笔记第二课】1.4 正则化,权重初始化和输入归一化
- 【deeplearning.ai笔记第二课】2.2 优化算法(动量梯度下降,RMSprop,adam)
- 吴恩达Coursera深度学习课程 DeepLearning.ai 编程作业——Keras tutorial
- 吴恩达深度学习课程deeplearning.ai课程作业:Class 4 Week 2 Keras
- Directive 详解
- JS/JavaScript中解析JSON --- JSON.parse()、JSON.stringify()以及$.parseJSON()使用详解
- node 服务器端管理模块forever forever start -a -l /tmp/testc.log
- python pandas (ix & iloc &loc) 的区别
- 为PowerApps和Flow,Power BI开发自定义连接器
- deeplearning.ai 第四课第二周,keras导航
- 购物车批量删除
- RPC、SQL、NFS属于OSI的哪一层
- 编码顺序、frame_num和POC
- Anagrams问题 关键字:数组运算 字符操作
- linux redis安装
- [吴恩达 DL] Class1 Week3 浅层神经网络+代码实现
- padding和margin区别以及常用法
- 如何去掉 Liferay 页面下方的 Powered by Lifray