泛型模型实例

来源:互联网 发布:淘宝中差评回复语 编辑:程序博客网 时间:2024/06/06 18:39

需要一定的keras基础,然后还请对着代码和图认真的看一看实现过程,其实不算复杂

文章来源与keras中文翻译网站,想查看具体内容还请移步到这里该教程里面有更为详细的解释,我只是搬运并记录了一下。

  • 泛型模型实例:
    image of a model
  • code:
from keras.layers import Input, Embedding, LSTM, Dense, mergefrom keras.models import Model# headline input: meant to receive sequences of 100 integers, between 1 and 10000.# note that we can name any layer by passing it a "name" argument.main_input = Input(shape=(100,), dtype='int32', name='main_input')# this embedding layer will encode the input sequence# into a sequence of dense 512-dimensional vectors.x = Embedding(output_dim=512, input_dim=10000, input_length=100)(main_input)# a LSTM will transform the vector sequence into a single vector,# containing information about the entire sequencelstm_out = LSTM(32)(x)auxiliary_output = Dense(1, activation='sigmoid', name='aux_output')(lstm_out)auxiliary_input = Input(shape=(5,), name='aux_input')x = merge([lstm_out, auxiliary_input], mode='concat')# we stack a deep fully-connected network on topx = Dense(64, activation='relu')(x)x = Dense(64, activation='relu')(x)x = Dense(64, activation='relu')(x)# and finally we add the main logistic regression layermain_output = Dense(1, activation='sigmoid', name='main_output')(x)model = Model(input=[main_input, auxiliary_input], output=[main_output, auxiliary_output])model.compile(optimizer='rmsprop', loss='binary_crossentropy',              loss_weights=[1., 0.2])model.fit([headline_data, additional_data], [labels, labels],          nb_epoch=50, batch_size=32)
0 0
原创粉丝点击