首页 > 其他 > 详细

Keras模型拼装

时间:2018-11-04 18:10:34      阅读:419      评论:0      收藏:0      [点我收藏+]

在训练较大网络时, 往往想加载预训练的模型, 但若想在网络结构上做些添补, 可能出现问题一二...

一下是添补的几种情形, 此处以单输出回归任务为例:

# 添在末尾:
base_model = InceptionV3(weights=‘imagenet‘, include_top=False)
x = base_model.output
x = GlobalAveragePooling2D()(x)
x = Dense(1, activation=‘relu‘)(x)

model = Model(inputs=base_model.input, outputs=x)
model.summary()
# 添在开头和末尾:
# 在开头加1x1卷积层, 使4通道降为3通道, 再传入InceptionV3
def head_model(input_shape=(150, 150, 4)):
    input_tensor = Input(input_shape)
    x = Conv2D(128, (1, 1), activation=‘relu‘)(input_tensor)
    x = Conv2D(3, (1, 1), activation=‘relu‘)(x)
    model = Model(inputs=input_tensor, outputs=x, name=‘head‘)
    return model

head_model = head_model()
body_model = InceptionV3(weights=‘imagenet‘, include_top=False)
base_model = Model(head_model.input, body_model(head_model.output))
base_model.summary()
# 两数据输入流合并于末尾:
base_model = InceptionV3(weights=‘imagenet‘, include_top=False, input_shape=(150, 150, 3))
flat = Flatten()(base_model.output)
input_K = Input((100, ))    # another_input
K_flow = Activation(activation=‘linear‘)(input_K)
x = concatenate([flat, K_flow])    # 合流
x = Dense(1024, activation=‘relu‘)(x)
x = Dense(512, activation=‘relu‘)(x)
x = Dense(1, activation=‘relu‘)(x)
model = Model(inputs=[*base_model.inputs, input_K], outputs=x)    # 数据生成器那里也以这种形式生成([x_0, x_1], y)即可.
model.summary()

References:
末尾
开头
末尾合流_0 末尾合流_1

Keras模型拼装

原文:https://www.cnblogs.com/ZhengPeng7/p/9904771.html

(0)
(0)
   
举报
评论 一句话评论(0
关于我们 - 联系我们 - 留言反馈 - 联系我们:wmxa8@hotmail.com
© 2014 bubuko.com 版权所有
打开技术之扣,分享程序人生!