前提・実現したいこと
Kerasで作成したモデル(学習済みEfficientNetB0 + 全結合層)について以下の2つを実現したいです。
- model.summary()で全結合層の詳細を表示したい
model.summary()すると、EfficientNetの構造は表示される一方で全結合層はひとまとめにSequentialと表示されます。
こちらも同程度の粒度(Conv2DやActivation単位)で表示したいです。
2. 全結合層のうち出力側の一部の層を取り除きたい
下記の通り、全結合層はFlatten→Dense→Dropout→Denseから構成されていますが、このうち出力側のDropout,Denseを取り除きたいです。
現状model.layers.pop()するとSequentialごと削除されてしまいます。
###モデル
モデルは以下のように作成しました。
from keras.models import Sequential, Model, load_model from keras.layers import Flatten, Dense,Input, Dropout from efficientnet.keras import EfficientNetB0 from keras import optimizers input_shape = (9, 128, 3) input_tensor = Input(shape=input_shape) # EfficientNetB0 main_model = EfficientNetB0(include_top=False, weights='imagenet',input_tensor=input_tensor) # 全結合層 top_model = Sequential() top_model.add(Flatten(input_shape=main_model.output_shape[1:])) top_model.add(Dense(256, activation='relu')) top_model.add(Dropout(0.5)) top_model.add(Dense(2, activation='softmax')) top_model = Model(inputs=main_model.input, outputs=top_model(main_model.output)) top_model.compile(loss='categorical_crossentropy',optimizer=optimizers.SGD(lr=1e-3, momentum=0.9),metrics=['accuracy']) path_model = "./models/EfficientNetB0_FC_not_work.h5" top_model.save(path_model)
発生している問題・エラーメッセージ
1について
上で作成したモデルを読み込みmodel.summary()を表示したところ、以下のように全結合層はひとまとめにSequentialとなりました。
path_model = "./models/EfficientNetB0_FC_not_work.h5" model = load_model(path_model, compile=False) print(model.summary())
Model: "model_1" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) (None, 9, 128, 3) 0 __________________________________________________________________________________________________ stem_conv (Conv2D) (None, 5, 64, 32) 864 input_1[0][0] __________________________________________________________________________________________________ stem_bn (BatchNormalization) (None, 5, 64, 32) 128 stem_conv[0][0] __________________________________________________________________________________________________ stem_activation (Activation) (None, 5, 64, 32) 0 stem_bn[0][0] __________________________________________________________________________________________________ (中略) __________________________________________________________________________________________________ block7a_project_conv (Conv2D) (None, 1, 4, 320) 368640 block7a_se_excite[0][0] __________________________________________________________________________________________________ block7a_project_bn (BatchNormal (None, 1, 4, 320) 1280 block7a_project_conv[0][0] __________________________________________________________________________________________________ top_conv (Conv2D) (None, 1, 4, 1280) 409600 block7a_project_bn[0][0] __________________________________________________________________________________________________ top_bn (BatchNormalization) (None, 1, 4, 1280) 5120 top_conv[0][0] __________________________________________________________________________________________________ top_activation (Activation) (None, 1, 4, 1280) 0 top_bn[0][0] __________________________________________________________________________________________________ sequential_1 (Sequential) (None, 2) 1311490 top_activation[0][0] ================================================================================================== Total params: 5,361,054 Trainable params: 5,319,038 Non-trainable params: 42,016 __________________________________________________________________________________________________ None
2について
model.layers.pop()のあとmodel.summary()を表示したところ、以下のようにSequentialごと削除されました。
model.layers.pop() print(model.summary())
Model: "model_1" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) (None, 9, 128, 3) 0 __________________________________________________________________________________________________ stem_conv (Conv2D) (None, 5, 64, 32) 864 input_1[0][0] __________________________________________________________________________________________________ stem_bn (BatchNormalization) (None, 5, 64, 32) 128 stem_conv[0][0] __________________________________________________________________________________________________ stem_activation (Activation) (None, 5, 64, 32) 0 stem_bn[0][0] __________________________________________________________________________________________________ (中略) __________________________________________________________________________________________________ block7a_project_conv (Conv2D) (None, 1, 4, 320) 368640 block7a_se_excite[0][0] __________________________________________________________________________________________________ block7a_project_bn (BatchNormal (None, 1, 4, 320) 1280 block7a_project_conv[0][0] __________________________________________________________________________________________________ top_conv (Conv2D) (None, 1, 4, 1280) 409600 block7a_project_bn[0][0] __________________________________________________________________________________________________ top_bn (BatchNormalization) (None, 1, 4, 1280) 5120 top_conv[0][0] __________________________________________________________________________________________________ top_activation (Activation) (None, 1, 4, 1280) 0 top_bn[0][0] Total params: 4,049,564 Trainable params: 4,007,548 Non-trainable params: 42,016 __________________________________________________________________________________________________ None
補足情報
私は機械学習初心者でプログラミング歴もコンピュータの知識も浅いため、初心者にもわかりやすい説明をいただけると非常にありがたいです。
回答1件
あなたの回答
tips
プレビュー