Bugs in source files

Hi =)

I have focused on Inception v3:
~/keras/applications/inception_v3.py

  1. [bug] line 240 : ‘mixed8’ should be MaxPooling2D instead of AveragePooling2D
  2. [bug] line 273 : GlobalAveragePooling2D instead of AveragePooling2D, also include corresponding import
  3. [bug line 274 : ] additional layer Flatten, causing failure of include-top weight loading, saying some thing like this “You trying to load file with 188 layers to model having 189 layers”
  4. [bug] line 276 : missing ‘else’ statement adding one more pooling layer to the model causing failure of no-top weight loading
  5. [feature] newer version v0.5 of pretrained models with ‘tf’ weights available, while ‘th’ are still v0.2, maybe it matters.
  6. [feature] Keras under Windows fails to compute hashsum of the model file, so every run it downloads *.h5 file once again, saying that file is obsolete or corrupted

Please, refer to updates in official repository of François Chollet:

*If you have official repo I can leave comments to source code there, as well as bugfixing with pull requests and signing corresponding agreement if you need one.

Thanks @Ambreaux. We will check this and include the fix in future versions.

@rajendra Great! Thank you =)

How to add layers on top of pre-trained model?

DLS tries to load weights into whole model including custom layers. I think loading weights into pre-trained model “by_name” can solve this, but it is necessary then to user set “name” of layer same as other Advanced Options that are shown from the right panel, or at least generate unique names for user’s layers.

@Ambreaux You should be able to treat pre-trained model just like any other layer and add more layers on top of it.

if weight argument is set to imagenet in pre-trained model, only pre-trained model’s layer will use imagenet weights.

However, Keras-mxnet has a bug where training works fine but model weights are not saved correctly if trainable parameter is not 100 (which is effectively, dont use imagenet’s weights).

We plan to look into it but would appreciate if someone from community can find a fix for it (the bug exists in keras-mxnet==1.2.2.1 pypi version)

Here is the snapshot of pre-trained model with custom layers.

Hi @rajendra
Thank you for reply. Yes, exactly, weights loading works fine when layers added on top of the model. But I have issues when adding layers to the bottom of the model

  1. I have tried Cropping2D added after Input layer in exactly the same model as you show as example. Cropping2D parameters are ((91,90),(171,170)) so input shape (None, 3, 480, 640) should be cropped to (None, 3, 299, 299), DLS recalculates output shapes shown in model tab and it seems working fine, but when compiling model for training it says shapes error that (None, None, 91, 171) inconsistent with (None, None, -90, -170). How to properly set cropping?
  2. Also I have tried to add few more fully convolutional layers between Input and the pre-trained model. DLS says something like “Variable uniformN already declared” where N depends on number of Convolution2D layers between Input and pre-trained model, I have tested up to 10 layers and number of ‘uniform’ variable in the error message consistent with number of added layers.

=(

@Ambreaux I see. Yes, we have seen this issue earlier (adding layer before the model doesn’t work). We will file a bug for this.

1 Like

Regarding keras-mxnet 1.2.2.1 issue it seems they are preparing keras-mxnet 2.2 release with this issue solved:

For now python3 version build is not ready:
Build failed for project keras2-mxnet-python3-pr-build