Export trained model to tflite-format or convert .h5 to .tflite

I created a model with DLS and finished its training there. Then I downloaded the model and realized that the model is in tensorflow.keras-format (“model.h5”). Now I want to use the model on a mobile device or on my Raspberry Pi so it’s neccessary to convert it to tensorflow lite format (tflite).

My problem now is that the export to tflite doesn’t work with tfliteConverter (https://github.com/tensorflow/tensorflow/blob/master/tensorflow/lite/g3doc/r1/convert/python_api.md). It doesn’t work on my Raspi and also doesn’t work on Google Colaboratory Notebook. The lather runs tfliteConverter but stops with an execution that I can’t fix:
in ()
1 from tensorflow import lite
----> 2 converter = lite.TFLiteConverter.from_keras_model_file(“model.h5”)
3 tfmodel = converter.convert()
4 open (“model.tflite”, “wb”).write(tfmodel)

11 frames
/usr/local/lib/python3.6/dist-packages/tensorflow_core/lite/python/lite.py in from_keras_model_file(cls, model_file, input_arrays, input_shapes, output_arrays, custom_objects)

/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/keras/saving/save.py in load_model(filepath, custom_objects, compile)

/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/keras/saving/hdf5_format.py in load_model_from_hdf5(filepath, custom_objects, compile)

/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/keras/saving/model_config.py in model_from_config(config, custom_objects)

/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/keras/layers/serialization.py in deserialize(config, custom_objects)

/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/keras/utils/generic_utils.py in deserialize_keras_object(identifier, module_objects, custom_objects, printable_module_name)

/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/keras/engine/network.py in from_config(cls, config, custom_objects)

/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/keras/engine/network.py in process_layer(layer_data)

/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/keras/layers/serialization.py in deserialize(config, custom_objects)

/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/keras/utils/generic_utils.py in deserialize_keras_object(identifier, module_objects, custom_objects, printable_module_name)

/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/keras/engine/base_layer.py in from_config(cls, config)

/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/keras/engine/input_layer.py in init(self, input_shape, batch_size, dtype, input_tensor, sparse, name, ragged, **kwargs)

ValueError: (‘Unrecognized keyword arguments:’, dict_keys([‘input_dtype’]))

Can anybody help me to fix the problem or can help me to create a tfllite-formated version of my model?

Problem solved:
The downloaded model is NOT in tensorflow.keras-format, it’s in Keras-Format. The definition of layers in TF-Keras is different to the definition of Keras and is not compatible. If you want to create a tflite-File you have to google for “convert Keras to tflite”. The solution is to create a Keras session with tensorflow and then convert the model.

1 Like