How to export Keras .h5 to tensorflow .pb?
Keras does not include by itself any means to export a TensorFlow graph as a protocol buffers file, but you can do it using regular TensorFlow utilities. Here is a blog post explaining how to do it using the utility script freeze_graph.py
included in TensorFlow, which is the "typical" way it is done.
However, I personally find a nuisance having to make a checkpoint and then run an external script to obtain a model, and instead prefer to do it from my own Python code, so I use a function like this:
def freeze_session(session, keep_var_names=None, output_names=None, clear_devices=True):
"""
Freezes the state of a session into a pruned computation graph.
Creates a new computation graph where variable nodes are replaced by
constants taking their current value in the session. The new graph will be
pruned so subgraphs that are not necessary to compute the requested
outputs are removed.
@param session The TensorFlow session to be frozen.
@param keep_var_names A list of variable names that should not be frozen,
or None to freeze all the variables in the graph.
@param output_names Names of the relevant graph outputs.
@param clear_devices Remove the device directives from the graph for better portability.
@return The frozen graph definition.
"""
graph = session.graph
with graph.as_default():
freeze_var_names = list(set(v.op.name for v in tf.global_variables()).difference(keep_var_names or []))
output_names = output_names or []
output_names += [v.op.name for v in tf.global_variables()]
input_graph_def = graph.as_graph_def()
if clear_devices:
for node in input_graph_def.node:
node.device = ""
frozen_graph = tf.graph_util.convert_variables_to_constants(
session, input_graph_def, output_names, freeze_var_names)
return frozen_graph
Which is inspired in the implementation of freeze_graph.py
. The parameters are similar to the script too. session
is the TensorFlow session object. keep_var_names
is only needed if you want to keep some variable not frozen (e.g. for stateful models), so generally not. output_names
is a list with the names of the operations that produce the outputs that you want. clear_devices
just removes any device directives to make the graph more portable. So, for a typical Keras model
with one output, you would do something like:
from keras import backend as K
# Create, compile and train model...
frozen_graph = freeze_session(K.get_session(),
output_names=[out.op.name for out in model.outputs])
Then you can write the graph to a file as usual with tf.train.write_graph
:
tf.train.write_graph(frozen_graph, "some_directory", "my_model.pb", as_text=False)
Tensorflow 2.0 Convert keras model to .pb file
Look at TensorFlow's tutorial on saving and loading models. You can use model.save("path")
, and if you do not include an extension, the model will be saved in the SavedModel
format.
import tensorflow as tf
pre_model = tf.keras.models.load_model("final_model.h5")
pre_model.save("saved_model")
Tensorflow (.pb) format to Keras (.h5)
In the Latest Tensorflow Version (2.2)
, when we Save
the Model using tf.keras.models.save_model
, the Model will be Saved
in not just a pb file
but it will be Saved in a Folder, which comprises Variables
Folder and Assets
Folder, in addition to the saved_model.pb
file, as shown in the screenshot below:
For example, if the Model
is Saved
with the Name, "Model"
, we have to Load
using the Name of the Folder, "Model", instead of saved_model.pb
, as shown below:
loaded_model = tf.keras.models.load_model('Model')
instead of
loaded_model = tf.keras.models.load_model('saved_model.pb')
One more change you can do is to replace
tf.keras.models.save_keras_model
with
tf.keras.models.save_model
Complete working Code to convert a Model from Tensorflow Saved Model Format (pb)
to Keras Saved Model Format (h5)
is shown below:
import os
import tensorflow as tf
from tensorflow.keras.preprocessing import image
New_Model = tf.keras.models.load_model('Dogs_Vs_Cats_Model') # Loading the Tensorflow Saved Model (PB)
print(New_Model.summary())
Output of the New_Model.summary
command is:
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 148, 148, 32) 896
_________________________________________________________________
max_pooling2d (MaxPooling2D) (None, 74, 74, 32) 0
_________________________________________________________________
conv2d_1 (Conv2D) (None, 72, 72, 64) 18496
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 36, 36, 64) 0
_________________________________________________________________
conv2d_2 (Conv2D) (None, 34, 34, 128) 73856
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 17, 17, 128) 0
_________________________________________________________________
conv2d_3 (Conv2D) (None, 15, 15, 128) 147584
_________________________________________________________________
max_pooling2d_3 (MaxPooling2 (None, 7, 7, 128) 0
_________________________________________________________________
flatten (Flatten) (None, 6272) 0
_________________________________________________________________
dense (Dense) (None, 512) 3211776
_________________________________________________________________
dense_1 (Dense) (None, 1) 513
=================================================================
Total params: 3,453,121
Trainable params: 3,453,121
Non-trainable params: 0
_________________________________________________________________
None
Continuing the code:
# Saving the Model in H5 Format and Loading it (to check if it is same as PB Format)
tf.keras.models.save_model(New_Model, 'New_Model.h5') # Saving the Model in H5 Format
loaded_model_from_h5 = tf.keras.models.load_model('New_Model.h5') # Loading the H5 Saved Model
print(loaded_model_from_h5.summary())
Output of the command, print(loaded_model_from_h5.summary())
is shown below:
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 148, 148, 32) 896
_________________________________________________________________
max_pooling2d (MaxPooling2D) (None, 74, 74, 32) 0
_________________________________________________________________
conv2d_1 (Conv2D) (None, 72, 72, 64) 18496
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 36, 36, 64) 0
_________________________________________________________________
conv2d_2 (Conv2D) (None, 34, 34, 128) 73856
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 17, 17, 128) 0
_________________________________________________________________
conv2d_3 (Conv2D) (None, 15, 15, 128) 147584
_________________________________________________________________
max_pooling2d_3 (MaxPooling2 (None, 7, 7, 128) 0
_________________________________________________________________
flatten (Flatten) (None, 6272) 0
_________________________________________________________________
dense (Dense) (None, 512) 3211776
_________________________________________________________________
dense_1 (Dense) (None, 1) 513
=================================================================
Total params: 3,453,121
Trainable params: 3,453,121
Non-trainable params: 0
_________________________________________________________________
As can be seen from the Summary
of both the Models
above, both the Models
are same.
Related Topics
How to Remove Specific Elements in a Numpy Array
How to Switch Position of Two Items in a Python List
How to Convert an Xml String to a Dictionary
How to Print Unicode Character in Python
Read File Data Without Saving It in Flask
How to Convert a Date String to Different Format
How to Limit Concurrency with Python Asyncio
How to Find the Number of Arguments of a Python Function
When to Use "While" or "For" in Python
How to Write Png Image to String with the Pil
Create a "With" Block on Several Context Managers
How to Transform an Xml File Using Xslt in Python
Differencebetween Class and Instance Variables
Windows Is Not Passing Command Line Arguments to Python Programs Executed from the Shell