Save and Load Keras Deep Learning Model in Python
In this tutorial, we will learn how to save and load the Keras deep learning model in Python.
Once we train a deep learning model, the work done during training will become worthless if we cannot save the work we have done, as training is a costly task altogether. It’s not possible to retrain the model every time we execute the program. So Keras provides a better way to tackle this issue by enabling us to save the structure along with the weights.
Method of saving and loading model in Keras
The HDF5 format saves the weights in the model, and JSON or YAML format preserves the structure. In this tutorial, we use the iris flower classification dataset to perform the task of classification of the flower.
This tutorial shows saving and loading weights and structure using JSON format as well as YAML format.
Implementation in Python
Below is the basic model before saving using either of the formats. Training of model is using Keras with TensorFlow in the backend.
from sklearn.model_selection import train_test_split from sklearn.model_selection import cross_val_score from sklearn.preprocessing import LabelEncoder import numpy as np import pandas as pd import matplotlib.pyplot as plt import os import tensorflow as tf from keras.models import model_from_json from keras.model import model_from_yaml import os dataframe=pd.read_csv("iris_flower.csv",header=None) dataset=dataframe.values X=dataset[1:,0:4].astype(float) Y=dataset[1:,4] seed=7 np.random.seed(seed) encoder=LabelEncoder() encoder.fit(Y) encoded=encoder.transform(Y) dummy_y=np_utils.to_categorical(encoded) Xtrain,Xtest,Ytrain,Ytest=train_test_split(X,dummy_y,stratify=dummy_y,random_state=7,test_size=0.3) cvscores=[] def create_model(): model=Sequential() model.add(Dense(8,input_dim=4,init="normal",activation="relu")) model.add(Dense(8,init="normal",activation="relu")) model.add(Dense(3,init="normal",activation="softmax")) model.compile(optimizer="adam", loss="categorical_crossentropy", metrics=['accuracy']) return model model = create_model() model.fit(Xtrain,Ytrain,nb_epoch=100,batch_size=5,verbose=0) score=model.evaluate(Xtest,Ytest,verbose=0) cvscores.append(score[1]*100) print("Accuracy of Model is =",np.mean(cvscores))
Output:
Saving and loading Using JSON
JSON use to_json() function to convert the data into JSON format.json_file.write() function writes data to the file .model_from_json() loads the file back to Keras.save_weights() and load_weights() are respectively saves and loads data to and from JSON file simultaneously. The respective code for the JSON file’s saving and loading is given below :
print("Accuracy before saving to disk =",np.mean(cvscores)) model_json = model.to_json() with open("model.json", "w") as json_file: json_file.write(model_json) # serializing weights model.save_weights("model.h5") print("Saved model to disk") # loading json json_file = open('model.json', 'r') loaded_model_json = json_file.read() json_file.close() loaded_model = model_from_json(loaded_model_json) # loading weights loaded_model.load_weights("model.h5") print("Loaded model from disk") print("Accuracy after loading from disk =",np.mean(cvscores))
Output: Accuracy before saving to disk = 97.77777791023254 Saved model to disk Loaded model from disk Accuracy after loading from disk = 97.77777791023254
The Formatted JSON code from theĀ file is :
{ "class_name":"Sequential", "config":{ "name":"sequential_2", "layers":[ { "class_name":"Dense", "config":{ "name":"dense_4", "trainable":true, "batch_input_shape":[ null, 8 ], "dtype":"float32", "units":12, "activation":"relu", "use_bias":true, "kernel_initializer":{ "class_name":"VarianceScaling", "config":{ "scale":1.0, "mode":"fan_avg", "distribution":"uniform", "seed":null } }, "bias_initializer":{ "class_name":"Zeros", "config":{ } }, "kernel_regularizer":null, "bias_regularizer":null, "activity_regularizer":null, "kernel_constraint":null, "bias_constraint":null } }, { "class_name":"Dense", "config":{ "name":"dense_5", "trainable":true, "dtype":"float32", "units":8, "activation":"relu", "use_bias":true, "kernel_initializer":{ "class_name":"VarianceScaling", "config":{ "scale":1.0, "mode":"fan_avg", "distribution":"uniform", "seed":null } }, "bias_initializer":{ "class_name":"Zeros", "config":{ } }, "kernel_regularizer":null, "bias_regularizer":null, "activity_regularizer":null, "kernel_constraint":null, "bias_constraint":null } }, { "class_name":"Dense", "config":{ "name":"dense_6", "trainable":true, "dtype":"float32", "units":1, "activation":"sigmoid", "use_bias":true, "kernel_initializer":{ "class_name":"VarianceScaling", "config":{ "scale":1.0, "mode":"fan_avg", "distribution":"uniform", "seed":null } }, "bias_initializer":{ "class_name":"Zeros", "config":{ } }, "kernel_regularizer":null, "bias_regularizer":null, "activity_regularizer":null, "kernel_constraint":null, "bias_constraint":null } } ] }, "keras_version":"2.3.1", "backend":"tensorflow" }
Saving and loading Using YAML
The model saves to YAML using the model.to_yaml() function. While YAML file loads back to model using the model_from_yaml(). The code for saving and loading in YAML is as follows:
print("Accuracy before saving to disk =",np.mean(cvscores)) model_yaml = model.to_yaml() with open("model.yaml", "w") as yaml_file: yaml_file.write(model_yaml) # serialize weights to HDF5 model.save_weights("model.h5") print("Saved model to disk") # load YAML and create model yaml_file = open('model.yaml', 'r') loaded_model_yaml = yaml_file.read() yaml_file.close() loaded_model = model_from_yaml(loaded_model_yaml) # load weights into new model loaded_model.load_weights("model.h5") print("Loaded model from disk") print("Accuracy after loading from disk =",np.mean(cvscores))
Output:
Accuracy before saving to disk = 97.77777791023254 Saved model to disk Loaded model from disk Accuracy after loading from disk = 97.77777791023254
The YAML fileĀ is :
backend: tensorflow class_name: Sequential config: layers: - class_name: Dense config: activation: relu activity_regularizer: null batch_input_shape: !!python/tuple [null, 4] bias_constraint: null bias_initializer: class_name: Zeros config: {} bias_regularizer: null dtype: float32 kernel_constraint: null kernel_initializer: class_name: RandomNormal config: {mean: 0.0, seed: null, stddev: 0.05} kernel_regularizer: null name: dense_16 trainable: true units: 8 use_bias: true - class_name: Dense config: activation: relu activity_regularizer: null bias_constraint: null bias_initializer: class_name: Zeros config: {} bias_regularizer: null dtype: float32 kernel_constraint: null kernel_initializer: class_name: RandomNormal config: {mean: 0.0, seed: null, stddev: 0.05} kernel_regularizer: null name: dense_17 trainable: true units: 8 use_bias: true - class_name: Dense config: activation: softmax activity_regularizer: null bias_constraint: null bias_initializer: class_name: Zeros config: {} bias_regularizer: null dtype: float32 kernel_constraint: null kernel_initializer: class_name: RandomNormal config: {mean: 0.0, seed: null, stddev: 0.05} kernel_regularizer: null name: dense_18 trainable: true units: 3 use_bias: true name: sequential_6 keras_version: 2.3.1
Summary
So clearly now we are able to save and load JSON and YAML formats into the file and back onto the model. We have even seen how to serialize the model and formatting weights into HDF5 format while saving network structure into JSON and YAML format.
Leave a Reply