Coming soon: We’re working on a brand new, revamped Community experience. Want to receive updates? Sign up now!

0 votes

Dear all,

I'm facing an issue when trying to implement my own keras layer. After training the model, it crashes when trying to load the model. The load_model() command leads to the following error message:

File "/home/dataiku/dss_data/code-envs/python/google_api/lib/python2.7/site-packages/keras/utils/generic_utils.py", line 134, in deserialize_keras_object
    ': ' + class_name)

After some google investigations, it seems the load_model() function should integrate a second optional argument which is a dictionary of the custom objects --> custom_objects={'LayerCustom: LayerCustom }.

Unfortunately, the load_model() function is called without this optional argument from dataiku as is it observed in the log file:

model = load_model(osp.join(run_folder, constants.KERAS_MODEL_FILENAME))

Would some of you have already implemented your own keras layers ? If yes, was it successful and did you face the problem ?

Thanks for your help,

Regards,

by
reopened by

2 Answers

0 votes
Best answer

Hello,

We have added the possibility to use custom objects in the visual Deep Learning part of the Visual Machine Learning of DSS in release 5.1.3.

To use it, you need to register your custom object using the custom object handler, and then DSS will handle the serialization/deserialization for you.

For example, you can write a "MyDense" custom layer, that you put in a "my_layers.py" inside the libraries of the project on which you're going to build your DL model. It will look like:

# my_layers.py
from keras import backend as K
from keras.engine.topology import Layer

class MyDense(Layer):

    def __init__(self, output_dim=32, **kwargs):
        self.output_dim = output_dim
        super(MyDense, self).__init__(**kwargs)

    def build(self, input_shape):
        self.kernel = self.add_weight(name='kernel', 
                                      shape=(input_shape[1], self.output_dim),
                                      initializer='uniform',
                                      trainable=True)
        super(MyDense, self).build(input_shape)

    def get_config(self):
        config = {'name': self.name,
                    'trainable': self.trainable,
                    'output_dim': self.output_dim
                    }
        if hasattr(self, 'batch_input_shape'):
            config['batch_input_shape'] = self.batch_input_shape
        if hasattr(self, 'dtype'):
            config['dtype'] = self.dtype
        return config        

    def call(self, x):
        y = K.dot(x, self.kernel)
        return y

    def compute_output_shape(self, input_shape):
        return (input_shape[0], self.output_dim)

Note that in this example, you must implement a "get_config" method, that converts your object into a dict. Otherwise, Keras will not be able to figure out how to serialize it.

Then, in the "Architecture" tab of your DL algorithm, you can use the "MyDense" layer, and need to register it for DSS to later save it. The architecture code will look like:

from keras.layers import Input, Dense
from keras.models import Model

from my_layers import MyDense
import dataiku.doctor.deep_learning.custom_objects_handler as coh
coh.register_object("MyDense", MyDense)


def build_model(input_shapes, n_classes=None):
    inputs = Input(shape=input_shapes['main'], name='main')
    x = Dense(128, activation='relu')(inputs)
    x = Dense(128, activation='relu')(x)
    x = MyDense(64)(x)
    predictions = Dense(n_classes, activation="softmax")(x)
    model = Model(inputs=inputs, outputs=predictions)
    return model

def compile_model(model):            
    model.compile(optimizer='rmsprop',
                  loss='binary_crossentropy')
    return model

 

Don't hesitate if you have further questions.

Best regards,

 

 

 

 

 

by
selected by
0 votes
Hello,

It is currently not possible to define custom_objects in the Deep Learning section of visual ML of DSS.

We will work on adding this functionality for a future release of DSS.

To make it work, you would need to use a python recipe, where you would need to handle yourself the preprocessing and the training.

Regards,

Nicolas Servel
by
Thanks Nicolas,
I successfully implemented it by using a python recipe as you mentioned !
Good to know that it will be added in the future release of DSS.
1,337 questions
1,362 answers
1,556 comments
11,912 users

©Dataiku 2012-2018 - Privacy Policy