Notimplementederror: Layers with Arguments in '_Init_' Must Override 'Get_Config'

NotImplementedError: Layers with arguments in `__init__` must override `get_config`

It's not a bug, it's a feature.

This error lets you know that TF can't save your model, because it won't be able to load it.

Specifically, it won't be able to reinstantiate your custom Layer classes: encoder and decoder.

To solve this, just override their get_config method according to the new arguments you've added.

A layer config is a Python dictionary (serializable) containing the configuration of a layer. The same layer can be reinstantiated later (without its trained weights) from this configuration.


For example, if your encoder class looks something like this:

class encoder(tf.keras.layers.Layer):

def __init__(
self,
vocab_size, num_layers, units, d_model, num_heads, dropout,
**kwargs,
):
super().__init__(**kwargs)
self.vocab_size = vocab_size
self.num_layers = num_layers
self.units = units
self.d_model = d_model
self.num_heads = num_heads
self.dropout = dropout

# Other methods etc.

then you only need to override this method:

    def get_config(self):

config = super().get_config().copy()
config.update({
'vocab_size': self.vocab_size,
'num_layers': self.num_layers,
'units': self.units,
'd_model': self.d_model,
'num_heads': self.num_heads,
'dropout': self.dropout,
})
return config

When TF sees this (for both classes), you will be able to save the model.

Because now when the model is loaded, TF will be able to reinstantiate the same layer from config.


Layer.from_config's source code may give a better sense of how it works:

@classmethod
def from_config(cls, config):
return cls(**config)

NotImplementedError: Layer ModuleWrapper has arguments in `__init__` and therefore must override `get_config`

ModuleWrapper layer name is because you are mixing keras and tensorflow libraries. Use just one of them (Then you will get dense name for Dense layers and also you don't need to implement get_config).

Change this line:

#from keras.layers import Dense             #comment this
from tensorflow.keras.layers import Dense #add this

Also, noted shapes of your dataset will cause error, since they are incompatible with the model you have defined, and you should remove last axis from your data. Add these 2 lines before model.fit():

train_data = tf.squeeze(train_data)
test_data = tf.squeeze(test_data)

These lines change shapes from (None,1024,1) to (None,1024). Then you can feed them to your model without any error.

NotImplementedError: Layer attention has arguments in `__init__` and therefore must override `get_config`

You need a config method like this:

def get_config(self):
config = super().get_config().copy()
config.update({
'return_sequences': self.return_sequences
})
return config

All the info needed was in the other post that you linked.



Related Topics



Leave a reply



Submit