Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug with saving the model #6442

Closed
hamzamerzic opened this issue Apr 29, 2017 · 3 comments
Closed

Bug with saving the model #6442

hamzamerzic opened this issue Apr 29, 2017 · 3 comments

Comments

@hamzamerzic
Copy link

hamzamerzic commented Apr 29, 2017

I want to rescale the outputs of my model, so I do the following.

state_in = ...
h = ...
scale_vector = K.variable(value=np_scale_vector) # np_scale_vector is a numpy array
out = Lambda(lambda x: x * scale_vector, output_shape=(n_out,))(h)

model = Model(inputs=[state_in], outputs=[out])
model.save(filename)

I get:

  File "/usr/local/lib/python2.7/dist-packages/keras/engine/topology.py", line 2434, in save
    save_model(self, filepath, overwrite, include_optimizer)
  File "/usr/local/lib/python2.7/dist-packages/keras/models.py", line 102, in save_model
    'config': model.get_config()
  File "/usr/local/lib/python2.7/dist-packages/keras/engine/topology.py", line 2311, in get_config
    return copy.deepcopy(config)
  File "/usr/lib/python2.7/copy.py", line 163, in deepcopy
    y = copier(x, memo)
  File "/usr/lib/python2.7/copy.py", line 182, in deepcopy
    rv = reductor(2)
TypeError: can't pickle NotImplementedType objects

If I, on the other hand do

out = Lambda(lambda x: x * np_scale_vector, output_shape=(n_out,))(h)

I get:

  File "/usr/local/lib/python2.7/dist-packages/keras/engine/topology.py", line 2434, in save
    save_model(self, filepath, overwrite, include_optimizer)
  File "/usr/local/lib/python2.7/dist-packages/keras/models.py", line 103, in save_model
    }, default=get_json_type).encode('utf8')
  File "/usr/lib/python2.7/json/__init__.py", line 251, in dumps
    sort_keys=sort_keys, **kw).encode(obj)
  File "/usr/lib/python2.7/json/encoder.py", line 207, in encode
    chunks = self.iterencode(o, _one_shot=True)
  File "/usr/lib/python2.7/json/encoder.py", line 270, in iterencode
    return _iterencode(o, 0)
  File "/usr/local/lib/python2.7/dist-packages/keras/models.py", line 77, in get_json_type
    return obj.item()
ValueError: can only convert an array of size 1 to a Python scalar

I want to emphasize that the model compiles, works, I can see that I am getting proper outputs, but the save fails.

@fchollet
Copy link
Collaborator

It's likely that your issue is with the type of np_scale_vector.

In general, saving models that have Lambda layers can be error prone, because the serialization of arbitrary code is brittle. It works for most expressions but it can't be guaranteed to work in all situations.

I recommend that you write a custom layer with a get_config method (and potentially a from_config class method). It' safer.

@farizrahman4u
Copy link
Contributor

In order to make it serializable, make sure that the numpy array is defined within your function. Any information required to create the numpy array can be passed using the arguments argument. (see std and avg in below example).

def func(x, std, avg):
   shape = # scale factor shape
   scale_factor = np.random.normal(shape) * std + avg
   return x * scale_factor


out = Lambda(func, output_shape=(n_out,), arguments={'std': 0.5, 'avg': 0.4})(h)

@hamzamerzic
Copy link
Author

Thanks a lot for the answer. Your example works like a charm, but unfortunately, it is not what I intended to do. But, it did help me figure out what the issue was! What I tried to do is the following:

def func(x, np_scale_array):
    # print np_scale_array.shape gives (17,)
    return x * np_scale_array

np_scale_array = np.array(...
out = Lambda(func, output_shape=(n_out,), arguments={'np_scale_array': np_scale_array})(h)

This still gives me the error:
ValueError: can only convert an array of size 1 to a Python scalar

But, when I converted the arguments to a Python list and converted them back in the function, it worked like a charm:

def func(x, scale_array):
    # print np_scale_array.shape gives (17,)
    np_scale_array = np.array(scale_array)
    return x * np_scale_array

np_scale_array = np.array(...
out = Lambda(func, output_shape=(n_out,), arguments={'np_scale_array': np_scale_array.tolist()})(h)

This works like a charm!
But I still think this might be a bug, since the Lambda layer assumes numpy inputs to be scalars.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants