You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I'm trying to do transfer learning from the provided English net to a custom data set. The readme mentions that even languages with a different number of characters compared to English can be used by supplying the allowed_characters_for_loaded_model parameter, however I'm getting an AssertionError upon execution:
Traceback (most recent call last):
File "test.py", line 4, in <module>
english = Configuration.english().load_model(load_name=Configuration.english_baseline[0], load_epoch=Configuration.english_baseline[1], allowed_characters_for_loaded_model=new_corpus.frequent_characters)
File "/usr/local/lib/python3.5/dist-packages/speechless/configuration.py", line 184, in load_model
reinitialize_trainable_loaded_layers=reinitialize_trainable_loaded_layers)
File "/usr/local/lib/python3.5/dist-packages/speechless/net.py", line 182, in __init__
loaded_first_layers_count=frozen_layer_count if reinitialize_trainable_loaded_layers else None)
File "/usr/local/lib/python3.5/dist-packages/speechless/net.py", line 231, in load_weights
asg_transition_probabilities=self.asg_transition_probabilities)
File "/usr/local/lib/python3.5/dist-packages/speechless/net.py", line 182, in __init__
loaded_first_layers_count=frozen_layer_count if reinitialize_trainable_loaded_layers else None)
File "/usr/local/lib/python3.5/dist-packages/speechless/net.py", line 212, in load_weights
self.predictive_net.load_weights(str(load_model_from_directory / self.model_file_name(load_epoch)))
File "/usr/local/lib/python3.5/dist-packages/keras/engine/network.py", line 1161, in load_weights
f, self.layers, reshape=reshape)
File "/usr/local/lib/python3.5/dist-packages/keras/engine/saving.py", line 915, in load_weights_from_hdf5_group
reshape=reshape)
File "/usr/local/lib/python3.5/dist-packages/keras/engine/saving.py", line 571, in preprocess_weights_for_loading
assert shape[0] == layer.filters and shape[2:] == (layer.kernel_size[0], 1)
AssertionError
Hi, I'm trying to do transfer learning from the provided English net to a custom data set. The readme mentions that even languages with a different number of characters compared to English can be used by supplying the
allowed_characters_for_loaded_model
parameter, however I'm getting an AssertionError upon execution:I'm starting the training by this script:
Training from scratch has no problems. What am I doing wrong?
The text was updated successfully, but these errors were encountered: