-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error while converting #13
Comments
It's hard to help without knowing of your keras model definition |
Here is the model I used: model = Sequential()
|
What is nb_classes and input shape? I will try to reproduce this error. |
Thank you for your help.
|
Would it be because of the version of tools?
|
I had no problem converting your model with Keras 2.1.5. You should try it |
I have tried mine with 2.1.5 and still got the same error. |
import sys import keras2caffe from keras.models import Sequential. nb_classes=2 model = Sequential() model.add(Convolution2D(128, (3, 3), padding='same')) model.add(Flatten()) model.compile(loss='binary_crossentropy', optimizer='adadelta', metrics=['accuracy']) keras2caffe.convert(model, 'deploy.prototxt', 'weights.caffemodel') |
i have the same problem @younkyul , and i find that if the "MaxPooling2D" contains "padding='same'" , it can convert well, but it can not work without "padding='same'" |
I got an error while converting as follows:
....
I0107 15:12:23.722321 7166 net.cpp:242] This network produces output dense_2
I0107 15:12:23.722332 7166 net.cpp:255] Network initialization done.
Traceback (most recent call last):
File "/home/sr5/younkyu.lee/keras2caffe/keras2caffe-master/convert_youn.py", line 91, in
keras2caffe.convert(keras_model, 'youn.prototxt', 'youn.caffemodel')
File "/home/sr5/younkyu.lee/keras2caffe/keras2caffe-master/keras2caffe/convert.py", line 399, in convert
caffe_model.params[layer][n].data[...] = net_params[layer][n]
ValueError: could not broadcast input array from shape (512,28800) into shape (512,1920)
Process finished with exit code 1
Please help me resolve this issue..
The text was updated successfully, but these errors were encountered: