Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add transformation to memory data layer #3100

Closed

Conversation

bwilbertz
Copy link
Contributor

This PR adds transformations according to TransformParam to the MemoryDataLayer for the case that data is directly set by a pointer via reset. The transformations (on the blobs) are done on-the-fly during Forward. Including test cases.

@dfagnan
Copy link

dfagnan commented Sep 25, 2015

This is exactly what I was looking for and I'm not sure why it wasn't supported. Can you elaborate what you mean by set by a pointer via reset? Would this also work using python.set_input_arrays? (I got a cublas error attempting to use this through python and I'm trying to diagnose).

@dfagnan
Copy link

dfagnan commented Sep 25, 2015

@bwilbertz It looks like this code assumes height/width are preserved during transformation, which is usually not the case, am I missing something?

@bwilbertz
Copy link
Contributor Author

Hi dfagnan.

Yes, python.set_input_arrays uses this mechanism.

What do you exactly mean by "height/width are preserved during transformation"?

This PR uses (like in the DataLayer) DataTransformer::Transform(Blob* input_blob, Blob* transformed_blob) for the transformation.

That means, if you have input pictures in 1024x768, you would specify this height/width in the proto file and when setting a crop of 256, all the output blobs would have size 256x256.

@dfagnan
Copy link

dfagnan commented Sep 29, 2015

Sorry for the delay in following up but this code worked exactly as I hoped. My error was an unrelated bug with memory data layer which has been addressed but not merged here: #2334. Thanks for the great work. I still don't understand the details of the code, but will leave that for another day!

@bwilbertz
Copy link
Contributor Author

Great :)

@shelhamer
Copy link
Member

Closing according to #5528, but thanks @bwilbertz for proposing an improvement to the layer (in good form too with a test).

@shelhamer shelhamer closed this Apr 14, 2017
beniz added a commit to jolibrain/caffe that referenced this pull request Apr 23, 2017
beniz added a commit to jolibrain/caffe that referenced this pull request Jun 6, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants