You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
So as to load only a portion of the caffe net for GPU based training, I am extracting features at layer L1 and re-using them in a model built L1 onwards.
To load the extracted (sequential) features, I must re-structure them to the blob dimension at L1. Is there a better way to do this than to manually set the blob dimensions while extracting, like so:
//snippet from the 'root-to-caffe/tools/extract_features' code
int h = h1; //Addition 1
int w = w1; //Addition 2
int channels = c1; //Addition 3
for (int n = 0; n < batch_size; ++n) {
datum.set_height(h);
datum.set_width(w);
datum.set_channels(channels);
datum.clear_data();
I have added the top three lines into the extract_features.cpp code to extract a blob of dimension h1xw1xc1; where h1-height, w1-width,c1-number of channels.
Please feel free to correct me or make suitable suggestions.
thanks.
The text was updated successfully, but these errors were encountered:
So as to load only a portion of the caffe net for GPU based training, I am extracting features at layer L1 and re-using them in a model built L1 onwards.
To load the extracted (sequential) features, I must re-structure them to the blob dimension at L1. Is there a better way to do this than to manually set the blob dimensions while extracting, like so:
//snippet from the 'root-to-caffe/tools/extract_features' code
int h = h1; //Addition 1
int w = w1; //Addition 2
int channels = c1; //Addition 3
for (int n = 0; n < batch_size; ++n) {
datum.set_height(h);
datum.set_width(w);
datum.set_channels(channels);
datum.clear_data();
I have added the top three lines into the extract_features.cpp code to extract a blob of dimension h1xw1xc1; where h1-height, w1-width,c1-number of channels.
Please feel free to correct me or make suitable suggestions.
thanks.
The text was updated successfully, but these errors were encountered: