Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added CUB memory pool. CNMEM pool kept as default if USE_CNMEM set. #62

Merged
merged 2 commits into from
Nov 1, 2015
Merged

Added CUB memory pool. CNMEM pool kept as default if USE_CNMEM set. #62

merged 2 commits into from
Nov 1, 2015

Conversation

borisfom
Copy link

@borisfom borisfom commented Nov 1, 2015

Checking in as single squashed commit - trying to extract individual commits for CNMEM unintegration and CUB introduction appeared to be impractical.

borisfom added a commit that referenced this pull request Nov 1, 2015
Added CUB memory pool. CNMEM pool kept as default if USE_CNMEM set.
@borisfom borisfom merged commit fc52f18 into NVIDIA:caffe-0.14 Nov 1, 2015
@lukeyeager
Copy link
Member

The Make build seems to work fine. The build succeeds and all tests pass whether you build with CNMeM or not.

But with CMake I'm getting this error whether I tell it to use CNMeM or not:

$ mkdir build && cd build
$ cmake ..
$ make -j12
...
/home/lyeager/caffe/nv-0.14.0-rc.1/src/caffe/util/gpu_memory.cpp:14:38: fatal error: cub/cub/util_allocator.cuh: No such file or directory
 #include "cub/cub/util_allocator.cuh"
                                      ^
compilation terminated.
make[2]: *** [src/caffe/CMakeFiles/caffe.dir/util/gpu_memory.cpp.o] Error 1
make[2]: *** Waiting for unfinished jobs....
make[1]: *** [src/caffe/CMakeFiles/caffe.dir/all] Error 2
make: *** [all] Error 2

@lukeyeager
Copy link
Member

When I use Make build this, I can't tell that it's ever actually using a memory allocator - either CNMeM or CUB. I don't see it in the ldd output of the executable, and when I try to train a model the memory utilization is low - about 14% instead of 95%.

@lukeyeager lukeyeager mentioned this pull request Nov 3, 2015
@lukeyeager
Copy link
Member

Now that #69 is merged, I can see that CNMeM now utilizes 80% of memory instead of 95%. Is that expected?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants