Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

On-the-fly net resizing, without reallocation (where possible) #594

Merged
merged 19 commits into from
Sep 18, 2014

Commits on Sep 18, 2014

  1. use Blob directly instead of shared_ptr for EltwiseLayer::max_idx_

    This is in keeping with BVLC#742.
    longjon authored and shelhamer committed Sep 18, 2014
    Configuration menu
    Copy the full SHA
    69bf6b5 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    3194bb1 View commit details
    Browse the repository at this point in the history
  3. don't reallocate blobs when shrinking memory use

    This allows nets to be reshaped very quickly (essentially for free) as
    long as sufficient memory has been allocated. Calling Blob::Reshape in
    order to free up memory becomes impossible; however, this is not a
    normal use case (and deleting blobs does free memory).
    longjon authored and shelhamer committed Sep 18, 2014
    Configuration menu
    Copy the full SHA
    4fff966 View commit details
    Browse the repository at this point in the history
  4. enable reshaping in the forward pass

    Note that calling Reshape when no reshape is necessary should be
    effectively a no-op, so this is not a performance regression.
    longjon authored and shelhamer committed Sep 18, 2014
    Configuration menu
    Copy the full SHA
    87de5ed View commit details
    Browse the repository at this point in the history
  5. separate setTensor4dDesc from createTensor4dDesc

    This will make it possible to add reshaping to cuDNN layers.
    longjon authored and shelhamer committed Sep 18, 2014
    Configuration menu
    Copy the full SHA
    5ce519c View commit details
    Browse the repository at this point in the history
  6. Configuration menu
    Copy the full SHA
    d7e8f2a View commit details
    Browse the repository at this point in the history
  7. split off Reshape for data layers

    longjon authored and shelhamer committed Sep 18, 2014
    Configuration menu
    Copy the full SHA
    4b34c72 View commit details
    Browse the repository at this point in the history
  8. split off Reshape for loss layers

    longjon authored and shelhamer committed Sep 18, 2014
    Configuration menu
    Copy the full SHA
    62bc0a8 View commit details
    Browse the repository at this point in the history
  9. split off Reshape for neuron layers

    longjon authored and shelhamer committed Sep 18, 2014
    Configuration menu
    Copy the full SHA
    256209d View commit details
    Browse the repository at this point in the history
  10. split off Reshape for common layers

    longjon authored and shelhamer committed Sep 18, 2014
    Configuration menu
    Copy the full SHA
    07d6246 View commit details
    Browse the repository at this point in the history
  11. split off Reshape for vision layers

    Note that we are dropping some checks from LRN layer. However, these
    checks are fairly redundant; something is very wrong if these layers
    are producing top blobs that are different sizes than their inputs, and
    tests are the right place to catch that. The thing that really should be
    checked (that isn't) is that that local_size needs to be odd; this will
    be added in a future commit.
    longjon authored and shelhamer committed Sep 18, 2014
    Configuration menu
    Copy the full SHA
    6c63b8c View commit details
    Browse the repository at this point in the history
  12. call Reshape in Layer::SetUp

    Strictly speaking, Reshape doesn't need to be called until the first
    Forward call; however, much existing code (especially tests) assumes
    that top blobs will be set up in SetUp, so we may as well do it there.
    longjon authored and shelhamer committed Sep 18, 2014
    Configuration menu
    Copy the full SHA
    d2de2ee View commit details
    Browse the repository at this point in the history
  13. default LayerSetUp to no-op instead of NOT_IMPLEMENTED

    Now that top blobs are set up in Layer::Reshape, it's Reshape that is
    mandatory, and simple layers often don't need to implement LayerSetUp.
    Reshape is (already) declared abstract, so not implementing it is a
    compile-time error.
    longjon authored and shelhamer committed Sep 18, 2014
    Configuration menu
    Copy the full SHA
    4f1b668 View commit details
    Browse the repository at this point in the history
  14. test net reshaping

    longjon authored and shelhamer committed Sep 18, 2014
    Configuration menu
    Copy the full SHA
    db5bb15 View commit details
    Browse the repository at this point in the history
  15. include Reshape in caffe time

    Since we are now calling Reshape in the Forward pass, it's only fair to
    include it when timing. Reshape calls should normally be four or so
    orders of magnitude faster than Forward calls; this change also makes it
    easy to notice a mistake that causes something slow to happen in
    Reshape.
    longjon authored and shelhamer committed Sep 18, 2014
    Configuration menu
    Copy the full SHA
    24350a6 View commit details
    Browse the repository at this point in the history
  16. add Net::Reshape for only reshaping

    Note that it is not normally necessary to call this function when using
    reshapable nets, but sometimes it can be useful to compute the sizes of
    intermediate layers without waiting for the forward pass.
    longjon authored and shelhamer committed Sep 18, 2014
    Configuration menu
    Copy the full SHA
    490077e View commit details
    Browse the repository at this point in the history
  17. [pycaffe] expose Net::Reshape

    longjon authored and shelhamer committed Sep 18, 2014
    Configuration menu
    Copy the full SHA
    fdf2de1 View commit details
    Browse the repository at this point in the history
  18. Configuration menu
    Copy the full SHA
    0b5e11d View commit details
    Browse the repository at this point in the history
  19. Configuration menu
    Copy the full SHA
    d833ab3 View commit details
    Browse the repository at this point in the history