Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

minor memory leak in op creation #15339

Closed
jczaja opened this issue Jan 15, 2019 · 4 comments
Closed

minor memory leak in op creation #15339

jczaja opened this issue Jan 15, 2019 · 4 comments
Assignees

Comments

@jczaja
Copy link
Contributor

jczaja commented Jan 15, 2019

When working on #15032 I noticed potential memory leak when ops are created. It introduce lots of noise when analysing memory problems in Paddle. Problem seems to be that lambda assigned to creator_ is allocating memory , but I cannot find location of releasing it.

info->creator_ = [](const std::string& type, const VariableNameMap& inputs,
const VariableNameMap& outputs,
const AttributeMap& attrs) {
return new T(type, inputs, outputs, attrs);
};

Here is some callstack (test_analyzer_small_dam) from valgrind/memcheck reflecting quoted code:

==2565== 4,557 (120 direct, 4,437 indirect) bytes in 1 blocks are definitely lost in loss record 18,636 of 18,684
==2565==    at 0x65C2255: operator new(unsigned long) (vg_replace_malloc.c:334)
==2565==    by 0x3B476F0: paddle::framework::details::OpInfoFiller<paddle::operators::BatchNormOpMaker, (paddle::framework::details::OpInfoFillType)1>::operator()(char const*, paddle::framework::OpInfo*) const (op_registry.h:96)
==2565==    by 0x3B42C98: paddle::framework::details::OperatorRegistrarRecursive<1ul, false, paddle::operators::BatchNormOp, paddle::operators::BatchNormOpMaker, paddle::operators::BatchNormOpInferVarType, paddle::operators::BatchNormGradMaker>::OperatorRegistrarRecursive(char const*, paddle::framework::OpInfo*) (op_registry.h:68)
==2565==    by 0x3B4272F: paddle::framework::details::OperatorRegistrarRecursive<0ul, false, paddle::operators::BatchNormOp, paddle::operators::BatchNormOpMaker, paddle::operators::BatchNormOpInferVarType, paddle::operators::BatchNormGradMaker>::OperatorRegistrarRecursive(char const*, paddle::framework::OpInfo*) (op_registry.h:71)
==2565==    by 0x3B4216A: paddle::framework::OperatorRegistrar<paddle::operators::BatchNormOp, paddle::operators::BatchNormOpMaker, paddle::operators::BatchNormOpInferVarType, paddle::operators::BatchNormGradMaker>::OperatorRegistrar(char const*) (op_registry.h:61)
==2565==    by 0x3B3A958: __static_initialization_and_destruction_0(int, int) (batch_norm_op.cc:609)
==2565==    by 0x3B3A9C2: _GLOBAL__sub_I_batch_norm_op.cc (batch_norm_op.cc:619)
==2565==    by 0x462211C: __libc_csu_init (in /home/jczaja/paddle/build-debug/paddle/fluid/inference/tests/api/test_analyzer_small_dam)
==2565==    by 0x82B5F6E: (below main) (in /usr/lib64/libc-2.20.so)

@reyoung , @luotao1
Please advice (explain where this memory is released) or if it is a memory leak, please consider fixing it.

@luotao1
Copy link
Contributor

luotao1 commented Jan 15, 2019

Is this the reason of #15032? Or it's another problem?
Anyway, we will look at it.

@jczaja
Copy link
Contributor Author

jczaja commented Jan 15, 2019

@luotao1 This problem came out when looking at #15032, but it is not reason for timeout exceeding it just makes analysis a bit more difficult , so I reported it. F

@chengduoZH
Copy link
Contributor

@jczaja The return of OpRegistry::CreateOp is a std::unique_ptr, so there are no memory leak when ops are created.

std::unique_ptr<OperatorBase> OpRegistry::CreateOp(
const std::string& type, const VariableNameMap& inputs,
const VariableNameMap& outputs, AttributeMap attrs) {
auto& info = OpInfoMap::Instance().Get(type);
if (info.Checker() != nullptr) {
info.Checker()->Check(&attrs);
}
auto op = info.Creator()(type, inputs, outputs, attrs);
return std::unique_ptr<OperatorBase>(op);
}

@jczaja
Copy link
Contributor Author

jczaja commented Jan 16, 2019

@chengduoZH Thanks for explanation. look like my memcheck's false positive.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants