Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

paddleOCR 识别,内存一直上涨,只要进程不杀死,内存就不会下降 #10062

Closed
feifaxiaoming opened this issue May 30, 2023 · 8 comments
Assignees
Labels
expneeded need extra experiment to fix issue good first issue Good for newcomers status/close

Comments

@feifaxiaoming
Copy link

请提供下述完整信息以便快速定位问题/Please provide the following information to quickly locate the problem

  • 系统环境/System Environment:
  • 版本号/Version:Paddle: PaddleOCR: 问题相关组件/Related components:
  • 运行指令/Command Code:
  • 完整报错/Complete Error Message:

环境:
centos7 8核16G内存 16G显存

Paddle版本:paddlepaddle-gpu 2.4.2.post112
cuda版本:11.2
OCR使用的是 PaddleOCR-release-2.3

今天在用的时候,只要进程不杀死,内存就一直上升,直到内存爆掉

初始启动,就用了3个G的内存,然后之后一直上升

@feifaxiaoming
Copy link
Author

L$CXQ8I~KKL9HAEZXOTB_7F

@ToddBear ToddBear added expneeded need extra experiment to fix issue good first issue Good for newcomers labels Jun 30, 2023
@livingbody
Copy link
Contributor

需要了解你的预测模式,有空贴下。

@shiyutang
Copy link
Collaborator

你好,请贴出运行指令方便我们复现,谢谢~

@tfka
Copy link

tfka commented Jul 10, 2023

@feifaxiaoming 请问解决了吗?出现了同样的问题

1 similar comment
@zhh8689
Copy link

zhh8689 commented Dec 15, 2023

@feifaxiaoming 请问解决了吗?出现了同样的问题

@zhh8689
Copy link

zhh8689 commented Dec 15, 2023

你好,请贴出运行指令方便我们复现,谢谢~
paddlepaddle 2.5.2
paddleocr 2.7.0.3

from paddleocr import PaddleOCR
ocr = PaddleOCR( lang="ch")
for item in files:
ocr.ocr(item)

在使用GPU,CUDA11.2的设备上,循环进行ocr推理,显存一直上涨,最后爆掉
使用paddle.device.cuda.empty_cache(),显存会有下降,但是实际显存还是在上涨,最后依然会爆掉

@UserWangZz
Copy link
Collaborator

该issue长时间未更新,暂将此issue关闭,如有需要可重新开启。

@Yins11
Copy link

Yins11 commented Nov 12, 2024

python tools/infer/predict_rec.py
--rec_model_dir=./inference/rec_ppocr_v4/
--image_dir="xxxxxxxxxxxxxx"
--rec_algorithm="SVTR_LCNet"
--rec_image_shape="3, 48, 320"
--rec_batch_num=1
--rec_char_dict_path="./ppocr/utils/en_dict.txt"
--benchmark=True
--use_gpu=True
--use_tensorrt=True
--warmup=True
运行命令如上,服务器运行 内存一直涨,直到挂掉,
paddlecor: v2.8.1
tensorrt: 8.6.1.6
python: 3.10
paddle: v2.6.1
这个问题只出现在使用tensorrt推理时。请问到底是什么原因

@PaddlePaddle PaddlePaddle locked as resolved and limited conversation to collaborators Nov 12, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
expneeded need extra experiment to fix issue good first issue Good for newcomers status/close
Projects
None yet
Development

No branches or pull requests

10 participants