Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

inference question #44

Open
sevenandseven opened this issue Sep 18, 2024 · 5 comments
Open

inference question #44

sevenandseven opened this issue Sep 18, 2024 · 5 comments
Labels
question Further information is requested

Comments

@sevenandseven
Copy link

Question

我训练完之后的模型,使用inference脚本进行推理,并且结果进行可视化,我使用的以下命令,但是它仅仅创建了文件夹,里边没有可视化结果,请问这个是什么原因,我在调试过程中发现他并没有进入到可视化语句中。
python inference.py --image-dir Salience-DETR/coco/testdata --model-config Salience-DETR/configs/salience_detr/salience_detr_resnet50_800_1333.py --checkpoint Salience-DETR/checkpoints/salience_detr_resnet50_800_1333/train/2024-09-13-18_09_23/best_ap50.pth --show-dir test_result

而且路径我用的是绝对路径,不知道为什么没有可视化结果?

补充信息

No response

@sevenandseven sevenandseven added the question Further information is requested label Sep 18, 2024
@xiuqhou
Copy link
Owner

xiuqhou commented Sep 18, 2024

奇怪,我这边按照类似的命令没复现出来你的问题,inference.py正常输出检测结果了。能否提供下更详细的信息,例如输出或者报错?调试的时候可能是多进程导致可视化命令中的断点不会触发,可以把workers参数设为0再调试看看。

@sevenandseven
Copy link
Author

奇怪,我这边按照类似的命令没复现出来你的问题,inference.py正常输出检测结果了。能否提供下更详细的信息,例如输出或者报错?调试的时候可能是多进程导致可视化命令中的断点不会触发,可以把workers参数设为0再调试看看。

很感谢您的回复,结果是这样的,我觉得可能是我训练的权重问题,检索不出来内容,所以导致什么输出都没有。
Image_20240918142453

@sevenandseven
Copy link
Author

奇怪,我这边按照类似的命令没复现出来你的问题,inference.py正常输出检测结果了。能否提供下更详细的信息,例如输出或者报错?调试的时候可能是多进程导致可视化命令中的断点不会触发,可以把workers参数设为0再调试看看。

很感谢您的回复,结果是这样的,我觉得可能是我训练的权重问题,检索不出来内容,所以导致什么输出都没有。 Image_20240918142453

我在以下代码里进行了打印输出,发现每句print都是可以输出的,但是在_visualize_batch_for_infer函数中的print无法输出,怀疑是没有进入到该函数里,但是不知道是什么原因导致的。

with torch.inference_mode():
predictions = []
for index, images in enumerate(tqdm(data_loader)):
prediction = model(images)[0]
# change torch.Tensor to CPU
for key in prediction:
prediction[key] = prediction[key].to("cpu", non_blocking=True)
image_name = data_loader.dataset.images[index]
image = images[0].to("cpu", non_blocking=True)
prediction = {"image_name": image_name, "image": image, "output": prediction}
predictions.append(prediction)
print("the prdiction is:", prediction)

#print("inference done*********show results", args.show_dir)
# save visualization results
if args.show_dir:
    os.makedirs(args.show_dir, exist_ok=True)
    #print("begin show_dir******************")
    # create a dummy dataset for visualization with multi-workers
    data_loader = create_test_data_loader(
        predictions, accelerator=accelerator, batch_size=1, num_workers=args.workers
    )
    print("data_loader*******************", model.CLASSES)
    data_loader.collate_fn = partial(_visualize_batch_for_infer, classes=model.CLASSES, **vars(args))
    [None for _ in tqdm(data_loader)]
    print("**************end of plot results")

@xiuqhou
Copy link
Owner

xiuqhou commented Sep 18, 2024

不知道为什么没进去_visualize_batch_for_infer这个函数,我猜测还是多进程的原因。你试试直接调用_visualize_batch_for_infer能否打印出结果呢?只需要修改下面这几行代码。

修改前:

    # save visualization results
    if args.show_dir:
        os.makedirs(args.show_dir, exist_ok=True)

        # create a dummy dataset for visualization with multi-workers
        data_loader = create_test_data_loader(
            predictions, accelerator=accelerator, batch_size=1, num_workers=args.workers
        )
        data_loader.collate_fn = partial(_visualize_batch_for_infer, classes=model.CLASSES, **vars(args))
        [None for _ in tqdm(data_loader)]

修改后:

    # save visualization results
    if args.show_dir:
        os.makedirs(args.show_dir, exist_ok=True)

        # create a dummy dataset for visualization with multi-workers
        for prediction in tqdm(predictions):
            _visualize_batch_for_infer([prediction], classes=model.CLASSES, **vars(args))

@sevenandseven
Copy link
Author

# create a dummy dataset for visualization with multi-workers
        for prediction in tqdm(predictions):
            _visualize_batch_for_infer([prediction], classes=model.CLASSES, **vars(args))

修改后的代码可以生成可视化结果,谢谢。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants