Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

convert this project's .pth to darknet's .weights #292

Open
LifeIsSoSolong opened this issue Sep 2, 2019 · 8 comments · May be fixed by #842
Open

convert this project's .pth to darknet's .weights #292

LifeIsSoSolong opened this issue Sep 2, 2019 · 8 comments · May be fixed by #842

Comments

@LifeIsSoSolong
Copy link

I want to convert this project's .pth to darknet's .weights ,anyone else have some advice?
I try to load .pth to model, and then save modle's para to .weights , but I use it in darknet's get None detect result.

@JsBlueCat
Copy link

I want to convert this project's .pth to darknet's .weights ,anyone else have some advice?
I try to load .pth to model, and then save modle's para to .weights , but I use it in darknet's get None detect result.

maybe onnx?

@blackCmd
Copy link

blackCmd commented Oct 27, 2020

I want to convert this project's .pth to darknet's .weights ,anyone else have some advice?
I try to load .pth to model, and then save modle's para to .weights , but I use it in darknet's get None detect result.

Here is the answer.
https://kevin970401.github.io/etc/2018/12/26/converting-models.html

@aarkey81
Copy link

aarkey81 commented Nov 9, 2020

create a new python file, import all requirements and copy the following lines in them

model = Darknet("your .cfg file")

model.load_state_dict(torch.load("your_ckpt.pth", map_location=torch.device('cpu'))) # for loading model on cpu

model.save_weights(savedfile='your_ckpt.weights', cutoff=0)
print("successfully converted .pth to .weights")

Hope it helps you

@aarkey81
Copy link

aarkey81 commented Nov 9, 2020

I want to convert this project's .pth to darknet's .weights ,anyone else have some advice?
I try to load .pth to model, and then save modle's para to .weights , but I use it in darknet's get None detect result.

Create a new python file, import all requirements and copy the following lines in them

model = Darknet("your .cfg file")

model.load_state_dict(torch.load("your_ckpt.pth", map_location=torch.device('cpu'))) # for loading model on cpu

model.save_weights(savedfile='your_ckpt.weights', cutoff=0)
print("successfully converted .pth to .weights")

Hope it helps you

@aarkey81
Copy link

aarkey81 commented Nov 9, 2020

I want to convert this project's .pth to darknet's .weights ,anyone else have some advice?
I try to load .pth to model, and then save modle's para to .weights , but I use it in darknet's get None detect result.

maybe onnx?

Create a new python file, import all requirements and copy the following lines in them

model = Darknet("your .cfg file")

model.load_state_dict(torch.load("your_ckpt.pth", map_location=torch.device('cpu'))) # for loading model on cpu

model.save_weights(savedfile='your_ckpt.weights', cutoff=0)
print("successfully converted .pth to .weights")

Hope it helps you

@Flova
Copy link
Collaborator

Flova commented Jan 11, 2021

We should add this file as a script to this git. With some command-line arguments.

@ktanay20
Copy link

ktanay20 commented Mar 6, 2023

Hi, which model to import while doing this? Can anyone guide me on this?
I am unable to import Darknet, is there anything else, i can try?
@Flova @aarkey81

@Flova
Copy link
Collaborator

Flova commented Mar 6, 2023

I think you mean that the Darknet python module import fails right?

Make sure you ran poetry shell in your terminal if you use poetry.

Afterward, you should be able to import it like this:

from pytorchyolo.models import Darknet

@zjuyzj zjuyzj linked a pull request Sep 8, 2023 that will close this issue
4 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants