Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stuck on Queued #218

Open
ralghamdi opened this issue May 28, 2024 · 5 comments
Open

Stuck on Queued #218

ralghamdi opened this issue May 28, 2024 · 5 comments

Comments

@ralghamdi
Copy link

ralghamdi commented May 28, 2024

I have installed TRAM locally, and have uploaded three different reports to be analyzed. It has been almost 2 hours for the newest one of them and they all still have the status of Queued as shown below:
image

I've been monitoring logs of the container with no indication of progress or errors.

@mehaase
Copy link
Contributor

mehaase commented Jun 22, 2024

Hi @ralghamdi sorry for the delay in replying. When you run TRAM locally, you need to run this command in a separate tab:

tram pipeline run --model bert --run-forever &

That will run a background job that processes the reports. Please note that the BERT model is a bit slow when running on CPU (a few minutes per report) but the background process will display something in the logs when it starts processing a report.

@hAnguyen1517
Copy link

Hello @mehaase.
When you run the program and upload the documents, their statuses are in the queue:
image

But when you run the tram pipeline run --model bert --run-forever & in the new terminal, the files show the Error status.
image

Error from the terminal:
image

@mehaase
Copy link
Contributor

mehaase commented Dec 18, 2024

Hi @hAnguyen1517. Are you running this locally? Did you download the BERT model? Step 9 in these instructions: https://github.com/center-for-threat-informed-defense/tram/wiki/Developers#developer-setup

@hAnguyen1517
Copy link

Hello @mehaase . Thank you so much for taking the time to respond to my question.

Yes, I am running tram locally using VS Code. I downloaded the BERT model and completed everything in step 9 as well. But it still won't work, and we are having issues loading and processing in the queue; I wonder if others are having the same issues.

@mehaase
Copy link
Contributor

mehaase commented Dec 19, 2024

Can you please check that the following files exist in your installation and are approximately correct file size?

$ ls -lah data/ml-models/bert_model/
total 886352
drwxr-xr-x  5 mhaase  staff   160B Dec 19 15:34 ./
drwxr-xr-x  4 mhaase  staff   128B Aug 31  2023 ../
-rw-r--r--@ 1 mhaase  staff   395B Aug 31  2023 classes.txt
-rw-r--r--  1 mhaase  staff   2.7K Dec 19 15:34 config.json
-rw-r--r--  1 mhaase  staff   420M Dec 19 15:35 pytorch_model.bin

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants