Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

write to closed file #89

Open
dmd opened this issue Dec 18, 2023 · 5 comments
Open

write to closed file #89

dmd opened this issue Dec 18, 2023 · 5 comments

Comments

@dmd
Copy link

dmd commented Dec 18, 2023

I very often am getting this:

Traceback (most recent call last):
  File "/opt/conda/lib/python3.11/site-packages/NDATools/Download.py", line 49, in run
    func(*args, **kargs)
  File "/opt/conda/lib/python3.11/site-packages/NDATools/Download.py", line 308, in write_to_download_progress_report_file
    download_progress_report.flush()
ValueError: write to closed file

If I run the program again, it works fine - usually.

@gregmagdits
Copy link
Contributor

Are you running multiple instances of the downloader at the same time when you see this error?

@dmd
Copy link
Author

dmd commented Dec 18, 2023

Nope.

@dmd
Copy link
Author

dmd commented Dec 18, 2023

But I'm also failing to reproduce it any more, so maybe something was weird about my container. I'll reopen if it recurs.

@dmd dmd closed this as completed Dec 18, 2023
@liningpan
Copy link

liningpan commented Dec 18, 2023

I think the reason is that this thread pool is not joined download_progress_file_writer_pool

download_progress_file_writer_pool.add_task(write_to_download_progress_report_file, download_record)
download_pool = ThreadPool(self.thread_num, self.thread_num*6)
download_progress_file_writer_pool = ThreadPool(1, 1000)

You should only close the file handle after joining the download_progress_file_writer_pool. Otherwise the file handle could be invalidated.

This also means that the download report may be incomplete

@gregmagdits
Copy link
Contributor

yea, I think that makes sense. Thank you for pointing that out. Eventually I think we are going to use a sqllite db instead of writing to csv files, but this should be a quick fix that we can include in the next release. Thanks again

@gregmagdits gregmagdits reopened this Dec 18, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants