You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I think (but could not check because of Gitpod), that the problem is this piece of code:
r=requests.get(container, allow_redirects=True, stream=True, timeout=60*5)
filesize=r.headers.get("Content-length")
iffilesize:
progress.update(task, total=int(filesize))
progress.start_task(task)
# Stream downloadfordatainr.iter_content(chunk_size=io.DEFAULT_BUFFER_SIZE):
# Check that the user didn't hit ctrl-cifself.kill_with_fire:
raiseKeyboardInterruptprogress.update(task, advance=len(data))
fh.write(data)
At some iteration, progress.update(advance=) fails with a KeyError. I presume this is, because the respective self._tasks[task.id] has already terminated?
A plausible reason could be, that the total filesize set for this task through the request header filesize = r.headers.get("Content-length"), is smaller than what is actually sent later as data. Might be just a few extra bytes, but they then crash nf-core download?
I did a little additional digging on a branch and commented out one progress logging line that seemed to be triggering the key error and added a few additional debug statements.
This generates a new error:
ERROR [Errno 2] No such file or directory: './singularity_container_images/blobs-sha256-22-22e054c20192395e0e143df6c36fbed6ce4bd404feba05793aff16819e01fff1-data.img.partial' -> __main__.py:131 './singularity_container_images/blobs-sha256-22-22e054c20192395e0e143df6c36fbed6ce4bd404feba05793aff16819e01fff1-data.img'
In my debug statements it almost looks like this could be a repeated action error. There are two processes that share use this image (not sure if that could be relevant).
In my debug statements I can see that file is made and opened.
DEBUG Opened output file, ./singularity_container_images/blobs-sha256-22-22e054c20192395e0e143df6c36fbed6ce4bd404feba05793aff16819e01fff1-data.img.partial download.py:1352
It looks like the singularity and docker images are found.
DEBUG https://community-cr-prod.seqera.io:443 "GET /docker/registry/v2/blobs/sha256/22/22e054c20192395e0e143df6c36fbed6ce4bd404feba05793aff16819e01fff1/data HTTP/11" 200 825028608 connectionpool.py:546
DEBUG https://community-cr-prod.seqera.io:443 "GET /docker/registry/v2/blobs/sha256/22/22e054c20192395e0e143df6c36fbed6ce4bd404feba05793aff16819e01fff1/data HTTP/11" 200 825028608 connectionpool.py:546
DEBUG Request made for https://community-cr-prod.seqera.io/docker/registry/v2/blobs/sha256/22/22e054c20192395e0e143df6c36fbed6ce4bd404feba05793aff16819e01fff1/data download.py:1355
DEBUG File size of https://community-cr-prod.seqera.io/docker/registry/v2/blobs/sha256/22/22e054c20192395e0e143df6c36fbed6ce4bd404feba05793aff16819e01fff1/data is 825028608 download.py:1358
DEBUG Request made for https://community-cr-prod.seqera.io/docker/registry/v2/blobs/sha256/22/22e054c20192395e0e143df6c36fbed6ce4bd404feba05793aff16819e01fff1/data download.py:1355
DEBUG File size of https://community-cr-prod.seqera.io/docker/registry/v2/blobs/sha256/22/22e054c20192395e0e143df6c36fbed6ce4bd404feba05793aff16819e01fff1/data is 825028608 download.py:1358
Extra weirdness. I think the final expected file exists and is the correct size, 825028608 == 825028608.
-rw-r--r--. 1 ec2-user ec2-user 825028608 Nov 18 17:06 singularity_container_images/community-cr-prod.seqera.io-docker-registry-v2-blobs-sha256-22-22e054c20192395e0e143df6c36fbed6ce4bd404feba05793aff16819e01ff>
I need to explore this further, but if it is indeed an issue with two processes competing for the same cached blob, the download should work, if you additionally provide the argument -d 1 / --parallel-downloads 1.
Description of the bug
Error encountered when installing dev branch of fastquourum with latest nf-core/tools build.
@MatthiasZepper helped with a little digging.
I did a little additional digging on a branch and commented out one progress logging line that seemed to be triggering the key error and added a few additional debug statements.
This generates a new error:
In my debug statements it almost looks like this could be a repeated action error. There are two processes that share use this image (not sure if that could be relevant).
In my debug statements I can see that file is made and opened.
It looks like the singularity and docker images are found.
Extra weirdness. I think the final expected file exists and is the correct size, 825028608 == 825028608.
Command used and terminal output
System information
Nextflow version: 24.10.1
Hardware: AWS t2-micro
Executor: NA
OS: Amazon Linux
nf-core/tools version: dev
Python version: 3.12
The text was updated successfully, but these errors were encountered: