You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have hundreds of jobs to run, but my server only has enough resources to run 5 jobs at a time.
It would be nice to create a queue of jobs where 5 are always running.
For context, it's all the same task/logic, just for a different input file.
How do I solve my problem right now?
I have a CSV file that keeps track of which files have/haven't been processed. I run a notebook that fetches 5 unprocessed files and feeds them through a series of multiprocessing Pools. It works, but I wouldn't want the entire queue of jobs in a while loop in case a job fails. So I have to check back every few hours and I am missing out on the chance to launch jobs while I sleep.
Thank you for opening your first issue in this project! Engagement like this is essential for open source projects! 🤗
If you haven't done so already, check out Jupyter's Code of Conduct. Also, please try to follow the issue template as it helps other other community members to contribute more effectively.
You can meet the other Jovyans by joining our Discourse forum. There is also an intro thread there where you can stop by and say Hi! 👋
I have hundreds of jobs to run, but my server only has enough resources to run 5 jobs at a time.
It would be nice to create a queue of jobs where 5 are always running.
For context, it's all the same task/logic, just for a different input file.
How do I solve my problem right now?
I have a CSV file that keeps track of which files have/haven't been processed. I run a notebook that fetches 5 unprocessed files and feeds them through a series of multiprocessing Pools. It works, but I wouldn't want the entire queue of jobs in a while loop in case a job fails. So I have to check back every few hours and I am missing out on the chance to launch jobs while I sleep.
The text was updated successfully, but these errors were encountered: