-
Notifications
You must be signed in to change notification settings - Fork 569
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Per-project job limit #140
Conversation
…_jobs_per_project
Hi, What is your use-case for this feature? Btw, try running the tests locally before opening a PR |
Hello Digenis. I want to use Scrapyd in a production environment. There is a lot of Spiders projects. Some of those, runs eventually (monthly) but take about 3 days to complete all jobs, with about 500 jobs. So, I don´t want to lock the other jobs when this project starts. I found other users that need this kind of feature too, like this one: I will work to fix the tests today, if I have time. And push the code to this branch. |
I will need someone who's been more involved in the poller/launcher to review this when ready. |
Sorry about the mess in Travis history. I fixed the unit test using mock, with this module: https://pypi.python.org/pypi/mock I am looking for how to add this egg in travis. |
# Conflicts: # scrapyd/poller.py
schedules no postgres, esse comportamento pode ser hablitado através da configuração 'enable_postgres_persist = true'. Isso viabilizará o reprocessamento de jobs.
inicio e fim de uma execução.
request_count) no banco ao termíno de uma execução. Além disso foi criado parametros no arquivo de configuração para os buckets, do s3, para armazenamento do arquivo de log e items.
…om um banco postgres.
This introduces postgres and rabbitmq as dependencies, will increase technical debt. Also some things added here were already done in simpler way using sqlite here: #359 and merged. So in this form this PR cannot be merged. Ideally we should just allow configurable Pollers, now we load QueuePoller class by default, but we could just make it possible for people to write any sort of complex Pollers themselves. Same for scheduler. I think ScrapyD should be basic and simple, but should provide building blocks to extend it with your desired functionality. This desired functionality from this PR could be added as custom project extension of some specific ScrapyD project, and ScrapyD should just allow people to integrate it easily by making all core components configurable. |
I developed this new feature to allow limit the maximum jobs per project. Please, check if it is interesting.