Skip to content

Does crawlee-python has a task distributed or manage system like scrapyd and scrapyd-web? #679

Answered by vdusek
QThans asked this question in Q&A
Discussion options

You must be logged in to vote

Crawlee does not include native support for "daemonization" or task distribution, but you can integrate it with other libraries that provide these features. For instance, to run a Crawlee project as a daemon, you can embed it within a framework like FastAPI and communicate with it using HTTP. If you need distributed task management, Celery could be an option, which can handle task scheduling, queuing, and scaling. Both FastAPI and Celery are free, open-source tools, that you can integrate with Crawlee.

If you’re interested in hosting your crawlers on a managed platform, you can use Apify. Check out our Deploy to Apify guide for more details.

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@QThans
Comment options

Answer selected by vdusek
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
t-tooling Issues with this label are in the ownership of the tooling team.
2 participants
Converted from issue

This discussion was converted from issue #677 on November 11, 2024 11:58.