This project is designed to handle long-running tasks and send notifications to users through different channels (desktop, android). The system enables task execution in the background using a distributed task queue, ensuring tasks are processed efficiently and asynchronously.
Modern applications often need to perform background operations such as sending notifications, processing large datasets, or handling user-generated tasks. These tasks can be resource-intensive and time-consuming, impacting the application's overall performance if executed in the main thread. This project was built to:
- Offload long-running tasks to background workers.
- Send notifications through multiple channels asynchronously.
- Improve scalability and performance for high-demand applications.
- Demonstrate advanced concepts like distributed task queues and message brokering.
-
Task Queueing with Celery:
- When a user triggers a task (e.g., via an API request), the task is added to a Redis queue.
- Celery workers continuously monitor this queue and process tasks in the background without blocking the main application.
-
Notification System:
- The project allows sending notifications to users through different channels like desktop notifications and Pushbullet.
- Each task can trigger a notification after execution, allowing users to receive updates on the task's completion.
-
Task Scheduling:
- The system can schedule tasks at specific intervals (e.g., every minute) using the Schedule library. This enables automation and periodic task execution.
-
Logging and Monitoring:
- The system uses Python’s built-in
logging
module to log every function call and execution, ensuring transparency and easy debugging. - Every task is logged with detailed information about success or failure, helping track and troubleshoot issues.
- The system uses Python’s built-in
Celery Redis Queue
│ main.py
│ README.md
│ .env
| .gitignore
│ requirements.txt
└───Services
├───api
│ └───api_client.py
├───config
│ └───celery_config.py
├───database
│ └───json_save.py
├───logger
│ └───logger.py
├───notification
│ └───notification_service.py
│ └───send_notification.py
├───scheduler
│ └───scheduler.py
├───task
│ ├───long_running_task.py
│ └───short_running_task.py
└───task_executor
└───execute_task.py
-
Asynchronous Task Queue with Celery:
- Celery allows the system to handle multiple tasks concurrently, distributing work across workers and improving the system's efficiency. Tasks are processed in the background, enabling the main application to remain responsive.
-
Redis as a Message Broker:
- Redis is used as the message broker for Celery. It queues tasks, ensuring they are handled properly even under heavy load or if the system experiences temporary outages.
-
Task Scheduling:
- The system uses Schedule to automate task execution at fixed intervals (e.g., running tasks every minute). This is useful for periodic maintenance tasks or sending recurring notifications.
-
Multi-channel Notifications:
- The notification system is built to support multiple channels (desktop, Pushbullet). This flexibility allows the system to adapt to different user preferences and use cases.
-
Logging with Decorators:
- A logging decorator is applied to functions to automatically log when they are called, their success, or any errors. This improves debugging and monitoring without cluttering the business logic.
-
Environment Variable Management:
- The dotenv package is used to securely manage API tokens and other sensitive information, ensuring they are not hard-coded into the application.
- Python 3.9+
- Redis installed and running locally
- Pushbullet API key (if Pushbullet notifications are required)
-
Clone the repository:
git clone https://github.com/AnamolZ/celery_redis_queue.git cd celery_redis_queue
-
Create and activate a virtual environment:
python -m venv venv venv\Scripts\activate
-
Install dependencies:
pip install -r requirements.txt
-
Set up environment variables: Create a
.env
file in the root directory:PUSHBULLET_TOKEN=<Your_Pushbullet_Token>
-
Start Redis (if not already running):
redis-server
-
Run Celery worker: In a new terminal, navigate to the project directory and run:
celery -A Services.config.celery_config.app worker --loglevel=info
-
Start the FastAPI server:
uvicorn main:app --host 127.0.0.1 --port 8000 --reload
-
Schedule tasks: In another terminal, run the task scheduler:
python main.py
If you have ideas for new features, improvements, or advanced system integrations, feel free to contribute. Whether it's enhancing task management, adding new notification channels, or optimizing performance, your expertise is valued.