- Use the official installer for Poetry:
curl -sSL https://install.python-poetry.org | python3 -
- On Windows (using PowerShell):
(Invoke-WebRequest -Uri https://install.python-poetry.org -UseBasicParsing).Content | py -
- To install all dependencies for an already created project:
poetry install --no-root
- To initialize a new Poetry project:
poetry init
- To add a new dependency:
poetry add <package-name>
- To remove a dependency:
poetry remove <package-name>
- Add FastAPI with standard dependencies:
poetry add "fastapi[standard]"
- Add Uvicorn (ASGI server):
poetry add uvicorn
- Use
poetry run
to execute Python scripts:Example:poetry run python <relative-path-to-file>
poetry run python 07-llm-and-prompt-engineering/01_gemeni_llm.py
- Use
poetry run
with Uvicorn to serve FastAPI applications:poetry run uvicorn <relative-path-to-module>:app --reload
- Replace
/
with.
in the path, and change.py
to:app
. - Example:
poetry run uvicorn 09-langgraph.websocket-agent.ws_agent_server_gemini:app --reload
- Replace
-
Python Foundations
- Overview of Python for AI development
- Essential libraries and best practices
-
AI Theory & Terminologies
- Key AI concepts and definitions
- Understanding machine learning, deep learning, and reinforcement learning
- Ethics and bias in AI
-
FastAPI
- Introduction to FastAPI for building APIs
- Designing, implementing, and testing AI-powered APIs
-
Databases
- SQL Databases: Basics and advanced queries
- NoSQL Databases: Understanding document, key-value, and graph databases
-
Third-Party Libraries
- NumPy: Numerical computing
- Pandas: Data manipulation and analysis
- OpenCV: Computer vision basics and applications
-
Model Development Lifecycle
- Model building and training using Keras
- Data preprocessing, validation, and evaluation
- Deploying AI models
-
Large Language Models (LLMs)
- Overview of LLMs: Gemini, OpenAI, and Allama
- Selecting the right LLM for applications
-
LLM Framework: LangChain
- Building applications with LangChain
- Advanced techniques for chaining and managing LLMs
-
Agentic Framework: LangGraph
- Overview and integration with LangChain
- Developing agentic systems for AI workflows
-
Agentic Framework: CrewAI
- Overview and integration with Mulit Ai Agents
- Developing agentic systems for AI workflows
-
Cloud Computing & DevOps
- Docker: Containerizing AI applications
- Kubernetes: Orchestrating containers at scale
- Managing deployments in cloud-native environments
-
Frontend Development with Next.js
- Designing chatbot UIs
- Building interactive agent frontends for seamless user experience
This structured course ensures a comprehensive journey from fundamental concepts to advanced cloud-native AI development, emphasizing both backend and frontend technologies.