Skip to content
Merged
5 changes: 4 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -86,4 +86,7 @@ conductor_ports.lock
line_item_*_response.json

# Test database files (PostgreSQL test databases)
test_*
# Only ignore database files/dirs, not test_*.py files
test_*.db
test_*.db-*
test_*[0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f]/
71 changes: 45 additions & 26 deletions CLAUDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,16 +2,30 @@

## 🚨 CRITICAL ARCHITECTURE PATTERNS

### Database JSON Fields Pattern (SQLAlchemy 2.0)
**🚨 MANDATORY**: All JSON columns MUST use `JSONType` for cross-database compatibility.
### PostgreSQL-Only Architecture
**🚨 DECISION**: This codebase uses PostgreSQL exclusively. No SQLite support.

**Why:**
- Production uses PostgreSQL exclusively
- SQLite hides bugs (different JSONB behavior, no connection pooling, single-threaded)
- "No fallbacks - if it's in our control, make it work" (core principle)
- One database. One source of truth. No hidden bugs.

**What this means:**
- All tests require PostgreSQL container (run via `./run_all_tests.sh ci`)
- `db_config.py` only supports PostgreSQL connections
- Unit tests should mock database access, not use real connections
- Integration tests require `ADCP_TEST_DB_URL` or will skip

**Migration note:** We removed SQLite support to eliminate cross-database bugs. If you see SQLite references in old docs/code, they're outdated.

---

**The Problem:**
- SQLite stores JSON as text strings → requires manual `json.loads()`
- PostgreSQL uses native JSONB → returns Python dicts/lists automatically
- This inconsistency causes bugs (e.g., iterating over strings character-by-character)
### Database JSON Fields Pattern (SQLAlchemy 2.0)
**🚨 MANDATORY**: All JSON columns MUST use `JSONType` for PostgreSQL JSONB handling.

**The Solution: JSONType**
We have a custom `JSONType` TypeDecorator that handles this automatically:
We have a custom `JSONType` TypeDecorator for PostgreSQL JSONB:

```python
# ✅ CORRECT - Use JSONType for ALL JSON columns
Expand Down Expand Up @@ -165,7 +179,7 @@ def create_media_buy_raw(promoted_offering: str, ...) -> CreateMediaBuyResponse:
**Your Deployment**: You can host this anywhere that supports:
- Docker containers (recommended)
- Python 3.11+
- PostgreSQL (production) or SQLite (dev/testing)
- PostgreSQL (production and testing)
- We'll support your deployment approach as best we can

### Git Workflow - MANDATORY (Reference Implementation)
Expand Down Expand Up @@ -386,13 +400,16 @@ docker-compose down

### Testing
```bash
# Run all tests
uv run pytest
# Recommended: Full test suite with PostgreSQL (matches CI/production)
./run_all_tests.sh ci

# Fast iteration during development (skips database tests)
./run_all_tests.sh quick

# By category
uv run pytest tests/unit/
uv run pytest tests/integration/
uv run pytest tests/e2e/
# Manual pytest commands
uv run pytest tests/unit/ # Unit tests only
uv run pytest tests/integration/ # Integration tests (needs database)
uv run pytest tests/e2e/ # E2E tests (needs database)

# With coverage
uv run pytest --cov=. --cov-report=html
Expand Down Expand Up @@ -593,31 +610,33 @@ If the hook isn't installed or you want to update it:

**Test Modes:**

**CI Mode (runs automatically on push):**
**CI Mode (DEFAULT - runs automatically on push):**
- Starts PostgreSQL container automatically (postgres:15)
- Runs ALL tests including database-dependent tests
- Exactly matches GitHub Actions environment
- Exactly matches GitHub Actions and production environment
- Catches database issues before CI does
- Automatically cleans up container
- ~3-5 minutes

**Quick Mode (for fast development iteration):**
- Fast validation: unit tests + integration tests (no database)
- Skips database-dependent tests
- Skips database-dependent tests (marked with `@pytest.mark.requires_db`)
- Good for rapid testing during development
- Run manually: `./run_all_tests.sh quick`

**Full Mode (comprehensive, no Docker):**
- All tests with SQLite instead of PostgreSQL
- Good for development without Docker
- Run manually: `./run_all_tests.sh full`
- ~1 minute

**Command Reference:**
```bash
./run_all_tests.sh ci # Like CI (PostgreSQL container) - USE THIS!
./run_all_tests.sh quick # Fast pre-push validation (automatic)
./run_all_tests.sh full # Full suite (SQLite, no Docker)
./run_all_tests.sh # CI mode (default) - PostgreSQL container
./run_all_tests.sh ci # CI mode (explicit) - USE THIS before pushing!
./run_all_tests.sh quick # Quick mode - fast iteration
```

**Why PostgreSQL-only?**
- Production uses PostgreSQL exclusively
- SQLite hides bugs (different JSONB behavior, no connection pooling, single-threaded)
- "No fallbacks - if it's in our control, make it work" (core principle)
- One database. One source of truth. No hidden bugs.

See `docs/testing/` for detailed patterns and case studies.

## Pre-Commit Hooks
Expand Down
71 changes: 11 additions & 60 deletions run_all_tests.sh
Original file line number Diff line number Diff line change
Expand Up @@ -20,13 +20,9 @@ BLUE='\033[0;34m'
NC='\033[0m' # No Color

# Determine test mode
MODE=${1:-full} # Default to full if no argument
USE_DOCKER=${USE_DOCKER:-false} # Set USE_DOCKER=true to run with PostgreSQL
MODE=${1:-ci} # Default to ci if no argument

echo "🧪 Running tests in '$MODE' mode..."
if [ "$USE_DOCKER" = "true" ]; then
echo -e "${BLUE}🐳 Docker mode enabled - using PostgreSQL container${NC}"
fi
echo ""

# Docker setup function (like CI does)
Expand Down Expand Up @@ -197,70 +193,25 @@ if [ "$MODE" == "ci" ]; then
exit 0
fi

# Full mode: all tests
if [ "$MODE" == "full" ]; then
echo "📦 Step 1/4: Validating imports..."

# Check all critical imports
if ! uv run python -c "from src.core.tools import get_products_raw, create_media_buy_raw, get_media_buy_delivery_raw, sync_creatives_raw, list_creatives_raw, list_creative_formats_raw, list_authorized_properties_raw" 2>/dev/null; then
echo -e "${RED}❌ Import validation failed!${NC}"
exit 1
fi

if ! uv run python -c "from src.core.main import _get_products_impl, _create_media_buy_impl, _get_media_buy_delivery_impl, _sync_creatives_impl, _list_creatives_impl, _list_creative_formats_impl, _list_authorized_properties_impl" 2>/dev/null; then
echo -e "${RED}❌ Import validation failed!${NC}"
exit 1
fi

echo -e "${GREEN}✅ Imports validated${NC}"
echo ""

echo "🧪 Step 2/4: Running unit tests..."
if ! uv run pytest tests/unit/ -x --tb=short; then
echo -e "${RED}❌ Unit tests failed!${NC}"
exit 1
fi
echo -e "${GREEN}✅ Unit tests passed${NC}"
echo ""

echo "🔗 Step 3/4: Running integration tests..."
if ! uv run pytest tests/integration/ -x --tb=short; then
echo -e "${RED}❌ Integration tests failed!${NC}"
exit 1
fi
echo -e "${GREEN}✅ Integration tests passed${NC}"
echo ""

echo "🌍 Step 4/4: Running e2e tests..."
if ! uv run pytest tests/e2e/ -x --tb=short; then
echo -e "${RED}❌ E2E tests failed!${NC}"
exit 1
fi
echo -e "${GREEN}✅ E2E tests passed${NC}"
echo ""

echo -e "${GREEN}✅ All tests passed!${NC}"
exit 0
fi

# Unknown mode
echo -e "${RED}❌ Unknown test mode: $MODE${NC}"
echo ""
echo "Usage: ./run_all_tests.sh [quick|ci|full]"
echo "Usage: ./run_all_tests.sh [quick|ci]"
echo ""
echo "Modes:"
echo " quick - Unit tests + integration tests (no database)"
echo " Fast validation for pre-push hook (~1 min)"
echo " Fast validation for rapid iteration (~1 min)"
echo " Skips database-dependent tests"
echo ""
echo " ci - Like GitHub Actions: PostgreSQL container + all tests"
echo " ci - Full test suite with PostgreSQL (DEFAULT)"
echo " Runs unit + integration + e2e with real database (~3-5 min)"
echo " Automatically starts/stops PostgreSQL container"
echo ""
echo " full - All tests with SQLite (no container needed)"
echo " Unit + integration + e2e tests (~3-5 min)"
echo " Matches production environment and GitHub Actions"
echo ""
echo "Examples:"
echo " ./run_all_tests.sh quick # Fast pre-push validation"
echo " ./run_all_tests.sh ci # Test like CI does (with PostgreSQL)"
echo " ./run_all_tests.sh full # Full test suite (SQLite)"
echo " ./run_all_tests.sh # Run CI mode (default, recommended)"
echo " ./run_all_tests.sh quick # Fast iteration during development"
echo " ./run_all_tests.sh ci # Explicit CI mode (same as default)"
echo ""
echo "💡 Tip: Use 'quick' for rapid development, 'ci' before pushing to catch all bugs"
exit 1
110 changes: 74 additions & 36 deletions src/core/database/database_session.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@
management across the entire application.
"""

import logging
import os
from collections.abc import Generator
from contextlib import contextmanager
from typing import Any
Expand All @@ -15,41 +17,73 @@

from src.core.database.db_config import DatabaseConfig

# Create engine and session factory with production-ready settings
connection_string = DatabaseConfig.get_connection_string()

# Configure engine with appropriate pooling and retry settings
if "postgresql" in connection_string:
# PostgreSQL production settings
engine = create_engine(
connection_string,
pool_size=10, # Base connections in pool
max_overflow=20, # Additional connections beyond pool_size
pool_timeout=30, # Seconds to wait for connection
pool_recycle=3600, # Recycle connections after 1 hour
pool_pre_ping=True, # Test connections before use
echo=False, # Set to True for SQL logging in debug
)
else:
# SQLite settings (development) - SQLite doesn't support all pool options
engine = create_engine(
connection_string,
pool_pre_ping=True,
echo=False,
)

SessionLocal = sessionmaker(bind=engine)
import logging

logger = logging.getLogger(__name__)

# Thread-safe session factory
db_session = scoped_session(SessionLocal)
# Module-level globals for lazy initialization
_engine = None
_session_factory = None
_scoped_session = None


def get_engine():
"""Get the current database engine."""
return engine
"""Get or create the database engine (lazy initialization)."""
global _engine, _session_factory, _scoped_session

if _engine is None:
# In test mode without DATABASE_URL, we should NOT create a real connection
# Unit tests should mock database access, not use real connections
if os.environ.get("ADCP_TESTING") and not os.environ.get("DATABASE_URL"):
raise RuntimeError(
"Unit tests should not create real database connections. "
"Either mock get_db_session() or set DATABASE_URL for integration tests. "
"Use @pytest.mark.requires_db for integration tests."
)

# Get connection string from config
connection_string = DatabaseConfig.get_connection_string()

if "postgresql" not in connection_string:
raise ValueError("Only PostgreSQL is supported. Use DATABASE_URL=postgresql://...")

# Create engine with production-ready settings
_engine = create_engine(
connection_string,
pool_size=10, # Base connections in pool
max_overflow=20, # Additional connections beyond pool_size
pool_timeout=30, # Seconds to wait for connection
pool_recycle=3600, # Recycle connections after 1 hour
pool_pre_ping=True, # Test connections before use
echo=False, # Set to True for SQL logging in debug
connect_args={"connect_timeout": 5} if os.environ.get("ADCP_TESTING") else {}, # Fast timeout for tests
)

# Create session factory
_session_factory = sessionmaker(bind=_engine)
_scoped_session = scoped_session(_session_factory)

return _engine


def reset_engine():
"""Reset engine for testing - closes existing connections and clears global state."""
global _engine, _session_factory, _scoped_session

if _scoped_session is not None:
_scoped_session.remove()
_scoped_session = None

if _engine is not None:
_engine.dispose()
_engine = None

_session_factory = None


def get_scoped_session():
"""Get the scoped session factory (lazy initialization)."""
# Calling get_engine() ensures all globals are initialized
get_engine()
return _scoped_session


@contextmanager
Expand All @@ -67,22 +101,23 @@ def get_db_session() -> Generator[Session, None, None]:
The session will automatically rollback on exception and
always be properly closed. Connection errors are logged with more detail.
"""
session = db_session()
scoped = get_scoped_session()
session = scoped()
try:
yield session
except (OperationalError, DisconnectionError) as e:
logger.error(f"Database connection error: {e}")
session.rollback()
# Remove session from registry to force reconnection
db_session.remove()
scoped.remove()
raise
except SQLAlchemyError as e:
logger.error(f"Database error: {e}")
session.rollback()
raise
finally:
session.close()
db_session.remove()
scoped.remove()


def execute_with_retry(func, max_retries: int = 3, retry_on: tuple = (OperationalError, DisconnectionError)) -> Any:
Expand Down Expand Up @@ -115,7 +150,8 @@ def execute_with_retry(func, max_retries: int = 3, retry_on: tuple = (Operationa
wait_time = 0.5 * (2**attempt)
logger.info(f"Waiting {wait_time}s before retry...")
time.sleep(wait_time)
db_session.remove() # Clear the session registry
scoped = get_scoped_session()
scoped.remove() # Clear the session registry
continue
raise
except SQLAlchemyError as e:
Expand All @@ -142,7 +178,8 @@ def __init__(self):
def session(self) -> Session:
"""Get or create a session."""
if self._session is None:
self._session = db_session()
scoped = get_scoped_session()
self._session = scoped()
return self._session

def commit(self):
Expand All @@ -163,7 +200,8 @@ def close(self):
"""Close and cleanup the session."""
if self._session:
self._session.close()
db_session.remove()
scoped = get_scoped_session()
scoped.remove()
self._session = None

def __enter__(self):
Expand Down
Loading
Loading