Skip to content

Commit 0b69bee

Browse files
Lint and fix issues in example projects
- Add test file pattern to ruff ignore rules for S101 - Fix logging issues in web scraper example (use module logger) - Fix try/except pattern in web scraper cache_page method - Auto-format code with ruff (import sorting, blank lines) - Fix markdown formatting issues (code block languages, list spacing) Co-authored-by: William Easton <strawgate@users.noreply.github.com>
1 parent 6b94cb8 commit 0b69bee

File tree

11 files changed

+31
-18
lines changed

11 files changed

+31
-18
lines changed

examples/README.md

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -47,6 +47,7 @@ pytest test_chat_app.py -v
4747
**Use Case**: Simple chat message storage with automatic expiration
4848

4949
**Demonstrates**:
50+
5051
- Type-safe message storage with PydanticAdapter
5152
- Automatic message expiration using TTLClampWrapper
5253
- Operation statistics tracking with StatisticsWrapper
@@ -66,6 +67,7 @@ pytest test_chat_app.py -v
6667
caching
6768

6869
**Demonstrates**:
70+
6971
- Multi-tier caching (memory + disk) with PassthroughCacheWrapper
7072
- Data compression with CompressionWrapper
7173
- Automatic retry with RetryWrapper
@@ -85,6 +87,7 @@ optimization
8587
**Use Case**: Cache scraped web pages with encryption and size limits
8688

8789
**Demonstrates**:
90+
8891
- Encrypted storage with FernetEncryptionWrapper
8992
- Size limits with LimitSizeWrapper (reject pages >5MB)
9093
- TTL enforcement with TTLClampWrapper
@@ -185,6 +188,7 @@ async def cache(self, tmp_path) -> MyCache:
185188
- pydantic
186189

187190
Additional dependencies per example:
191+
188192
- **trading_data**: None (uses built-in stores)
189193
- **web_scraper_cache**: cryptography (for encryption)
190194

@@ -194,7 +198,7 @@ Additional dependencies per example:
194198

195199
Each example follows this structure:
196200

197-
```
201+
```text
198202
example_name/
199203
├── README.md # Detailed documentation
200204
├── pyproject.toml # Project metadata and dependencies

examples/chat_app/chat_app.py

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,13 +12,12 @@
1212
import logging
1313
from datetime import datetime, timezone
1414

15-
from pydantic import BaseModel
16-
1715
from key_value.aio.adapters.pydantic import PydanticAdapter
1816
from key_value.aio.stores.memory.store import MemoryStore
1917
from key_value.aio.wrappers.logging.wrapper import LoggingWrapper
2018
from key_value.aio.wrappers.statistics.wrapper import StatisticsWrapper
2119
from key_value.aio.wrappers.ttl_clamp.wrapper import TTLClampWrapper
20+
from pydantic import BaseModel
2221

2322
# Configure logging
2423
logging.basicConfig(level=logging.INFO, format="%(asctime)s - %(name)s - %(levelname)s - %(message)s")

examples/trading_data/README.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -31,6 +31,7 @@ The wrapper stack (applied inside-out):
3131
persistence
3232

3333
Data flow:
34+
3435
- **Write**: Data → Memory cache → Compressed → Disk storage
3536
- **Read**: Check memory cache → If miss, load from disk → Decompress → Cache in
3637
memory
@@ -114,6 +115,7 @@ cache_store = PassthroughCacheWrapper(
114115
```
115116

116117
Benefits:
118+
117119
- **Fast reads**: Recent data served from memory
118120
- **Persistence**: All data backed by disk storage
119121
- **Automatic promotion**: Disk data cached in memory on read
@@ -127,6 +129,7 @@ compressed_store = CompressionWrapper(key_value=disk_cache)
127129
```
128130

129131
Especially effective for:
132+
130133
- Historical price data with many data points
131134
- JSON-serialized objects with repeated keys
132135
- Text-heavy data structures
@@ -144,6 +147,7 @@ retry_store = RetryWrapper(
144147
```
145148

146149
Automatically retries on:
150+
147151
- Network timeouts
148152
- Temporary unavailability
149153
- Rate limiting

examples/trading_data/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,4 +2,4 @@
22

33
from trading_app import PriceData, TradingDataCache
44

5-
__all__ = ["TradingDataCache", "PriceData"]
5+
__all__ = ["PriceData", "TradingDataCache"]

examples/trading_data/test_trading_app.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,6 @@
11
"""Tests for the trading data cache example."""
22

33
import pytest
4-
54
from trading_app import PriceData, TradingDataCache
65

76

examples/trading_data/trading_app.py

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -14,15 +14,14 @@
1414
from datetime import datetime, timezone
1515
from pathlib import Path
1616

17-
from pydantic import BaseModel
18-
1917
from key_value.aio.adapters.pydantic import PydanticAdapter
2018
from key_value.aio.stores.disk.store import DiskStore
2119
from key_value.aio.stores.memory.store import MemoryStore
2220
from key_value.aio.wrappers.compression.wrapper import CompressionWrapper
2321
from key_value.aio.wrappers.passthrough_cache.wrapper import PassthroughCacheWrapper
2422
from key_value.aio.wrappers.retry.wrapper import RetryWrapper
2523
from key_value.aio.wrappers.statistics.wrapper import StatisticsWrapper
24+
from pydantic import BaseModel
2625

2726
# Configure logging
2827
logging.basicConfig(level=logging.INFO, format="%(asctime)s - %(name)s - %(levelname)s - %(message)s")
@@ -167,7 +166,6 @@ def get_cache_statistics(self) -> dict[str, int]:
167166
async def cleanup(self):
168167
"""Clean up resources (close stores, etc.)."""
169168
# In a real application, you'd close any open connections here
170-
pass
171169

172170

173171
async def main():

examples/web_scraper_cache/README.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -30,6 +30,7 @@ The wrapper stack (applied inside-out):
3030
4. **FallbackWrapper** - Falls back to memory storage if disk operations fail
3131

3232
Data flow:
33+
3334
- **Write**: Data → Size check → Encrypt → Store (disk or memory fallback)
3435
- **Read**: Retrieve → Decrypt → Return data
3536

@@ -149,6 +150,7 @@ fallback_store = FallbackWrapper(
149150
```
150151

151152
Use cases:
153+
152154
- Disk failures → Memory fallback
153155
- Remote failures → Local fallback
154156
- Complex store → Simple store fallback
@@ -166,6 +168,7 @@ clamped_store = TTLClampWrapper(
166168
```
167169

168170
Benefits:
171+
169172
- Prevents too-short TTL (cache thrashing)
170173
- Prevents too-long TTL (stale data)
171174
- Enforces consistent caching policy
@@ -182,6 +185,7 @@ def url_to_key(url: str) -> str:
182185
```
183186

184187
Advantages:
188+
185189
- Filesystem-safe keys
186190
- Consistent key generation
187191
- Privacy (URLs not stored in plaintext as keys)

examples/web_scraper_cache/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,4 +2,4 @@
22

33
from scraper import ScrapedPage, WebScraperCache
44

5-
__all__ = ["WebScraperCache", "ScrapedPage"]
5+
__all__ = ["ScrapedPage", "WebScraperCache"]

examples/web_scraper_cache/scraper.py

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -16,18 +16,18 @@
1616
from pathlib import Path
1717

1818
from cryptography.fernet import Fernet
19-
from pydantic import BaseModel
20-
2119
from key_value.aio.adapters.pydantic import PydanticAdapter
2220
from key_value.aio.stores.disk.store import DiskStore
2321
from key_value.aio.stores.memory.store import MemoryStore
2422
from key_value.aio.wrappers.encryption.wrapper import FernetEncryptionWrapper
2523
from key_value.aio.wrappers.fallback.wrapper import FallbackWrapper
2624
from key_value.aio.wrappers.limit_size.wrapper import LimitSizeWrapper
2725
from key_value.aio.wrappers.ttl_clamp.wrapper import TTLClampWrapper
26+
from pydantic import BaseModel
2827

2928
# Configure logging
3029
logging.basicConfig(level=logging.INFO, format="%(asctime)s - %(name)s - %(levelname)s - %(message)s")
30+
logger = logging.getLogger(__name__)
3131

3232

3333
class ScrapedPage(BaseModel):
@@ -54,8 +54,8 @@ def __init__(self, cache_dir: str = ".scraper_cache", encryption_key: bytes | No
5454
# Generate or use provided encryption key
5555
if encryption_key is None:
5656
encryption_key = Fernet.generate_key()
57-
logging.warning(f"Generated new encryption key: {encryption_key.decode()}")
58-
logging.warning("Store this key securely! Data encrypted with different keys cannot be decrypted.")
57+
logger.warning(f"Generated new encryption key: {encryption_key.decode()}")
58+
logger.warning("Store this key securely! Data encrypted with different keys cannot be decrypted.")
5959

6060
self.encryption_key = encryption_key
6161

@@ -123,10 +123,11 @@ async def cache_page(self, url: str, content: str, headers: dict[str, str] | Non
123123

124124
try:
125125
await self.adapter.put(collection="pages", key=key, value=page, ttl=ttl)
126-
return True
127-
except Exception as e:
128-
logging.error(f"Failed to cache page {url}: {e}")
126+
except Exception:
127+
logger.exception(f"Failed to cache page {url}")
129128
return False
129+
else:
130+
return True
130131

131132
async def get_cached_page(self, url: str) -> ScrapedPage | None:
132133
"""
@@ -170,7 +171,6 @@ async def is_cached(self, url: str) -> bool:
170171
async def cleanup(self):
171172
"""Clean up resources."""
172173
# In a real application, you'd close any open connections here
173-
pass
174174

175175

176176
async def simulate_scrape(url: str) -> tuple[str, dict[str, str]]:

examples/web_scraper_cache/test_scraper.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,6 @@
22

33
import pytest
44
from cryptography.fernet import Fernet
5-
65
from scraper import ScrapedPage, WebScraperCache
76

87

0 commit comments

Comments
 (0)