Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
109 commits
Select commit Hold shift + click to select a range
3590c43
docs: LLM profiles design + example profile
openhands-agent Oct 18, 2025
9b1e3db
llm: add profile_id field to LLM (profile filename identifier)\n\nCo-…
openhands-agent Oct 18, 2025
21efefe
feat(llm): add ProfileManager and eagerly register profiles at conver…
openhands-agent Oct 18, 2025
46ca1b7
chore: stop tracking local runtime and worktree files; add to .gitignore
openhands-agent Oct 18, 2025
5efdaee
chore: only ignore bead databases
enyst Oct 18, 2025
9cbf67f
test: cover llm profile manager
enyst Oct 18, 2025
dfab517
Update .gitignore
enyst Oct 18, 2025
441eb25
Improve LLM profile manager persistence
enyst Oct 18, 2025
e7cd039
Add example for managing LLM profiles
enyst Oct 18, 2025
269610a
Document plan for profile references
enyst Oct 18, 2025
d0ab952
Integrate profile-aware persistence
enyst Oct 19, 2025
f74d050
Simplify profile registration logging
enyst Oct 19, 2025
df308fb
Normalize inline_mode naming
enyst Oct 19, 2025
4d293db
Simplify profile_id sync in ProfileManager
enyst Oct 19, 2025
7d1a525
Rename profile sync helper
enyst Oct 19, 2025
ec45ed5
LLMRegistry handles profile management
enyst Oct 19, 2025
1566df4
docs: clarify LLMRegistry profile guidance
enyst Oct 19, 2025
8f8b5b9
refactor: rename profile persistence helpers
enyst Oct 19, 2025
a3efa6e
refactor: split profile transform helpers
enyst Oct 19, 2025
17617aa
style: use f-strings in LLMRegistry logging
enyst Oct 19, 2025
9134aa1
Update openhands/sdk/llm/llm_registry.py
enyst Oct 19, 2025
36ab580
chore: stop tracking scripts/worktree.sh
enyst Oct 19, 2025
cea6a0d
Merge upstream main into agent-sdk-18-profile-manager
enyst Oct 21, 2025
12eec55
fix: remove runtime llm switching
enyst Oct 21, 2025
03b4600
style: use f-string for registry logging
enyst Oct 21, 2025
acf67e3
docs: expand LLM profile example
enyst Oct 21, 2025
218728e
Refine LLM profile persistence
enyst Oct 21, 2025
75e8ecd
Update LLM profile docs for usage_id semantics
enyst Oct 22, 2025
8511524
Merge remote-tracking branch 'upstream/main' into agent-sdk-18-profil…
enyst Oct 23, 2025
1f3adab
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Oct 24, 2025
96ba8e9
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Oct 25, 2025
142faee
fix LLM mutation for profiles to respect immutability; add docstring;…
enyst Oct 25, 2025
82138dd
refactor: keep LLM profile expansion at persistence layer
enyst Oct 25, 2025
b6511a9
Merge branch 'main' of github.com:All-Hands-AI/agent-sdk into agent-s…
enyst Oct 25, 2025
f5404b6
fix: restore LLM profile validation behavior
enyst Oct 26, 2025
85bc698
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Oct 26, 2025
ba4bd50
harden profile handling
enyst Oct 26, 2025
99a422c
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Nov 6, 2025
5c52fa5
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Nov 20, 2025
b69db09
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Nov 28, 2025
5dc94c1
update to current state
enyst Nov 28, 2025
69d3a7d
remove deprecated from llm
enyst Nov 28, 2025
61f5b77
ruff
enyst Nov 28, 2025
2381da7
restore gitignore
enyst Nov 28, 2025
b2f80d3
Delete .openhands/microagents/vscode.md
enyst Nov 28, 2025
8a95dac
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Dec 1, 2025
0aa1164
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Dec 12, 2025
744f171
fix(llm): tolerate legacy profile fields
enyst Dec 12, 2025
24d59bd
fix(llm): keep profile loading strict
enyst Dec 12, 2025
a4d6cd4
fix(llm): reduce profile side effects
enyst Dec 12, 2025
075c9b2
test(utils): stabilize discriminated union suite
enyst Dec 12, 2025
ab3a265
single source of truth for persistence behavior
enyst Dec 13, 2025
82549cc
Merge branch 'main' of github.com:OpenHands/software-agent-sdk into a…
enyst Dec 13, 2025
f400d7d
Update openhands-sdk/openhands/sdk/persistence/__init__.py
enyst Dec 14, 2025
a112ddc
feat(llm): save API keys in LLM profiles by default and set 0600 perm…
enyst Dec 15, 2025
60bfbb2
Merge branch 'main' of github.com:OpenHands/software-agent-sdk into a…
enyst Dec 16, 2025
0d01065
Merge branch 'agent-sdk-18-profile-manager' of github.com:OpenHands/s…
enyst Dec 16, 2025
bc94774
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Dec 16, 2025
ad07b05
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Dec 17, 2025
2464633
Delete docs/llm_profiles.md
enyst Dec 18, 2025
db10002
Update openhands-sdk/openhands/sdk/llm/llm.py
enyst Dec 18, 2025
ce31e79
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Dec 18, 2025
1fe3929
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Dec 19, 2025
8625ff2
Merge remote-tracking branch 'upstream/main' into agent-sdk-18-profil…
enyst Dec 29, 2025
95d94c3
Merge remote-tracking branch 'upstream/main' into agent-sdk-18-profil…
enyst Dec 29, 2025
67ab2c0
ci: detect nested examples in docs check
enyst Dec 29, 2025
fab1d57
ci: fix nested examples regex
enyst Dec 29, 2025
926fb90
ci(docs): clarify example skip rationale
enyst Dec 30, 2025
5676592
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Jan 1, 2026
b2ea371
Merge main into agent-sdk-18-profile-manager
enyst Jan 6, 2026
2aa320d
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Jan 7, 2026
b5a01ad
fix(llm): reject unknown fields when loading profiles
enyst Jan 7, 2026
90257c5
Revert "fix(llm): reject unknown fields when loading profiles"
enyst Jan 7, 2026
7a83b34
refactor(persistence): default to LLM profiles, drop inline env toggle
enyst Jan 7, 2026
9530155
Update .gitignore
enyst Jan 7, 2026
69e259b
chore(examples): make llm profiles example last
enyst Jan 8, 2026
c6f5db7
Update examples/01_standalone_sdk/34_llm_profiles.py
enyst Jan 8, 2026
9ecab27
Merge branch 'main' into agent-sdk-18-profile-manager
xingyaoww Jan 8, 2026
cd3ab89
Merge branch 'main' into agent-sdk-18-profile-manager
xingyaoww Jan 8, 2026
9859f21
chore(examples): inline llm profiles script body
enyst Jan 8, 2026
23cb159
feat(llm): default profile persistence and drop inline key
enyst Jan 8, 2026
dcc83f5
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Jan 8, 2026
6807c99
test: fix workflow model resolver + session api key env
enyst Jan 27, 2026
498fd80
Merge main into agent-sdk-18-profile-manager
enyst Jan 27, 2026
1383003
chore: cleanup from PR review
enyst Jan 27, 2026
4b34b5f
Merge upstream/main into agent-sdk-18-profile-manager
enyst Jan 27, 2026
5e3caab
Update examples/01_standalone_sdk/34_llm_profiles.py
enyst Jan 27, 2026
66411b3
fix(tests): correct broken assertions in test_resolve_model_config.py
enyst Jan 27, 2026
897392c
fix(tests): update test_conversation_basic_creation for profile feature
enyst Jan 27, 2026
1c1be88
feat(profiles): use usage_id as profile name instead of 'default'
enyst Jan 27, 2026
affa51c
chore: rename llm_profiles example to 35 (34 was taken)
enyst Jan 27, 2026
f496755
chore: address AI review bot suggestions
enyst Jan 27, 2026
8e10e46
refactor(llm): encapsulate profile persistence in store
enyst Jan 27, 2026
7f19ae3
feat(llm): support LLM_PROFILES_DIR env var
enyst Jan 27, 2026
ffaeb7b
feat(conversation): persist explicit LLM profile refs
enyst Jan 27, 2026
f8b113a
docs(llm): clarify profile_id persistence contract
enyst Jan 27, 2026
bd7c2ce
docs(examples): align llm profiles example with profile_ref schema
enyst Jan 27, 2026
f754097
fix(llm): disallow profile_ref payloads in profile validation
enyst Jan 27, 2026
813645a
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Jan 28, 2026
a742c61
Merge branch 'main' into agent-sdk-18-profile-manager
enyst Feb 2, 2026
63b30c5
refactor(llm): extract profile ref expansion
enyst Feb 3, 2026
a625b95
Update openhands-sdk/openhands/sdk/conversation/impl/local_conversati…
enyst Feb 3, 2026
86ae7cc
Update openhands-sdk/openhands/sdk/llm/llm.py
enyst Feb 3, 2026
46b36ee
refactor: use pydantic serializer for llm profile refs
enyst Feb 3, 2026
449e790
feat(llm): log secret scrubbing on save
enyst Feb 3, 2026
eacd4e3
test: cover missing profile refs in state resume
enyst Feb 3, 2026
e0dc785
chore(examples): renumber llm profiles example
enyst Feb 3, 2026
f3b58c1
fix(llm): persist profile refs only for profile_ref inputs
enyst Feb 3, 2026
572e719
refactor(examples): remove if __name__ guard from llm_profiles example
openhands-agent Feb 4, 2026
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 9 additions & 3 deletions .github/scripts/check_documented_examples.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,12 @@ def find_documented_examples(docs_path: Path) -> set[str]:
"""
documented_examples: set[str] = set()

# Pattern to match example file references with arbitrary nesting depth.
# Pattern to match example file references.
#
# The agent-sdk examples tree includes nested modules (e.g.
# examples/02_remote_agent_server/05_custom_tool/custom_tools/log_data.py),
# so we intentionally support *arbitrary* nesting depth under examples/.
#
# Matches: examples/<dir>/.../<file>.py
pattern = r"examples/(?:[-\w]+/)+[-\w]+\.py"

Expand Down Expand Up @@ -81,8 +86,9 @@ def find_agent_sdk_examples(agent_sdk_path: Path) -> set[str]:
if relative_path_str.startswith("examples/03_github_workflows/"):
continue

# Skip LLM-specific tools examples: these are intentionally not
# enforced by the docs check. See discussion in PR #1486.
# Skip LLM-specific tools examples: these depend on external
# model/provider availability and are intentionally excluded from
# docs example enforcement.
if relative_path_str.startswith("examples/04_llm_specific_tools/"):
continue

Expand Down
3 changes: 3 additions & 0 deletions AGENTS.md
Original file line number Diff line number Diff line change
Expand Up @@ -167,6 +167,9 @@ mkdir -p .pr
- Avoid getattr/hasattr guards and instead enforce type correctness by relying on explicit type assertions and proper object usage, ensuring functions only receive the expected Pydantic models or typed inputs. Prefer type hints and validated models over runtime shape checks.
- Prefer accessing typed attributes directly. If necessary, convert inputs up front into a canonical shape; avoid purely hypothetical fallbacks.
- Use real newlines in commit messages; do not write literal "\n".

## Example Scripts
- Example scripts in `examples/` should run code directly at module level without wrapping in `if __name__ == "__main__":` guards. This saves a level of indentation and keeps examples concise.
</CODE>

<TESTING>
Expand Down
132 changes: 132 additions & 0 deletions examples/01_standalone_sdk/36_llm_profiles.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,132 @@
"""Create and use an LLM profile with :class:`LLMRegistry`.

Run with::

uv run python examples/01_standalone_sdk/36_llm_profiles.py

Profiles are stored under ``$LLM_PROFILES_DIR/<name>.json`` when the env var is
set, otherwise ``~/.openhands/llm-profiles/<name>.json``.

Set ``LLM_PROFILE_NAME`` to choose which profile file to load.

Notes on credentials:
- New profiles include API keys by default when saved
- To omit secrets on disk, pass include_secrets=False to LLMRegistry.save_profile
Comment on lines +10 to +14
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🔴 Critical: This documentation is misleading and dangerous. The current default behavior (include_secrets=True) could lead users to accidentally commit API keys.

Suggested change
Set ``LLM_PROFILE_NAME`` to choose which profile file to load.
Notes on credentials:
- New profiles include API keys by default when saved
- To omit secrets on disk, pass include_secrets=False to LLMRegistry.save_profile
Set ``LLM_PROFILE_NAME`` to choose which profile file to load.
Security Best Practice:
- Profiles should be saved WITHOUT secrets (include_secrets=False)
- Provide API keys via environment variables (LLM_API_KEY, AWS_ACCESS_KEY_ID, etc.)
- Never commit profile files containing secrets to version control
- Add *.json to .gitignore if storing profiles in your project directory

Copy link
Collaborator Author

@enyst enyst Feb 3, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as below

"""

import json
import os
from pathlib import Path

from pydantic import SecretStr

from openhands.sdk import (
LLM,
Agent,
Conversation,
LLMRegistry,
Tool,
)
from openhands.tools.terminal import TerminalTool


PROFILE_NAME = os.getenv("LLM_PROFILE_NAME", "gpt-5-mini")


def ensure_profile_exists(registry: LLMRegistry, name: str) -> None:
"""Create a starter profile in the default directory when missing."""

if name in registry.list_profiles():
return

model = os.getenv("LLM_MODEL", "anthropic/claude-sonnet-4-5-20250929")
base_url = os.getenv("LLM_BASE_URL")
api_key = os.getenv("LLM_API_KEY")

profile_defaults = LLM(
usage_id="agent",
model=model,
base_url=base_url,
api_key=SecretStr(api_key) if api_key else None,
temperature=0.2,
max_output_tokens=4096,
)
path = registry.save_profile(name, profile_defaults)
print(f"Created profile '{name}' at {path}")


def load_profile(registry: LLMRegistry, name: str) -> LLM:
"""Load profile and merge credentials from environment if needed.

Note: Profiles should be saved without secrets (include_secrets=False)
and credentials provided via environment variables for better security.
"""
llm = registry.load_profile(name)
# If profile was saved without secrets, allow providing API key via env var
if llm.api_key is None:
api_key = os.getenv("LLM_API_KEY")
if api_key:
llm = llm.model_copy(update={"api_key": SecretStr(api_key)})
return llm


registry = LLMRegistry()
ensure_profile_exists(registry, PROFILE_NAME)

llm = load_profile(registry, PROFILE_NAME)

tools = [Tool(name=TerminalTool.name)]
agent = Agent(llm=llm, tools=tools)

workspace_dir = Path(os.getcwd())
summary_path = workspace_dir / "summary_readme.md"
if summary_path.exists():
summary_path.unlink()

persistence_root = workspace_dir / ".conversations_llm_profiles"
conversation = Conversation(
agent=agent,
workspace=str(workspace_dir),
persistence_dir=str(persistence_root),
visualizer=None,
)

conversation.send_message(
"Read README.md in this workspace, create a concise summary in "
"summary_readme.md (overwrite it if it exists), and respond with "
"SUMMARY_READY when the file is written."
)
conversation.run()

if summary_path.exists():
print(f"summary_readme.md written to {summary_path}")
else:
print("summary_readme.md not found after first run")

conversation.send_message(
"Thanks! Delete summary_readme.md from the workspace and respond with "
"SUMMARY_REMOVED once it is gone."
)
conversation.run()

if summary_path.exists():
print("summary_readme.md still present after deletion request")
else:
print("summary_readme.md removed")

persistence_dir = conversation.state.persistence_dir
if persistence_dir is None:
raise RuntimeError("Conversation did not persist base state to disk")

base_state_path = Path(persistence_dir) / "base_state.json"
state_payload = json.loads(base_state_path.read_text())
llm_entry = state_payload.get("agent", {}).get("llm", {})
profile_in_state = llm_entry.get("profile_id")
kind_in_state = llm_entry.get("kind")
print(f"Profile recorded in base_state.json: {kind_in_state} / {profile_in_state}")
if kind_in_state != "profile_ref" or profile_in_state != PROFILE_NAME:
print(
"Warning: base_state.json did not persist the expected profile_ref payload."
" This likely means your runtime LLM did not have profile_id set,"
" or persistence was configured differently."
)
7 changes: 7 additions & 0 deletions examples/llm-profiles/gpt-5-mini.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
{
"model": "litellm_proxy/openai/gpt-5-mini",
"base_url": "https://llm-proxy.eval.all-hands.dev",
"temperature": 0.2,
"max_output_tokens": 4096,
"usage_id": "agent"
}
1 change: 1 addition & 0 deletions openhands-sdk/openhands/sdk/agent/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -420,6 +420,7 @@ def model_dump_succint(self, **kwargs):
"""Like model_dump, but excludes None fields by default."""
if "exclude_none" not in kwargs:
kwargs["exclude_none"] = True

dumped = super().model_dump(**kwargs)
# remove tool schema details for brevity
if "tools" in dumped and isinstance(dumped["tools"], dict):
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -133,6 +133,12 @@ def __init__(
decrypted when loading. If not provided, secrets are redacted
(lost) on serialization.
"""
# Initialize the registry early so profile references resolve during resume.
# The registry must exist before ConversationState.create() attempts to load
# persisted state that may contain profile_ref payloads.
self.llm_registry = LLMRegistry()
self.llm_registry = LLMRegistry()

super().__init__() # Initialize with span tracking
# Mark cleanup as initiated as early as possible to avoid races or partially
# initialized instances during interpreter shutdown.
Expand Down Expand Up @@ -169,6 +175,7 @@ def __init__(
else None,
max_iterations=max_iteration_per_run,
stuck_detection=stuck_detection,
llm_registry=self.llm_registry,
cipher=cipher,
)

Expand Down Expand Up @@ -234,7 +241,6 @@ def _default_callback(e):

# Agent initialization is deferred to _ensure_agent_ready() for lazy loading
# This ensures plugins are loaded before agent initialization
self.llm_registry = LLMRegistry()

# Initialize secrets if provided
if secrets:
Expand Down
94 changes: 74 additions & 20 deletions openhands-sdk/openhands/sdk/conversation/state.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
from collections.abc import Sequence
from enum import Enum
from pathlib import Path
from typing import Any, Self
from typing import TYPE_CHECKING, Any, Self

from pydantic import Field, PrivateAttr, model_validator

Expand All @@ -18,6 +18,12 @@
from openhands.sdk.event.base import Event
from openhands.sdk.io import FileStore, InMemoryFileStore, LocalFileStore
from openhands.sdk.logger import get_logger


if TYPE_CHECKING:
from openhands.sdk.llm.llm_registry import LLMRegistry


from openhands.sdk.security.analyzer import SecurityAnalyzerBase
from openhands.sdk.security.confirmation_policy import (
ConfirmationPolicyBase,
Expand Down Expand Up @@ -181,8 +187,15 @@ def _save_base_state(self, fs: FileStore) -> None:
"redacted and lost on restore. Consider providing a cipher to "
"preserve secrets."
)
payload = self.model_dump_json(exclude_none=True, context=context)
fs.write(BASE_STATE, payload)
payload = self.model_dump(
mode="json",
exclude_none=True,
context={**(context or {}), "persist_profile_ref": True},
)
if self.agent.llm.profile_id and self.agent.llm.profile_ref:
payload["agent"]["llm"] = self.agent.llm.to_profile_ref()

fs.write(BASE_STATE, json.dumps(payload))

# ===== Factory: open-or-create (no load/save methods needed) =====
@classmethod
Expand All @@ -194,6 +207,7 @@ def create(
persistence_dir: str | None = None,
max_iterations: int = 500,
stuck_detection: bool = True,
llm_registry: "LLMRegistry | None" = None,
cipher: Cipher | None = None,
) -> "ConversationState":
"""Create a new conversation state or resume from persistence.
Expand All @@ -211,13 +225,19 @@ def create(
history), but all other configuration can be freely changed: LLM,
agent_context, condenser, system prompts, etc.

When conversation state is persisted with LLM profile references (instead
of inlined credentials), pass an ``llm_registry`` so profile IDs can be
expanded during restore.

Args:
id: Unique conversation identifier
agent: The Agent to use (tools must match persisted on restore)
workspace: Working directory for agent operations
persistence_dir: Directory for persisting state and events
max_iterations: Maximum iterations per run
stuck_detection: Whether to enable stuck detection
llm_registry: Optional registry used to expand profile references when
conversations persist profile IDs instead of inline credentials.
cipher: Optional cipher for encrypting/decrypting secrets in
persisted state. If provided, secrets are encrypted when
saving and decrypted when loading. If not provided, secrets
Expand All @@ -241,35 +261,69 @@ def create(
except FileNotFoundError:
base_text = None

context: dict[str, object] = {}
registry = llm_registry
if registry is None:
from openhands.sdk.llm.llm_registry import LLMRegistry

registry = LLMRegistry()
context["llm_registry"] = registry

# Ensure we have a registry available during both dump and validate.
#
# We do NOT implicitly write profile files here. Instead, persistence will
# store a profile reference only when the runtime LLM already has an
# explicit ``profile_id``.

# ---- Resume path ----
if base_text:
# Use cipher context for decrypting secrets if provided
context = {"cipher": cipher} if cipher else None
state = cls.model_validate(json.loads(base_text), context=context)
base_payload = json.loads(base_text)
# Add cipher context for decrypting secrets if provided
if cipher:
context["cipher"] = cipher

# Restore the conversation with the same id
if state.id != id:
persisted_id = ConversationID(base_payload.get("id"))
if persisted_id != id:
raise ValueError(
f"Conversation ID mismatch: provided {id}, "
f"but persisted state has {state.id}"
f"but persisted state has {persisted_id}"
)

persisted_agent_payload = base_payload.get("agent")
if persisted_agent_payload is None:
raise ValueError("Persisted conversation is missing agent state")

# Attach event log early so we can read history for tool verification
event_log = EventLog(file_store, dir_path=EVENTS_DIR)

persisted_agent = AgentBase.model_validate(
persisted_agent_payload,
context={"llm_registry": registry},
)
agent.verify(persisted_agent, events=event_log)

# Use runtime-provided Agent directly (PR #1542 / issue #1451)
#
# Persist LLMs as profile references only when an explicit profile_id is
# set on the runtime LLM.
agent_payload = agent.model_dump(
mode="json",
exclude_none=True,
context={"expose_secrets": True, "persist_profile_ref": True},
)
if agent.llm.profile_id and agent.llm.profile_ref:
agent_payload["llm"] = agent.llm.to_profile_ref()

base_payload["agent"] = agent_payload
base_payload["workspace"] = workspace.model_dump(mode="json")
base_payload["max_iterations"] = max_iterations
Comment on lines 261 to 319
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🟠 Important: The create() method has become quite complex with the profile reference logic. Consider extracting the resume logic into a separate _resume_from_persistence() method to improve readability.

The multiple payload mutations (expanding profile refs, injecting runtime agent, converting back to profile refs) make this hard to follow and maintain.


state = cls.model_validate(base_payload, context=context)
state._fs = file_store
state._events = EventLog(file_store, dir_path=EVENTS_DIR)
state._events = event_log
state._cipher = cipher

# Verify compatibility (agent class + tools)
agent.verify(state.agent, events=state._events)

# Commit runtime-provided values (may autosave)
state._autosave_enabled = True
state.agent = agent
state.workspace = workspace
state.max_iterations = max_iterations

# Note: stats are already deserialized from base_state.json above.
# Do NOT reset stats here - this would lose accumulated metrics.

logger.info(
f"Resumed conversation {state.id} from persistent storage.\n"
Expand Down
Loading
Loading