Document LLM overrides and refresh provisioning workflows#13
Document LLM overrides and refresh provisioning workflows#13
Conversation
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting
| return LLMClient( | ||
| self.config.override( | ||
| models=models, base_urls=base_urls, api_key=api_key | ||
| ), | ||
| client_kwargs=self._default_kwargs, |
There was a problem hiding this comment.
with_overrides ignores base URL and API key changes
The new override helper creates a fresh LLMClient, but it passes the existing _default_kwargs as client_kwargs. Inside LLMClient.__init__ those kwargs are only merged with the new config via setdefault, so any previously cached api_key or base_url entries remain unchanged and the override never takes effect. Calling client.with_overrides(api_key="..." or base_urls={...}) therefore still talks to the original endpoint/credential. The constructor should rebuild client_kwargs from the overridden config (or omit the old defaults) so that the new key/URL is actually used.
Useful? React with 👍 / 👎.
| "label": payload.get("entity_label", label), | ||
| "name": payload.get("name", name), | ||
| "namespace": namespace, | ||
| "props": json.dumps(payload), | ||
| }, |
There was a problem hiding this comment.
SQLite driver fails to persist entities with datetimes/UUIDs
Entity properties are stored by dumping the entire payload with json.dumps(payload). Typical Memory instances contain uuid.UUID, datetime, and other non‑JSON‑serializable objects, so json.dumps raises TypeError before any insert occurs. As a result the SQLite backend cannot store memories out of the box. Properties should be converted to JSON‑safe values (e.g. via Pydantic’s model_dump(mode="json") or explicit isoformat()/str()) before calling json.dumps.
Useful? React with 👍 / 👎.
Summary
meshmind/llm_client.pyand thread LLM override plumbing through the client, pipelines, CLI, and service layersTesting
https://chatgpt.com/codex/tasks/task_b_68ee1c9ee92c83218cbbebce8b0667b8