Skip to content

feat: add LLM wrapper and interceptors for LLM calls#131

Merged
ankaisen merged 2 commits intomainfrom
feat/llm_wrapper
Jan 2, 2026
Merged

feat: add LLM wrapper and interceptors for LLM calls#131
ankaisen merged 2 commits intomainfrom
feat/llm_wrapper

Conversation

@ankaisen
Copy link
Collaborator

No description provided.

Copilot AI review requested due to automatic review settings December 30, 2025 16:59
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR introduces an LLM wrapper system with interceptors for LLM calls, enabling observability and control over LLM interactions. The implementation adds infrastructure for monitoring, tracing, and intercepting LLM operations before, after, and on error.

Key changes:

  • New LLMClientWrapper and LLMInterceptorRegistry for wrapping LLM clients and managing interceptors
  • Modified workflow step execution to include step_id in context for better traceability
  • Enhanced service layer to integrate LLM wrapper with interceptor registration methods

Reviewed changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 2 comments.

File Description
src/memu/llm/wrapper.py Implements core LLM wrapper infrastructure including interceptor registry, client wrapper, and associated dataclasses for call context, request/response views, and usage tracking
src/memu/app/service.py Integrates LLM wrapper into the service layer by adding interceptor registry, wrapping LLM clients with metadata, and exposing interceptor registration methods
src/memu/workflow/step.py Ensures step_id is always included in step context for LLM call traceability

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.



def _build_embedding_response_view(response: Sequence[Sequence[float]]) -> LLMResponseView:
vector_dim = len(response[0]) if response else 0
Copy link

Copilot AI Dec 30, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Potential IndexError if response is an empty sequence of sequences. The condition checks if response is truthy, but an empty list evaluates to False, while a list containing empty sequences like [[]] would pass the check but fail on response[0]. Consider checking response and response[0] or handling the case where response contains empty sequences.

Copilot uses AI. Check for mistakes.
await self._run_before(snapshot.before, call_ctx, request_view)
start_time = time.perf_counter()
try:
result = call_fn()
Copy link

Copilot AI Dec 30, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The call_fn() invocation is missing await. Since call_fn is expected to return an awaitable (as indicated by lines 363-364), this should be result = await call_fn() to properly await the coroutine. The current code relies on checking inspect.isawaitable(result) afterwards, but this creates an unawaited coroutine that may trigger runtime warnings.

Copilot uses AI. Check for mistakes.
@ankaisen ankaisen merged commit 416e102 into main Jan 2, 2026
2 checks passed
@ankaisen ankaisen deleted the feat/llm_wrapper branch January 2, 2026 12:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

2 participants