Skip to content

Commit 4e4cd18

Browse files
daibhinDavid Newellandrewm4894claude
authored
feat: llma / error tracking integration (#376)
* feat: llma / error tracking integration * capture all metadata in llm event * instrument with contexts * bump version * indentation * linting * tests * raise * test: add exception capture integration tests for langchain Add 6 tests covering the new LLMA + error tracking integration: - capture_exception called on span/generation errors - $exception_event_id added to AI events - No capture when autocapture disabled - AI properties passed to exception event - Handles None return from capture_exception 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * fix: pass context tags to capture() for test compatibility - Export get_tags() from posthog module - Explicitly pass context tags to capture() in AI utils - Fix $ai_model fallback to extract from response.model - Fix ruff formatting in langchain test_callbacks.py 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * fix: disable auto-capture exceptions in LLM context The new_context() defaults to capture_exceptions=True which would auto-capture any exception regardless of enable_exception_autocapture setting. This was inconsistent with LangChain callbacks which explicitly check the setting. Pass capture_exceptions=False to let exception handling be controlled explicitly by the enable_exception_autocapture setting. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * fix: isolate LLM context with fresh=True to avoid tag inheritance Use fresh=True to start with a clean context for each LLM call. This avoids inheriting $ai_* tags from parent contexts which could cause mismatched AI metadata due to the tag merge order bug in contexts.py (parent tags incorrectly override child tags). 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * fix: correct tag merge order so child tags take precedence The collect_tags() method had a bug where parent tags would overwrite child tags, despite the comment saying the opposite. This fix ensures child context tags properly override parent tags. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * refactor: remove fresh=True now that tag merge order is fixed With the collect_tags() bug fixed, child tags properly override parent tags. LLM events can now inherit useful parent context tags (request_id, user info, etc.) while still having their $ai_* tags take precedence. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * test: add test for child tags overriding parent tags Verifies that in non-fresh contexts, child tags properly override parent tags with the same key while still inheriting other parent tags. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * chore: add TODO for OpenAI/Anthropic/Gemini exception capture Document that exception capture needs to be added for the direct SDK wrappers, similar to how it's implemented in LangChain callbacks. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> --------- Co-authored-by: David Newell <david@Mac.communityfibre.co.uk> Co-authored-by: Andrew Maguire <andrewm4894@gmail.com> Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
1 parent c1548c4 commit 4e4cd18

9 files changed

Lines changed: 534 additions & 210 deletions

File tree

CHANGELOG.md

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
# 7.5.0 - 2026-01-06
2+
3+
feat: Capture Langchain, OpenAI and Anthropic errors as exceptions (if exception autocapture is enabled)
4+
feat: Add reference to exception in LLMA trace and span events
5+
16
# 7.4.3 - 2026-01-02
27

38
Fixes cache creation cost for Langchain with Anthropic

posthog/__init__.py

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,9 @@
2929
from posthog.contexts import (
3030
tag as inner_tag,
3131
)
32+
from posthog.contexts import (
33+
get_tags as inner_get_tags,
34+
)
3235
from posthog.exception_utils import (
3336
DEFAULT_CODE_VARIABLES_IGNORE_PATTERNS,
3437
DEFAULT_CODE_VARIABLES_MASK_PATTERNS,
@@ -190,6 +193,19 @@ def tag(name: str, value: Any):
190193
return inner_tag(name, value)
191194

192195

196+
def get_tags() -> Dict[str, Any]:
197+
"""
198+
Get all tags from the current context.
199+
200+
Returns:
201+
Dict of all tags in the current context
202+
203+
Category:
204+
Contexts
205+
"""
206+
return inner_get_tags()
207+
208+
193209
"""Settings."""
194210
api_key = None # type: Optional[str]
195211
host = None # type: Optional[str]

posthog/ai/langchain/callbacks.py

Lines changed: 46 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -22,8 +22,8 @@
2222

2323
try:
2424
# LangChain 1.0+ and modern 0.x with langchain-core
25-
from langchain_core.callbacks.base import BaseCallbackHandler
2625
from langchain_core.agents import AgentAction, AgentFinish
26+
from langchain_core.callbacks.base import BaseCallbackHandler
2727
except (ImportError, ModuleNotFoundError):
2828
# Fallback for older LangChain versions
2929
from langchain.callbacks.base import BaseCallbackHandler
@@ -35,15 +35,15 @@
3535
FunctionMessage,
3636
HumanMessage,
3737
SystemMessage,
38-
ToolMessage,
3938
ToolCall,
39+
ToolMessage,
4040
)
4141
from langchain_core.outputs import ChatGeneration, LLMResult
4242
from pydantic import BaseModel
4343

4444
from posthog import setup
45-
from posthog.ai.utils import get_model_params, with_privacy_mode
4645
from posthog.ai.sanitization import sanitize_langchain
46+
from posthog.ai.utils import get_model_params, with_privacy_mode
4747
from posthog.client import Client
4848

4949
log = logging.getLogger("posthog")
@@ -506,6 +506,14 @@ def _capture_trace_or_span(
506506
if isinstance(outputs, BaseException):
507507
event_properties["$ai_error"] = _stringify_exception(outputs)
508508
event_properties["$ai_is_error"] = True
509+
event_properties = _capture_exception_and_update_properties(
510+
self._ph_client,
511+
outputs,
512+
self._distinct_id,
513+
self._groups,
514+
event_properties,
515+
)
516+
509517
elif outputs is not None:
510518
event_properties["$ai_output_state"] = with_privacy_mode(
511519
self._ph_client, self._privacy_mode, outputs
@@ -576,10 +584,24 @@ def _capture_generation(
576584
if run.tools:
577585
event_properties["$ai_tools"] = run.tools
578586

587+
if self._properties:
588+
event_properties.update(self._properties)
589+
590+
if self._distinct_id is None:
591+
event_properties["$process_person_profile"] = False
592+
579593
if isinstance(output, BaseException):
580594
event_properties["$ai_http_status"] = _get_http_status(output)
581595
event_properties["$ai_error"] = _stringify_exception(output)
582596
event_properties["$ai_is_error"] = True
597+
598+
event_properties = _capture_exception_and_update_properties(
599+
self._ph_client,
600+
output,
601+
self._distinct_id,
602+
self._groups,
603+
event_properties,
604+
)
583605
else:
584606
# Add usage
585607
usage = _parse_usage(output, run.provider, run.model)
@@ -607,12 +629,6 @@ def _capture_generation(
607629
self._ph_client, self._privacy_mode, completions
608630
)
609631

610-
if self._properties:
611-
event_properties.update(self._properties)
612-
613-
if self._distinct_id is None:
614-
event_properties["$process_person_profile"] = False
615-
616632
self._ph_client.capture(
617633
distinct_id=self._distinct_id or trace_id,
618634
event="$ai_generation",
@@ -863,6 +879,27 @@ def _parse_usage(
863879
return llm_usage
864880

865881

882+
def _capture_exception_and_update_properties(
883+
client: Client,
884+
exception: BaseException,
885+
distinct_id: Optional[Union[str, int, UUID]],
886+
groups: Optional[Dict[str, Any]],
887+
event_properties: Dict[str, Any],
888+
):
889+
if client.enable_exception_autocapture:
890+
exception_id = client.capture_exception(
891+
exception,
892+
distinct_id=distinct_id,
893+
groups=groups,
894+
properties=event_properties,
895+
)
896+
897+
if exception_id:
898+
event_properties["$exception_event_id"] = exception_id
899+
900+
return event_properties
901+
902+
866903
def _get_http_status(error: BaseException) -> int:
867904
# OpenAI: https://github.com/openai/openai-python/blob/main/src/openai/_exceptions.py
868905
# Anthropic: https://github.com/anthropics/anthropic-sdk-python/blob/main/src/anthropic/_exceptions.py

0 commit comments

Comments
 (0)