LangGraph integration¶
If you use LangGraph / LangChain, assistant-stream provides helpers for:
appending LangGraph events into a state object
extracting/initializing tool subgraph state for streamed tool execution
These helpers are available only if the LangGraph/LangChain dependencies are installed.
Append LangGraph events¶
Tool subgraph state helper¶
- assistant_stream_ce.modules.langgraph.get_tool_call_subgraph_state(controller, namespace, subgraph_node, default_state, *, artifact_field_name=None, tool_name=None)[source]¶
Get the state for a tool call subgraph by traversing the namespace and checking for subgraph nodes. Ensures there’s a ToolMessage as the last message and returns its artifact field value.
- Parameters:
controller (RunController) – The run controller managing the state
subgraph_node (str | List[str] | Callable[[List[str]], bool]) – Node name(s) to check against, or a function that checks node names
namespace (Tuple[str, ...]) – Tuple of strings in format ‘node_name:task_id’
artifact_field_name (str | None) – Optional field name to extract from artifact
default_state (Dict[str, Any]) – Default state to use if artifact field is None
- Returns:
The artifact field value from the ToolMessage. If the last message is already a ToolMessage, returns its artifact field. If it’s an AI message with tool calls, creates a ToolMessage and returns the appropriate artifact field value.
- Return type:
Typical usage¶
from assistant_stream_ce import create_run
from assistant_stream_ce.modules.langgraph import append_langgraph_event
async def run(controller):
# Ensure state has a place for messages
controller.state = {"messages": []}
# In your LangGraph event handler:
append_langgraph_event(controller.state, "ns", "messages", payload=[...])