Extraction LLMs
TypeGraph uses LLMs for query rewriting, summarization, and - when graph extraction is enabled - automatic entity and relationship extraction during document, event, and thread ingestion. Any model supported by the Vercel AI SDK works. Pass an llmto use the default extractor, or pass a custom extractorwhen you want full control over extraction behavior.
@ai-sdk/openai
@ai-sdk/openai@ai-sdk/xai
@ai-sdk/xai@ai-sdk/google
@ai-sdk/googleAI Gateway (OpenAI)
@ai-sdk/gatewayAI Gateway (xAI)
@ai-sdk/gatewayDefault Extractor
Pass an AI SDK language model as llm. TypeGraph builds its default extractor internally and uses the configured ontology when one is supplied:
Custom Extractor
TypeGraph does not expose single-pass or two-pass extraction knobs. Those staging choices are private to the default extractor. For self-hosted deployments, provide your own extractor when you need a different prompt, model stack, constrained-decoding pipeline, or deterministic parser: