CausalIQ Knowledge API Reference¶
API documentation for causaliq-knowledge, organised by module.
Import Patterns¶
Graph generation classes are available from the graph submodule:
from causaliq_knowledge.graph import (
# Network context (main model)
NetworkContext,
NetworkLoadError,
# Variable specification
VariableSpec,
VariableType,
VariableRole,
# Supporting models
ViewDefinition,
Provenance,
LLMGuidance,
Constraints,
CausalPrinciple,
GroundTruth,
PromptDetails,
# Filtering
ViewFilter,
PromptDetail,
# Generation
GraphGenerator,
GraphGeneratorConfig,
GeneratedGraph,
ProposedEdge,
GenerationMetadata,
# Parameters
GenerateGraphParams,
# Prompts
GraphQueryPrompt,
OutputFormat,
# Cache
GraphCompressor,
)
Cache infrastructure is available from causaliq-core:
LLM clients should be imported from the llm submodule:
from causaliq_knowledge.llm import (
# Abstract base interface
BaseLLMClient,
LLMConfig,
LLMResponse,
# Vendor clients
GroqClient,
GroqConfig,
GeminiClient,
GeminiConfig,
OpenAIClient,
OpenAIConfig,
AnthropicClient,
AnthropicConfig,
DeepSeekClient,
DeepSeekConfig,
MistralClient,
MistralConfig,
OllamaClient,
OllamaConfig,
)
Modules¶
Graph Module¶
LLM-based causal graph generation from network context specifications:
- Graph Generator - Generate complete causal graphs
- GraphGenerator, GraphGeneratorConfig
- GeneratedGraph, ProposedEdge, GenerationMetadata
- Network Context - Pydantic models for network context
- NetworkContext, NetworkLoadError
- VariableSpec, VariableType, VariableRole
- PromptDetails, ViewDefinition, Provenance, Constraints
- View Filter - Extract context levels
- ViewFilter, PromptDetail (MINIMAL, STANDARD, RICH)
- Graph Prompts - Prompt builders
- GraphQueryPrompt, OutputFormat
- Response Models - Response parsing
- ProposedEdge, GeneratedGraph, GenerationMetadata
LLM Client Interface¶
Abstract base class and common types for LLM vendor clients:
- BaseLLMClient - Abstract interface all vendor clients implement
- LLMConfig - Base configuration dataclass
- LLMResponse - Unified response format
Vendor API Clients¶
Direct API clients for specific LLM providers. All implement the
BaseLLMClient interface.
- Groq Client - Fast inference via Groq API
- Gemini Client - Google Gemini API
- OpenAI Client - OpenAI GPT models
- Anthropic Client - Anthropic Claude models
- DeepSeek Client - DeepSeek models
- Mistral Client - Mistral AI models
- Ollama Client - Local LLMs via Ollama
CLI¶
Command-line interface for graph generation and cache management.