CausalIQ Knowledge API Reference¶
API documentation for causaliq-knowledge, organized by module.
Import Patterns¶
Core models are available from the top-level package:
Cache infrastructure is available from the cache submodule:
LLM-specific classes should be imported from the llm submodule:
from causaliq_knowledge.llm import (
# Abstract base interface
BaseLLMClient,
LLMConfig,
LLMResponse,
# Main provider
LLMKnowledge,
# Vendor clients
GroqClient,
GroqConfig,
GeminiClient,
GeminiConfig,
OpenAIClient,
OpenAIConfig,
AnthropicClient,
AnthropicConfig,
DeepSeekClient,
DeepSeekConfig,
MistralClient,
MistralConfig,
OllamaClient,
OllamaConfig,
# Prompts
EdgeQueryPrompt,
parse_edge_response,
)
Modules¶
Models¶
Core Pydantic models for representing causal knowledge:
- EdgeDirection - Enum for causal edge direction (a_to_b, b_to_a, undirected)
- EdgeKnowledge - Structured knowledge about a potential causal edge
Cache¶
SQLite-backed caching infrastructure:
- TokenCache - Cache with connection management and transaction support
Base¶
Abstract interfaces for knowledge providers:
- KnowledgeProvider - Abstract base class that all knowledge sources implement
LLM Provider¶
Main entry point for LLM-based knowledge queries:
- LLMKnowledge - KnowledgeProvider implementation using vendor-specific API clients
- weighted_vote - Multi-model consensus by weighted voting
- highest_confidence - Select response with highest confidence
LLM Client Interface¶
Abstract base class and common types for LLM vendor clients:
- BaseLLMClient - Abstract interface all vendor clients implement
- LLMConfig - Base configuration dataclass
- LLMResponse - Unified response format
Vendor API Clients¶
Direct API clients for specific LLM providers. All implement the BaseLLMClient interface.
- Groq Client - Fast inference via Groq API
- Gemini Client - Google Gemini API
- OpenAI Client - OpenAI GPT models
- Anthropic Client - Anthropic Claude models
- DeepSeek Client - DeepSeek models
- Mistral Client - Mistral AI models
- Ollama Client - Local LLMs via Ollama
Prompts¶
Prompt templates for LLM edge queries:
- EdgeQueryPrompt - Builder for edge existence/orientation prompts
- parse_edge_response - Parse LLM JSON responses to EdgeKnowledge
CLI¶
Command-line interface for testing and querying.