AI Models¶
ToolFront supports various AI model providers through Pydantic AI. You can use models from OpenAI, Anthropic, Google, Mistral, Cohere, and more.
Basic Usage¶
All models are specified using the provider prefix format:
Export your API key:
Then use the model:
Export your API key:
Then use the model:
Setting Default Model¶
You can set a default model using the TOOLFRONT_MODEL
environment variable:
Then, use directly without specifying the model:
Custom Model Providers¶
You can use any Pydantic AI model provider and configuration with ToolFront:
from toolfront import Database
from pydantic_ai.models.openai import OpenAIChatModel
from pydantic_ai.providers.openrouter import OpenRouterProvider
openrouter_model = OpenAIChatModel(
'anthropic/claude-3.5-sonnet',
provider=OpenRouterProvider(api_key='your-openrouter-api-key'),
)
db = Database("postgres://user:pass@localhost:5432/mydb")
answer = db.ask("What are our best-sellers?", model=openrouter_model)
from toolfront import Database
from pydantic_ai.models.openai import OpenAIChatModel
ollama_model = OpenAIChatModel(
'llama3.2',
base_url='http://localhost:11434/v1',
api_key='ignored',
)
db = Database("postgres://user:pass@localhost:5432/mydb")
answer = db.ask("What's the revenue of our top-5 products", model=ollama_model)
from toolfront import Database
from pydantic_ai.models.openai import OpenAIChatModel
vercel_model = OpenAIChatModel(
'gpt-4o',
base_url='https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_slug}/openai',
api_key='your-openai-api-key',
)
db = Database("postgres://user:pass@localhost:5432/mydb")
answer = db.ask("What's the revenue of our top-5 products", model=vercel_model)
For reference, see the Pydantic AI model documentation, which includes the full list of OpenAI-compatible providers.