Skip to main content
The Models Registry connects LLM providers to the Alquimia Runtime. Each entry tells the Runtime which provider class to use, what model to call, and how to authenticate.

Adding a model

Click Add Model and fill in the form:
FieldDescription
NameDisplay name shown in the agent creation wizard (e.g., “Claude Sonnet”, “GPT-4o”)
ProviderThe LangChain integration class the Runtime uses. Determines the API format.
Model IDThe exact identifier used by the provider API (e.g., claude-sonnet-4-20250514, gpt-4o)
API KeyYour provider API key. Stored as a secret automatically.
Base URLOptional. Override the default API endpoint.

Supported providers

ProviderExample Model IDs
anthropicclaude-sonnet-4-20250514, claude-haiku-4-5-20251001
openaigpt-4o, gpt-4o-mini, o1-preview
groqllama-3.3-70b-versatile, mixtral-8x7b-32768
googlegemini-1.5-pro, gemini-1.5-flash
mistralmistral-large-latest, mistral-small-latest
coherecommand-r-plus, command-r
metameta-llama/Llama-3.3-70B-Instruct
All providers use the OpenAI-compatible interface under the hood. If your provider supports OpenAI-compatible endpoints, you can register it by setting the Base URL to your provider’s endpoint and using the openai provider type.

API keys become secrets

When you save a model with an API key, Studio automatically:
  1. Stores the key in the Registry secrets store with a name based on the provider (e.g., ANTHROPIC_API_KEY)
  2. Replaces the raw key in the model config with $ANTHROPIC_API_KEY
The raw key is never stored in plain text after this point. See Secrets for how this works. To rotate a key: Edit the model and enter the new API key. The secret is updated and all agents using that model pick up the new key automatically.

Base URL use cases

The Base URL field lets you use:
  • OpenAI-compatible proxies
  • Local endpoints — Ollama (http://localhost:11434/v1), LM Studio
  • Custom deployments — self-hosted vLLM, TGI

Next steps

Creating an Agent

Once you have a model, you’re ready to create your first agent.