Documentation Index
Fetch the complete documentation index at: https://mintlify.com/Y-Research-SBU/QuantAgent/llms.txt
Use this file to discover all available pages before exploring further.
QuantAgent supports three LLM providers. Each analysis run uses two separate
models: an agent_llm for the Indicator, Pattern, and Trend agents, and a
graph_llm for the Decision agent that synthesises the final LONG/SHORT trade
directive.
The Pattern and Trend agents pass chart images to the model. Your chosen
provider and model must support vision (image) input. All default models
listed below satisfy this requirement.
Supported providers
Default provider. Uses the OpenAI Chat Completions API.| Role | Default model |
|---|
| Agent LLM | gpt-4o-mini |
| Graph LLM | gpt-4o |
Set the API key
Environment variable (recommended)export OPENAI_API_KEY="sk-..."
Config dictconfig = {
"agent_llm_provider": "openai",
"graph_llm_provider": "openai",
"agent_llm_model": "gpt-4o-mini",
"graph_llm_model": "gpt-4o",
"api_key": "sk-...",
}
Web UIOpen Settings, select OpenAI, paste your key, and click Save.
The key is applied immediately via POST /api/update-api-key.Obtain a key at platform.openai.com/api-keys. Uses the Anthropic Messages API via langchain-anthropic.| Role | Default model |
|---|
| Agent LLM | claude-haiku-4-5-20251001 |
| Graph LLM | claude-haiku-4-5-20251001 |
Set the API key
Environment variable (recommended)export ANTHROPIC_API_KEY="sk-ant-..."
Config dictconfig = {
"agent_llm_provider": "anthropic",
"graph_llm_provider": "anthropic",
"agent_llm_model": "claude-haiku-4-5-20251001",
"graph_llm_model": "claude-haiku-4-5-20251001",
"anthropic_api_key": "sk-ant-...",
}
Web UIOpen Settings, select Anthropic, paste your key, and click
Save.Obtain a key at console.anthropic.com. Uses Alibaba Cloud’s DashScope API via langchain-qwq. The graph model
(qwen3-vl-plus) is a vision-language model required for chart analysis.| Role | Default model |
|---|
| Agent LLM | qwen3-max |
| Graph LLM | qwen3-vl-plus |
DashScope inference endpoints are hosted in Singapore. Depending on your
location, you may experience higher latency or occasional timeouts compared
to OpenAI or Anthropic. The TradingGraph client is configured with
max_retries=4 to handle transient failures automatically.
Set the API key
Environment variable (recommended)export DASHSCOPE_API_KEY="sk-..."
Config dictconfig = {
"agent_llm_provider": "qwen",
"graph_llm_provider": "qwen",
"agent_llm_model": "qwen3-max",
"graph_llm_model": "qwen3-vl-plus",
"qwen_api_key": "sk-...",
}
Web UIOpen Settings, select Qwen, paste your DashScope key, and click
Save.Obtain a key at dashscope.console.aliyun.com.
Switching providers
Web UI
Open Settings, choose the provider from the dropdown, and click Apply.
This sends a POST /api/update-provider request that updates both
agent_llm_provider and graph_llm_provider simultaneously and swaps the
model names to the defaults for that provider.
Programmatically
Update the config and call refresh_llms() to rebuild the agent and graph
LLMs without recreating the full TradingGraph object:
from trading_graph import TradingGraph
tg = TradingGraph()
# Switch to Anthropic
tg.config["agent_llm_provider"] = "anthropic"
tg.config["graph_llm_provider"] = "anthropic"
tg.config["agent_llm_model"] = "claude-haiku-4-5-20251001"
tg.config["graph_llm_model"] = "claude-haiku-4-5-20251001"
tg.config["anthropic_api_key"] = "sk-ant-..."
tg.refresh_llms()
Or use update_api_key() which calls refresh_llms() for you:
tg.update_api_key("sk-ant-...", provider="anthropic")
Configuration reference
All provider settings live in default_config.py:
DEFAULT_CONFIG = {
"agent_llm_model": "gpt-4o-mini",
"graph_llm_model": "gpt-4o",
"agent_llm_provider": "openai", # "openai", "anthropic", or "qwen"
"graph_llm_provider": "openai",
"agent_llm_temperature": 0.1,
"graph_llm_temperature": 0.1,
"api_key": "sk-", # OpenAI key
"anthropic_api_key": "sk-", # Anthropic key
"qwen_api_key": "sk-", # Qwen / DashScope key
}
You can pass a full or partial override dict to TradingGraph(config=...).
Any key not present in your dict falls back to the default.