Model Setup
How to connect AI models to Skycode
Skycode works with any model through your own API key. Data goes directly to the provider — no proxy servers.
Supported Providers
| Provider | Models | Note |
|---|---|---|
| OpenRouter | Claude, GPT, Gemini, DeepSeek, Qwen and 200+ | Recommended — one key for all models |
| Anthropic | Claude 4, Claude 3.5 Sonnet | Direct |
| OpenAI | GPT-4o, o1, o3 | Direct |
| Google AI | Gemini 2.5 Pro, Flash | Direct |
| GigaChat | GigaChat-2 Pro, Max | Russian provider |
| YandexGPT | YandexGPT 5 Pro, Lite | Russian provider |
| Ollama | Any local model | Fully offline |
Quick Start with OpenRouter
- Sign up at openrouter.ai
- Create an API key
- In Skycode settings: provider → OpenRouter
- Paste the key
- Select a model (recommended:
anthropic/claude-sonnet-4)
Local Models (Ollama)
For fully offline work:
- Install Ollama
- Download a model:
ollama pull qwen2.5-coder:32b - In Skycode settings: provider → Ollama
- URL:
http://localhost:11434 - Model:
qwen2.5-coder:32b
Choosing a Model
| Task | Model | Why |
|---|---|---|
| Daily work | Claude 3.5 Sonnet | Best price/quality balance |
| Complex tasks | Claude 4 Opus | Best quality |
| Budget option | DeepSeek V3 | Cheap and good |
| Offline | Qwen Coder (latest version) | Best local model for code |