SkycodeSkycode

Model Setup

How to connect AI models to Skycode

Skycode works with any model through your own API key. Data goes directly to the provider — no proxy servers.

Supported Providers

ProviderModelsNote
OpenRouterClaude, GPT, Gemini, DeepSeek, Qwen and 200+Recommended — one key for all models
AnthropicClaude 4, Claude 3.5 SonnetDirect
OpenAIGPT-4o, o1, o3Direct
Google AIGemini 2.5 Pro, FlashDirect
GigaChatGigaChat-2 Pro, MaxRussian provider
YandexGPTYandexGPT 5 Pro, LiteRussian provider
OllamaAny local modelFully offline

Quick Start with OpenRouter

  1. Sign up at openrouter.ai
  2. Create an API key
  3. In Skycode settings: provider → OpenRouter
  4. Paste the key
  5. Select a model (recommended: anthropic/claude-sonnet-4)

Local Models (Ollama)

For fully offline work:

  1. Install Ollama
  2. Download a model: ollama pull qwen2.5-coder:32b
  3. In Skycode settings: provider → Ollama
  4. URL: http://localhost:11434
  5. Model: qwen2.5-coder:32b

Choosing a Model

TaskModelWhy
Daily workClaude 3.5 SonnetBest price/quality balance
Complex tasksClaude 4 OpusBest quality
Budget optionDeepSeek V3Cheap and good
OfflineQwen Coder (latest version)Best local model for code