Прозрачные и понятные тарифы: оплачивайте только фактически использованный объем. Никаких подписок или скрытых комиссий.
Стоимость рассчитывается за каждую 1 000 “токенов” (единиц измерения для языковых моделей). Токеном может быть как целое слово, так и его часть или даже знак препинания. Цена различается для токенов “запроса” (что вы отправляете модели) и токенов “ответа” (что возвращает модель).
Почему не считать просто символы?
Современные модели оптимизированы именно под работу с токенами, обеспечивая большую экономию. Так, GPT-4o при создании фразы «Привет! Как я могу помочь?» потребляет лишь 8 токенов, хотя в строке содержится 26 символов. Получается, что расчёт по токенам оказывается более чем в три раза дешевле, чем по количеству символов.
Подробная информация о стоимости каждого тарифа
Услуга | Стоимость |
---|---|
Vector Store | 10,00 ₽за 1 ГБ в день (первые 10 МБ - бесплатно) |
Языковые модели (LLM)
Ценообразование производится на основании 1 000 токенов. В аудиомоделях эта же «порция» — 1 000 токенов — примерно соответствует одной минуте.
Провайдер | Модель | Запрос | Ответ |
---|---|---|---|
gigachat | Особенности | ||
GigaChat | 1.0000 ₽ | 1.0000 ₽ | |
GigaChat-2 | 1.0000 ₽ | 1.0000 ₽ | |
GigaChat-2-Max | 1.0000 ₽ | 1.0000 ₽ | |
GigaChat-Plus | 1.0000 ₽ | 1.0000 ₽ | |
GigaChat-Pro | 1.0000 ₽ | 1.0000 ₽ | |
openai | Особенности | ||
OpenAI: o1-pro | 14.1900 ₽ | 56.7600 ₽ | |
OpenAI: GPT-4.5 (Preview) | 7.0950 ₽ | 14.1900 ₽ | |
OpenAI: GPT-4 32k | 5.6760 ₽ | 11.3520 ₽ | |
OpenAI: GPT-4 32k (older v0314) | 5.6760 ₽ | 11.3520 ₽ | |
OpenAI: o1 | 1.4190 ₽ | 5.6760 ₽ | |
OpenAI: o1-preview (2024-09-12) | 1.4190 ₽ | 5.6760 ₽ | |
OpenAI: GPT-4 | 2.8380 ₽ | 5.6760 ₽ | |
OpenAI: GPT-4 (older v0314) | 2.8380 ₽ | 5.6760 ₽ | |
OpenAI: o3 | 0.9460 ₽ | 3.7840 ₽ | |
OpenAI: GPT-4 Turbo | 0.9460 ₽ | 2.8380 ₽ | |
OpenAI: GPT-4 Turbo Preview | 0.9460 ₽ | 2.8380 ₽ | |
OpenAI: GPT-4 Turbo (older v1106) | 0.9460 ₽ | 2.8380 ₽ | |
OpenAI: GPT-4o (extended) | 0.5676 ₽ | 1.7028 ₽ | |
OpenAI: ChatGPT-4o | 0.4730 ₽ | 1.4190 ₽ | |
OpenAI: GPT-4o (2024-05-13) | 0.4730 ₽ | 1.4190 ₽ | |
o1-preview | 1.0000 ₽ | 1.0000 ₽ | |
gpt-4o | 1.0000 ₽ | 1.0000 ₽ | |
OpenAI: GPT-4o Search Preview | 0.2365 ₽ | 0.9460 ₽ | |
OpenAI: GPT-4o (2024-11-20) | 0.2365 ₽ | 0.9460 ₽ | |
OpenAI: GPT-4o (2024-08-06) | 0.2365 ₽ | 0.9460 ₽ | |
OpenAI: GPT-4.1 | 0.1892 ₽ | 0.7568 ₽ | |
OpenAI: o4 Mini High | 0.1041 ₽ | 0.4162 ₽ | |
OpenAI: o4 Mini | 0.1041 ₽ | 0.4162 ₽ | |
OpenAI: o3 Mini High | 0.1041 ₽ | 0.4162 ₽ | |
OpenAI: o3 Mini | 0.1041 ₽ | 0.4162 ₽ | |
OpenAI: o1-mini (2024-09-12) | 0.1041 ₽ | 0.4162 ₽ | |
OpenAI: GPT-3.5 Turbo 16k | 0.2838 ₽ | 0.3784 ₽ | |
OpenAI: GPT-3.5 Turbo (older v0613) | 0.0946 ₽ | 0.1892 ₽ | |
OpenAI: GPT-3.5 Turbo 16k (older v1106) | 0.0946 ₽ | 0.1892 ₽ | |
OpenAI: GPT-3.5 Turbo Instruct | 0.1419 ₽ | 0.1892 ₽ | |
OpenAI: GPT-4.1 Mini | 0.0378 ₽ | 0.1514 ₽ | |
OpenAI: GPT-3.5 Turbo | 0.0473 ₽ | 0.1419 ₽ | |
OpenAI: GPT-3.5 Turbo 16k | 0.0473 ₽ | 0.1419 ₽ | |
o1-mini | 0.1000 ₽ | 0.1000 ₽ | |
gpt-4o-mini | 0.1000 ₽ | 0.1000 ₽ | |
OpenAI: GPT-4o-mini Search Preview | 0.0142 ₽ | 0.0568 ₽ | |
OpenAI: GPT-4o-mini (2024-07-18) | 0.0142 ₽ | 0.0568 ₽ | |
OpenAI: GPT-4.1 Nano | 0.0095 ₽ | 0.0378 ₽ | |
anthropic | Особенности | ||
Anthropic: Claude 3 Opus (self-moderated) | 1.4190 ₽ | 7.0950 ₽ | |
Anthropic: Claude 3 Opus | 1.4190 ₽ | 7.0950 ₽ | |
Anthropic: Claude v2.1 (self-moderated) | 0.7568 ₽ | 2.2704 ₽ | |
Anthropic: Claude v2.1 | 0.7568 ₽ | 2.2704 ₽ | |
Anthropic: Claude v2 (self-moderated) | 0.7568 ₽ | 2.2704 ₽ | |
Anthropic: Claude v2 | 0.7568 ₽ | 2.2704 ₽ | |
Anthropic: Claude v2.0 (self-moderated) | 0.7568 ₽ | 2.2704 ₽ | |
Anthropic: Claude v2.0 | 0.7568 ₽ | 2.2704 ₽ | |
Anthropic: Claude 3.7 Sonnet | 0.2838 ₽ | 1.4190 ₽ | |
Anthropic: Claude 3.7 Sonnet (thinking) | 0.2838 ₽ | 1.4190 ₽ | |
Anthropic: Claude 3.7 Sonnet (self-moderated) | 0.2838 ₽ | 1.4190 ₽ | |
Anthropic: Claude 3.5 Sonnet (self-moderated) | 0.2838 ₽ | 1.4190 ₽ | |
Anthropic: Claude 3.5 Sonnet | 0.2838 ₽ | 1.4190 ₽ | |
Anthropic: Claude 3.5 Sonnet (2024-06-20) (self-moderated) | 0.2838 ₽ | 1.4190 ₽ | |
Anthropic: Claude 3.5 Sonnet (2024-06-20) | 0.2838 ₽ | 1.4190 ₽ | |
Anthropic: Claude 3 Sonnet (self-moderated) | 0.2838 ₽ | 1.4190 ₽ | |
Anthropic: Claude 3 Sonnet | 0.2838 ₽ | 1.4190 ₽ | |
Anthropic: Claude 3.5 Haiku (self-moderated) | 0.0757 ₽ | 0.3784 ₽ | |
Anthropic: Claude 3.5 Haiku | 0.0757 ₽ | 0.3784 ₽ | |
Anthropic: Claude 3.5 Haiku (2024-10-22) (self-moderated) | 0.0757 ₽ | 0.3784 ₽ | |
Anthropic: Claude 3.5 Haiku (2024-10-22) | 0.0757 ₽ | 0.3784 ₽ | |
Anthropic: Claude 3 Haiku (self-moderated) | 0.0237 ₽ | 0.1182 ₽ | |
Anthropic: Claude 3 Haiku | 0.0237 ₽ | 0.1182 ₽ | |
google | Особенности | ||
Google: Gemini 2.5 Pro Preview | 0.1182 ₽ | 0.9460 ₽ | |
Google: Gemini 1.5 Pro | 0.1182 ₽ | 0.4730 ₽ | |
Google: Gemini 2.5 Flash Preview (thinking) | 0.0142 ₽ | 0.3311 ₽ | |
Google: Gemini Pro Vision 1.0 | 0.0473 ₽ | 0.1419 ₽ | |
Google: Gemini 2.5 Flash Preview | 0.0142 ₽ | 0.0568 ₽ | |
gemini-1.5-pro | 0.0500 ₽ | 0.0500 ₽ | |
Google: Gemini 2.0 Flash | 0.0095 ₽ | 0.0378 ₽ | |
Google: Gemini 2.0 Flash Lite | 0.0071 ₽ | 0.0284 ₽ | |
Google: Gemma 2 27B | 0.0095 ₽ | 0.0284 ₽ | |
Google: Gemini 1.5 Flash | 0.0071 ₽ | 0.0284 ₽ | |
Google: Gemma 3 27B | 0.0095 ₽ | 0.0189 ₽ | |
Google: Gemini 1.5 Flash 8B | 0.0035 ₽ | 0.0142 ₽ | |
gemini-2.0-flash-exp | 0.0100 ₽ | 0.0100 ₽ | |
gemini-1.5-flash | 0.0100 ₽ | 0.0100 ₽ | |
Google: Gemma 3 12B | 0.0047 ₽ | 0.0095 ₽ | |
Google: Gemma 2 9B | 0.0019 ₽ | 0.0057 ₽ | |
Google: Gemma 3 4B | 0.0019 ₽ | 0.0038 ₽ | |
deepseek | Особенности | ||
deepseek-chat | 1.0000 ₽ | 1.0000 ₽ | |
deepseek-reasoner | 1.0000 ₽ | 1.0000 ₽ | |
DeepSeek: DeepSeek Prover V2 | 0.0473 ₽ | 0.2062 ₽ | |
DeepSeek: R1 | 0.0473 ₽ | 0.2062 ₽ | |
DeepSeek: DeepSeek V3 0324 | 0.0255 ₽ | 0.1041 ₽ | |
DeepSeek: R1 Distill Llama 70B | 0.0095 ₽ | 0.0378 ₽ | |
DeepSeek: R1 Distill Qwen 1.5B | 0.0170 ₽ | 0.0170 ₽ | |
DeepSeek: R1 Distill Qwen 32B | 0.0114 ₽ | 0.0170 ₽ | |
DeepSeek: R1 Distill Qwen 14B | 0.0142 ₽ | 0.0142 ₽ | |
DeepSeek-Coder-V2 | 0.0038 ₽ | 0.0114 ₽ | |
DeepSeek: R1 Distill Llama 8B | 0.0038 ₽ | 0.0038 ₽ | |
01-ai | |||
01.AI: Yi Large | 0.2838 ₽ | 0.2838 ₽ | |
aetherwiing | |||
Aetherwiing: Starcannon 12B | 0.0757 ₽ | 0.1135 ₽ | |
ai21 | |||
AI21: Jamba 1.6 Large | 0.1892 ₽ | 0.7568 ₽ | |
AI21: Jamba 1.5 Large | 0.1892 ₽ | 0.7568 ₽ | |
AI21: Jamba Instruct | 0.0473 ₽ | 0.0662 ₽ | |
AI21: Jamba Mini 1.6 | 0.0189 ₽ | 0.0378 ₽ | |
AI21: Jamba 1.5 Mini | 0.0189 ₽ | 0.0378 ₽ | |
aion-labs | |||
AionLabs: Aion-1.0 | 0.3784 ₽ | 0.7568 ₽ | |
AionLabs: Aion-1.0-Mini | 0.0662 ₽ | 0.1324 ₽ | |
AionLabs: Aion-RP 1.0 (8B) | 0.0189 ₽ | 0.0189 ₽ | |
alfredpros | |||
AlfredPros: CodeLLaMa 7B Instruct Solidity | 0.0757 ₽ | 0.1135 ₽ | |
all-hands | |||
OpenHands LM 32B V0.1 | 0.2460 ₽ | 0.3216 ₽ | |
allenai | |||
OLMo 7B Instruct | 0.0076 ₽ | 0.0227 ₽ | |
alpindale | |||
Goliath 120B | 0.6208 ₽ | 0.8869 ₽ | |
Magnum 72B | 0.3784 ₽ | 0.5676 ₽ | |
amazon | |||
Amazon: Nova Pro 1.0 | 0.0757 ₽ | 0.3027 ₽ | |
Amazon: Nova Lite 1.0 | 0.0057 ₽ | 0.0227 ₽ | |
Amazon: Nova Micro 1.0 | 0.0033 ₽ | 0.0132 ₽ | |
anthracite-org | |||
Magnum v2 72B | 0.2838 ₽ | 0.2838 ₽ | |
Magnum v4 72B | 0.1419 ₽ | 0.2129 ₽ | |
arcee-ai | |||
Arcee AI: Maestro Reasoning | 0.0851 ₽ | 0.3122 ₽ | |
Arcee AI: Virtuoso Large | 0.0709 ₽ | 0.1135 ₽ | |
Arcee AI: Caller Large | 0.0520 ₽ | 0.0804 ₽ | |
Arcee AI: Coder Large | 0.0473 ₽ | 0.0757 ₽ | |
Arcee AI: Virtuoso Medium V2 | 0.0473 ₽ | 0.0757 ₽ | |
Arcee AI: Arcee Blitz | 0.0426 ₽ | 0.0709 ₽ | |
Arcee AI: Spotlight | 0.0170 ₽ | 0.0170 ₽ | |
cognitivecomputations | |||
Dolphin 2.9.2 Mixtral 8x22B 🐬 | 0.0851 ₽ | 0.0851 ₽ | |
cohere | |||
Cohere: Command R+ | 0.2838 ₽ | 1.4190 ₽ | |
Cohere: Command R+ (04-2024) | 0.2838 ₽ | 1.4190 ₽ | |
Cohere: Command A | 0.2365 ₽ | 0.9460 ₽ | |
Cohere: Command R+ (08-2024) | 0.2365 ₽ | 0.9460 ₽ | |
Cohere: Command | 0.0946 ₽ | 0.1892 ₽ | |
Cohere: Command R | 0.0473 ₽ | 0.1419 ₽ | |
Cohere: Command R (03-2024) | 0.0473 ₽ | 0.1419 ₽ | |
Cohere: Command R (08-2024) | 0.0142 ₽ | 0.0568 ₽ | |
Cohere: Command R7B (12-2024) | 0.0035 ₽ | 0.0142 ₽ | |
eleutherai | |||
EleutherAI: Llemma 7b | 0.0757 ₽ | 0.1135 ₽ | |
eva-unit-01 | |||
EVA Llama 3.33 70B | 0.3784 ₽ | 0.5676 ₽ | |
EVA Qwen2.5 72B | 0.3784 ₽ | 0.5676 ₽ | |
EVA Qwen2.5 32B | 0.2460 ₽ | 0.3216 ₽ | |
gryphe | |||
MythoMax 13B | 0.0061 ₽ | 0.0061 ₽ | |
inception | |||
Inception: Mercury Coder Small Beta | 0.0237 ₽ | 0.0946 ₽ | |
infermatic | |||
Infermatic: Mistral Nemo Inferor 12B | 0.0757 ₽ | 0.1135 ₽ | |
inflection | |||
Inflection: Inflection 3 Productivity | 0.2365 ₽ | 0.9460 ₽ | |
Inflection: Inflection 3 Pi | 0.2365 ₽ | 0.9460 ₽ | |
jondurbin | |||
Airoboros 70B | 0.0473 ₽ | 0.0473 ₽ | |
liquid | |||
Liquid: LFM 40B MoE | 0.0142 ₽ | 0.0142 ₽ | |
Liquid: LFM 3B | 0.0019 ₽ | 0.0019 ₽ | |
Liquid: LFM 7B | 0.0009 ₽ | 0.0009 ₽ | |
mancer | |||
Mancer: Weaver (alpha) | 0.1064 ₽ | 0.1064 ₽ | |
meta-llama | |||
Meta: Llama 3.1 405B (base) | 0.1892 ₽ | 0.1892 ₽ | |
Meta: Llama 3.2 90B Vision Instruct | 0.1135 ₽ | 0.1135 ₽ | |
Meta: Llama 2 70B Chat | 0.0851 ₽ | 0.0851 ₽ | |
Meta: Llama 3.1 405B Instruct | 0.0757 ₽ | 0.0757 ₽ | |
Meta: Llama 4 Maverick | 0.0161 ₽ | 0.0568 ₽ | |
Meta: Llama 3 70B Instruct | 0.0284 ₽ | 0.0378 ₽ | |
Meta: Llama 4 Scout | 0.0076 ₽ | 0.0284 ₽ | |
Meta: Llama 3.1 70B Instruct | 0.0095 ₽ | 0.0265 ₽ | |
Meta: Llama 3.3 70B Instruct | 0.0095 ₽ | 0.0237 ₽ | |
Meta: LlamaGuard 2 8B | 0.0189 ₽ | 0.0189 ₽ | |
Llama Guard 3 8B | 0.0019 ₽ | 0.0057 ₽ | |
Meta: Llama 3 8B Instruct | 0.0028 ₽ | 0.0057 ₽ | |
Meta: Llama Guard 4 12B | 0.0047 ₽ | 0.0047 ₽ | |
Meta: Llama 3.2 11B Vision Instruct | 0.0046 ₽ | 0.0046 ₽ | |
Meta: Llama 3.1 8B Instruct | 0.0019 ₽ | 0.0028 ₽ | |
Meta: Llama 3.2 3B Instruct | 0.0009 ₽ | 0.0019 ₽ | |
Meta: Llama 3.2 1B Instruct | 0.0005 ₽ | 0.0009 ₽ | |
microsoft | |||
WizardLM-2 8x22B | 0.0473 ₽ | 0.0473 ₽ | |
Microsoft: Phi 4 Reasoning Plus | 0.0066 ₽ | 0.0331 ₽ | |
Microsoft: Phi-3 Medium 128K Instruct | 0.0095 ₽ | 0.0284 ₽ | |
Microsoft: Phi 4 | 0.0066 ₽ | 0.0132 ₽ | |
Microsoft: Phi 4 Multimodal Instruct | 0.0047 ₽ | 0.0095 ₽ | |
Microsoft: Phi-3 Mini 128K Instruct | 0.0095 ₽ | 0.0095 ₽ | |
Microsoft: Phi-3.5 Mini 128K Instruct | 0.0028 ₽ | 0.0085 ₽ | |
minimax | |||
MiniMax: MiniMax-01 | 0.0189 ₽ | 0.1041 ₽ | |
mistral | |||
Mistral: Ministral 8B | 0.0095 ₽ | 0.0095 ₽ | |
mistralai | |||
Mistral Medium | 0.2601 ₽ | 0.7663 ₽ | |
Mistral Large 2411 | 0.1892 ₽ | 0.5676 ₽ | |
Mistral Large 2407 | 0.1892 ₽ | 0.5676 ₽ | |
Mistral: Pixtral Large 2411 | 0.1892 ₽ | 0.5676 ₽ | |
Mistral Large | 0.1892 ₽ | 0.5676 ₽ | |
Mistral: Mixtral 8x22B Instruct | 0.0378 ₽ | 0.1135 ₽ | |
Mistral: Codestral 2501 | 0.0284 ₽ | 0.0851 ₽ | |
Mistral: Saba | 0.0189 ₽ | 0.0568 ₽ | |
Mistral Small | 0.0189 ₽ | 0.0568 ₽ | |
Mistral: Codestral Mamba | 0.0237 ₽ | 0.0237 ₽ | |
Mistral Tiny | 0.0237 ₽ | 0.0237 ₽ | |
Mistral: Mixtral 8x7B Instruct | 0.0076 ₽ | 0.0227 ₽ | |
Mistral: Mistral 7B Instruct v0.2 | 0.0189 ₽ | 0.0189 ₽ | |
Mistral: Mistral 7B Instruct v0.1 | 0.0104 ₽ | 0.0180 ₽ | |
Mistral: Mistral Small 3.1 24B | 0.0047 ₽ | 0.0142 ₽ | |
Mistral: Mistral Small 3 | 0.0057 ₽ | 0.0114 ₽ | |
Mistral: Ministral 8B | 0.0095 ₽ | 0.0095 ₽ | |
Mistral: Pixtral 12B | 0.0095 ₽ | 0.0095 ₽ | |
Mistral: Mistral Nemo | 0.0028 ₽ | 0.0066 ₽ | |
Mistral: Mistral 7B Instruct | 0.0026 ₽ | 0.0051 ₽ | |
Mistral: Mistral 7B Instruct v0.3 | 0.0026 ₽ | 0.0051 ₽ | |
Mistral: Ministral 3B | 0.0038 ₽ | 0.0038 ₽ | |
neversleep | |||
NeverSleep: Llama 3 Lumimaid 70B | 0.3784 ₽ | 0.5676 ₽ | |
NeverSleep: Lumimaid v0.2 70B | 0.1419 ₽ | 0.2129 ₽ | |
Noromaid 20B | 0.0709 ₽ | 0.1419 ₽ | |
NeverSleep: Lumimaid v0.2 8B | 0.0089 ₽ | 0.0709 ₽ | |
NeverSleep: Llama 3 Lumimaid 8B (extended) | 0.0089 ₽ | 0.0709 ₽ | |
NeverSleep: Llama 3 Lumimaid 8B | 0.0089 ₽ | 0.0709 ₽ | |
nothingiisreal | |||
Mistral Nemo 12B Celeste | 0.0757 ₽ | 0.1135 ₽ | |
nousresearch | |||
Nous: Hermes 3 405B Instruct | 0.0757 ₽ | 0.0757 ₽ | |
Nous: Hermes 2 Mixtral 8x7B DPO | 0.0568 ₽ | 0.0568 ₽ | |
Nous: Hermes 3 70B Instruct | 0.0114 ₽ | 0.0284 ₽ | |
NousResearch: Hermes 2 Pro - Llama-3 8B | 0.0024 ₽ | 0.0038 ₽ | |
nvidia | |||
NVIDIA: Llama 3.3 Nemotron Super 49B v1 | 0.0123 ₽ | 0.0378 ₽ | |
NVIDIA: Llama 3.1 Nemotron 70B Instruct | 0.0114 ₽ | 0.0284 ₽ | |
perplexity | |||
Perplexity: Sonar Pro | 0.2838 ₽ | 1.4190 ₽ | |
Perplexity: Sonar Reasoning Pro | 0.1892 ₽ | 0.7568 ₽ | |
Perplexity: Sonar Deep Research | 0.1892 ₽ | 0.7568 ₽ | |
Perplexity: R1 1776 | 0.1892 ₽ | 0.7568 ₽ | |
Perplexity: Sonar Reasoning | 0.0946 ₽ | 0.4730 ₽ | |
Perplexity: Sonar | 0.0946 ₽ | 0.0946 ₽ | |
Perplexity: Llama 3.1 Sonar 70B Online | 0.0946 ₽ | 0.0946 ₽ | |
Perplexity: Llama 3.1 Sonar 8B Online | 0.0189 ₽ | 0.0189 ₽ | |
pygmalionai | |||
Pygmalion: Mythalion 13B | 0.0532 ₽ | 0.1064 ₽ | |
qwen | |||
Qwen: Qwen-Max | 0.1514 ₽ | 0.6054 ₽ | |
Qwen: Qwen VL Max | 0.0757 ₽ | 0.3027 ₽ | |
Qwen: Qwen-Plus | 0.0378 ₽ | 0.1135 ₽ | |
Qwen: Qwen2.5 VL 32B Instruct | 0.0851 ₽ | 0.0851 ₽ | |
Qwen 2 72B Instruct | 0.0851 ₽ | 0.0851 ₽ | |
Qwen: Qwen2.5 VL 72B Instruct | 0.0237 ₽ | 0.0709 ₽ | |
Qwen: Qwen VL Plus | 0.0199 ₽ | 0.0596 ₽ | |
Qwen: Qwen2.5-VL 72B Instruct | 0.0568 ₽ | 0.0568 ₽ | |
Qwen2.5 72B Instruct | 0.0114 ₽ | 0.0369 ₽ | |
Qwen: Qwen3 30B A3B | 0.0095 ₽ | 0.0284 ₽ | |
Qwen: Qwen3 32B | 0.0095 ₽ | 0.0284 ₽ | |
Qwen: QwQ 32B Preview | 0.0085 ₽ | 0.0255 ₽ | |
Qwen: Qwen3 14B | 0.0066 ₽ | 0.0227 ₽ | |
Qwen: QwQ 32B | 0.0142 ₽ | 0.0189 ₽ | |
Qwen: Qwen-Turbo | 0.0047 ₽ | 0.0189 ₽ | |
Qwen: Qwen2.5-VL 7B Instruct | 0.0189 ₽ | 0.0189 ₽ | |
Qwen2.5 Coder 32B Instruct | 0.0057 ₽ | 0.0170 ₽ | |
Qwen: Qwen3 8B | 0.0033 ₽ | 0.0131 ₽ | |
Qwen: Qwen3 235B A22B | 0.0095 ₽ | 0.0095 ₽ | |
Qwen2.5 7B Instruct | 0.0047 ₽ | 0.0095 ₽ | |
Qwen: Qwen2.5 Coder 7B Instruct | 0.0009 ₽ | 0.0028 ₽ | |
raifle | |||
SorcererLM 8x22B | 0.4257 ₽ | 0.4257 ₽ | |
sao10k | |||
Sao10k: Llama 3 Euryale 70B v2.1 | 0.1400 ₽ | 0.1400 ₽ | |
Fimbulvetr 11B v2 | 0.0757 ₽ | 0.1135 ₽ | |
Sao10K: Llama 3.3 Euryale 70B | 0.0662 ₽ | 0.0757 ₽ | |
Sao10K: Llama 3.1 Euryale 70B v2.2 | 0.0662 ₽ | 0.0757 ₽ | |
Sao10K: Llama 3 8B Lunaris | 0.0019 ₽ | 0.0047 ₽ | |
scb10x | |||
Typhoon2 70B Instruct | 0.0832 ₽ | 0.0832 ₽ | |
Typhoon2 8B Instruct | 0.0170 ₽ | 0.0170 ₽ | |
sophosympatheia | |||
Midnight Rose 70B | 0.0757 ₽ | 0.0757 ₽ | |
thedrummer | |||
TheDrummer: Anubis Pro 105B V1 | 0.0757 ₽ | 0.0946 ₽ | |
TheDrummer: Skyfall 36B V2 | 0.0473 ₽ | 0.0757 ₽ | |
Rocinante 12B | 0.0237 ₽ | 0.0473 ₽ | |
Unslopnemo 12B | 0.0426 ₽ | 0.0426 ₽ | |
thudm | |||
THUDM: GLM Z1 Rumination 32B | 0.0227 ₽ | 0.0227 ₽ | |
THUDM: GLM Z1 32B | 0.0227 ₽ | 0.0227 ₽ | |
THUDM: GLM 4 32B | 0.0227 ₽ | 0.0227 ₽ | |
undi95 | |||
Toppy M 7B | 0.0757 ₽ | 0.1135 ₽ | |
ReMM SLERP 13B | 0.0532 ₽ | 0.1064 ₽ | |
x-ai | |||
xAI: Grok 3 Beta | 0.2838 ₽ | 1.4190 ₽ | |
xAI: Grok Vision Beta | 0.4730 ₽ | 1.4190 ₽ | |
xAI: Grok Beta | 0.4730 ₽ | 1.4190 ₽ | |
xAI: Grok 2 Vision 1212 | 0.1892 ₽ | 0.9460 ₽ | |
xAI: Grok 2 1212 | 0.1892 ₽ | 0.9460 ₽ | |
xAI: Grok 3 Mini Beta | 0.0284 ₽ | 0.0473 ₽ |