Skip to content
Google Gemini

Learnlm 1.5 Pro Experimental

Learnlm 1.5 Pro Experimental is available via Google Gemini with a 33K context window and up to 8,192 output tokens. Pricing: $0.000000/1M input tokens, $0.000000/1M output tokens.

Learnlm 1.5 Pro Experimental Pricing & Specifications

Input Price$0.000 per 1M tokens
Output Price$0.000 per 1M tokens
Context Window32,767 tokens (33K)
Max Output8,192 tokens
ProviderGoogle Gemini

What is Learnlm 1.5 Pro Experimental?

Learnlm 1.5 Pro Experimental is a large language model by Google Gemini with a 33K context window and up to 8,192 output tokens. It costs $0.000 per 1M input tokens and $0.000 per 1M output tokens. Learnlm 1.5 Pro Experimental is available via Google Gemini with a 33K context window and up to 8,192 output tokens. Pricing: $0.000000/1M input tokens, $0.000000/1M output tokens.

Capabilities

text vision function calling json mode

Learnlm 1.5 Pro Experimental Cost Examples

Short prompt (500 tokens)

$0.000000

Medium prompt (2K tokens)

$0.00000

Long output (4K tokens)

$0.00000

Count tokens for Learnlm 1.5 Pro Experimental

Paste your prompt to see exact token counts and API cost estimates.

Open Token Counter

Similar Models to Learnlm 1.5 Pro Experimental

Google Gemini

Gemini Exp 1114

$0.000/1M in 1.0M ctx

Google Gemini

Gemini Exp 1206

$0.000/1M in 2.1M ctx

Google Gemini

Gemma 3 27b It

$0.000/1M in 131K ctx

Google Gemini

Lyria 3 Clip Preview

$0.000/1M in 131K ctx

Frequently Asked Questions

How much does Learnlm 1.5 Pro Experimental cost per token? +
Learnlm 1.5 Pro Experimental costs $0.000 per 1M input tokens and $0.000 per 1M output tokens. For a typical 1,000-token request with a 500-token response, that works out to roughly $0.000000.
What is the context window for Learnlm 1.5 Pro Experimental? +
Learnlm 1.5 Pro Experimental supports a context window of 32,767 tokens (33K). This determines the maximum combined length of your prompt and conversation history in a single API call.
What is the maximum output length for Learnlm 1.5 Pro Experimental? +
Learnlm 1.5 Pro Experimental can generate up to 8,192 tokens in a single response. If you need longer outputs, you can make multiple API calls and concatenate the results.
Is Learnlm 1.5 Pro Experimental good for coding tasks? +
Yes, Learnlm 1.5 Pro Experimental supports capabilities well-suited for coding tasks including code generation, debugging, and refactoring.
Token Counter | Pricing Calculator | Model Comparison | All Google Gemini Models