Function Calling Token Calculator
Calculate how many tokens your function/tool definitions consume in the LLM context. Paste a schema or build one, then compare overhead across models.
Valid JSON schema
Tokens / Tool
89
Total Tool Overhead
445
Tips to Reduce Tool Token Overhead
- Keep descriptions concise -- every character counts toward tokens.
- Use short, descriptive parameter names instead of verbose ones.
- Remove optional parameters that are rarely used.
- Combine related tools into a single tool with a mode/action parameter.
- Use enums sparingly -- each enum value adds tokens.
- Avoid deeply nested object parameters; flatten when possible.
- Only send the tools relevant to the current conversation turn.
Token counts are estimated using character-based approximation (~4 chars/token for JSON). Actual token counts vary by model tokenizer. Each provider adds different overhead per tool definition (system prompt formatting, delimiters, etc.). Results are approximate.
How to Use Function Calling Token Calculator
- 1
Input your schema
Paste a JSON tool/function schema or use the builder form to create one with parameters.
- 2
Set tool count
Enter how many tools you define in a typical API call.
- 3
Compare across models
See tokens per tool, total overhead, and percentage of context window used for GPT-4o, Claude, and Gemini.
- 4
Optimize
Use the tips section to reduce your tool definition token count and save on costs.