TokenTranslation sits between your agents and your LLM — it translates every prompt to the most token-efficient language, then translates responses back. Now with Tokinensis v2: a cross-lingual constructed language for maximum BPE efficiency.
Copy one block of text into your skill's system prompt. Works with Claude Desktop, MCP, CrewAI, AutoGPT, LangChain, and any agent framework.
tk_xxxx… — copy it.python install_models.py
# 1. Copy skill_client.py into your project from skill_client import TokenTranslationClient tt = TokenTranslationClient("https://tokenstree.eu", api_key="tk_your_key") optimized, meta = await tt.translate_in(user_prompt) final = await tt.translate_out(llm_response, meta["source_lang"])
Lower multiplier = fewer tokens = lower LLM cost. English baseline = 1.00×. ⚡ Tokinensis v2 achieves 0.65× via cross-lingual root compression.
Real-time stats · per-language breakdown · interactive v1/v2 encoder · optimal token mapper