OmniPoly
Released under GPL-3.0, OmniPoly provides frontend for LanguageTool and LibreTranslate with AI translation on self-hosted infrastructure.
Self-hosted language tooling, honestly reviewed. Built by one developer who wanted a better UI and ended up shipping something genuinely useful.
TL;DR
- What it is: A Docker-based frontend that unifies LibreTranslate (translation), LanguageTool (grammar checking), and optionally Ollama (AI insights) into one browser interface [README].
- Who it’s for: Developers and privacy-minded individuals who already run or want to run self-hosted translation and grammar tools, and don’t want to manage three separate browser tabs to use them [README].
- Cost savings: DeepL Pro starts at ~$8.99/month; Google Translate API costs $20 per 1M characters. OmniPoly is $0 software on top of free self-hosted backends — your only cost is the VPS hosting all four services.
- Key strength: It’s modular — if you don’t configure Ollama, the AI tab simply doesn’t appear. You get exactly the surface area you need without dead UI cluttering the interface [README].
- Key weakness: OmniPoly is a UI shell, not a translation engine. Every backend (LibreTranslate, LanguageTool, Ollama) must be separately deployed, configured, and maintained. This is a project with 104 stars and one primary maintainer — not production software with an SLA behind it [GitHub].
What is OmniPoly
OmniPoly is a self-hosted web frontend that wraps three open-source language services into a single interface: LibreTranslate for text and file translation, LanguageTool for grammar and style checking, and Ollama for AI-powered insights like sentiment analysis and sentence extraction [README].
The origin story is worth quoting directly from the README: “The project started because I wasn’t satisfied with the standard app that comes with Libre Translate (it didn’t remember my previous choices). I decided to create my own solution. Eventually, I discovered self-hosted LanguageTool and learned that it lacked a frontend interface.” [README]
That’s an honest pitch. This is a developer scratch-your-own-itch project that grew into something broader. It’s not a company, not venture-backed, not targeting enterprise buyers. It’s a TypeScript/Vite application with 101 commits, 6 forks, 9 releases, and a GPL-3.0 license [GitHub].
What it does well is the integration layer. LibreTranslate’s default UI doesn’t save your language preferences. LanguageTool’s self-hosted instance ships with no web UI at all — just an API. OmniPoly fixes both problems in one Docker container, and adds optional Ollama capabilities on top [README].
Why People Choose It
No independent third-party reviews of OmniPoly were available at time of writing — the project is too small to have attracted mainstream tech press coverage. The following analysis is based entirely on the README, GitHub repository, and direct comparison to the tools it replaces.
The choice to use OmniPoly comes down to a specific frustration: you’re already self-hosting LibreTranslate or LanguageTool, and the native interfaces are inadequate. LibreTranslate’s stock UI forgets your source and target language on every page load. LanguageTool’s self-hosted API has no UI whatsoever — you have to hit endpoints manually or install browser extensions that phone home. OmniPoly solves both with a single lightweight container [README].
The other reason is consolidation. Switching between a translation tab, a grammar tab, and an AI summarization tool creates friction. OmniPoly’s tabbed interface reduces that to one URL. For someone doing regular document translation or non-native language writing, this isn’t trivial.
The Ollama integration is the most ambitious part. Plugging a local LLM into the translation workflow for sentiment analysis and “interesting sentence extraction” goes beyond what commercial tools like DeepL expose in their base plans. Whether you’ll use it depends heavily on whether you already run an Ollama instance — if you don’t, it’s another service to deploy [README].
Features
From the README and Docker Compose configuration:
Translation (LibreTranslate backend):
- Text translation across all languages LibreTranslate supports [README]
- File upload and download — paste text or upload a document [README]
- Language filter configuration via
LIBRETRANSLATE_LANGUAGES— limit to only the pairs you care about [README] - Default target language pre-configured per deployment [README]
- Auto-expand display for shorter translations [README]
- Clear buttons to reset text boxes [README]
Grammar checking (LanguageTool backend):
- Inline error highlighting with correction suggestions [README]
- “Picky mode” via
LANGUAGE_TOOL_PICKY: truefor stricter style checks [README] - Language filter configuration similar to translation [README]
- Add-to-dictionary functionality — teach LanguageTool your terminology [README]
- Optional: can be disabled by omitting the
LANGUAGE_TOOLenvironment variable [README]
AI insights (Ollama backend):
- Sentiment analysis on translated or input text [README]
- Extraction of “interesting sentences” — useful for summarization workflows [README]
- Model is configurable (
OLLAMA_MODEL: model_name) — any Ollama-compatible model works [README] - Entirely optional — if
OLLAMAenv var is absent, the AI tab doesn’t render [README]
Harper integration:
- The Docker Compose
DEFAULT_TABoption includesharperas a valid value, suggesting an additional grammar backend (Harper) is supported alongside LanguageTool [README]
UI / configuration:
- Three themes:
pole,light,dark[README] - Per-deployment defaults for language and active tab [README]
- Debug mode logs raw API traffic for troubleshooting [README]
- User preferences persist (the original motivation for building this) [README]
What’s missing: no authentication, no multi-user support, no audit logging, no API of its own. This is a personal-use tool. If you’re deploying it internally for a team, you’ll want a reverse proxy with auth in front of it.
Pricing: SaaS vs Self-Hosted Math
OmniPoly itself is free under GPL-3.0. The economics are about the backends it wraps and the commercial alternatives those replace.
Commercial alternatives:
- DeepL Pro: from ~$8.99/month for individuals, ~$57/month for teams. Translation only, no grammar bundled at this tier.
- DeepL + Grammarly bundle: DeepL Pro Individual + Grammarly Premium runs roughly $20–25/month combined.
- Google Translate API: $20 per 1M characters — cheap for low volume, meaningful at scale.
- Microsoft Translator: ~$10 per 1M characters standard, higher for custom model tiers.
- LanguageTool Premium (cloud): €4.92/month for individuals via annual plan.
Self-hosted stack cost:
- OmniPoly: $0
- LibreTranslate: $0 (MIT license, though language model downloads are large — 1–3GB per language pair)
- LanguageTool: $0 (LGPL, though the full n-gram dataset for enhanced accuracy is ~8GB)
- Ollama: $0 (MIT)
- VPS to run all four: realistically $10–20/month for a machine with enough RAM — LibreTranslate needs at least 4GB RAM with multiple language models loaded; LanguageTool with n-grams wants another 4–8GB
Honest math: If you only need English↔one other language, DeepL’s free tier (500K characters/month) may cover you with zero setup. The self-hosted stack makes sense when you’re translating at volume, need offline operation, can’t send content to third-party APIs for compliance reasons, or want grammar checking bundled without a second subscription.
Pricing data for LibreTranslate and LanguageTool hosting requirements: no authoritative third-party benchmarks available — numbers above are based on documented system requirements.
Deployment Reality Check
This is where the project description needs the most supplementing, because “run docker-compose up” undersells the actual work involved.
What you’re actually deploying (four services):
- OmniPoly itself — lightweight, minimal RAM
- LibreTranslate — requires downloading language model files at startup; each language pair is 200MB–1GB
- LanguageTool — Java-based; the base image is manageable, but the optional n-gram data for better accuracy is a 6–8GB download
- Optionally Ollama — with whichever LLM model you choose (7B models run 4–8GB on disk)
The README provides separate docker-compose snippets for each service but doesn’t assemble them into a single production-ready compose file. You’ll need to wire the networking yourself so OmniPoly can reach the other containers by service name [README].
Minimum realistic machine:
- 4GB RAM if using OmniPoly + LibreTranslate only (one or two language pairs)
- 8GB RAM to add LanguageTool comfortably
- 16GB+ if you’re running Ollama with a 7B model alongside the rest
What can go sideways:
- LibreTranslate’s startup time varies with how many language models are configured — the container may appear healthy before translation is actually ready
- LanguageTool’s Docker image (
meyay/languagetool) has specific tmpfs and capability requirements in the sample compose; copying them incorrectly produces silent failures - Ollama model names are case-sensitive and must match exactly what’s installed in your Ollama instance
- No built-in HTTPS — you need a reverse proxy (Caddy or nginx) in front of OmniPoly for anything beyond local testing
Time estimate for someone comfortable with Docker: 1–2 hours to get OmniPoly + LibreTranslate working. Another hour for LanguageTool if you skip the n-gram data. Ollama integration is fast if you already run it. N-gram download and LanguageTool optimization adds a half-day the first time.
Pros and Cons
Pros
- Genuinely solves a real gap. LibreTranslate’s default UI is limited; LanguageTool self-hosted has no UI at all. OmniPoly fills both gaps with one container [README].
- Modular by design. Unconfigured backends are invisible in the UI — no dead tabs, no broken API calls [README]. You deploy what you use.
- Local-only by nature. No text leaves your infrastructure. For legal, compliance, or personal privacy reasons, this is the architecture that works — nothing phones home.
- File translation support. Upload a document, download the translated version. Not available in LibreTranslate’s basic web UI out of the box [README].
- Persistent preferences. The original motivation for building it — language choices are remembered between sessions [README].
- Ollama integration. AI-assisted sentence extraction and sentiment on top of translation is a combination commercial tools don’t offer at self-hosted price points [README].
- Active development. 9 releases and recent commits suggest the project isn’t abandoned [GitHub].
Cons
- Not a standalone tool. OmniPoly does nothing without at least one backend deployed. Users who see the Docker image and don’t read the docs will be confused immediately [README].
- GPL-3.0 license. Copyleft. If you embed OmniPoly in a product you distribute, your modifications must be open-sourced. MIT this is not. For personal use it’s irrelevant; for commercial embedding it matters [GitHub].
- One maintainer, 104 stars. This is a personal project. There’s no SLA, no issue response commitment, no long-term roadmap beyond what the author publishes. If the author stops maintaining it, you’re on your own [GitHub].
- No authentication. Deploy this on a public-facing port without a reverse proxy auth layer and anyone can use your translation backends. This is a personal-use tool without multi-user access controls [README].
- High RAM requirements at scale. Running all four services (OmniPoly + LibreTranslate with several language pairs + LanguageTool + Ollama) realistically needs 16GB RAM, which is not a $5/month VPS [system requirements].
- Translation quality ceiling. LibreTranslate’s translation quality is meaningfully below DeepL for most language pairs. OmniPoly can’t fix that — it’s a UI layer, not a translation engine. If quality is the primary requirement, DeepL’s API may be better value even with its cost.
- No REST API of its own. OmniPoly is a browser UI, not a service you can call programmatically. For pipeline automation, you’d call the underlying backends directly.
Who Should Use This / Who Shouldn’t
Use OmniPoly if:
- You already self-host LibreTranslate and/or LanguageTool and want a unified interface for both.
- You translate documents regularly and need file upload/download alongside text translation.
- Your content cannot leave your infrastructure — legal, compliance, or confidentiality requirements eliminate cloud APIs.
- You run Ollama and want AI-assisted analysis of translated content without paying for a commercial tier.
- You value persistent language preferences and a cleaner UI than LibreTranslate’s default.
Skip it (use LibreTranslate directly) if:
- You only need translation — no grammar checking, no AI. LibreTranslate’s API is well-documented and its own UI has improved. OmniPoly adds operational complexity you don’t need.
Skip it (use DeepL) if:
- Translation quality is the primary concern. DeepL’s neural models outperform LibreTranslate on most language pairs, especially for nuanced professional content. At low volume, DeepL’s free tier beats the infrastructure cost of self-hosting.
Skip it (use LanguageTool Premium cloud) if:
- You don’t need translation — just grammar checking. LanguageTool’s cloud product at ~€5/month is cheaper than hosting a full VPS for the self-hosted stack and includes their best models.
Skip it entirely if:
- You’re not comfortable with Docker and don’t have someone to deploy for you. The multi-service dependency chain will produce a frustrating experience without basic container troubleshooting skills.
Alternatives Worth Considering
- LibreTranslate standalone — if translation only, no grammar. MIT license, Docker-deployable, same backend OmniPoly calls. https://github.com/LibreTranslate/LibreTranslate
- LanguageTool self-hosted — if grammar only, no translation. The API is well-documented even without OmniPoly’s UI. https://github.com/languagetool-org/languagetool
- Argos Translate — alternative self-hosted translation engine with offline-first design and different language model coverage. https://github.com/argosopentech/argostranslate
- DeepL API — commercial, but free tier covers 500K characters/month. Substantially better quality than LibreTranslate for European languages.
- Lingva Translate — lightweight LibreTranslate frontend, simpler than OmniPoly but translation-only. https://github.com/thedaviddelta/lingva-translate
- SimplyTranslate — multi-engine frontend (Google, DeepL, LibreTranslate, ICIBA) in one UI. Different architecture — proxies commercial APIs rather than hosting open-source backends.
For the OmniPoly target audience — someone who wants translation + grammar in one self-hosted interface — there isn’t a direct competitor. The choice is either OmniPoly or managing LibreTranslate and LanguageTool as separate browser tabs.
Bottom Line
OmniPoly is a focused, honest project that solves a real problem: LibreTranslate needs a better UI, LanguageTool needs any UI, and there’s no off-the-shelf solution that combines both. The author built it out of personal frustration, added Ollama support as a bonus, and ships it as GPL-3.0 software with Docker Compose configs and reasonable documentation. At 104 stars it’s not a mainstream tool, and that’s fine — it doesn’t need to be. If you’re already running self-hosted language backends and want a single interface to manage them, OmniPoly does exactly that. The caveats are real: multi-service deployment isn’t beginner territory, translation quality is bounded by LibreTranslate’s ceiling, and one-maintainer projects carry inherent continuity risk. But for the specific use case — privacy-first, infrastructure-owned language tooling — there’s nothing else that glues these three services together as cleanly.
If deploying this stack sounds right but setting it up doesn’t, upready.dev handles exactly this kind of one-time self-hosted deployment.
Sources
Primary sources:
- GitHub repository README — OmniPoly (kWeglinski/OmniPoly, 104 stars, GPL-3.0). https://github.com/kweglinski/omnipoly
- LibreTranslate repository — translation backend referenced throughout. https://github.com/LibreTranslate/LibreTranslate
- LanguageTool repository — grammar backend referenced throughout. https://github.com/languagetool-org/languagetool
- Ollama — LLM backend referenced throughout. https://github.com/ollama/ollama
Note: No independent third-party reviews of OmniPoly were available at time of writing. All feature and deployment claims in this article are sourced from the project’s own README and repository. Pricing comparisons for DeepL, Google Translate, and LanguageTool Premium are based on those services’ published pricing pages.
Features
AI & Machine Learning
- AI / LLM Integration
Localization & Accessibility
- Multi-Language / i18n
Related Developer Tools Tools
View all 181 →Neovim
97KThe hyperextensible Vim fork that rewards the time you invest — sub-100ms startup, modal editing, total customization, and no licensing fees.
Hoppscotch Community Edition
78KOpen-source API development ecosystem — lightweight, fast alternative to Postman with REST, GraphQL, WebSocket, and real-time API testing.
code-server
77KRun VS Code on any machine and access it through a browser — code from your iPad, Chromebook, or any device with a web browser.
Appwrite
55KOpen-source backend-as-a-service with authentication, databases, storage, functions, and messaging. Self-hosted Firebase alternative for web and mobile apps.
Gitea
54KLightweight, self-hosted Git service with code hosting, pull requests, CI/CD, package registry, and project management. GitHub alternative that runs on a Raspberry Pi.
Gogs
48KA painless, lightweight, self-hosted Git service written in Go. Minimal resource usage, easy setup, and runs on anything from a Raspberry Pi to a VPS.