Lost Pixel
Released under MIT, Lost Pixel provides automated visual testing tool on self-hosted infrastructure.
Open-source visual regression testing, honestly reviewed. No marketing fluff, just what you get when you self-host it.
TL;DR
- What it is: Open-source (MIT) visual regression testing tool — take screenshots of your Storybook stories and app pages, diff them against baselines, catch visual bugs before users do [README][5].
- Who it’s for: Frontend teams and indie devs who want Percy- or Chromatic-style visual testing without the per-screenshot SaaS bills. Especially relevant if you run Storybook, Ladle, or Histoire, or need to test Next.js pages and Playwright flows in one pipeline.
- Cost savings: Chromatic’s paid plans start at $149/mo for 5,000 snapshots; Percy (BrowserStack) runs similar numbers. Lost Pixel self-hosted runs on a CI runner you already pay for, with zero per-screenshot fees.
- Key strength: First-class support for Storybook, Ladle, and Histoire in a single run alongside app pages and Playwright tests — holistic coverage competitors charge enterprise rates for [README][5].
- Key weakness: GitHub is the only supported Git provider for the managed platform tier [website]. GitLab integration exists but requires manual baseline management in the repository itself [3]. The open-source core puts infrastructure on you.
What is Lost Pixel
Lost Pixel is a visual regression testing engine. The loop is simple: capture screenshots of your UI (Storybook stories, application pages, Playwright test states), store them as baselines, and on every subsequent run diff the new screenshots against the baseline. Any visual change — shifted layout, broken font, wrong color after a CSS refactor — shows up as a flagged difference. You approve or reject it through a web UI.
The project describes itself as “an open source alternative to Percy, Chromatic, Applitools” in its GitHub README. That’s an accurate and useful pitch. It sits at 1,648 GitHub stars as of this review, which puts it firmly in the “serious tool with a real community” category without being the dominant player in the space.
What separates Lost Pixel from the old-school pixel-diff approach is the threshold system. Every screenshot can have its own tolerance — absolute pixel count or relative percentage — so a one-pixel antialiasing difference doesn’t halt your pipeline [website]. You also get element masking to ignore dynamic content like timestamps, and the ability to execute custom CSS/JavaScript before snapshotting to normalize state [website].
The business model mirrors the standard open-core playbook: the core engine is MIT-licensed and self-hostable. The Lost Pixel Platform adds managed cloud hosting, a review UI, GitHub Actions integration, and team collaboration on top. The open-source mode runs entirely without their servers.
Why people choose it over Chromatic, Percy, and Loki
Versus Chromatic. Chromatic is built and maintained by the Storybook team, which gives it obvious advantages: native Storybook support, sophisticated anti-flakiness infrastructure (they claim 7.3 billion tests run), and 99.9% uptime SLA [1]. But Chromatic is closed-source SaaS. You can’t self-host it. If pricing goes up, you absorb the increase or migrate. Lost Pixel is the escape hatch — the MIT license means the engine keeps running regardless of what nineLemon (the company behind Lost Pixel) does next [4].
The Chromatic team’s own comparison page [1] concedes Lost Pixel’s free open-source mode but frames the infrastructure burden as a reason to stay on Chromatic: “With the open source version of Lost Pixel, your team is responsible for managing parallel test runs… your team is responsible for maintaining the infrastructure end to end.” That’s true, and it’s a real trade-off. The question is whether the cost of that ownership beats a $149–$400/mo SaaS bill for growing teams.
Versus Percy (BrowserStack). Percy was the category originator and is now part of the BrowserStack suite. Enterprise pricing, closed source, usage-based billing. Lost Pixel explicitly positions itself as the open-source replacement. No direct feature-level comparison data was available in the sources, but the positioning is consistent: data sovereignty and cost predictability versus managed convenience.
Versus Loki.js. This is the most interesting comparison because Loki is also open source. Lost Pixel’s own comparison page [5] shows the gaps: Loki has no screenshots in a free plan, no paid tier structure, no collaborative review UI. Lost Pixel adds 7,000 free screenshots per month in its platform tier, multi-source holistic runs, and team-level approval workflows. For teams that outgrew Loki’s basic diff workflow, Lost Pixel is a natural upgrade without jumping to SaaS pricing [5].
Versus Applitools. Applitools is the AI-powered visual testing platform with smart element-based comparison (not pixel diff). It’s priced for enterprise and targets QA teams rather than frontend developers. Lost Pixel is a direct code-level integration aimed at frontend engineers and CI pipelines. Different buyer, different price point.
The practical translation: Chromatic is the production-hardened SaaS with Storybook pedigree. Lost Pixel is the option you choose when you want the core engine for free and are willing to manage the infrastructure or pay a more transparent platform fee.
Features
Core engine (MIT, self-hostable):
- Storybook integration — first-class, no config beyond pointing at the build [README][5]
- Ladle integration — first-class [README]
- Histoire integration — covered in the Lost Pixel + Histoire stack [4][5]
- Page shots — full-page screenshots from arbitrary URLs [3]
- Playwright integration — capture visual state from E2E test runs [website]
- Multiple breakpoints per test (320, 640, 1024, 1200, 1920 are common) [3]
- Multiple browser support [5]
- Per-screenshot thresholds — absolute pixel count or relative percentage [3][website]
- Element masking — mark areas to ignore during comparison [website]
- Pre-screenshot JavaScript/CSS injection — normalize dynamic content before snapshotting [website]
- Parallel test execution [website]
- Holistic runs — combine Storybook, page shots, and Playwright in a single pipeline pass [5]
- Monorepo support — a specific documented configuration mode [5]
- Docker image for reproducible baseline generation [3]
Platform tier (managed cloud, paid above free tier):
- GitHub Actions native integration [website][1]
- Web UI for reviewing, approving, or rejecting diffs [3][website]
- Before/after screenshot comparison view [3]
- Team collaboration on approvals [5]
- Baseline management hosted on Lost Pixel infrastructure [3]
What’s not there:
- GitLab CI has no first-class integration — baselines must be stored in the repository itself, updated manually via Docker [3]
- No Bitbucket support mentioned in any source
- No AI-powered visual comparison (element-based, not pixel-based) — thresholds are the mitigation, not smart rendering analysis [2]
Pricing: SaaS vs self-hosted math
Lost Pixel Platform (managed cloud):
- Hobby: Free — 7,000 shots/month, 5 repos/projects [website]
- Startup: $100/mo — 40,000 shots/month, unlimited repos, $0.006/additional shot [website]
- Business: $250/mo — 100,000 shots/month, unlimited repos, $0.005/additional shot [website]
- Scale: $670/mo — 300,000 shots/month, unlimited repos, $0.004/additional shot [website]
- Open Source projects: free, contact for access [website]
- Enterprise: custom pricing [website]
Self-hosted (open-source mode):
- Software cost: $0 (MIT) [README]
- CI cost: whatever your CI runner bills you — no per-screenshot fee
- Infrastructure: none required — the runner takes and compares screenshots locally
- Baseline storage: in your repository or an artifact store you manage
Chromatic for comparison (public pricing):
- Free: 5,000 snapshots/month
- Starter: $149/mo for 35,000 snapshots
- Business and above: scales to several hundred per month for larger teams
- All plans: per-snapshot billing once you exceed the included quota
Concrete math for a mid-size frontend team:
Say you have 800 Storybook components with 3 stories each, tested at 2 breakpoints: 4,800 screenshots per run. Weekly releases mean roughly 4 runs per month: ~19,200 screenshots per month.
- Chromatic Starter ($149/mo): 35,000 shots included — you’d fit in this tier, just.
- Lost Pixel Startup ($100/mo): 40,000 shots included — comfortable, $49/mo cheaper.
- Lost Pixel self-hosted: $0 per shot. You pay only for CI minutes.
Over a year: Chromatic ≈ $1,788. Lost Pixel Platform ≈ $1,200. Lost Pixel self-hosted ≈ $0 beyond existing CI costs. The gap widens as your component library grows.
The open-source free tier at 7,000 shots/month covers small projects entirely — a design system with ~150 components at one breakpoint fits comfortably.
Deployment reality check
Open-source mode (self-hosted):
Lost Pixel runs as an npm package in your CI pipeline. There’s no server to stand up, no Docker Compose file to maintain. You install it, write a lostpixel.config.ts, point it at your Storybook build or app URLs, and run it as a CI step. The screenshots are diff’d locally. Baselines live in your repository as committed images.
The GitLab integration documented by Blueshoe [3] uses Docker to normalize rendering environments:
docker run --rm -v $PWD:$PWD -e WORKSPACE=$PWD -e DOCKER=1 \
-e LOST_PIXEL_DISABLE_TELEMETRY=0 \
-e LOST_PIXEL_MODE=update --network="host" \
lostpixel/lost-pixel:v3.22.0
This works, but it adds complexity: you’re pinning a Docker image version, managing the baseline images in Git (they can be large), and manually triggering updates. The Blueshoe team notes this is workable but not elegant [3].
Platform mode (managed cloud):
You add the Lost Pixel GitHub Action to your CI workflow, authenticate with the platform, and Lost Pixel takes over baseline hosting and the review UI. Setup time is measured in minutes. The friction disappears, and so does your self-management burden — but GitHub is the only supported provider [website].
What can go sideways:
- Baseline drift in self-hosted mode: If your CI environment changes (Node version, OS fonts, browser version), all your baselines potentially need regenerating. Chromatic’s “upgrade builds” feature handles this automatically [1] — Lost Pixel open source does not. You’ll get false positives until you regenerate.
- GitLab users: The baseline-in-Git approach works, but binary image files accumulating in your repository is a maintenance concern over time [3].
- Flakiness: Lost Pixel provides retry utilities and custom wait times to fight flakiness, but the burden of diagnosing and configuring them is on you in open-source mode [1][website].
- Parallelism in open-source mode: The platform claims “max possible” parallel runs; open source is limited to what your CI allocates and how you configure it [5][1].
Realistic time estimate for the platform mode on GitHub: under 30 minutes for a project already on Storybook. For GitLab or self-hosted baseline management: 2–4 hours including understanding the Docker workflow and getting baselines committed cleanly.
Pros and Cons
Pros
- Genuinely MIT-licensed core. You can run the engine entirely without Lost Pixel’s servers. The tool doesn’t stop working if the company pivots or shuts down [4][5]. This is a real differentiator from Chromatic and Percy, both closed-source.
- Holistic visual runs. Storybook, Ladle, Histoire, app pages, and Playwright tests all combined in one pipeline pass [5][README]. Competitors tend to silo these — you pay per scope.
- Honest free tier. 7,000 shots/month on the platform for free, including the review UI [website]. Not a crippled demo — usable for real small projects.
- Monorepo-aware. First-class configuration for design systems that feed into product apps, with cross-repo change visibility [5].
- Per-screenshot threshold control. Fine-grained tolerance settings prevent noisy alerts from font rendering differences while catching real layout regressions [website][3].
- Element masking. Ignore dynamic content (timestamps, ads, user avatars) without maintaining brittle exclusion regions [website].
- Transparent pricing. Flat tiers with overage rates published on the pricing page. No “contact sales” until the Enterprise tier [website].
Cons
- GitHub-only for the platform. GitLab, Bitbucket, and other providers have no managed integration. GitLab users are on their own with baselines in Git [3][website].
- Baseline management in self-hosted mode is manual. No equivalent to Chromatic’s “upgrade builds” for infrastructure changes. A Node or browser version bump can invalidate all your baselines [1].
- No AI-based comparison. Pixel diffing with thresholds is the mechanism. For dynamically rendered content with natural variation, you’re configuring tolerances, not benefiting from smart element-level analysis [2].
- 1,648 GitHub stars. Respectable but modest compared to Chromatic’s backing from the Storybook team. Community and ecosystem are thinner; fewer tutorials, fewer Stack Overflow answers.
- Uptime and reliability data not public. Chromatic publishes 99.9% uptime claims and has run 7.3 billion tests [1]. Lost Pixel has no equivalent public numbers. For teams relying on visual testing to gate production deploys, this is an unknown.
- Platform tier UI is adequate, not exceptional. The Blueshoe review notes the before/after comparison view works well [3], but there are no published feature comparisons on interaction testing, accessibility integration, or advanced filtering that Chromatic’s addon ecosystem provides [4].
- Small company. nineLemon is the company behind Lost Pixel. Size and runway are not disclosed. Vendor risk is higher than tools backed by larger companies or large open-source foundations.
Who should use this / who shouldn’t
Use Lost Pixel if:
- You’re running Storybook, Ladle, or Histoire and paying Chromatic bills that are growing faster than your headcount.
- You’re on GitHub and want a managed visual testing platform at a lower price point than Percy or Chromatic.
- You need to combine component, page, and Playwright visual testing in one run — not three separate tools.
- You’re building a design system that feeds into multiple apps in a monorepo.
- Data sovereignty matters: you don’t want screenshots of your proprietary UI leaving your infrastructure.
- You’re an open-source project looking for free visual regression coverage.
Skip it (use Chromatic) if:
- You want the tool with the best Storybook integration, the most infrastructure investment, and the largest community knowledge base.
- You need anti-flakiness features that self-configure and require zero maintenance.
- Your team relies on the broader Storybook addon ecosystem (interaction testing, accessibility, Figma embedding).
- You need published reliability SLAs for visual testing as a deployment gate.
Skip it (use Percy/Applitools) if:
- You’re on Bitbucket or Azure DevOps and need first-class CI integration.
- Your testing team, not your frontend developers, owns visual testing, and they need enterprise-grade tooling with support contracts.
Skip it (self-host with open-source mode only) if:
- You’re on GitLab and willing to accept baselines in Git — this works but requires ongoing attention.
- You want zero external service dependencies and control every byte of the pipeline.
Alternatives worth considering
- Chromatic — the category leader for Storybook-first teams. Closed source, managed infrastructure, anti-flakiness engineering, 99.9% uptime claim. Priced at $149/mo and up [1][4].
- Percy (BrowserStack) — original visual testing SaaS, now bundled with BrowserStack. Enterprise pricing, broadest CI integrations, no self-hosting option.
- Applitools — AI-based element comparison rather than pixel diffing. Smart about dynamic content. Enterprise pricing and audience.
- Loki.js — open-source, no managed platform tier, no collaborative review. Functional but limited [5].
- Playwright visual comparisons — Playwright ships built-in screenshot comparison. No separate tool required, but no collaboration UI, no baseline hosting, and it’s per-test rather than holistic across your component library.
- Storyshots (deprecated in Storybook 8) — was the old way to do Storybook snapshot testing. Storybook now officially recommends Chromatic or Storyshot Puppeteer replacements.
- Histoire — a Vite-native Storybook alternative for Vue teams, pairs well with Lost Pixel as reviewed by alisoueidan.com [4].
For a frontend team on GitHub that wants to escape Chromatic bills, the real shortlist is Lost Pixel Platform vs. Chromatic. The cost difference is real; the infrastructure and reliability gaps are also real. Pick Lost Pixel if the savings justify accepting more self-management. Pick Chromatic if visual testing reliability is non-negotiable and $50–$100/mo is worth not thinking about it.
Bottom line
Lost Pixel is the most credible open-source alternative to Chromatic for teams that want visual regression testing without the managed SaaS dependency. The MIT license is genuine, the holistic test runs are a real feature advantage, and the pricing is transparent. The gaps are equally real: GitHub-only managed platform, no AI-based comparison, manual baseline management when things shift under you, and a smaller company with less published reliability data than Chromatic. For a solo developer or small team running Storybook on GitHub and watching a growing Chromatic bill, Lost Pixel Platform at $100/mo is a legitimate replacement. For teams where visual regressions blocking production deploys is a hard requirement and engineering time for maintenance is expensive, Chromatic’s overhead is worth it. The open-source core remains available either way — that alone is reason enough to try it before signing anything.
Sources
- Chromatic — “Lost Pixel vs Chromatic” (comparison page, updated March 2026). https://www.chromatic.com/compare/lost-pixel
- Rishabh Kumar, Virtuoso QA — “AI Visual Testing: How It Works, Top Tools & Practices” (April 18, 2026). https://www.virtuosoqa.com/post/ai-visual-testing
- Robert Stein, Blueshoe — “Visual Regression with Lost Pixel and Gitlab” (updated March 7, 2025). https://www.blueshoe.io/blog/visual-regression-in-gitlab-with-lost-pixel/
- Ali Soueidan — “Comparing Visual Regression Testing Solutions: Chromatic + Storybook vs. Lost Pixel + Histoire”. https://alisoueidan.com/blog/comparing-chromatic-storybook-with-lost-pixel-hostoire
- Lost Pixel — “Loki.js vs Lost Pixel” (official comparison page). https://www.lost-pixel.com/loki-js-vs-lost-pixel
Primary sources:
- GitHub repository and README: https://github.com/lost-pixel/lost-pixel (1,648 stars, MIT license)
- Official website: https://lost-pixel.com
- Pricing page: https://lost-pixel.com (pricing section)
- Documentation: https://docs.lost-pixel.com/user-docs
Features
Integrations & APIs
- REST API
Mobile & Desktop
- Responsive / Mobile-Friendly
Related DevOps & Infrastructure Tools
View all 196 →Coolify
52KSelf-hosting platform that deploys apps, databases, and services to your own server with a single click. Open-source alternative to Heroku, Netlify, and Vercel.
Portainer
37KEnterprise container management platform for Kubernetes, Docker and Podman environments. Deploy, troubleshoot, and secure across any infrastructure.
1Panel
34KModern, open-source Linux server management panel. Web-based interface for managing servers, websites, databases, and containers.
CasaOS
33KA simple, easy-to-use, elegant open-source personal cloud system.
Dokku
32KA docker-powered PaaS that helps you build and manage the lifecycle of applications. The smallest PaaS implementation you've ever seen.
Dokploy
32KThe lightest self-hosted PaaS — one command, 3 minutes, and your apps are deploying with automatic SSL on a $4/month VPS.