unsubbed.co

Open-Meteo

Released under AGPL-3.0, Open-Meteo provides weather API with open-data forecasts on self-hosted infrastructure.

Open-source weather data, honestly reviewed. No API key, no vendor lock-in, just what you actually get.

TL;DR

  • What it is: Open-source (AGPLv3) weather API that aggregates data from national weather services — NOAA, DWD, ECMWF, Météo-France, JMA, and others — and exposes it as a clean JSON REST endpoint [1].
  • Who it’s for: Developers building weather-dependent apps, ML engineers training models on historical climate data, and self-hosters who need reliable weather data without per-call pricing [1][2].
  • Cost savings: Non-commercial use is free with no API key, no registration, no credit card. Commercial users can self-host on their own infrastructure instead of paying per-call to proprietary providers [1][2].
  • Key strength: Zero-friction access — one curl command, sub-10ms response, 80 years of historical hourly data included at no additional cost [1].
  • Key weakness: AGPLv3 license means code modifications must be open-sourced if deployed; the hosted API has a non-commercial restriction that pushes commercial apps toward self-hosting or a commercial agreement; self-hosting at full scale requires serious storage infrastructure [1].

What is Open-Meteo

Open-Meteo is a weather API that does something unusual: it shows its work. Most weather APIs are black boxes — you send coordinates, you get a forecast, and you trust that the vendor’s model is good. Open-Meteo publishes its full source code on GitHub, explicitly credits every national weather service whose data it processes (NOAA GFS/HRRR, DWD ICON, Météo-France Arome and Arpège, ECMWF IFS, JMA, GEM HRDPS, MET Norway), and explains exactly how it ingests, stores, and serves that data [1].

The pitch is direct: the same national weather models that back expensive commercial APIs are freely downloadable. Open-Meteo does the hard part — parsing binary meteorological file formats, handling grid projections, storing 50TB of time-series data in a custom format optimized for fast retrieval — and then exposes the result through a REST API that returns JSON in under 10ms [1][2].

Over 2TB of weather model data are downloaded and processed daily. The historical archive covers 80 years at hourly resolution. For non-commercial use, none of this requires an account. The API is available at api.open-meteo.com and routes via GeoDNS to servers in Europe and North America for latency optimization [1].

The project has 5,007 GitHub stars, is licensed AGPLv3 for code and CC BY 4.0 for the underlying data — a split that has meaningful implications for commercial use [merged profile][1].


Why people choose it

Third-party review coverage of Open-Meteo is sparse in available sources — most discovery happens through GitHub, developer forums, and word of mouth in the ML and self-hosting communities. Based on what the project itself documents and the problems it explicitly solves, three use cases dominate:

The no-friction developer integration. The weather API market typically requires account creation, API key management, quota dashboards, and rate limit anxiety before you write a single line of code. Open-Meteo removes all of that for non-commercial use. There is no signup. You send a GET request with latitude and longitude and you receive a forecast. That matters for prototyping, open-source projects, and side projects that don’t need vendor relationships [1][2].

The ML and data science use case. The 80-year historical weather archive is the feature that separates Open-Meteo from most alternatives. Hourly resolution, global coverage, 10km grid, 50TB of data — accessible through the same simple API endpoint as the live forecast. For training models that need weather as a feature (energy demand, agriculture, logistics, e-commerce), this is either unavailable or expensive from commercial providers. Open-Meteo makes it free and accessible programmatically [1].

The self-hosting path for commercial use. Because the AGPLv3 code is deployable via Docker or prebuilt Ubuntu packages, commercial users can run their own instance and sidestep both the non-commercial restriction on the hosted API and the per-call billing model of proprietary alternatives. The project explicitly frames self-hosting as the path for “demanding applications like machine learning or large language models” that need “practically unlimited API calls” [1][2].

What’s notable is what Open-Meteo does not do: it doesn’t try to compete on hyperlocal precision with proprietary sensors, doesn’t offer a dashboard for non-technical users, and doesn’t bundle enterprise sales. It is, unambiguously, a developer tool.


Features

Core forecast endpoints:

  • Hourly weather forecast up to 16 days [1]
  • Current conditions endpoint [2]
  • Intelligent model blending: high-resolution regional models (down to 1.5km) for near-term forecasts, global models (11km) for the extended range [1]
  • Weather model updates every hour for Europe and North America [1]

Specialized APIs (same interface, no additional authentication):

  • Historical Weather API: 80 years of hourly data at 10km resolution; recent weeks at 1km via archiving of current regional models [1]
  • Marine Forecast API: wave height, swell direction, water temperature, ocean currents [1]
  • Air Quality API: PM2.5, PM10, ozone, nitrogen dioxide, UV index [1]
  • Flood API: river discharge forecasts from hydrological models [1]
  • Geocoding API: city and location name lookup to get coordinates for the forecast API [1]
  • Elevation API: terrain elevation at any coordinate [1]

Weather models available individually: NOAA GFS + HRRR, DWD ICON, Météo-France Arome + Arpège, ECMWF IFS, JMA, GEM HRDPS, MET Norway — accessible as separate endpoints if you need a specific model rather than the blended forecast [1].

Infrastructure:

  • No API key, no authentication for non-commercial use [1][2]
  • CORS supported — works directly from browser JavaScript [1]
  • Sub-10ms response times via GeoDNS-routed servers [1]
  • No ads, no tracking, no cookies [1][2]
  • Docker and prebuilt Ubuntu packages for self-hosting [1]

Licensing split worth understanding: The code is AGPLv3 (deploy modifications = must open-source them). The data is CC BY 4.0 — meaning weather data from Open-Meteo can be used commercially with attribution, even on the hosted API. The non-commercial restriction is specifically about using the hosted infrastructure for free; the data license itself is permissive [1].


Pricing: SaaS vs self-hosted math

Open-Meteo hosted API:

  • Non-commercial: free, no API key, no stated rate limits [1][2]
  • Commercial: the public website does not publish explicit pricing tiers in available sources — the implication is self-hosting or a direct commercial arrangement with the project [1]

Self-hosted:

  • Software: $0 (AGPLv3)
  • Server costs: depends heavily on scope. A minimal deployment serving regional forecasts for recent days is manageable on a mid-tier VPS ($20–40/month). A full deployment with the complete 80-year historical archive requires storage at the 50TB scale — that’s dedicated server territory, not shared hosting [1]

The ML use case math is unambiguous. If a machine learning pipeline needs to fetch weather features for millions of coordinate-timestamp pairs from a commercial API charging per call, the bill compounds fast. A self-hosted Open-Meteo instance, once configured, serves those requests from local storage at sub-10ms with no incremental cost per query [1]. The hosted non-commercial API also works here if the project isn’t commercial — which covers most academic research and open-source ML projects.

Commercial pricing data for alternatives (OpenWeatherMap, Tomorrow.io, WeatherAPI, Meteomatics) is not in available sources and won’t be invented here. The general pattern in the weather API market is per-call or per-month pricing that scales with volume, which is exactly the cost structure Open-Meteo eliminates via self-hosting.


Deployment reality check

Open-Meteo is not a standard self-hosted application. It is a data ingestion and processing system.

What running it actually involves:

  • Installing Docker or Ubuntu packages for the API server [1]
  • Configuring automated downloads of weather model files from national weather services — over 2TB per day in binary meteorological formats (GRIB/NetCDF) [1]
  • Sufficient disk for the data you want to cache. A subset (recent forecasts, one or two regions) is manageable. The full historical archive is 50TB [1]
  • Reliable bandwidth for continuous upstream data ingestion
  • Familiarity with Linux systems administration

What can go sideways:

  • The storage requirements are the primary blocker. Most self-hosting guides assume a VPS with tens or hundreds of gigabytes. A full Open-Meteo deployment needs infrastructure planning, not just a Hetzner CAX11.
  • The project processes binary weather model files using a custom file format and compression technique — there’s no relational database here, and debugging data ingestion issues requires understanding NWP model formats [1].
  • No third-party deployment guides or managed hosting options appeared in available sources. You’re working from official documentation.

Realistic time estimates:

  • A developer comfortable with Docker and Linux: a few hours to a working minimal instance for recent forecast data in a limited region.
  • A full production deployment with historical archive: a multi-day infrastructure project.
  • For most developers building apps: use the hosted non-commercial API during development, decide on the self-hosting path only when commercial scale demands it.

Pros and cons

Pros

  • Truly zero-friction access. No account, no API key, no credit card. One GET request to a public endpoint returns a full weather forecast in JSON. Nothing in the weather API market matches this onboarding experience for non-commercial use [1][2].
  • 80-year historical archive included. Hourly resolution, global coverage, 10km grid, accessible via the same API as live forecasts. For ML applications that need historical weather as a feature, this is the clearest differentiation [1].
  • Six specialized APIs under one roof. Marine, air quality, flood, geocoding, and elevation alongside the core forecast — same interface, no additional setup [1].
  • Transparent data sourcing. Every upstream weather service is credited and documented. You know exactly which model powers your Berlin forecast and which powers your Tokyo forecast [1].
  • Sub-10ms response times. The custom time-series file format and GeoDNS routing deliver performance that would require significant infrastructure spend to replicate commercially [1].
  • No tracking, no ads, no cookies. The API treats your requests as requests, not as behavioral data [1][2].
  • Data is CC BY 4.0 — commercially usable with attribution. Even if you need to self-host for commercial use of the infrastructure, the data itself can power commercial products with proper credit [1].
  • Self-hostable for unlimited commercial throughput. No per-call metering once you run your own instance [1].

Cons

  • AGPLv3 license on the code. Deploy a modified version and you must open-source your changes. For proprietary weather technology built on top of Open-Meteo internals, this is a hard constraint [1].
  • Non-commercial restriction on the hosted API. Using api.open-meteo.com in a commercial product without a commercial agreement is out of compliance. The project enforces this on the honor system, but the restriction is real [1][2].
  • Self-hosting at scale is serious infrastructure. The 50TB historical archive and 2TB/day ingestion requirements put full deployment well beyond typical VPS self-hosting [1].
  • No SLA, no uptime guarantee on the free tier. The hosted API is best-effort. For production commercial apps, availability risk is unmanaged [2].
  • No UI or dashboard. This is an API-only tool. Non-technical users cannot explore or download data without a developer in the loop.
  • Commercial pricing is opaque. The path from non-commercial free to commercial paid is not documented in available public sources — discovery requires direct contact [1][2].
  • Limited public third-party review coverage. Unlike more established tools, there are few independent production case studies available to validate real-world performance and reliability claims.

Who should use this / who shouldn’t

Use Open-Meteo if:

  • You’re building a non-commercial app, open-source project, or side project and need weather data without vendor friction.
  • You’re a data scientist or ML engineer who needs historical hourly weather data at scale — this is Open-Meteo’s sharpest use case.
  • You’re running commercial weather-dependent infrastructure at high query volumes and can invest in the self-hosting overhead to eliminate per-call costs.
  • You want to know exactly where your weather data comes from and have the ability to audit the processing pipeline.

Skip it (consider commercial APIs) if:

  • You need a service-level agreement and uptime guarantee for a production commercial application. Open-Meteo’s hosted tier offers no formal SLA.
  • You need enterprise support, incident response, or a contractual data quality commitment.
  • Your compliance team requires a vendor relationship with signed data processing agreements.

Self-hosting isn’t practical if:

  • You’re on shared hosting or a small VPS — the storage requirements for anything beyond recent regional forecasts are prohibitive.
  • You need a quick commercial deployment without operational overhead. The non-commercial hosted API works for prototyping; commercial self-hosting requires infrastructure planning.

Open-Meteo isn’t the right category if:

  • You need hyperlocal precision from proprietary ground sensors below the 1km resolution that national weather models provide.
  • You need a weather data dashboard for non-technical team members — there is no UI.

Alternatives worth considering

OpenWeatherMap — the most common starting point for weather API integration. Free tier for low volume (1,000 calls/day), commercial plans scale by call volume. Closed-source, no self-hosting, but the widest integration ecosystem and extensive community documentation. Easier commercial onboarding than Open-Meteo.

Meteostat — open-source historical weather and climate data project with a similar open-data philosophy. Complementary rather than competing: Meteostat is strong on station-based historical observations, Open-Meteo on model-based forecasts and very long historical archives.

Tomorrow.io — commercial provider focused on hyperlocal precision and real-time nowcasting, with proprietary model technology. Appropriate for logistics, agriculture, and enterprise applications requiring contractual data quality guarantees. Not self-hostable.

Meteomatics — commercial weather API with SLA, enterprise support, and broader variable coverage. The choice when you need a vendor relationship and Open-Meteo’s terms don’t fit.

Direct national weather service APIs (NOAA API, DWD Open Data, ECMWF MARS) — Open-Meteo is effectively a usability layer over these. If you need raw model output for scientific research, going directly to the source is valid — but you inherit all the binary format and grid-processing complexity that Open-Meteo abstracts away.

For a developer building a non-commercial app or ML pipeline, the realistic comparison is Open-Meteo (free, no account) versus OpenWeatherMap (easier commercial path, smaller free tier). For commercial applications needing SLAs, Open-Meteo’s hosted tier doesn’t fit regardless of price.


Bottom line

Open-Meteo occupies a narrow but clearly defined position: it’s the weather data API for developers who need to start immediately without creating an account, and for ML practitioners who need 80 years of historical hourly data without filling out a contact form. For those two audiences, nothing comparable exists at this price point. The tradeoffs — AGPL on the code, non-commercial restriction on the hosted API, serious storage overhead for full self-hosting — are real but don’t touch the core use cases. The audience that finds those tradeoffs painful (commercial apps needing SLAs, teams needing non-technical dashboards) isn’t the target anyway. If you’re paying per-call for weather data in a high-volume ML pipeline or a non-commercial application, there’s no justification for that expense once you’ve seen what Open-Meteo offers for free.


Sources

  1. Open-Meteo GitHub Repository — README, license information, feature documentation, deployment instructions. https://github.com/open-meteo/open-meteo (5,007 stars, AGPLv3 + CC BY 4.0)
  2. Open-Meteo Official Website — homepage, API documentation, feature overview, data licensing. https://open-meteo.com/

Features

Authentication & Access

  • API Key Authentication

Integrations & APIs

  • Plugin / Extension System

Security & Privacy

  • CORS Configuration
  • Privacy-Focused

Mobile & Desktop

  • Mobile App