Hosting the simplest of maps (static map) cost 2$ per 1,000 requests. Imagine you integrate such a map into content. Even for smaller projects it's not uncommon to get traffic into the hundreds of thousands or even millions on a monthly basis. I'm talking page views, not unique visitors. That would set you back hundreds per month or even 1,000$. Just to show a bloody map. Actually, not a map. It's a static picture of a map.
Add some basic map interactivity, even just things like pins using the JavaScript API, and it becomes 7 times more expensive than that. I'm not kidding.
Say you make a map with pins showing current wildfires. Now consider this costing 14$ per every 1,000 requests. Do the math of your page gaining some basic popularity. 100,000 page hits? Not a big deal. That'll be 1,400$ please.
In most projects you want your map to be more useful. Add (reverse) geocoding, place details, etc. That's 3 APIs. Assuming equal usage, that's over 5,000$.
In one month. For a smallish audience. To enable fairly basic functionality. You can rent a goddamn Mercedes AMG for that kind of money.
Also it appears each reload is chargeable - so anyone who wants to make your life hard can trivially run up your bill (alternatively, pushes the cost and complexity of avoiding such attacks down to anyone trying to provide a service built on gmap).
Or you could use a reasonable API like Mapbox and get better quality data for much, much cheaper.
Google Maps data quality has been tailing off. I encounter missing roads that are on the county parcel viewer and in OpenStreetMaps yet Google couldn't be bothered to ingest this public data. These roads aren't new, you can go back and look at county GIS exports from a decade ago and the road is there in the exact same place.
Google is still the king of points-of-interests and local businesses. Places that a tourist would actually care about. Mapbox would not even be within 1 or 2 order of magnitudes when it comes to this data.
You're missing the point entirely. It's not about the Mercedes. If anything, Google makes it impossible and unaffordable to serve up the useful information you desire.
I’m relatively excited about the pollen API. I suffer from severe seasonal allergies. I know exactly what I’m allergic to because I’ve seen an allergist and I’ve undergone immunotherapy, and while that helped it didn’t cure it.
I’ve struggled to find a good source of location specific pollen forecasting and historical data. The historical data is probably the most important for me because it would allow me to start taking medication weeks before the first pollen event, allowing for maximum therapeutic effect.
I’m probably going to try to build something for myself later today and see if I can get the kind of data I’d like to see out of it.
Just don't disregard the warning in the other comments about google not being a reliable API provider. Especially with maps and weather APIs Google's track record is abysmal, and I personally had to swap out APIs for a FOSS project of mine because their API went away.
As an independent hacker using Google Maps or Mapbox have been out of reach for years given their high cost. Sure you get a free tier but as soon as you spend just a little above that it can quickly cost a small fortune. It is a bit sad for the developer community because there could without a doubt be a lot smaller and innovative projects that utilize maps if the pricing made that possible.
I remember a few years ago when the address autocomplete API went from free to thousands of dollars. That was a surprise bill I won’t be forgetting anytime soon.
I remember at a previous job someone cut the cost by only doing look-ups after the first three characters had been typed, and at another job someone getting an unexpected five figure bill (they had some explaining to do).
Esri has a free address auto-complete API, and unlike Google who suggests addresses that aren't legal, USPS deliverable addresses, you nearly always get real, valid addresses from Esri since they are pulling USPS's dataset (who is the only org that is legally required to getsl notified when a city or county renames a road, renumbers a block of buildings, etc).
That's a neat project. A key difference is the Google dataset resolves down to the address level (at least in the US). So it is able to account for tree cover over individual buildings, which can be determinative to the economics of a solar install.
> Google launches applications based on BreezoMeter acquisition
> BreezoMeter, acquired by Google last year for $225 million, develops technology for predicting environmental hazards related to air quality and its impact on health.
I really wish Google would offer better lower level API's, and stop trying to do high level API's like "solar potential" and "pollen prediction".
I basically just want a data dump of their whole map, like OSM provides, but I understand why they don't want to do that...
So instead give me access to the whole maps database to run bigquery queries over it, and make me pay per record I touch or record in the result set.
I'd love to be able to answer questions like "How far, on average, are my customers home addresses from the nearest footpath?", or "What percentage of residential addresses in the UK have an ATM within 1 mile", or "Give me a route from A to B, but not via any unpaved streets", or "Give me a list of the 100 biggest cities that have no ice rinks".
I wonder if it's because there isn't that much map data to be had, not like search data, so no matter what price they set their competitors could siphon off all their data for relatively cheap.
And sure, a user agreement is a way to litigate after the fact (assuming you catch them in the act) but once they have the data it doesn't matter.
I suspect the reason they don't do this is because they want more pricing control.
For their solar API for example, they know how much time solar surveyors spend getting roof profiles, and they can therefore price their product to maximize revenue.
However if they just let you access all the data for any usecase, there will be plenty of people who pay Google a few bucks for data that might cost millions of dollars to obtain any other way.
I get why this feels like a good idea, and I’ve wanted similar myself, but having built APIs, ensuring that users use them correctly is hard. Designing APIs that are hard to use wrongly is tricky, but a much better approach than documentation.
There are lots of other reasons and it’s easy to jump to Google wanting to protect their data, but creating correctly usable APIs is a generally applicable reason that is perhaps more charitable.
Whats wrong with "all maps stuff is in this huge bigquery table. Here are a bunch of example queries to get you going."
SELECT business_name, business_phone FROM pois WHERE business_type = 'bank' and place_review_count > 10 ORDER BY distance(Location('New York'), location) ASC LIMIT 10;
Doesn't seem particularly hard for users to use 'right' - most people using an API will have some technical background and be able to use that.
From a business perspective, presumably they would like to charge radically different amounts for only slightly different queries.
A realtor who stands to make $10,000 on a house sale might happily pay a few bucks to know the nearest schools, shops, bus stops, broadband availability etc for a property listing, but only want to do one or two searches per week.
A lawyer closing the sale of a house might happily pay $50 for a report on flood risks and nearby planned developments, if it's the most reliable data available.
At the other end of the spectrum, finding the nearest EV chargers for a vehicle navigation system? Users don't pay per search in vehicle navigation systems, even a tenth of a cent per search is too much.
Companies who are willing to talk to their customers resolve this by the sales team drawing up a different contract, and maybe even a completely different pricing scheme, for every customer. But as Google wouldn't deign to talk to a customer over a mere $100k/year they don't have this option.
From a practical perspective, structured map data often ends up with a ridiculously complicated schema, to accurately represent a reality where rules can be arbitrarily complicated. There are roads which don't allow 'trade or business vehicles except permit holders and taxis'; other roads change direction depending on the time of day; a junction might have some legal turn restrictions which don't apply to emergency vehicles, but some physical or logical turn restrictions which do. A schema complex enough to represent that sort of thing correctly will be hard to query.
"map stuff" is much harder to understand than your query implies. As an example
Have you tried understanding all the possible things that you can get in 'address_components' depending on the input params, the region you are querying from, the region you results are coming from, the geopolitical situation on that data, the entity that was matched etc etc
Don't forget entities in side of other entities like businesses within a mall.
Don't forget that the user might want a specific service from the bank. The "bank" label doesn't tell you if they accept street traffic, have live tellers, are just an ATM or a corporate office etc etc.
This is easy to solve... Just publish 'views' of the data simplified for common usecases, and also offer the raw data for those who like joining 50 tables to know extreme corner cases like if the bar down the road's wheelchair accessible toilet will be open during the summertime hour shift.
Example, purely hypothetical for this scenario, but based on a real experience I had recently...
What if there was a data source that went into this from a third party? What if that third party had requirements about the timeline of removing data? This could be because it contained user-generated content and something was found to be offensive or even illegal, or it could be because it contained something proprietary that needed to be correctly licenced, all sorts of possibilities.
If you provide a data dump, you then need any integrators to add a data refresh process. That's an additional step that many won't do. Now you can't honour contracts or potentially legal requirements.
Sure you can document this, sure you can put it in contracts, but that may not be enough.
> but having built APIs, ensuring that users use them correctly is hard
Having used APIs, I don't really care if the organization who built it thinks I'm using it "incorrectly" as long as I'm getting the data out of it that I want. As long as I'm paying for it, I honestly don't think they have the right to say I'm doing it "wrong."
I built a checkout API. Using it incorrectly could easily result in a user ordering items they didn't want to, paying an amount they didn't expect, selecting the wrong shipping, billing the wrong card, and so on.
Building the API in such a way that it was hard for clients to get this wrong was a very key concern for us.
Separately though, I gave a better motivating example in my other reply here. tldr: what you might be paying for may come with stipulations that are not clear in the data – such as how up to date data is when used. Those stipulations may come through many layers of data source, not necessarily from the layer you pay.
> I don't really care if the organization who built it thinks I'm using it "incorrectly" as long as I'm getting the data out of it that I want
That's the way I think about it for stable APIs. But for API that are under constant development, using it "incorrectly" often results in your feature being inadvertently broken by developers who are unaware of your use case.
Neither can Google maps. Many countries have official maps, some are even open data, but even those don't have perfect coverage of "unofficial" features like footpaths.
Here in the Czech Republic, a combination of OSM and official data (ZABAGED) is the best bet, somewhere else it could be Google's data, if they made it available.
It depends where you're looking at. In US, I think google would do pretty well - better than OSM, for I.e this ATM question. Obviously, no maps are always 100% accurate.
True, but that's kind of a chicken-egg problem. If OSM were more popular the resulting influx of contributors and other resources would solve the data quality problem pretty quickly.
Click on the gear button in the top left to see, edit and disable the customization.
Also the Isochrone or Shortest Path Tree (spt) API can be used to answer "How far, on average, are my customers home addresses from the nearest footpath?" or "What percentage of residential addresses in the UK have an ATM within 1 mile" (fetch the ATM locations e.g. via overpass or a reverse Geocoding API)
Note, that I'm one of the founders of GraphHopper.
Yeah and then someone would just export all of the data if they offered low level access. I can’t think of any reason google would take that risk for minimal API fees.
For most of the data, it's already possible to siphon it off. Just browse with a instrumented browser and you can download all the 3D map data for a city in a few minutes from a single session, simply emulating a user zooming around the city.
Business data can likewise be methodologically downloaded, thousands of businesses per user account per day doesn't hit their rate limits.
The polygons of roads and stuff are also easily extracted from a browser.
And the Android/iOS mobile apps allow you to 'download for offline use', which is obfusticated but not properly encrypted.
Charging $1 per 1000 records for API access would cost massive amounts for a competitor to download all the data - and besides, maps data is fairly easy to hide a few erroneous entries in and catch anyone using the data to start a clone of Google Maps.
> Business data can likewise be methodologically downloaded
Google doesn't allow to store the data permanently. Just because it's technically possible doesn't make it legal.
From the Google Places API terms "you must not pre-fetch, index, store, or cache any Content except under the limited conditions stated in the terms"
"You can display Places API results on a Google Map, or without a map. If you want to display Places API results on a map, then these results must be displayed on a Google Map. It is prohibited to use Places API data on a map that is not a Google map."
I was thinking of users who just wanted to take all the data and ignore any T&C's. Since the Linkedin web-scraping case, T&C's on data on a public website effectively doesn't apply anyway.
I don't think mapping data from Google Maps is accurate enough to do this. Perhaps you're better off using a service that uses OSM data. But it depends on the region and use case.
Can we just get an API to request saved lists such as “favorites” and “want to go”?
My motivation is that there seems to be some consistency issues with the backend. I save a spot, and then it gets reverted. Given that I have over 6,000 spots saved, it’s impossible to keep track of when Google didn’t commit or, worst, reverted a past pin.
Is Solar API basically Project Sunroof but open to developers to put in any geo boundaries? This feels huge for scouting and conceptualizing solar projects.
We have a small backyard operation that I want to scale to support a lifestyle, then have passive income as well.
Anybody know if there's a way to add the pollution layer to regular Google Maps? After reading about the new NASA TEMPO satellite. Or how to get map from NASA? I see no links in their announcement.
What is the source of the data though? If I want the latest AQI for Mumbai, Delhi, and Bengaluru, how do I know where Google is getting the data from? Interesting product spin though ...
If google really cared about the environment, they wouldn't have mandated return to office. Anything they say from then on regarding the environment is null and void.
That's nice, would've been better if it endowed is basic features like the ability to extract and list places overlapping custom area polygons as well.
This is great but doesn't work for applications that don't have the luxury of the internet. What are some decent map alternatives for air-gapped (no internet access) applications.
On Android (and iPhone I guess) OsmAnd is a navigation app that works offline (with map data from OpenStreetMap) and exposes an API as well as examples on how to use its core in other apps [1].
Unless it's internal/admin-only I can't see myself building on top of Google Maps APIs again. It's just too dang expensive. Maybe if some of the products I wrote were making money per-user (directly to me at least, I'm mainly B2B and so I could pass on the cost but the businesses don't care enough to pay it) I could justify it but just showing an interactive map of restaurant locations for a local food week ran up a couple hundred dollars. I spent a few hours and switched over to use ProtonMaps [0] and I've been very happy. I still use the Google API on the admin side to aid in looking up addresses but that usage is tiny compared to all the people viewing the data.
That's all for a personal project but I've seen Google Maps costs spiral out of control at 2 different companies I've worked at. I pushed for OSM/etc at one company but was essentially told "Nobody ever got fired for buying Google Maps" (we were just drawing polygons on a map) and I think I might be successful at pushing for ProtonMaps (OSM under the covers) at my current.
Google Maps lets you get your foot in the door "for free" but once you pass the free tier it's insane.
One place I worked had been using a Google Map, but switched to Bing when the price went up. The Bing map had a banner on it saying they needed a license key, so they just hid the banner with CSS! I came in a few years later and replaced the whole thing with Leaflet+Carto, but it was pretty funny code archeology to see "Hmm, why is this crazy CSS class here. Oh…"
Which is funny because it seems misaligned with Google’s business model. Shouldn’t they be the cheapest option, and generating revenue by ad placements in 3rd party apps?
I suppose the ad value of static images in tiles is low, and there’s no way to ensure that developers honor JavaScript or other interactivity for ads?
I once was on a project that started out with OSM, but then realised there was nothing that compared to Google's Places API (might be different now) in terms of completeness and correctness. And your are not (or at least were not) allowed to use Places API on top of / in combination with OSM).
That's what you get when your consumer directed products are 'free' and thus widely used and thus lot's of stake holders ensure their place is correctly in their database (probably in combination with other, possibly community based, efforts).
We relied heavily on Places API and thus were bound to Google Maps API, we did just not see a reasonable way around that.
> That's what you get when your consumer directed products are 'free' and thus widely used and thus lot's of stake holders ensure their place is correctly in their database (probably in combination with other, possibly community based, efforts)
Would it be crazy yo have the government maintain such a database and make it available to anyone? It seems like the ideal place to join forces and avoid data duplication.
Well we were looking world wide so that would be a lot of governments. Having said that, here in the Netherlands I could see it happen, the gouvernement does have (had) quite a few open data initiatives.
>Do you want those people managing something like maps or places on maps
Yes. The USPS is actually pretty good at it too. They even have their own API you can use.
>Think about your interactions with the government
My public roads, building codes, food safety, food affordability, stable power grids, advanced medical care, space exploration.
The government is plenty capable of doing good things, however people letting carefully crafted legislation, get passed or even created, that supports private interests is a societal illness, not a fundamental issue with government capabilities.
In my opinion, the USPS is mostly an advertisement delivery system, and every once in a while they deliver a very important letter from the IRS so that you have to look through all of it.
I don’t know how we tolerate this federal system that is literally 80% spam advertisements
Yes, all digital interactions I have with the government, like renewing my license or paying a parking ticket, are generally very quick and easy. Filling out my taxes is not particularly fun, but that isn't the IRS IT department's fault, and the actual act of submitting them online is again very quick and easy. I have also had very good experiences with government public datasets (examples at https://catalog.data.gov/dataset).
At my job I use APIs from the EPA and EIA, and they are stable, well-documented, and generally pleasant to use. I can also email an actual human with questions and get quick responses, which I can't do with Google. I have no concerns about the APIs ever disappearing because the PM got bored or wanted a promotion, or because they couldn't extract enough money from users.
I think whoever manages these services would do an excellent job running a mapping service or Places API, although obviously it would be hard to get traction now that Google Maps is so established.
Some of the features that make Google Maps useful, like GPS and public transit arrival times, are already based on government services.
I have indeed paid my federal, state, and local taxes online, and it was not too terrible.
On the other hand, the USPTO patent database is nearly impossible for anyone other than a professional to navigate. Their search sucks.
Since you mentioned taxes: the fact that Intuit and H&R Block have successfully prevented the IRS from doing what nearly everyone wants: file your taxes for you, for free. For the vast majority of taxpayers, they could do that, yet the political pressure from paid tax preparers have prevented it, at least up to now.
And this illustrates the flaw with your starry-eyed Pollyanna-ism: no matter how honest & hard-working the bureaucrats might be, they're still accountable to corrupt pressures.
The 1950 census data, recently released, also sucks. To find one family whose exact address I wasn't sure of, I had to manually open about 50 strips of census taker daily records and decipher the hand-written names & addresses. Merely finding these 50 strips took more effort than nearly anyone would exert.
Their Search failed to find the family. I don't anyone could figure this out unless they were extremely motivated.
And last but certainly not least, let's look at the Obamacare website rollout, which got a Cabinet secretary fired.
So don't give us a couple good anecdotes and say, "look, the government works great!"
Those services work very well in many countries. In the United States we have one party that likes to show government doesn't work by crippling, defunding, and generally making government programs worse when they get power so that you pay private companies for those save services. Often times companies they have a connection with.
NASA, NOAA etc. do weather, accuWeather and others use federal weather data as the underlying source (and lobby and even install themselves in power to try and keep the government side less user friendly).
I mean when it comes to interactive maps you have to load their JS to display them. Unlike OSM you don't have (legit) access to the raw tiles. You would think that base maps would be a loss-leader for them but with it's current pricing that doesn't seem to be the case. Places API or Directions API are even more crazy in pricing that I never even consider using them. I'd love to give exact driving/walking distances but the cost to do so would be astronomical, instead I just use the haversine formula to give "as the crow flies" distances.
I would love to see some of the more complicated Google Maps APIs unseated by 3rd party providers (Places/Distance, I guess Solar/Pollen/etc too but I care less about those).
That’s not really “raw tiles” in the OSM or other 3rd party map provider sense. Especially since you can’t save/cache them really (and you can’t use them with something like leaflet legally)
> generating revenue by ad placements in 3rd party apps
What worked for websites doesn't work for apps. Have you seen apps with ads built into them? They look horrendous. I usually uninstall them within 5 seconds of seeing them.
Google must diversify its source of revenue, that's the purpose of hardware, Cloud, Youtube premium and others. It is also why they sell access to APIs.
When I worked in a startup with location-based apps we also were hit by that Google Maps API price increase.
However, my understanding is that they are still free for use on mobile, and maybe that's the place which Google really cares to be at any cost? As for the web, they already control all the data they want with their to analytics, ads, and Chrome browser.
I mean this one I actually get. The price under the hood must be immeasurable. They run street cars, buy satellite data, have developers, qa, support. All for a product used by most for free.
I mean the charging a price for it yes, but expecting everyone to be able to pay it is a fantasy.
but they also run a first-party maps app. the price of the streetview cars, satellite data, development, and qa isn't all just for the API.
i'm not really criticizing, they can charge whatever they want if they don't want people to use their maps api. and i suspect they don't really want people to use their maps API - the pricing seems targeted to milk the orgs who can't be bothered to implement a cheaper option.
Does anyone else have a solution for the satellite layer? I was using MapLibre [0] then needed direct-on-the-ground images which made me convert to Google Maps.
If you're willing to pay for it, a company I worked for switched from Google Maps to TomTom maps, and TomTom does offer aerial image maps. I don't remember the specifics of exactly how much cheaper TomTom was vs Google Maps, but I remember it being on the order of an 80% discount.
For a potential USA-only solution, (I don't know if it will fit or be usable for your use case) the USDA provides aerial imagery under the "National Agricultural Imagery Program." [1]
I was the Ads rep to the Maps Monetization team at Google in 2010, and then I actually worked in Maps. Back then, Maps was a huge money loser. Placing ads outside the map data was a dismal failure financially, and they weren't yet selling "ads" inside the map (where you pay to have your business shown).
API access was charged for, but as people have pointed out, it was a bargain compared to now.
In any large company, sooner or later the CFO and minions will notice that you're losing money and demand you fix it. Thus, YouTube has all these ads that interrupt your viewing, besides coming up before you start. And Maps is raising API prices.
If you read these articles, you see that ArcGIS has a fairly massive footprint in the geo space. Google is not the only game in town for geo information. I haven't done much with Apple Maps, but that seems to be improving, too.
You can find details of what I did, including how to use a feature I did that's still there, at:
I wish YouTube would use some AI to pick more appropriate points in the video to insert their ads. The current viewing experience with ads is simply too jarring. Maybe that's the point :-)?
Beyond the free tier and 200$ credit, prices are so insane that they would make Musk and that Reddit CEO blush. Have a look yourself:
https://mapsplatform.google.com/pricing/#pricing-grid
Hosting the simplest of maps (static map) cost 2$ per 1,000 requests. Imagine you integrate such a map into content. Even for smaller projects it's not uncommon to get traffic into the hundreds of thousands or even millions on a monthly basis. I'm talking page views, not unique visitors. That would set you back hundreds per month or even 1,000$. Just to show a bloody map. Actually, not a map. It's a static picture of a map.
Add some basic map interactivity, even just things like pins using the JavaScript API, and it becomes 7 times more expensive than that. I'm not kidding.
Say you make a map with pins showing current wildfires. Now consider this costing 14$ per every 1,000 requests. Do the math of your page gaining some basic popularity. 100,000 page hits? Not a big deal. That'll be 1,400$ please.
In most projects you want your map to be more useful. Add (reverse) geocoding, place details, etc. That's 3 APIs. Assuming equal usage, that's over 5,000$.
In one month. For a smallish audience. To enable fairly basic functionality. You can rent a goddamn Mercedes AMG for that kind of money.