WebGPU hits 40% availability in the subset of websites that intentionally put an WebGPU testing iframe that runs bleeding edge JS in their sites. Even the JS used in collector.js uses vanilla JS features that aren't even in all browsers and so will simply miss out on collecting any information about them, or browsers that don't run it's JS, at all.
Do you have better stats on its adoption? I am basing this on collecting capabilities from roughly 30K browser per day, which is at least decent. You can argue it is biased, but that is what the breakdowns are for.
Also, what features in collector.js do not have wide browser support? If you let me know, I can address it. Is the amount of browsers who can not run this script statistically significantly, like more than 1% of devices?
I do have error reporting on the script so I know that if the script runs I catch all the errors. I guess I am missing if the script fails to compile in the first place.
BTW I just pushed a new version that uses ES6 as the TypeScript compile target rather than ES2020 for collector.js.
You need error reporting as a separate script injected early on into the html page before your script is ever loaded. This error reporting script should be as bare bones and backwards compatible as possible.
It's from a checkout page (the most important page where fancy JS isn't worth a single lost sale) so it goes to great pains to catch all errors and not be itself a source of JS errors. It intercepts errors and POSTs them to the server with a good-faith attempt at getting the browser, faulting script, line number, column number, error message, and stack trace.
It was actually in a <script> tag and not in an external js, but that shouldn't matter too much. You can see the comments and code to support actually ancient browsers (predating IE6 and Firefox 24).
Chrome (to include the original Chromium and forks like Edge and Brave) holds 80~90% of the browser market share, and the vast majority of those installations will either autoupdate or be manually updated on a regular basis.
The logical conclusion thus is that whatever feature(s) introduced by Chrome will be widely available for common use in a short period of time.
This is the positive side to a monopolized browser market: Whatever Chrome supports is what the commons can support.
Naturally it depends on where you are: by Statcounter’s figures, Australia and USA are both under 62%, and India is over 95%.
(Note that Statcounter is far from reliable: its data comes from trackers that are blocked by most ad/content blockers, which mean it is likely to significantly undercount Firefox especially.)
And two other factors to consider here:
• On a feature like this, browser support is only one of the gating factors: you also need graphics card/driver support. For a long time, WebGL support was way lower than the browser support charts suggested, for this reason. (Subpoint: WebGPU is not a cohesive whole; devices may support some features and not others, which will always make life more difficult, and that’s about the graphics card hardware and driver, not the browser.)
• At this time, Chromium hasn’t shipped WebGPU on all platforms—only desktop platforms. Mobile platforms will lag, and I imagine that low-end devices will just not support WebGPU for many years to come.
It is not a positive, but a negative side to a monopolized browser market. Unrestrained power to define communication standards and protocols will eventually be used for malicious purposes and there will be no players in the market who would oppose it.
I too would be more than happy for Mozilla to be replaced but thinking that Google is any less cancerous is simply retarded. In fact, one of the biggest problems with Mozilla (but not the only one) is that they are effectively a Google vassal that can't afford to challenge its master. As for all the other non-Google-reliance problems, you can bet that as another SV tech corp, Google has them all too and then some.
As someone else pointed out, you're overestimating Chrome/ium's market share.
Regardless, after the web.dev/baseline announcement, I looked at Browslerlist and one of our site's analytics and it is shocking how many people are not using the last two versions of evergreen browsers. There is a long tail of browser versions in those stats.
If you only use JS to collect statistics you are only collecting statistics about JS users. And that is not everyone. And I'm not just talking about weird nerds like myself that intentionally disable it but also the underserved billions of the world without the latest year's smartphone.
It's a fatal flaw in pretty much all modern web stats collection that makes it seem like JS and bleeding edge JS support are far more of the pie than they are. And any time it's brought up those same people say it's not worth actually collecting information about all hits, say, via the webserver logs, because it's small. But they never connect the dots...
What do you mean by "the underserved billions of the world without the latest year's smartphone"?
Sure, there is a small set of people who choose not to run JS, or use something like lynx that doesn't support it, but almost every browser version released in the last 20 years, certainly all the graphical ones, have by default good enough JS support to gather basic statistics and send them back to the server.
Look at collector.js, in the first dozen characters it's already using ECMAScript 6 standards like promises. At the very least you need to change that "last 20 years" to the last 5 years.
You are gently omitting IE which doesn't have this at all. Doesn't matter when others were released, at that time IE had a market share that was non-neligible.
I get what you're trying to say, but a lot of these ES6 things have been supported for longer than you'd think a while now, and are near-universal.
Near-universal: not universal. But people still running very old devices or browsers are not going to have a fun time on many sites regardless, and WebGPU is not likely to be used casually on random sites: it will be used for the same stuff that WebGL and canvas is today: mostly for special-purpose things like games, some visualisations, etc. – the sort of stuff that very old and slow devices will have trouble with in the first place.
Promises have been supported since 2014 in both Chrome for Android and Safari on iOS. Even pretty old phones generally run web browsers more recent than that — and that's not "the last five years," that's nearly twice as long.
If you are going by the dates that Google and Apple browsers supported it for the first time. But if you tried to use ECMAScript 6 features back in 2014? Hah! The support would not be there. 5 years is a realistic number.
You argued that using these features today blocked "billions" of people using old smartphones. Promises have been supported by smartphone browsers for over nine years. I do not think that using promises today prevents anywhere near billions of people from running your code, nor do I think that "5 years is a realistic number" for a feature that has been supported for 9. We're discussing whether using promises today impacts a significant number of web users.
> you are only collecting statistics about JS users. And that is not everyone.
Okay. I can assume that anyone who has JS disabled, and I know how to detect that in the iframe, as also not having WebGPU available. Because WebGPU is exclusively JS and if JS is disabled, it effectively disables WebGPU. I'll add that.
> Users with an older smartphone on a slow connection could fail to run the script even if it's not disabled, if it's a separate network request.
Hmm... technically correct, but I wonder if this is statistically relevant? Remember that I am currently collecting 30K browser samples per day. I'd have to miss 300 samples in a day this way for it to cause an error of 1%.
From what kind of sites? The kind that know and care about WebGPU support enough to include your iframe? That's some heavy selection bias if my assumption is true. Not a lot of low end smart phones visiting the kind of person that runs that kind of site.
That's what I figured. So that supports the idea that it's only measuring a very skewed set of visitors from Webgl, JS, and gfx designer type sites. I'd put the iframe on my very non-JS, non-webgl, non-gfx sites to skew the other way but I don't put JS scripts on sites.
But then it wouldn't report anything, and assuming those cases are roughly representative for cases where it does work, it doesn't really skew the results.
It would be great if browser makers published their own statistics on this kind of thing. They presumably have the real data. But they don't share it, so the next best thing is for a community of interested developers to try to collect it themselves. Of course that's not as good as the real data that the browser makers have, but what better option is there if you need data like this?
WebGPU availability factoid is actually just statistical error. Noscript Georg, who lives in a cave and browses over 10 sites a day using Emacs, is not an outlier and should have been counted.