I've been doing mobile professionally for well over a decade, and every three months or so something happens in that world which annoys me into thinking that the web is the future again. The times when I've ever actually done anything to look into it - well, mobile's a mess, but the web is a whole other level of horrific.
Guess what I spent this afternoon doing.
This is why backend server devs seem to actually make it to middle age, whereas front end types go to burning man one time and never come back.
I'm not quite sure why there's such a negative opinion of the front-end web stack around these parts.
Yeah, HTML/CSS/JS have lots of warts and hangovers from their respective document-oriented origins. But recent developments have been great, and there are myriad tools to make working with these technologies pretty enjoyable (SASS in particular changes the whole business of styling.)
I think that ultimately, building cross-platform, cross-device apps is always going to involve a fair bit of work. The web's still the best tool for doing that, and it's getting better.
I'm confused why WHATWG is concerned about W3C plagiarizing their work on an HTML5 spec, when the alternative is competing specs which drift further apart. Surely interoperability is key here?
I'm very much a distant observer here so I might not even know what I'm talking about, but my understanding is that a large part of the problem is that W3C declares a spec as final and stops updating it, while WHATWG is still working on it. So we do end up with competing specs that drift further apart over time. It would be preferable for W3C to just link to the WHATWG docs which are actually maintained.
Exactly. Unfortunately they can't even bring themselves to do this.
See e.g. http://www.w3.org/TR/html5/references.html#refsURL wherein they add in a non-normative note a link to the WHATWG URL spec, but in the normative reference text instead link to an old, outdated working draft from 2012 which specifies an incorrect algorithm and APIs that do not exist in any browser like getParameterNames() and so on.
It's very tragic, and not at all good for the health of the web.
"...anything that slows the improvement of the Web means programmers are more likely to devote their energies to writing apps for smartphones and tablets running on Apple's iOS and Google's Android operating systems instead of HTML5..."
And rightly so. Being locked into a store worries me much, much less than being locked into using JS and HTML.
Well, when I want internet I launch browser. When I need something else, I launch an app. More likely than not it gets data from some web server via HTTP and in JSON. It may even use bits of HTML for some views. Who cares.
I don't get this obsession to defeat native on mobiles. Like don't get at all. Some dark thoughts start to creep in: maybe that's just some lazy webdevs who see mobile as new hotness but cannot be bored to learn native try to pull the only thing they know into that space? If embeded programming will be the new hotness, will we see the same?
Why complain about developers developing for Android and iOS, but not metion those developing for Windows or OS X?
And throwing half-baked (at best) features so just some capabilities can be checked on the HTML-on-mobile laundry list won't make them more fun to work with.
How's that offline web apps thing going?
Essentially every Android phone or tablet permits side-loading of apps, and installing third-party app directories ("stores"). People may not use such very much, but the freedom is still there.
It annoys me when the internet and the www so obviously get conflated. The internet is pretty much built on open standards, and open TCP/IP implementation exist for the most obscure of hardware, while HTML and JS are what I'd like to call de facto non-standard.
I'm not conflating the two by any means. Two of the internet's shining sources of freedom are HTML and JavaScript especially when compared to the walled gardens of app stores.
That doesn't preclude other sources and as you've noted, HTML and JavaScripts are Web technologies which are of course a subset of... the internet! Besides, these days not all IP packets are transited equally.
> I'm not conflating the two by any means. Two of the internet's shining sources of freedom are HTML and JavaScript especially when compared to the walled gardens of app stores.
As far as I'm concerned, you are comparing two relatively small and often overlapping subsets of internet based technology, and you are making assumptions about the internet as a whole based on that comparison. I don't mean to say that you don't understand the difference between the words, but saying that HTML/JS offers the most freedom seems to ignore the less restrictive technologies it is built on.
Surely, the ability to use an open source software stack to create a socket connection and send arbitrary data to a different machine across the network offers more freedom than HTML/JS in a general sense of the word.
As for not all IP packets being transited equally, I'd say that it's more the result of internet freedom than it is of any sort of restrictive policiy. The way to stop it is either a more restrictive law, or an inconvenience great enough to get people to vote with their feet.
Why not? App development for phones and tablets offer beautiful and very usable platforms to make it easy and painless for developers to create apps.
Contrast that to the preferred emacs/vim + terminal tools of your average developer. No one is trying to make this easier for us, and those that are get ignored (like Adobe Muse, maybe?).
Is there actually a secure way to add support for other languages to the web? No matter what it will need to run in a hardware agnostic VM. So let's say for the sake of argument they added built-in support for Java (just the language not the APIs) in browsers. Would that come free of any security costs or would the added schematics grow the attack barrier or make vulnerability testing more difficult?
Yep. just compile it to javascript/asm.js and you get basically native speed, same security sandbox as javascript, and all the same problems with the web; the vast majority of which have absolutely nothing to do with javascript.
While I believe you are correct in that supporting more languages by won't fix the issues people have with developing for it as they would really want the APIs/frameworks for those languages and in that regard compile to JavaScript tools would better serve them since the only secure way to give developers multiple API's to work with is through a translation tool.
But several people seem to insist that's a poor solution. It sounds like they want direct support for other languages in the browser but how can they securely directly support other languages?
Yes, wanting some other language in the browser is a very common wish on hacker news. It's just completely and utterly impractical in real life.
Even if you found some way to get some other language into the browser without translating to javascript; (and this has been done before. remember vbscript?)...
the new guest language has to co-exist with any javascript that might also run on the page, share the same memory, share the same dom objects, share the same GC, and avoid all the multiple potential nasty browser crashing bugs that could result from trying to do that. And you have to ship the runtime for that language along with the browser, or as sigh another plugin. And guess what, no backwards compatibility.
Given that, you then have to cope with the fact that javascript's GC might not totally make sense for your language, and do all the workarounds that requires.
Compiling to ASM.js gets around this by essentially giving you a low level VM whose bytecode just happens to look like javascript, and whose behavior when interpreted as javascript happens to be correct. When run in firefox, the code is short circuited and ahead-of-time compiled. It doesn't have to deal with the javascript GC since it allocates a heap for the guest program ahead of time as well.
This is really cool if you think about it. You get javascript's more or less secure sandboxing, you get access to all the apis, you can interact just fine with existing javascript libraries, with about the lowest amount of overhead you can get just short of the NaCl approach (which is basically just a revisit of activeX)
The only downside to all this:
it's not the web.
This is not web. It runs in a web browser, but it's downloading a blob up front, ahead of time compiling, and playing a raw executable inside a browser host.
There's considerable advantages to zero-install programs and games. But you know, exchanging one kind of opaque inaccessible blob in a webpage for a different kind that just happens to not need plugins is not that great.
Accessibility is a good thing, and we shouldn't be too eager to throw it away for the shiny.
I don't care about this the slightest. I would gladly see something that replace the http/html/css/js combination with something more dynamic that involves less text parsing...
compiled html with something that resembles protocol buffer would make webapps much smoother.
compiled html with something that resembles protocol buffer would make webapps much smoother.
no no no no no no no no no
You know what a compiled web app looks like? Here:
Content-Encoding: gzip
And presto! It works! It is human readable, and yet the representation the machine sees is compact!
You know what makes webapps slow? It's badly-written Javascript. It's bad router and proxy connections. It's creating forty-odd connections to thirty-odd 3rd party CDN and analytics and ad platforms. It's gigantic images that aren't sized correctly. It's badly-written CSS that misuses transforms and graphics commands.
It's not JS implementations being slow. It's not HTML documents being hard to parse. It's not a shortcoming in HTTP.
Your comment suggests a lack of familiarity with the problem domain.
We had compiled web apps. First there were Java applets which were a buggy, insecure, ugly PoS. Then we had ActiveX which was even worse. By comparison html/js/css have produced fairly good results. Parsing text is not a bottleneck for todays machines as any low level nerd would know.
>By comparison html/js/css have produced fairly good results.
Um, based on what? My browser routinely takes up gigabytes of memory just to show me 'one PDF' worth of content. Analytics scripts steal my CPU time/power that I pay for. HTML/CSS's shitty ambiguous spec without a reference implementation means that no two browsers will ever work alike. And if you're on mobile, all that means you're battery life is screwed. And um.. security?
So you are arguing java and activeX are better than web apps? Because that is the comparison you are responding too.
The 1/100 of a penny worth of electricity that analytic scripts cost you a year is worse than the security nightmare of java applets and activeX? Come on.
>So you are arguing java and activeX are better than web apps? Because that is the comparison you are responding too.
Sure, I personally don't see why that's controversial. They're no different from running JS or flash/silverlight plugins or what have you. My problem with HTML/JS/CSS is that they suck right from the design down to the implementation. With Java and ActiveX , the suckage exists, but mostly at the configuration, deployment and implementation steps. They are redeemable in my view.
>The 1/100 of a penny worth of electricity that analytic scripts cost you a year is worse than the security nightmare of java applets and activeX? Come on.
I see the nightmare coming from IE, FF and Chrome in terms of browser vulnerabilities. Java's installed base is tiny compared to those three.
Also, my CPU usage routinely goes over 50% (E8400) when browsing. I don't think reading a bunch of text should require you CPU to spike like that. I admit the analytics part is an aside because you can bolt it onto Java or ActiveX as well.
The fact that Java, ActiveX, Flash, and JavaScript became goto (sic) technologies only underlines how much the web has always been the poster child for 'worse is better.'
The irony is that the web stack has become so complex you may as well build apps using one of the many mainstream compiled languages for app logic, and work with an improved DSL for styling and markup. (Which is more or less what Go+SASS/etc are becoming anyway.)
The inevitable next stage will happen when the W3C discovers functional programming - I'm guessing around 2020 - and we'll have Greenspun our way to the 10th law again.
If you want it to change, develop something to change it.
I'm not aware of a single technology with a goal to fundamentally upset any of the things you listed.
Personally, I'm satisfied with javascript development. Is it perfect? No. Are there lots of things that could be done better? Absolutely.
But it frustrates me how many people complain about how much it sucks when there are no projects (with any support) attempting to really change things. If my opinion of javascript is wrong and it really is that bad - I'd think there would be more of a movement to move away from it.
> But it frustrates me how many people complain about how much it sucks when there are no projects (with any support) attempting to really change things. If my opinion of javascript is wrong and it really is that bad - I'd think there would be more of a movement to move away from it.
Programmers can't always just start something "new", they have to work on existing platforms to be able to deploy their software quickly and efficiently. This video explain than even though javascript was never intended to run a 3D engine, it now is able to. As he explains it in the video, it's a hack, and it doesn't work that well for most binaries.
If I could, I would start a new browser, without html, with more dynamic languages, with a clang VM, with protocol buffers, etc. But in this age of patents and market shares in IT, I don't expect having enough exposure to have users installing this future browser. If one software company can't deploy its app on the dominant systems, it's screwed. But it doesn't mean JS fits every possible job.
compiled html is not compiled web app. the java vm, activeX, those things are not open and can hardly be expanded, so they're not going to be standard or be federated.
also, http relies on tcp, so it will always be synchronous, there is hardly going back and forth between the client and server. ajax is not something easy to deploy.
I don't have data on this, but it feels like solving the wrong problem. Is text parsing overhead a significant drain on modern systems?
The size difference over a wire between gzipped/deflated text and protocol buffers is also usually dwarfed by almost any other asset being loaded on the page (images, fonts, videos etc).
The benefits of human readability and simplicity probably outweigh those size differences, I think.
I do agree there's a lot of room for improvement in the html/css/js combination, though. They were designed for far less dynamic interfaces than those we're trying to build today.
The CSS model in particular is very conceptually complicated for a layout system. I say that having used it for 13 years.
You're right: replacing HTML with protocol buffers wouldn't do much to make the web smoother. HTML is immediately parsed into a tree stored in memory, and the initial parsing overhead is negligible compared to asset load times (on this site, for example, Chrome reports parsing took ~0.4ms, whereas loading the tiny upvote arrow took a whopping 45ms). Once loaded, pages wouldn't be any more smooth than they are today if they used a binary encoding: regardless of whether you ship trees as binaries or text, after the initial parse the in-memory data structures used by the browser will be essentially the same. GZIP is fairly efficient at encoding HTML tags, and protocol buffers (and any other binary format) would have to encode all of the non-tag text on the page anyway. The gains from binary formats there are minimal.
Controlling browser prefetching and load ordering would do much more than rewriting to use a binary format, as would alternative layout options, as would better image formats, as would more performant VMs (e.g. better asm.js support), as would... A near-endless number of things. Text vs. binary for the initial document tree doesn't make an appreciable difference.
the performance and memory bottleneck in the web is not, as it is commonly believed, javascript, or html.
It's the increasingly complicated DOM, and CSS3 layout model.
Web Apps go slow, and this gets blamed on javascript because that's the language you happen to be writing in. (or wronging in). But the slowness you get usually comes from the constant triggering and retriggering of giant byzantine relayout and compositing algorithms from what you might think is reasonably written code.
it's THAT problem that facebook's react library is aimed at... fixing? no, reducing. Write reasonable JS, and let the library optimise DOM interactions.
Every new feature of HTML5 and CSS adds some weight to those enormous piles of sand the browser has to shift around.
Binary formats would not fix that. Bytecodes wouldn't fix that. Different languages wouldn't fix that. Javascript is fine. it's the DOM and CSS that need to be fixed.
Is it as fast on low end to medium-priced smartphones ? HN is much less complex than other website you can visit, it's a very minimal design.
Even if you compress it, how really is it to parse a big html page ? How many CPU cycle do you need depending on the size of the page ? Couldn't you optimize a webpage by removing unnecessary tags by compiling or pre parsing it ?
I'm talking about parsing performance though, not the size of one page.
The web as it exists today would have never happened in binary form. Everything that's terrible about HTML/HTTP/js such as tag soup, liberal acceptance of data, and being text-based have all been avenues to evolve the platform to it's current state.
Compiling HTML if done correctly and properly would effectively freeze the design permanently.
Backwards compatiblity is not the issue, it's the forwards compatibility that is the issue. The entire evolution of HTML is based on forward compatibility and progressive enhancement.
Many modern HTML features exist because old parsers were extremely forgiving or even buggy. Even something as simple as the HTML5 doctype isn't a valid SGML doctype expected by HTML 4 browsers. It is a hack which just happens to produce the correct results on older browsers.
Javascript used to have to be placed in HTML comments so that old browsers wouldn't display the source code as content because the earliest browsers had no concept of client side scripting.
Obviously a frozen platform is better to work on -- that's why people continually suggest such things for HTML. Make it compiled! Make it binary! The current situation is far from ideal. But if someone had done that with HTML 1.0 it would have been a lot harder to make it version 5.0.
I agree. I think it's really inertia that keeps us here. People look up to the W3C and the garbage they come up with.
I just saw my coworker using Axure for prototyping and I was impressed - it creates working prototypes with clickable buttons that perform functions and calculations. It's basically what making webpages should be. Except it isn't. Why don't they just make that compilable and distribute a viewer to compete with web browsers? Inertia.
Except it isn't. Why don't they just make that compilable and distribute a viewer to compete with web browsers? Inertia.
That's been done - with Flash, for example.
Azure is suitable for building prototypes. It's not a replacement for the arbitrarily flexible set of visual, semantic and development tools that HTML, CSS and Javascript offers.
Guess what I spent this afternoon doing.
This is why backend server devs seem to actually make it to middle age, whereas front end types go to burning man one time and never come back.