Hacker News new | past | comments | ask | show | jobs | submit login
Computers Aren’t Fast Anymore (2013) (evincarofautumn.blogspot.com)
97 points by buremba on Aug 28, 2014 | hide | past | favorite | 110 comments



I say this as a web developer: web apps are worse than native desktop apps in almost every way. They are slower, less usable, less accessible, less integrated with other software and they are harder to develop (although because most web developers never wrote desktop apps they don't realize how pitiful their tools are).

This is all true (or debatably conceivable), so why are so many apps web apps? Well, in my opinion, it's this: they are easier to deploy, easier to adopt, easier to bill, and much harder to pirate. Web apps are worse for end users and developers, but better for everyone else: the people who sell them, the people who buy them, the people who deploy them, and basically anyone who holds purchasing power.


You're overlooking the fundamental quality of the web and why a lowly document format and simple protocol eventually became a poor man's GUI platform that has been slowly rolling forward and crushing the desktop app market in its wake.

The web is a fully cross-platform/cross-device, installation-free, and open-standards-based platform.

Although traditional GUI toolkits can chip away at different aspects of this, it is actually a much more intractable problem for them to solve than the aspects you cite as native desktop apps being better at. For example, interoperability between web apps has gotten steadily better as HTTP APIs have evolved, today's generation of web apps often make interoperability a selling point (Slack is a good example). From a traditional single-platform native developer the mechanism for this looks crufty and ill-defined, but that is the price you pay for a truly open standard not controlled by a monolithic entity. For instance, for all the beauty and elegance of Microsoft Visual Studio, there is no way they can bring that experience to OS X and Linux, let alone all the mobile platforms.

Native app developers have been wringing their hands over the "inferiority" of the web for 15 years now, but all the while ignoring that the web does something which no technology has ever done in terms of ubiquity. And what enabled this was the trojan horse of HTML/HTTP's initial simplicity. Hell, the simple HTML documents for which the format is ideally suited still outnumber apps by an order of magnitude, but the beauty is you have one platform that can support everything on the complexity continuum from the simplest documents all the way up to sophisticated business apps.


Agreed, especially about integration. Many web apps expose their functionality with RESTful JSON APIs or similar. It seems to me that integrating data & functionality from across web apps is far more common than in the desktop setting. I've never built a desktop app, but just from a users perspective most of my desktop apps are isolated pieces of functionality while web apps are entangled in the greater web community.

My background is mostly web app development, and when I started I was using open technologies such as PHP and NodeJS. This summer I'm doing some work in Visual Studio with a C# .NET MVC 4000 web app. Visual Studio is a slick interface _mostly_ with some great convenience _usually_ but overall it feels suffocating compared to slapping together random tangles of open web technologies. I like the messy open standards...


You can even compare web standards to the most ubiquitous formats used in the most ubiquitous desktop programs. MS Office formats still cause people problems when they try to open complex files in older versions of Office or compatible suites. Whereas if data is stored as JSON, you can be sure that anybody will be able to access the data correctly in any of the thousands of apps that handle JSON, and they'll be able to handle it indefinitely.


I don't often agree with Richard Stallman, but ...

http://www.gnu.org/philosophy/javascript-trap.html

http://www.theguardian.com/technology/2008/sep/29/cloud.comp...

The TLDR is simple : web apps makes Unisys, Microsoft, Apple, ... look like heroes of Freedom, even during their most anticompetitive days.

The worst behaviour microsoft ever exhibited is now standard practice. That's what web apps have gotten us.

That, of course, in addition to the fact that saying that the tools suck does not quite do justice to just how bad web development tools are.


I do often agree with Richard Stallman, but...

These are problems with particular web apps, not web apps in general. I host lots of web apps on my own servers, and they are just as free as any software, but I can access them anywhere from any device. In cases where web apps are non-free, I would have to say it's better than having them installed locally. Web browsers do quite a bit to keep apps in jails where they can't access and affect the rest of the system in the same way a local program can.


I disagree. The difference being :

locally installed non-free web app

app: not yours (BUT: communication between author and app is impossible if you want it to be) data: yours (meaning you can delete it)

remotely installed non-free app

app: not yours, and you can't prevent the author from updating their app under your feet. And the author can do nearly anything, meaning any encryption on your data is useless. data: not yours (meaning the author can read, change, delete, and you CANNOT unless the author, and anyone with a global root certificate (like Saudi Arabia, dozens of companies that have committed breaches of trust, ...) can mitm you, and gain the author's access to your data)


Non-free web apps definitely come with their own range of problems. Data doesn't need to be out of your control though. JavaScript is usually run locally, so there are lots of calculations being performed by your computer before it reaches the remote server. For example, Mega encrypts data client-side before it is archived online. Just like with regular non-free software, you have to trust what it is doing. The best way to deal with these problems without throwing away non-free software would be to have security functions like encryption performed client-side with free software.

I agree that server-run non-free software is neither safe nor private, but I still believe that this is the flaw of particular programs, not web-based software in general. It only emphasizes what a need there is for further development of open standards in web apps.

But the ability to use a cross-platform browser as a universal client and run software that is built on the advantages of networking is a huge bonus for software in general. Free software just needs to catch up in a few areas, but in general it is dominating the backbone of the web. Now we just need to push that freedom forward to the user.


> For example, Mega encrypts data client-side before it is archived online. Just like with regular non-free software, you have to trust what it is doing.

This is, sadly, not true at all. You have to trust

1) that the actual author of the site is playing fair

2) that you are not being mitm attacked by anyone in this list [1]. Note that 3 organisations on this list are known to have issued false certificates with the express purpose of stealing login credentials. They did this by sending through "amended" login javascript bundles.

> I agree that server-run non-free software is neither safe nor private, but I still believe that this is the flaw of particular programs, not web-based software in general. It only emphasizes what a need there is for further development of open standards in web apps.

No. Web apps can be replaced by malicious software every time you use it, and there is nothing you can do to prevent this. It is a fundamental design flaw of web based systems. And, of course, "cert pinning" simply means that a few organisations (google, facebook) get isolated from a few kinds of attacks.

The flaw is that control is placed entirely in the hands of the remote side. Needless to say, this is not secure.

I don't get where this idea of open standards being the solution to privacy problems comes from. Cookies are an open standard, the web is an open standard, TPMs are an open standard, the SSL certiciate chain principle is an open standard. Hell, microsoft palladium is an open standard. All are complete disasters for privacy and freedom.

> But the ability to use a cross-platform browser as a universal client and run software that is built on the advantages of networking is a huge bonus for software in general. Free software just needs to catch up in a few areas, but in general it is dominating the backbone of the web. Now we just need to push that freedom forward to the user.

I disagree. The web has brought back the "freedoms" of the mainframe era, only with a much bigger dependency on the mainframe system. Mainframes also in many cases ran free software. Can you claim with a straight face that a non-root account on a mainframe system is in any way free and private ?

If you don't decide what software runs on your machine, like on the web, you have ZERO security guarantees. Zero. Nothing, nada, zilch, ... no matter how secure anything built on top of that is. I don't get why this is even the slightest bit controversial.

[1] https://www.mozilla.org/en-US/about/governance/policies/secu...


You bring up legitimate problems, but again, they aren't unique to web apps. Physical computers can be compromised too. They can be compromised in manufacturing before they even reach you. There's never a guarantee of security, just trust that you are using a secure product.

Improving the trust model has almost nothing to do with how much of the computer you have administrative access too. Secure computing means trusting those who build and maintain your computers and the software that runs on them. It's less possible than ever to do everything yourself.

There are big privacy and security advances happening because people have lost so much control of their data. People are trusting others more than ever with control of their computers, and that means certain demands for trustworthines that didn't manifest when people felt better about their data because they knew where it was physically stored.

Moving the easiest point of attack on a system to the external network just means we have to be more explicit about what we do with each other's data. We have to learn to trust each other, and that means developing systems that are transparent, auditable, and free, but it also means developing cultures that promote trust and proof of trustworthiness, because that's where real security will come from.


I'll just say this : your data is not private. Take a divorce proceeding (which is a CIVIL proceeding) from the last 5 years. Press CTRL-F, "facebook", and recoil in horror.

Basically all your cloud data will be used against you in any civil dispute in the US. So remember when you use web apps : anything you type in there is accessible to anybody who enters into a serious court case with you.

Another example : any office 365 document (esp. spreadsheets) WILL be read by the IRS if they ever decide to sue you (and you'll pay the wage of the person doing it, to make matters worse, whether or not they find any wrongdoing). Again, the evidence is plain to see in court transcripts.

And, lastly, sometimes your accounts will be compromised in petty legal disputes.

Therefore my policy is :

1) As microsoft has publicly demonstrated, they will use your hotmail stored information and use it to take action against you. If you work for a company that has a cloud platform, or a company that has a significant relationship to one of the cloud platform companies, you're taking unacceptable risks.

2) any dollar sign in any mail to me will immediately result in dead silence. I'll call you up and warn you to never do that again. If it's important enough I'll call. And if it's really important I'll drop by. Both kinds of interactions have vastly superior legal protection.

3) I will NEVER negotiate or store any contract over email, not even my freaking cell phone bill. I have them on my (encrypted) hard drive, of course, even indexed. But contracts on online services is just stupid.

Note that this behaviour is NOT illegal : the purpose here is to safeguard my personal information, which is a normal thing to do that is in fact encouraged by the relevant departments. I am trying to hide personal information from everybody and everything, which is my right.


Yes, you have that right, but I still think we're talking about different things. When you're talking about web apps, you're talking about apps hosted by Microsoft, Google, etc. I'm saying that those have their own issues, but the issues are issues with Microsoft and Google, not server-hosted applications made with HTML5, JavaScript, and PHP.

The right way to do web apps is to have something like a Debian Freedom Box, where you have your own server running free software sitting in your living room and you can access it from anywhere. Another pretty good option is to buy hosting from someone you trust with your data and run your server in their data center, preferably encrypting your data client-side before it's sent to the data center. These privacy issues you mention with Microsoft are due to using their particular implementation of web apps.


And many non web apps happily use RESTful JSON APIs too.


This seems very biased.

Web developers have been "wringing their hands" for the last 5 years swearing blind that any day now everyone's going to give up on smart phone native app development and go html5.

All the web has managed to prove is how badly MS cocked up app deployment on windows, nothing more. The web 'won' against one platform, windows deployment, MSIs and the rubbish they've been putting out with WPF, "one click" deployment. Smart phones have shown us how the web actually doesn't compete very well against native once a decent deployment solution is in place.


I agree with everything you've said (with the exception of speed in many cases), and I would add: easier to maintain. Even supporting multiple browsers is NOTHING in comparison to supporting millions of potential OS/machine configuration installs.

That, and the fact that your app can be available everywhere without having to have it on every machine you want to use it on.

The 'speed' thing is mostly because we have so many inexperienced programmers writing web apps in the world - it's easy to publish these days but I don't think there is that much focus on making your app run really well. It's mostly UI focus. For example you can use Bootstrap and it handles a lot of UI, but most coders don't know what that UI framework is actually doing. Not slamming Bootstrap at all, just using that as an example that many people would know.


I never had much trouble supporting those 'millions of potential OS/machine configurations' but keeping a web-app on the rails for multiple browsers + mobile is the stuff of nightmares. It is almost as if they break things on purpose.


They are "slower" in that actions take longer, mostly because so much data has to be serialized, encoded, shipped long distances, decoded and deserialized before rendering, but they're "faster" in that updates happen instantly and transparently.

Desktop software used to be very quick to respond locally, but updates would take weeks, months, or even years to get pushed out. Some applications, once sold, would never, ever get an update. You'd just buy the new version whenever that came out.


Flip side of that last one, sometimes you would even buy a program, and it would still work long after the company that wrote it went out of business. These days, when a company pulls the plug on a web app, you are SOL.


I personally prefer the latter.

When you buy a license for a desktop application, the price you pay includes tech support and often times future upgrades. When the developer goes out of business, you lose out on all of that. In contrast, the vast majority of web apps are on a subscription model - as long as you continue to pay, you're getting your money's full worth. Sure, it can be a bit disruptive when the company pulls the plug, but in most cases there are reasonable alternatives that are both easy to discover and easy to start using (usually within the span of minutes).


Which is, btw, another incentive for web companies as it forces large companies who do decide to make use of a web product integral to their operation willing to pay a lot more to keep the web company alive.


I'd take a software-as-a-service solution that I can replicate the data from in an open standard format like JSON or XML over a closed-source application with an impenetrable binary file format any day.


That seems like a false dichotomy. There are SaaS applications with no data export option, and there are closed-source locally installed applications that use textual data formats (e.g. XML), if only as a matter of convenience for their developers.


I mean this by way of comparison to applications that were distributed on floppy such as WordPerfect, dBase, FoxPro and so on, where the data format was proprietary.

It's only relatively recently that we've seen open-source and open-data applications. These are arguably better but relatively rare.


Unless it's Free Software, like Newsbeuter.


I think you're right, and I like web applications and think they are a good avenue. However, looking at mobile apps and how updates work there, it is a nice middle ground. No doubt it's more essential as we're talking about phones, not desktops. But I think this sort of computing has come back in some ways.


One advantage of web apps is that they don't require an installation process. It's easier to convince someone to use your software if all they have to do is visit a link. But as the author said, your code is now running at a layer far from the hardware.


And running in a sandbox, they are (mostly) unable to harm the users computer, which is big part of why it's easier to convince someone to use it.


Other advantages for the user:

- They can run 24/7 without having to keep your machines on. You can fix that by having a native app with a server backend, but then you lose some of advantages of native apps.

- They're cross-platform. For Linux users, but even for OS X users, how many times have we seen an Windows app that is a pain/impossible to install even on Wine?

- They run without admin privileges (e.g. work machines) and even without permissions for execute unknown binaries.

- They're less likely to infect your system.

- The frontend is usually more customizable (e.g. it's much harder to make something like RES[1] for a native app)

[1] http://redditenhancementsuite.com/


It amazes me how this is always overlooked, why are web apps used over desktop apps?

Because it's easier for the end user.

Have you ever tried to get a user to install a piece of software? It is hard. They don't know what is going on, is it going to harm their system, is it a virus? What are these terms I'm agreeing to? Whereas a web app they go to the site and BOOM it's done, loaded, and ready to go.

Having to download and install a program makes a lot of things a non-starter for many users. A lot of us struggle with that concept due to a lack of empathy for a general computer user. I have downloaded three programs today alone and didn't think twice about it. But that is just not going to happen for the vast majority of users, and is one of the areas where we (software developers in general) have a hard time empathizing with our end users on.


Sure would be better if desktop apps could be sandboxed like web apps. Installation would be less risky and less questions asked.


The Windows world needs an standard application sandbox for Win32 applications like iOS, Android, Linux (docker) and FreeBSD (jail). WinNT has the functionality inbuilt since NT 3.1 but the Win32 shell doesn't expose it.


There are programs that do this, such as sandboxie [0], or even a VM. But none of them are very user friendly to computer novices.

[0]: http://www.sandboxie.com/


Mac OS X has done this since the Mac App Store was introduced. Doesn't Windows 8 do it for Market apps?


You know, once GMail was the web app. Now I don't use it in the web form anymore, but through native clients.


Web apps could be better, for some X, <script src="/myapp.X" type="text/X">

where X != "javascript". But only Google, MS, Apple, and Firefox working together can make this happen.


All well and good but which of my desktops would you target.

I'm typing this on Android 4.4, though I might have picked my 4.1 tablet. Or my Debian laptop running 64 bit Xfce. Or my Debian Arm netbook running Plan 9 from user space. Or I might be on a vnc session to my virtual servers, either plan9 or CentosVM. Or might be at my Windows 7 desktop. All of those are happy with Web Apps.


I think I have achieved the same easy of development of the very old FoxPro, that is, having one form for both create and edit an item, with the combination of Laravel and AngularJS.

But of course much more is needed. FoxPro is ancient, forgotten tech.


web apps are better for developers. Write once, run on multiple OS / browsers. They're easier to maintain, you don't need to go through approval process or worry about your users running a version that's 3 months behind, meaning you can iterate a lot faster.


> less usable

In what way?

> less accessible

Nope.

> harder to develop

Not when you're trying to develop for ever platform, mobile and desktop.

Your other points still stand.


> In what way?

Slower. Network-bound.

> Nope.

Require internet connections. Require vendors to stay in business indefinitely.

The other is a matter of taste.


We built tools that allow us to create and distribute applications and information at an incredible pace. The web, and the technologies surrounding it, emerged quickly, with bitter fights (Netscape v. Microsoft) and competing standards. Now we pay the price for that speed of innovation.

Fortunately, things are looking up: As the web stabilizes as a platform, and browser developers have time to pay back their code debt, applications will respond more quickly. Similarly, we're starting to see better toolkits like React and Om, which treat the browser more as an application platform and less like a document viewer.

I wholeheartedly agree with the sentiment behind this post. We developed the web quickly; as the web stabilizes, so can we step back and optimize the slow parts. And we are: HTTP 2.0, Servo, WebGL, asm.js, better JavaScript engines, better rendering engines, and better applications.

The browser isn't inherently a slow layer of abstraction; we just haven't refined the technology to a point where it's simple to create performant webpages. I think we're on the right track, but progress is slow when you're trying to upgrade everyone in the world.


We're still in the middle of a shift. The killer feature of the internet is that you can give everyone the same version of your software at the same time, and even give them all different versions (eg A/B test buckets) at will. That was why we threw out decades of infrastructure and rewrote it all again in the browser. We're still learning how to make that fast. Windows 2000, believe it or not, was a kind of apotheosis of desktop programming. By that time the era was closing.

Maybe we'll go back, but I kind of doubt it.


It sounds like all we really ever needed was a universal package manager that could auto update your app.

I'd rather a native app speak the network than my network-speaking browser try to be a native app.


You still need to build native versions for every platform your app will ever run on. If you had that, the package manager would be easy.


That's not so bad - all we need is some kind of consistent abstraction layer we can target on every platform/OS.

wait.

right.

I see the problem.



That's not so bad - all we need is some kind of consistent abstraction layer we can target on every platform/OS.

Qt works fine for me!


Like the back-end of a compiler? JVM or LLVM?


Or you could use JVM apps. Distributing LLVM intermediate representation code and doing the final compile on end hardware could also work.

Building native versions on the developing end is a foolish idea not suitable for programs in general, and it is time we abandoned the approach. It made sense at the time, but much time has passed since then. Some software might benefit from the increased obfuscation, but other than that it's just shit.


Unfortunately, LLVM IR is platform-specific: http://llvm.org/docs/FAQ.html#can-i-compile-c-or-c-code-to-p...

An IR-esque approach is good, just not LLVM. Then you're back to something like the JVM.


Google tried to make a platform-independent LLVM IR [1], but the ended up needing to specify a bunch of additional things, like a virtual 32-bit processor target, their own ABI, their own API (Pepper), and prevent things like assembly language instructions and intrinsics (like SIMD) from passing into the bitstream. It also got locked at one particular LLVM version. This basically ends up bringing it to the level of asm.js, except with less Javascript API interoperability.

[1] https://developer.chrome.com/native-client/reference/pnacl-b...


LLVM helps, but it's not nearly enough. You need a standard library including network and GUI at least. (Edit: or statically-compile everything.) Java probably would have worked, but as I recall, Microsoft killed it.


Yes, that's true, it can't be done tomorrow, but I think it's the right approach. I'm not very familiar with Qt, but turning the more essential parts of it into a part of a LLVM standard library shouldn't take that many years, should it? And don't most standard libraries have networking now?

I'm curious how Microsoft killed Java. I assume you mean Java on the desktop?


Yeah, I mean "killed" specifically for writing web apps.

And by "library" maybe I meant "runtime"? I mean you just have to send the HTML and JS to the client, and everything else for running the app is already there. I guess now, you could send a Python file written to use QT and it would work equivalently on many platforms.


Ah, yeah, well, as long as Oracle is the steward of the most popular JVM I think we shouldn't let it run non-trusted code.

I was thinking more desktop (non-web) apps, really.


I'm OK with that. What does cross platform even mean? Windows, Linux, OSX? Maybe Android and iOS?

It's still easier than rewriting all of the development tools, libraries, etc in javascript and for the browser, and your app will be responsive and fast.


A single HTML documents will load on Win95 - Win8, old Mac Systems, OS X on PPC and Intel, a huge range of software and hardware running flavors of Unix, every smartphone OS ever, and be styled "natively" for every one of them (flat, shaded, rounded, even textured). That's only if you're coding for lowest-common-factor features, but that's not even possible for most other distributed files.

If you want to get fancy, and you're willing to sacrifice compatibility with less-cool platforms, JS is generally within 2x of native performance, which is plenty fast enough to feel "responsive". http://asmjs.org/faq.html


We also needed tighter security for running untrusted code. And a way to run small snippets of code quickly.

Tight security can be solved by some sort of sandboxing that takes a performance hit and a web of trust. Computers that ran the program can confirm that it is not malicious. Then an app can be run without a performance hit. This is the model of Apple's app store more or less. Except it's not a web of trust but a centralized authority and there is no way to run untrusted code.

And it makes a lot of sense. Most apps that are worth using will be used often, no reason they should take a constant performance hit after trust has been established.


We could have stub apps that download the latest code from the web (or use a locally cached copy), compile it for the local machine and pop up a native GUI. The local app could communicate with a server in the same way JavaScript client code does now. No need for the web browser. It just needs to be made easy and truly cross-platform and nobody has written the infrastructure and programming languages needed to support this in a sane, manageable and unified way (yet).


You also need the sandboxing and zero-install properties of web software, though.

Imagine if, say, we had a docker:// URL scheme, where you could click a link and it'd stream you a container image that would then start an ephemeral instance of itself with a linked persistent data container, taking the rest of the URL as its command line.


I'd rather have good software that just works. Don't fix it if it's not broken. Updates.. what are they? If it's not broken, it doesn't need updates. If it's broken, I probably won't be using it in the first place. Test before you release. Don't outsource testing to your users.

Just like in the 90s...

Web developers might think what they get to do is fun, but as a user the idea that software I use is going to change overnight without my permission is crazy. I shy away from web software. (Also, all of it tends to run far too slow on my system anyway)


... and for everyone to be on the same OS configured the same way. Which, given current business climate, is non-negotiably Windows.


We are in the process of "going back."

Mobile devices are designed to host a new generation of apps. The Web is great for expedient universal access, but it won't be a universal multi implementation runtime that works better than platform native runtimes. The Web is a pre-touch legacy platform as much as Windows. Take any tablet and try using it as a touch Web device without using native apps to get a feeling for the scope of the problem.


Hello, doctor. I am only 22 years old, but I think computers used to be faster.

As someone who has been using a computer almost daily since the mid 80s, I don't relate to this whatsoever. I remember waiting for programs to load for minutes in the 80s and numerous seconds even in the late 90s. Even fast DOS apps would hang for seconds to save something to disk.

I remember even early, simple versions of Photoshop taking a minute to load and then forever to run simple effects.. now it's about 2 seconds to load from scratch (and then milliseconds once in cache) and almost everything I do in the app feels instant despite being significantly more complex.

Even software operating way lower down the abstraction tree decades ago felt and acted slower to me than most JavaScript apps today.. things are only getting better IMHO, about the only limit on today's app seems to be network speed which in some cases is not much slower than disk speeds just over a decade ago..


Some stuff feels faster to me, especially compute-heavy stuff like you mention (waiting forever for a Photoshop filter to complete while it chews up your RAM and starts swapping). But quite a few things feel slower, particularly anything where latency for normal, lightweight interactions is the most noticeable aspect. For example Gmail feels quite sluggish to me compared to what I used to use for years, Pegasus Mail, at least for basic operations like opening a mail, opening a reply window, etc. (Gmail probably does scale to large number of messages better, and its search is faster). Anything that involves serious text manipulation in a browser also feels just high-latency enough to be painful, at least on my connection; I spent much of today writing in writeLaTeX and the experience was somewhat frustrating.

It's not unprecedented slowness, but a lot of things about webapps remind me of running remote X apps on an underpowered X terminal connected to a powerful server: compute-heavy stuff is generally handled nicely, but clicking on anything at all has a little bit of latency attached to it.


I remember when there was a hack to enable so when dragging windows the whole window contents would move instead of just a black and white outline. If the window was bigger than 1/4th of the screen it would lag to death but it was so cool.

Now we have live 60fps web page content relayouting when you resize the window, and people are saying computers are slower?


I agree that everything has gotten faster than 30, 20, or even 10 years ago.

But I am a little annoyed in the past year or two that web pages seem to have gotten slower. I suspect with the explosion in large complicated web pages, rendering overhead outpaced hardware development for a couple years.


Hang on, I just need to load my browser from this C60 cassette tape.


So right on everything. Web apps are often unresponsive and temperamental, only their mother can love them. Mouse responses are idiosyncratic (does it change to a wait-mouse when waiting for the server? almost never). Pages change drastically with a simple button press, instead of fluidly updating. Pages get drawn in pieces so you have to check the status bar or tab to see if they're done yet.

Any of these problems in a game for instance would keep it from releasing - Quality Assurance would never pass it! Yet web apps are given a pass on anything. Why? Because there's no simple way to fix them - the tools are too far from the behavior on-screen. You're not in control of your mouse, or your widget behaviors, or your server responsiveness.


It is utterly stunning how badly Sun screwed the pooch with Java.

Java could have won the web-app war, with a little more foresight. But a terrible GUI framework, terrible updater, terrible deployment, etc. killed it.

They knew exactly what they needed to make back when the idea of web-apps was still nascent - you can see the scaffolds laid in the history of Java. A zero-installation platform for cross-platform GUI apps.

But they just made so many painful errors that their platform was unusable.


Java got such a bad a reputation when it came out for slowness. And it was rightly deserved. The VM took forever to start up, especially on a consumer machine. Download speeds were still too slow. Performance was just terrible from start to finish.


They kept saying "it runs fast" which completely missed the point. The GUI toolkit was painfully slow, as was the JVM start-up. If they'd compromised on run-time performance in exchange for fast start-up and they'd re-thought the terrible GUI, it could have worked.


To be fair, executing today's Javascript web apps on that hardware would be much worse.


That's not true. GMail was always JS heavy and run fine back in IE 6 and Firebird 0.7 (now Firefox) in 2004/05.

Where as the Java 1.4 consumed too much memory and every now and then a lot of CPU cycles for its garbage collection cycle. 512MB RAM memory was the minimum for Java VM to execute a single Java app and consuming almost all memory resources.


"web-app war" threw me off for a second, because the file type for java server applications is .war for Web Application aRchive.


The problem of software feeling slow even though hardware gets faster is not limited to web applications. Wirth's law (http://en.wikipedia.org/wiki/Wirth%27s_law) precedes the widespread use of the Web as an application platform. Also, Stanislav Datskovskiy's rant "Going Nowhere Really Fast" (http://www.loper-os.org/?p=300) doesn't even mention the browser. So the web stack is not the root of the problem.


Yes, web layer right now just happens to be the one people notice slowing things down.

Go back to early 90's and you will find lots of complaints about how moving to Windows and GUI was backwards step from character-mode DOS because of the extra layers Windows added, the sluggish UI response, general slowness of Windows apps, and increased complexity and difficulty of developing for Windows. Of course, despite this, there were lots of obvious advantages of moving to Windows from DOS. Eventually the slowdown added by Windows abstractions became pretty much a non-issue, shortly before the move to web development gained critical mass and started the cycle all over again.


No one is pointing out that software today is doing more. Did any apps in the old days run spellchequers constantly in every window? No. Did apps in the old days automatically gentle, genevieve, generate typeahead suggestions in nearly every window? No. Did apps in the old days check for new versions on startup and in the background? No. (well okay one or two of them did).

I'm a cranky EE with a background in routers and even I don't blame "teh web" for making apps clunky. It's just that we're doing more, more colors, more background stuff, more compositing, more buttons on GUIs, more simultaneous file formats - it used to be every program had approximately ONE file format it supported, maybe two. What about running with many different library stacks/APIs underneath? All that shit takes time and memory.

And we are in a transitional period. We're maximalists with respect to amount of features and compatibility we cram into every app (and platform), but maybe soon hopefully eventually evolution will cool a bit and things will start to run as tightly again as Deluxe Paint III on the Amiga 500.


They may be doing more, but they're not solving more problems or adding much more value or productivity.

Most new 'apps' just re-brand the wheel and make it more 'fashionable'.

I used to blame the web, but know the same 'push software out to make monies fast' mentality is even hitting desktop software which is far more bug-ridden.

Software engineering is largely dead. We're in a world of throwing apps at the wall to see what generates money as quick as possible and move on to the next thing.

The web (and mobile devices) will kill personal computing as it alllows corporations to take all of the control away. Sure, you can customise your UI, but it's already been dumbed down.

It's evolution I suppose. I don't have to like it.


I write this very psychiatric confessional blog post within a WYSIWYG editor running inside a web browser. (Sorry, doctor, but you’re just a figment of my overactive rhetorical imagination.) In the browser lies the problem. Almost all of the erstwhile proper applications that I use on a daily basis are now web applications.

Interestingly enough, this is one reason I still write blog posts in Textmate and longer documents in Word (laugh all you want, but Word for OS X on an SSD is actually very fast). I still use Mail.app, which is very fast for composition and searchable even offline. I'm running 10.6 instead of 10.9 or 10.10 (can't remember the latest number; allegedly the most recent versions are slow on the hardware I'm using (http://jseliger.wordpress.com/2011/07/20/mac-os-10-7-is-out-...).

Many things can still be done on fast hardware. Most people choose not to do them, however. Do users drive developers or do developers drive users?


I remember a time when Java tried to be the alternative to native applications. When the power of Java was the byte-code that promised write-once run-anywhere. This was about the time the web was becoming big and I guess in battle between html+js & java...well, java lost. I wish someone was still pursuing that dream though. Oh, wait, Unity.

In any case the browser is intrinsically easier to operate than a package manager. I mean really, it can go from displaying a simple static content site with no formatting or CSS all the way to impressively complex web-apps like Google Docs. Not to mention the work of Three.js and the WebGL community. And all I have to provide is a single well-formatted string of no more than 2,083 characters.


Rants that boil down to implementation concerns always come across as whiny. Web apps aren't fundamentally slow, we're just ironing out the details of this free, basically ubiquitous, basically compatible platform that no one is forcing you to use.


>Web apps aren't fundamentally slow

Computers aren't fundamentally slow, either. The point is that they are slow in practice.


We're actually undergoing massive and dramatic improvements in the tooling, speed, and capabilities of web apps. Even the gap between now and two years ago is startling.

If you haven't given any of the newest tools a spin, you may not have noticed yet. For me, we crossed a tipping point in the past several months where an environment like Ember (with ember-cli as the toolchain) is actually nicer in every way than writing a more traditional server-rendered app backed by Rails or equivalent. And I say that as someone who suffered through the early days when there were serious growing pains as the whole ecosystem matured.

Now, the experience for both developer and end-user is a major improvement. It makes you appreciate how crazy and convoluted traditional web application are, given the way they need to thread state across pages. Rails is a testament to how far you can go despite those crazy constraints. But once those constraints are gone and you're writing a real, persistent client that's only loosely coupled with the server, you achieve a new clarity.

Up until very recently, the tooling all sucked hard, and so the net benefit wasn't very compelling. That has changed, and even if you're still not impressed just give it another year, because the improvements are still coming fast.


I completely understand the author's point, but I always find myself annoyed at these kinds of posts. Maybe it's simply a matter of emphasis. Yes, we have backtracked in so many ways from the speed and functionality of native applications. However, I'd prefer to accept that the web will be the dominant programming platform from now on, and use the capabilities that we had with native applications as inspiration to make the web stack better. Yes, It can be annoying to hear people breathlessly describe the responsiveness of some new javascript framework as if it's something that native apps haven't had for ages. However, I don't think that should make us want to go back. We need to work hard to bring that speed and functionality to the web while taking advantage of the unbelievable capabilities that the web offers that we didn't have before.

Besides, the browser as a user interface is only going to last a little while longer. Have you used Janus VR [1]? Have you used it with an oculus rift? It is crude. The graphics are simplistic. But it is AMAZING. It represents the future. The metaverse is almost here, and this is a very exciting time.

[1] http://janusvr.com/


The problem of web app being in general less responsive than desktop app is just one of the many instances of unifying abstraction lowering things down.

We witnessed that with virtual machine, the industry invented multiple OS, and turn out we need to be able to run different apps on different OS at the same time and we invent (process) virtual machine to bridge the gap like Java Virtual Machine, or even VMWare or Hyper-V.

And then we have different databases, and then we have different data access abstraction layer out of it making it hard to use the best bit for each database vendors.

And then we invent phones, and have different types of phones, and then again we re-invent phonegap to unify the different phones, and paid the price for that.

Who knows what's next - the history is simply going to repeat itself, we create variations, and then we create performance losing unifying abstractions...

And such trend seems to be in this industry only, you never see someone build different building and then find a way to generalize building and make worse master building out of them.

Just knowing such a trend is rather useless, there is nothing we can do to stop that either. As long as there are multiple big vendors around that just won't stop.


The browser is the common client app runtime that the JVM/Swing platform always wanted to be.

Native will always feel better, but the browser gives us a way to express cross platform designs much better/faster/cheaper than most native toolkits do, and far better than any cross platform toolkit that I've seen.

The tradeoff is performance in a big way.

Think of it this way, it's like the web is like OnLive or Galaki streaming to play a game and native is like playing a game on your local device. Streaming game performance is inherently worse, but at some point might be good enough/great for most people such that the investment in native games on local hardware no longer makes sense for certain games.

I think most people don't realize or think through the fact that they are making that performance tradeoff in exchange for other things.

Maybe people should have that discussion.


It's partly, or even mostly, a problem of trust. Back in the wild days of the internet you just couldn't trust a website to have a lot of control over your computer. You could argue that this is no longer the case since we're slowly building up the 'web of trust', but I think we're not quite there yet. Since there used to be less facilities to write speedy online software (secure offline storage, closer-to-the-metal 3D APIs) people were stuck with the old HTML+Javascript and never really evolved out of it. It's not the right way to go, but it's a necessary step to develop the infrastructure that'll be necessary for the next big leap.


My big pet peeve is that many mobile apps don't keep any local data; every time you load up the app is like an ST:TNG episode reset, as if nothing happened before. Save the data locally, serve that to me when I load up the app, and only once you've done that, check out the backend to see if the data needs to be refreshed. Even though most apps have many clients/endpoints nowadays, most users still access them from only one end-point, so load data locally first makes things much faster. The same can be done with web apps and localStorage.


It takes a lot of work to get that right, but I agree with you it is essential to making a quality mobile app.


Most of the apps that run on my android phone are some of the most clunky and unresponsive applications I have ever had the displeasure to use, so I don't think you can place this blame on web apps.

Still, grumpy old non-web developers love to blame their favorite whipping-boy. If pretending that web dev is an inherently poor quality technology makes you feel better, please carry on. Whatever gets you through your day, man. Just please don't start thinking your thoughts are clever or constructive.


Many of your Android apps are web apps that have been further wrapped into a Cordova wrapper :)


Pretending? What bushel are you hiding under? Web apps even on the desktop are demonstrably slow, error-prone and idiosyncratic. The straw-man of a phone (tiny memory, tiny low-power processor) doesn't mean much if my 3GHz 8-hyperthread 16GB desktop with graphics acceleration runs web apps that look like crashy alpha releases.


So who or what is to blame then?

If web applications run like shit and give me a poor user experience compared to native software? I can't blame anyone or anything? Is the web so perfect despite all its flaws?

Oh sure, there is poorly performing "native" software out there too. But in my experience, pretty much all web software performs poorly.


There is a lot of anger in response to my comment, but it is all legitimate points too. Here are the facts as I know them:

1. Poor software can be written using either web or native technology. 2. It is possible to write fast, responsive applications that run in the browser (up to a certain level of complexity.) I know because this is what I do for a living, and I am pretty good at it. Sadly, many web developers are not very good at it or simply don't care enough about preventing a slow, clunky experience. 3. Some cutting-edge web tech (WebGL is the best example I can think of) is not yet ready for prime time. I love proof-of-concepts that demonstrate the possibilities of canvas, but commercial software should not be relying it yet. 4. The barrier of entry is lower for web. The double edge of this sword is web applications are more likely to be built by inexperienced amateurs.

These are real problems, but they do not indicate that small applications will inherently run slowly in a browser. That is caused by poor design and architecture.


> 2. It is possible to write fast, responsive applications that run in the browser (up to a certain level of complexity.) I know because this is what I do for a living, and I am pretty good at it.

But if I tried your web application and found it to be slow, would you blame my computer, my OS, my browser, me, or something else?

I don't want to be angry but it gets frustrating when things doesn't work. Things that -- at this day and age -- should be completely a non-problem for pretty much any PC system. Things that ran fine on grandma's old toaster when implemented the way they were fifteen years ago. And I get attacked a lot for complaining about it. Some people just don't see a problem with it when displaying some text and a handful of pictures requires millions of lines of code and a few hundred million bytes of RAM.. code that can completely bog down a 64bit CPU running at nearly 2GHz. I'm the victim, blame me. I'm the slave, no freedom for me..

I admit, I am grumpy, perhaps even angry. But it's not directed at you personally; I have no hard feelings against anyone here. Sorry if it came across that way :-)


If you tried my app and found it slow, I'd blame myself. Then I would find the reason it sucked and fix it. I have never found a performance problem in a web app that could not be traced to a mistake and be corrected to work fast.

I am as frustrated with this situation as anyone else, even more so because it is my career. The amount lousy work being done makes it appear that the web application platform is to blame, when it is all the lousy developers' fault.

Another major problem I didn't mention before is JavaScript libraries. These are a disease that has only one cure: more JavaScript libraries! Pointless JS bloat destroys an app's performance and is in high fashion right now (especially with the recent ubiquity of front-end frameworks.) The use of Angular and Bootstrap should be considered computer crime.


What irritates me about this "soon it will be great" attitude is the utter passivity implied. We are supposed to wait for the Great Browser Makers to bestow upon us:

* performance

* decent tooling

* native capabilities

This has all happened before with Swing, and will all happen again. You won't understand or empathize with comment until you've had your chain jerked around enough in the past. You probably feel like the pace of progress is so fast and furious that soon we'll fix all these problems. You might feel that the Web is the ultimate platform that cannot be upstaged ever. Except it will, and it might require a new medium, but it will. All platforms are transient.

In a way, I can't be too mad at the Web because it made me realize I want nothing to do with consumer-grade tech. That side of the industry seems to revel in half-solutions and commoditizing developers.


> In a way, I can't be too mad at the Web because it made me realize I want nothing to do with consumer-grade tech. That side of the industry seems to revel in half-solutions and commoditizing developers.

I had a somewhat similar realization. Actually it made me drop out of high school and completely forget about going for comp-sci and a career in IT. Working in IT is still a possibility but if I do it, I will do it on my own terms (or at least in very good company). In the meanwhile, I'll keep programming as a dear hobby so I can focus on what I think is good and right.


Well, some of the browsers are open source, so in theory, we don't have to be passive; we can help make them great. So I guess I should find an area of Gecko, Blink, or WebKit that needs work, and dive in.


I think it misses the point that we use thousands of computers everyday that we connect to through the internet. So although the CPU in your mobile phone is not that fast the search engine it talks is very very fast. So if you take into account the network effect then things have speeded up a lot.


It's not just web apps.

I sometimes switch to tty, and am always surprised how responsive it is compared to my gnome-terminal which never feels slow on its own. In the tty characters seem to appear before I even fully press the key.


Another fun thing to do is to use old software on new box, they are now amazingly fast. We can use old text editor, or word, or even Windows XP.


The horrible thing is that the pain of user interface latency never gets any respect, and everything either has: 1. network latency (the "cloud"!), 2. stupid GPU eye-candy, or 3. plain-old bad design (everyone should read _Programming Pearls_ and the sequel sometime.)


>> We programmed on them in VB6. And let me tell you, for all its downsides, VB6 was screaming fast.

>> These crappy amateur applications were downright speedy.

Compared to what? Visual Basic performance was a joke, even when you compiled it to native code.


This is about user perception of performance: startup time, UI responsiveness, etc. Not raw CPU speed. It did fine on all those counts, even if you'd never let VB do your DSP work for you.


On this topic, Chad Austin's essay "Logic vs. Array Processing" was quite illuminating for me.

http://chadaustin.me/2009/02/logic-vs-array-processing/

Granted, the platform surrounding VB6 arguably had too much hairy logic related to COM and ActiveX, and the fragile system that existed for registering such components before Windows XP introduced side-by-side assemblies. If you really want to pine for the old days, consider Delphi, where you could compile everything into one executable; just map that hunk of code into memory and go.


You wouldn't crunch numbers with it, but it was responsive and fast enough for most purposes, and had simple interfaces to faster native libraries for a lot of the rest.


It was much faster than python.

smirk


It's just the trust you have for the doctor to invent a pill for every problem you face. But sadly that's not the case anymore nor in real human life and neither in the world of technology.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: