Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I really, really hate it when people point at a web browser and a modern game engine and then say "Look how sloppy and horrible and inefficient the web is!"

Seriously, just stop for a second and use your head.

First, that game? Probably using precompiled logic--especially as games in the last decade have gotten objectively harder to mod. The browser must deal with any arbitrary code shoved at it, and handle modifications to its scene graph (the DOM) at any time. These modifications may include creating more script code, pulling in networked resources, or any other damn fool thing.

Second, that game is only going to run on a narrow selection of hardware. It's not going to run on a machine from ten years ago, probably. It's not going to run on a machine ten years from now, probably.

Third, that game is built to use files and formats specifically made for itself. It's not dealing with old goofy image formats. It's not dealing with potentially malformed XML documents. It's not dealing with any number of things, because those have been trimmed away and pre-verified.

Fourth, that game is never going to have to scale from a multiple-socket workstation all the way down to a handheld phone or shell script.

It's really silly to point at a hyperoptimized purpose-built tool and claim it is somehow massively better than a platform for distributing massively-varied media and documents.

EDIT:

Downvote away--but first, write a purpose-built pipeline for deferred rendering, and then a real-time app in Angular, for example. If you haven't done both of these things, you probably don't know what you're raging about.



Have an upvote, dissenting opinions should at least be legible.

That said, I really can't agree with you.

Browsers can re-compile their input into whatever intermediary format they desire subject to the usual space/speed trade-offs. By the looks of it (judging by the space they need) they use that trade-off rather liberally in the 'speed' direction and they still run quite slow. I just can't imagine that laying out some text and images in 2d (which I've done for limited contexts) is that much harder than rendering a 3D scene (in software, not using a bunch of GPU power to do the heavy lifting) (which I've done, but not recently and definitely not with this kind of detail, roughly at the level of the original 'Doom' with a slightly better lighting model and right around the time it came out).

Games may not have security issues in the input data that they consume when it comes to graphics and such, so that's a valid point but games do have to run on a wide variety of hardware and they typically adjust really gracefully.

The one thing I think that really differentiates game rendering engines from browsers is that game engines tend to model some physical process whereas browsers attempt to implement a massive spec to allow any server it chooses to contact to send it a bunch of bytes the rendering result of which is not known ahead of time so arguably the games people have a relatively limited possible space of outputs they can generate whereas browsers theoretically can display anything including that game.

That still doesn't excuse them for the crappy performance they deliver, that simply means we've moved too fast in adding layers before getting lower layers to perform adequately. The original web did not seem sluggish to me, it seemed about as fast as what you could expect from the hardware of the day, whereas the 'modern' web seems to be (to me) terribly slow and inefficient.

The ratio of content:markup+eye candy has deteriorated and I suspect that that is part of the answer here (and of course that would not be the browsers fault per-se).


Notice that webapps using famo.us actually break that stereotype you described by using the GPU and a Physics engine.


The web doesn't deliver crappy performance, though--apps written by bad developers deliver crappy performance.

The complaints about the abstractions and everything honestly remind me of the complaints about Java, and look how that turned out over the same time period.

As for the engine vs. browser thing--you can totally render a DOM super efficiently if you are sure that it remains relatively static. That's very much, in fact, how the game engines do so well. They have the task of "throw as much as we can in triangle buffers onto the card, and occasionally draw with uniforms set to our desired transform matrices, and let the Z-buffer and pixel shaders sort it out."

By contrast, the DOM may have any number of properties twiddled at any time, and the layout engine is forced to deal with that. A random float or something could cause hundreds and thousands of nodes to reflow.

There is a cost to such flexibility, to be sure, but it's not inefficient for no good reason--the tradeoff was made to lower the entry to program it and to allow maximum malleability.


> By contrast, the DOM may have any number of properties twiddled at any time, and the layout engine is forced to deal with that. A random float or something could cause hundreds and thousands of nodes to reflow.

So the problem is that DOM is a bad abstraction for building GUIs, as it is necessarily inefficient.

Desktop apps have been more efficient than web apps forever. The reason is that desktop GUI toolkits were, y'know, actually designed to be GUI toolkits.


Inefficient != bad.

Frankly, I think that the design tools and document model for the web knocks the stuffing out of any native apps, especially if you are trying to make something work quickly or on multiple platforms. I'd take CSS over Qt any day.


Amen. It's like people think the existence of a facade undermines the merit of the underlying architecture. There is no "moving too fast". If you're looking for a curated set of panacea tech then get off the frontier and wait for it to be a class. The people designing what will eventually be its content need a place to iterate. If you're butthurt about the web dominating app distribution you can use emscripten and canvas or circumvent the web entirely and use steam. The highest level interfaces of the web aren't trying to be everything for everyone. They're just standardized solutions to common problems. Look at how lodash improved performance relative to native browser methods by implementing only a useful subset of their behavior. It's not all or nothing.


I don't care about application development personally (although I'm sad that the dominant programming environments are much worse than they were 20 years ago), but I do care about the performance of the applications that I use every day. When you consider what my browser actually does it uses a tremendous amount of resources. Computers are so fast but yet most applications feel slower than the ones I used on my 100MHz Pentium in the 90s. That's broken.


They were developed with significantly less effort than applications from the 90s. Many are just experiments someone published after mere hours of work. A large number of people exist who would trade performance for more content. That's not broken it's just a difference in preference. There are high performance webapps you can use, albeit fewer of them. That and higher performance apps if you don't mind waiting to download and install.

I'd love to have the dynamism of the web and the performance of native all on one platform. We're not there yet.


The web apps I'm talking about are best-of-breed; not just some random crap on a web site somewhere. They're still slower than what I was using on Windows 95.

I also disagree that app development was harder back then. Back then we had IDEs that included online help and APIs that were actually _designed_. It wasn't a perfect world but it was at least coherent.


You're gonna need to list some examples of both, 'cause you kinda sound like a crank here.

Also, I don't really get what you're complaining about in terms of IDEs or API design here--go spend some time on the Mozilla Developer Network and come back and tell me it's just slapdash and thoughtless work.


I didn't say that it was slapdash our thoughtless; don't put those words into my mouth. It is truly admirable the great effort that people at Mozilla, Google, and Microsoft have put into improving the web.

What I'm saying is that if you sat down to design a nice cross-platform, distributed environment for building applications it would not look like the web.


To paraphrase: If you sat down to build something other than the web you wouldn't end up with the web.


Computers are so fast but yet most applications feel slower than the ones I used on my 100MHz Pentium in the 90s. That's broken.

How many multiple-megabyte images could you display in 24-bit color at 1080p on that Pentium while scrolling? How good was the support for Unicode and non-English languages? How many streaming videos and ads could you see at one time? When you downloaded kilobytes of markup, and in turn request a dozen more things from a server on the other side of the planet, how long exactly did it take to parse and load? How many times did you chat with people and embed pictures in your conversation, while listening to streaming internet radio?

Hint: computers are actually doing quite a bit more in apps these days than they used to, and its only through the good luck of hiding this fact most of the time that you can make such silly complaints.


Computers are close to two orders of magnitude faster than they were 20 years ago. It would be shocking if the stuff you mention wasn't possible.

But it's a silly complaint that some web apps can't keep up with my typing? Please.


There are many websites. Frankly, it is silly. It should be obvious that ∃site∈internet.slow(site) is true and further that it's boring.


> The complaints about the abstractions and everything honestly remind me of the complaints about Java, and look how that turned out over the same time period.

I don't think that supports your argument that poor performance is the developers' fault.

Java got fast as its VM improved and it got JIT compilation.


I have done both of those things (though my real time web app is written in React/cljs), and I agree entirely with the grandparent.

The difference isn't in the technical details, the difference is that a chat app with a 1 second redraw can succeed in the marketplace, whereas a game with a 1 second redraw will never be played. The bloat and inefficient abstractions grow to the limit of our tolerance.

If CPUs had hit the wall at 66mhz single core, that chat app would somehow still take the same 1s to redraw. Of course, it would probably be written in something like NaCL and have significantly more engineering effort put into optimization, with a browser stack engineered for high performance.


The bloat and inefficient abstractions grow to the limit of our tolerance.

Which is tragic in some ways IMO. It means every system is always optimized to the level of misery where you are just barely not willing to do something about it. Which when you think about it sounds kind of like what you would expect from a lesser circle of hell.


Apparently in the early days of HDTV, cable/satellite companies would slowly reduce the quality of broadcasts until people started complaining. At which point they stopped.

When BBC HD was a channel, its quanily was black and white to everything else, now even they are in on the reduced bandwidth game :(


Written both. Not sure why it needs to be deferred rendering specifically lol. A forward renderer (or a light deferred, or tiled deferred, or forward plus, etc) would easily prove the same point.

The precompiled logic thing is nonsense. This happens on the fly in games too. Shaders get compiled on the device, assets streamed in from disk or the network. Game behavior is typically scripted and patched, etc. Narrow selection of hardware? 10 years ago? Wrong. I work on a game engine that fully supports a 10 year old machine, and most state of the art engines can degrade to that level too.

Goofy image formats? Please. DXT1-6, BC1-7, crunched assets, raw DDS, etc. We deal with tons of different formats, lossy and otherwise. Texture arrays, cubemaps, etc too. Malformed XML documents are trivial with a good lexer and we need tons of validation too on our side. Many games patch themselves.

The game can scale to a mobile device believe or not. It just doesn't make sense to. Throttling draw calls or streaming only the lowest mip levels is trivial for a good engine.

The web is absolutely 100% less efficient than a commercial game engine and does far far less. The flip side is that the barrier to entry is way lower as well. It takes less experience to write a web app than it takes to author a game. I would argue that authoring a web renderer is easier than authoring a state of the art game renderer too, although there are certainly complexities there. The DOM scene graph compared to a game scene graph? Come on, is there even a comparison? I could dump a scene graph for you which is totally dynamically generated in real time that dwarfs the most complicated web page you could find. I mean, heck, we use BVHs, KD trees, octrees, and more to represent our scene graphs. You think the DOM has even a fraction of that complexity?

Too quote you, "seriously, just stop for a second and use your head."


"The web is absolutely 100% less efficient than a commercial game engine and does far far less"

I think you underestimate how much work web browsers are doing. The browser rendering engine lays out and renders thousands of character glyphs out of arbitrary vector typefaces (they might even have been downloaded over the network at runtime) at subpixel resolution, compositing over arbitrary backgrounds, keeping track of their precise location on the screen so you can, for example, select a chunk of text and copy it.

That's the equivalent level to the pixel-level shenanigans in a game engine. HTML and CSS descriptions of a webpage, with all its embedded images, are the equivalent of a complete description of a level's geometry and textures being loaded into a game. And most games, let's be honest, spend a lot of time displaying a 'Loading please wait' graphic while they pull all of that content into RAM and prepare it for rendering.


In graphics, everything is done at subpixel resolution. We also have to handle text and the aliasing that occurs not just from screen resolution downsampling but projective aliasing as well for text that is not camera facing or text that is wrapped on a surface. Keeping track of text on a screen is, to put it simply, not hard. There's a lot more state to a level's geometry and textures than just position. Momentum, for one thing, but tint, wind, pathing, any other behaviors, key frames, etc. Some of the meshes get skinned or dynamically deformed if its cloth or hair. Some meshes get generated on the fly. Still others move according to a script or analytic equation like particle quads. Not to mention audio that may be keyed per item and the physics tick.

The loading is sometimes more than just pulling stuff from RAM. Decompression, shader compilation, and more. Maybe textures are created dynamically or the level is randomly generated (terrain, creatures, and all). Usually, loading is full threaded and all reading happens asynchronously. Depending on whether you're on a console or not, different optimizations happen since the disk drives have different buffered reading characteristics.

I've read plenty of stuff on web browser tech and architecture, particularly because I've been following the Servo project with some interest. I'm not saying it isn't a lot of work (it's a lot of work, like anything worth doing), but I do think it's easier than you're making it out to be.


> Goofy image formats? Please. DXT1-6, BC1-7, crunched assets, raw DDS, etc. We deal with tons of different formats, lossy and otherwise. Texture arrays, cubemaps, etc too. Malformed XML documents are trivial with a good lexer and we need tons of validation too on our side. Many games patch themselves.

This is the crux of why you're wrong. You get to /choose/ which formats you use and support. You can completely avoid any and all of those, if you like, just by limited your asset support. The web? It has to support all of it's cruft, at all times. It's not optional.


Gosh, and that's ... hard? Image processing and compression is the easiest type of problem to solve. The algorithms are well documented and available.

Also, for your benefit, the formats listed all have different usages, and encode the different channels with varying precisions. They also have different compression artifacts and perform better or worse using different samplers.


I'll stick with my precompiled logic assertion. How many AAA games ship with a fully dynamic language like JS? Of those, how many then expose low-level APIs to those languages instead of just an operating environment suitable only for scripting? At any point, do those games allow trivial fetching of new code from the network from an unspecified source, parsing and executing that code in or out of a sandbox?

As for "goofy image formats"--I'm going to be somewhat surprised if you aren't doing a preprocessing step to normalize your textures and images into either an atlas or at least some standard format (with texture compression or whatnot). I know that across the entire space of game engines there are bajillions of formats, but again, when you're building a custom engine for a game, I'm willing to bet that you can pick your battles in ways that browsers simply can't.

The "malformed XML" isn't a simple matter of just "Oh, a missing closing tag!". It's "quirks" mode. It's supporting weird old behavior, bug-for-bug, across browser versions (thank you Microsoft).

"The web is absolutely 100% less efficient than a commercial game engine and does far far less."

See, from a performance standpoint, I wouldn't hesitate to agree that the web engines are less efficient. Similarly, I wouldn't claim that a gas turbine is less efficient than piston engine. However, piston engines are a lot more common because they're more flexible, easier to make, and can put up with a lot more nonsense.

You lost me, though, when you said that the web does "far far less" than the specialized work a game engine does.

"Come on, is there even a comparison?"

Look at the spec:

http://www.w3.org/TR/html5/rendering.html#rendering

You're absolutely right--there's no comparison. Show me a modern game engine whose rendering pipeline is one tenth as complicated or well-specified!

Sure, bang around about your scene-management structures for visibility tests. I'm sure that'll impress your friend at GDC who similarly rediscovering techniques already mined out twenty years ago--better than thirty in the case of octress.

At the end of the day, game engines (from a display standpoint), are just rendering lots of triangles, with optional intermediary targets clever shading tricks. There may be some animation, there may be some physics, but all of that is relatively straightforward in implementation compared with the downright byzantine rules for, say, CSS:

http://www.w3.org/TR/css-2010/

I'm not saying that game engines aren't impressive feats of engineering--they are in no way, however, near as complicated from anything other than a technical perspective when compared with a modern web browser.


> How many AAA games ship with a fully dynamic language like JS?

Every single game that lets you use Lua to write game logic? AIUI, every Elder Scrolls game since Morrowind? The Crash Bandicoot games? The Jax and Daxter games? (The Jax and Daxter games use http://en.wikipedia.org/wiki/Game_Oriented_Assembly_Lisp and the Crash Bandicoot games use its predecessor.)

> At any point, do those games allow trivial fetching of new code from the network from an unspecified source, parsing and executing that code in or out of a sandbox?

Every single game that permits custom levels that can include scripted elements fits this bill. See, though, this is a strawman. Almost every competently designed website is just like a AAA game in that all of the code that that site runs and all of the assets that it loads were explicitly selected by the dev team and tested by its QA team. Moreover, any user-supplied data is carefully handled and constrained so that processing of said data is low-impact and non-disruptive; just like in a AAA game.

I also note that you're refusing to engage with folks who call you out on the fact that your argument that "Having to handle a wide array of obscure image formats and potentially invalid XML makes page renders and reflows slow!" is bunk.


I've engaged specifically on the image formats--consider both GIF and XBM, for example, or WebM or whatever else. Do those show up frequently in games? Nope. Games have other things, like DDS and whatnot, but those are actually kind of designed for their use case (storing mipmaps, cubemaps, etc.), and game engine developers have the luxury of saying "We use this blessed format and anything else runs through our tools pipeline first to become this blessed format". Web browsers have no such luck.

The XML/HTML stuff I mentioned elsewhere is still the same: it's not just simple missing character stuff, it's dealing with old pages that are semantically (not syntactically) malformed and still rendering them properly. See also quirks mode.

By contrast, game engines have level and script formats that they support, with simpler document models, and they get to deprecate things whenever it suits the developers.

As for scripting, just look at some docs for the Morrowing stuff you bring up: http://www.quicenter.com/morrowind/files/Morrowind_Scripting.... That's hardly on the same class of utility as Javascript--it's basically a procedural wrapper for stepping through basic scripted sequences. And that's fine--it's built to do that and only that.

Every single game that permits custom levels that can include scripted elements fits this bill.

"Scripted elements" is very, very different from, say, loading an arbitrary IDE into the browser: http://repl.it/languages

On the average, most game scripting languages don't support running Ruby, Python, or Erlang interpreters.


> I've engaged specifically on the image formats [and malformed documents]...

Yeah, you haven't. The comment which spawned your first statement was one that complained about rendering speed. Many people in this thread have noted that by the time an image or document (XML or otherwise) gets to the do-layout-and-render-pixels-on-the-screen stage of the process, it DOESN'T MATTER how malformed that document was or what format that image was when it came over the wire.

Why? Because when we're at this stage of the process, that data has been converted into whatever representation is most convenient for the web browser to use while rendering its screen. Just like in a video game, support for a new image type or document type is as "simple" as adding on a converter from $NEW_FORMAT to $BROWSER_INTERNAL_FORMAT.

> "Scripted elements" is very, very different from, say, loading an arbitrary IDE into the browser

No. It's a difference of size, not of complexity. The interpretation and validation tasks are the same in both examples.

> On the average, most game scripting languages don't support running Ruby, Python, or Erlang interpreters.

Heh. Nice of you to ignore my Crash Bandicoot, Jax & Daxter, and every-game-which-supports-lua-scripting examples. Lisp, Scheme, and Lua are every bit as complex as Ruby, Python, JavaScript and -to some degree- Erlang.

I've read your other replies in this thread. Your initial arguments are weak, and you continually choose to ignore evidence and assertions that contradict your initial argument. This is a real pity, because you're pretty well-written. If your behavior matched your tagline "Strong opinions, weakly held", you'd be far better off. :)


"It's a difference of size, not of complexity."

Again, no. "Scripted elements" is waaaay too broad, and includes elements for purely procedural work (move here, say these lines, move here, etc.).

Your example of Crash Bandicoot, by the way, isn't great--GOOL was compiled down into a form that shipped with the game. It wasn't dynamically compiled from external sources by the engine during runtime, unlike what we have to deal with with Javascript. Same criticism applies to Jak and Daxter with GOAL.

"Lisp, Scheme, and Lua are every bit as complex as Ruby, Python, JavaScript"

Scheme and Lua, certainly not--just compare their BNF grammars, much less their actual use. If by Lisp you mean Common Lisp, well, you have a point, but otherwise nope.

As for the rest, I don't ignore the assertions; it's just that the fine difference between "can support" and "must support" is apparently lost on most folks. Additionally, points on loading are all sidestepped by you folks ignoring that, as part of the browser, that flexible loading must happen, whereas in the engine you can assume the tools pipeline (you know, the 3DS plugins or whatever) have already done the work.

And then, for all that, people are still complaining about the platform of the browsers when all of there examples are of shitty and slow web apps running on it. It's like they've never seen badly-performing maps in a game engine, or a shitty Unity game which isn't optimized properly.

I'm trying to help educate folks here by clearing up misconceptions, but it's rather uphill.


Do you actually think it'd be hard to put GIF and XBM support in a commercial engine? I really wish you would stop commenting, and I also really wish I could ignore you, but given your last of experience, I'd appreciate it if you stopped talking out of your depth.

I'm not sure your last point makes sense to me. I can't run Ruby, Python, or Erlang in a web browser either.


but given your last of experience

s/last/lack

So, other than the annoyance of actual handling animating the GIF, I don't think that adding the image formats to a commercial engine would be that hard. Hell, I'd just link in FreeImage and be done with it. The point, though, is that a web browser pretty much has to support that, whereas a game engine doesn't. For any particular feature X that a browser has and an engine doesn't, sure, you could absolutely add that feature to the engine. Or just embed Awesomium and have a browser in your engine. Doing so, though, is more of "anything you can do we can do" argument instead of the issue at hand of "browsers are required to be more complicated than engines".

To explain my last point: both Python and Ruby there are available in interactive REPLs inside the browser, implemented in JS:

http://www.repl.it/languages/Ruby

http://www.repl.it/languages/Python3

Erlang has also been implemented in Javscript:

http://svahne.github.io/browserl/

The thing we keep disagreeing on is the basic premise that browsers are more complex than engines because they do more stuff, albeit less efficiently. Game engines, especially commercial ones like Unity or Source or whatever, are very nifty feats of engineering. I'm not disputing that. However, they aren't specified, they aren't standards conformant, they don't really have legacy stuff they have to put up with, they don't have security concerns in the same way browsers do, etc.

If you want to keep claiming that engines are more complicated than browsers, that's fine, but realize that that's a position far from settled. You have yet to, for example, answer any of the points about how much more complicated the operating and compliance environment of a browser codebase is.


Or PyPy, "compiled for the web via emscripten, with a custom JIT backend that emits asm.js code at runtime." http://pypyjs.org/

Or Windows 95 running in DOSBox compiled for web with empscripten http://win95.ajf.me/


Preprocessing sure. But the different formats are necessary. Depending on the data in the texture, the compression algorithm may alter the data in a way that behaves better or worse (for example, textures representing normals vs material vs albedo).

Before I discuss rendering with you, can you first disclose your experience with game engines or rendering? It's sort of pointless to go into that discussion unless you understand how things have changed in the last 5-10 years or so.

edit:

Forgot to mention that shipping with a scripting engine is sort of easy. It's just a matter of whether it's exposed to the customer or not. Tons of games (even decade old ones) have shipped with turing complete editors for modding or making custom levels or what have you. I've written a custom VM for a homebrew project that can do precisely this.


Sure. Started out with some school projects that were beautiful object-oriented single-thread vistor-pattern for rendering in fixed-function OpenGL. This predictably led to terrible performance. At the end of school, friend and I decided to extract that code and try to build a real engine around it.

Upgraded to thread-safe rendering in OpenGL, one thread handling rendering and windowing tasks, other threads game logic and submitting draw requests, and ultimately other threads handling resource loading onto the GPU (though that came later towards the end).

Upgraded that to actually use the programmable pipeline, and finally added support for FBOs and uniforms and buffers and everything so we could actually start on deferred rendering as outlined in Engel's blog and other places. Never did transparency, but if I remember correctly the goal was to probably do something like screen-door+deferred. For the life of me I can't remember the paper we were looking at.

Eventual goal was to do full deferred lighting and PBR, but by that point I'd moved over to web development. Every time I went back to that codebase, clean and organized as it was, it was just such a pain in the ass to do anything compared to splatting some JS and Three onto the screen and being done with it.

Concurrent with that, I was doing 3D CAD software development in Java with Ardor3D. So, big annoying retained-mode scenegraph with all the BVH nonsense you could ever want, and a bunch of legacy files for things that, when one rendered a building all at once, would kill the graphics card. And don't even get me started on trying to sort out order-independent transparency for that...one of my great failures.

By contrast, nowadays I do multiple web workers with websockets to provide 30-60 fps display of waveform data on just bare 2D canvases, with Angular wrapping the components.

EDIT:

I'm aware of the scripting thing. Note that, until perhaps FarCry, it was pretty well held that scripting languages were "too slow" for most games, nevermind the success of such things as JIT Quake 1 bytecode, or the stuff in Out Of This World, or what have you. Even UnrealScript was considered godawful dog slow and to be avoided if possible, and was finally shown the door. But, in the end, scripting was discovered not to be the super worst thing ever.

I think Eve finally hammered that point home to everyone.

The scripting languages we have now have come a long way in terms of ease of embedding, with a possible exception for Lua, which has been easy as hell for a while now.

Then again, I don't think that many scripting languages have the sheer size of interface with native code that, say, JS in the browser does.


It's a shame you didn't continue. Where you left off (prior to deferred ... shading I'm assuming and PBR) is still a bit a ways from where things get a bit more interesting. Physically correct rendering changes the game a good deal. We can't haphazardly blur textures and add things and lerp like we used to.


I don't think you can compare them.

The web feels clunky but if you think about it, it's the ultimate sandbox. Hey, it even supports scriptable accelerated 3d graphics :)

So yeah, two different things for two different purpose. They are both state of the art and pushing limits.


I think many would agree with you, that most modern browsers with the expected features are pretty bloody efficient pieces of software. I believe what OP was saying was that the layers of abstraction used in building web application on top of the foundation the browser provides -- while really useful -- are also inefficient. This is to be expected since they do not usually have access to the full capabilities of the operating system and hardware.


I suspect you're being down voted because most of your claims are false. It's not hard to find counter examples for all all of them.

And also because web developers made their own bed. It was web developers who decided it was a good idea to cram everything into the browser and make web pages into "web apps." They could have said, "Hey, this is dumb, doing this will be slow and inefficient, and here are the other technical reasons why it's a bad idea," but they never did that. Instead it's always, "Hey, look how I can make a low quality Doom clone in a browser that runs slower than original Doom did 22 years ago!" and hack after hack to get things running at a speed comparable to regular applications.


The reason for that is not 'web developers' but rather the fact that a single delivery device for a huge variety of content without intermediaries is a thing no corporation will be able to ignore.

That's what caused the envelope to be pushed as far and as fast as it did, and we all both enjoy the fruits of that and suffer because of it. It's like forcing a kid to grow from age 3 to age 30 in a few weeks time, it's bound to give trouble, no matter how impressive the feat may be technologically speaking.


There is a ton of unbelievably bad web code out there. jQuery is a great example of a 'huge' and popular slowdown for a minor gain.

Sure, it's 'worth it' because nobody cares if a web app takes 1/2 second to render, but as soon as you trade speed for anything else you quickly end up with code that's as slow as your willing to live with.

Having said that there are a few modern web-apps that really got 10x faster while sticking with HTML, CSS, and java-script so it's possible. The trick is stick with competent people and reasonable designs.


I believe it's been discussed many times before. Developers who use jQuery do so because it provides a good API / abstraction layer, as dealing with browser specific issues could be non-trivial.


> The reason for that is not 'web developers' but rather the fact that a single delivery device for a huge variety of content without intermediaries is a thing no corporation will be able to ignore.

That argument doesn't work because that was already possible with TCP, and it doesn't explain the rise of "apps".

The reality is, corporate decision makers saw cheesy demos of web stuff, and said, "Hey, lets use that in production," and instead of saying, "Hey, that's a bad idea let's just use TCP for this application!" the web devs said, "Okay, I guess we can make it work." It's clever and maybe technologically impressive, but that still doesn't make it a great idea, IMO.


> That argument doesn't work because that was already possible with TCP

TCP: low level network protocol, HTTP: high level network protocol, HTML: high level layout description (ok, maybe not that high ;) ).

> and it doesn't explain the rise of "apps".

Actually, you can explain apps (and silos) as a way to undo all that open goodness and to bring control back to the large companies that are (rightly) frightened out of their wits by what an open internet and a peer-to-peer world actually means.

Now if we could get providers to play ball and to allow servers to co-exist with residential services on ports 80 and 25 (somebody think of the spam...) and hand out symmetric bandwidth by default we might be able to undo some of the damage.


Oh goody, so let's go and each roll our own custom brand of query-response and framing over TCP again. That'll be great. And fuck, it's so much faster and more flexible and debuggable to write our tools in window toolkit $whatever than to just use CSS and HTML.

Those who don't learn from history, etc. etc.

There's a reason the web won.


The funny thing is that all of the those problems had been solved and made into libraries for native apps a long, long time ago. And none of them were really solved by moving using the web as a platform.

How do you explain WebSockets and HTTP2?

And instead of creating a GUI in Gtk or Qt, we have IE, Chrome, Firefox, Opera, Amaya, and myriad mobile browsers, all of which behave differently, especially on more advanced JS/CSS/HTML.

The people not learning from history are the web developers. It's pretty clear they're the ones reinventing the wheels...


If, in 10 years, I'm unable to play Deus Ex HR but all of these bloated, insecure, backdoored, "social" web sites are still supported, I might just quit this industry for good.


Vote at http://www.gog.com/wishlist/games/deus_ex_human_revolution, buy the DRM-free version if it eventually comes out, then store it on good quality archival media.


Dont worry, you will be able to emulate it through a browser in your watch in 10 years :)


Certainly, getting to market early, being stable, compatible, secure, accessible etc. is way more important than performance, but I think the hate is justified.

Browsers are super-performant these days: JS and CSS engines are highly optimised, internet connections are very fast, server-side caching makes for an instant response, and yet web apps are slower and less compatible than they've ever been.

You mention Angular, which is a big, slow, complex, and horrible framework. It's a fine example of the nonsense that is perpetuated all the time in the web dev world (I'd call it an extreme example if it wasn't used everywhere).

I often say on here that all you need for most applications (that are currently adopting Angular) is a bit of jQuery, and I get pointed and laughed at every single time, but my jQuery app would be smaller, simpler, faster, and easier to maintain than its Angular equivalent for just about every use-case going.

P.S. It makes me angry that you got down-voted for having an opinion - it seems that we've all got to think the same these days. Thanks for making a valuable contribution to this discussion.


Here's the thing. You're replying to

"...re-drawing when resizing the Slack chat window can take up to a second."

0) I agree that working with the DOM can get very complicated, and that layout can also be a hard problem[0]. Keep this in mind while you read the rest of my response.

1) When a browser goes to render a page, it does so from assets that have already been loaded and validated. All the fiddling with goofy image formats and malformed inputs is over and done with. All of those resources should have been processed by the browser into a format that's specifically made for the browser. So, when you resize the browser window, all[1] you're asking the browser to do is to reflow the page using the resources that some other code has specially prepared for rendering.

2) I bet you ten dollars that -if it had been compiled for a 64-bit machine- Deus Ex: HR would run on a machine manufactured in August, 2021. (Indeed, look at how well Starcraft 1 runs on machines built in 2008 and later. :) ) (Oh, BTW. Deus Ex: HR runs on the XBox 360; a machine that was released in November 2005... just about ten years ago. :) )

3) Have you used Mobile Chrome or Mobile Firefox on a Nexus S (a phone released in December 2010)? I have and -when I have no other choice- do. They are slow as balls for web pages of any appreciable complexity. Hell, for rather complicated pages, they consistently get reaped by the Android OOM killer.

4) I assume that by "That game is never going to have to scale .. down ... to a ... shell script" you mean something like "The web is browsable through lynx. You will never have a text-mode version of Deus Ex: HR.". If the game dev was willing to let folks make Deus Ex: HR look like a game from the early 2000's (and run on machines from the mid-to-late 2000's), they could have inserted a checkbox that switched to pre-computed lighting and turned off (or greatly reduced the complexity of) all shaders. However, I bet you fifty dollars that a wide section of The Web is completely unusable on lynx (or even links, for that matter).

Edit: Know that I've built soft-realtime networked GUI software with both OpenGL (using FLTK as the base) and a modern browser (with JQuery as the base). :)

[0] However, TeX has been doing complicated page layout since the late 1970's. :)

[1] Yes, I'm quite aware that browsers can do all sorts of things -including loading and processing new resources- when the viewport size changes. If you've written a web page that does so much work on viewport size change that redraw takes 1000ms or more, guess what? You've written a slow web page. :) If you build a game that requires 1000ms to render a single frame, guess what? You've built a slow game. :)


I think your intuition, that game engines are a poor comparison to a virtual machine and GUI targeting general purpose computing, is fundamentally correct, since they have mostly different design and technological constraints. Your detailed rationales are a bit off though. The parsing and image processing bit should involve only preprocessing of data and not runtime performance, and should not take that long anyhow.

The visible stuff a web browser does should not take exhaustive resources to get running nice and smooth. Also, since web is now most of all a UI tech in a sane world it should have the explicit design constraint of helping developers write fluid interfaces.

I'm not sure what the root cause of the slowness of web tech in general but I'm guessing there is a megaton of accidental complexity in the standards and implementing them is probably a small hell in itself.

From purely user facing side, and estimating intuitively the intrinsic complexity of the stuff a browser does, I would compare the experience just to the experience one gets on a modern desktop.


The thing about the parsing and image processing is this. Browser JS is completely allowed to, say every few frames in a setTimeout() or whatever, dynamically generate a pile of bytes, create an image from that pile, attach it to the DOM, randomly remove some other node from the DOM, and finally change the position property on a third node and cause everything to be reflowed. This all being done from substrings that it randomly selects together and then eval's.

That level of pathological douchebaggery is rare (well, sorta, kinda) and yet browser vendors still have to write code that supports it as well as it can.

I'm not sure what the root cause of the slowness of web tech in general but I'm guessing there is a megaton of accidental complexity in the standards and implementing them is probably a small hell in itself.

Quite right! The problem with "slowness in web tech" is that there are just so many things that can be going wrong--slow CDNs, bad Angular code, whatever--that simply aren't the fault of the web, just the fault of sloppy developers. If the game engine bro who I've been going back-and-forth with wanted to claim that web developers were, in general, rather shit, I wouldn't disagree at all.

I think that having design constraints might help with this, but honestly we've all gotten so much from the sheer flexibility of these janky sandboxes that every time I see people pining for native apps I die a little inside.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: