Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It may be common when starting out, but we do have paths to optimize out of it.

We can do code splitting, eager fetching js when page is idle, optimistic rendering when page is taking time etc. Unlike what a lot of people like to believe not every spa runs 20 megs of js on page load.

Also the initial load time being a few seconds and then the app being snappy and interactive is an acceptable compromise for a lot of apps (not everything is an ecommerce site).

When most fragments need to be server rendered it manifests as a general slowness throughout the interaction lifecycle that you can't do much about without adopting different paradigm. The hey-style service-worker based caching hits clear boundaries when the ui is not mostly read only and output of one step very closely depends on the previous interactions.

I joined a place working on larger rails+unpoly+stimulus app which started off as server rendered fragments with some js sprinkled in, but after two years had devolved into a spaghetti where to figure out any bug I'd typically need to hunt down what template was originally rendered, whether or not it was updated via unpoly, whether or not what unpoly swapped in used the same template as the original (often it was not), whether or not some js interacted with it before it was swapped, after it was swapped etc. .... all in all I felt like if you push this to use cases where lot of interactivity is needed on the client, it is better to opt for a framework that provides more structure and encapsulation on the client side.

I am sure good disciplined engineers will be able to build maintainable applications with these combinations, but in my experience incrementally optimizing a messy spa app is generally more straightforward than a server-rendered-client-enhanced mishmash. ymmv.



> Unlike what a lot of people like to believe not every spa runs 20 megs of js on page load

This is not a new take, it's exactly what every die-hard SPA dev says. While 20MB is an exaggeration, the average web page size has ballooned in the past decade from ~500KB in 2010 to around 4MB today. And the vast majority of those pages is just text, there is usually nothing really interactive in them that would require a client-side framework.

Others will say 2MB, 4MB is not that bad, but that just shows how far out of touch with the reality of mobile internet they are. Start measuring the actual download speeds your users are getting and you'll be terribly disappointed even in major urban centers.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: