Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The rich snippet inspection tool will give you an idea of how Googlebot renders JS.

Although they will happily crawl and render JS heavy content, I strongly suspect bloat negatively impacts the "crawl budget". Although in 2024 this part of the metric is probably much less than overall request latency. If Googlebot can process several orders of magnitude of sanely built pages with the same memory requirement as a single React page, it isn't unreasonable to assume they would economize.

Another consideration would be that "properly" used, a JS heavy page would most likely be an application of some kind on a single URL, whereas purely informative pages, such as blog articles or tables of data would exist on a larger number of URLs. Of course there are always exceptions.

Overall, bloated pages are a bad practice. If you can produce your content as classic "prerendered" HTML and use JS only for interactive content, both bots and users will appreciate you.

HN has already debated the merits of React and other frameworks. Let's not rehash this classic.



> If you can produce your content as classic "prerendered" HTML and use JS only for interactive content, both bots and users will appreciate you.

Definitely -- as someone who's spent quite a lot of time in the JavaScript ecosystem, we tend to subject ourselves to much more complexity than is warranted. This, of course, leads to [mostly valid] complaints about toolchain pain[0], etc.

> HN has already debated the merits of React and other frameworks.

I'll note though that while React isn't the cure-all, we shouldn't be afraid of reaching for it. In larger codebases, it can genuinely make the experience substantially easier than plain HTML+JS (anyone maintain a large jQuery codebase?).

The ecosystem alone has definitely played into React's overall success -- in some cases, I've found the complexity of hooks to be unwarranted, and have struggled to use them. Perhaps I'm just not clever enough, or perhaps the paradigm does have a few rough edges (useEffect in particular.)

[0]: Toolchain pain is definitely a thing. I absolutely hate setting toolchains up. I spent several hours trying to setup an Expo app; curiously, one of the issues I found (which I may be misremembering) is that the .tsx [TypeScript React] extension wasn't actually supported. Definitely found that odd, as you'd assume a React toolkit would support that OOTB.


HTML is often not flexible or capable enough. JS exists for a reason, and is an integral part of the web. Without it, you will struggle to express certain things online and lean JS sites can be really quite nice to use (and are generally indexed well by Google).

Bloated JS sites are a horrible thing, but they almost sideline themselves. I rarely visit a bloated site after an initial bad experience, unless I'm forced.


For documents, you can absolutely have all the structured content in HTML, and add JS to improve things. This way, you have your feature rich experience, the bot can build its indexing without having to run this extra js, and I have my lightweight experience.

Progressive enhancement :-)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: