Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think this "thing" is insane.

First of all, battery life. He specifically calls out phones, and "not just phones, they could be wearables and other...", as targets for this. Every bit of computing you do on my device is battery life I lose. You're welcome-in-theory to use some compute on my CPU, but stay the hell away from my battery life, which in practice means stay off my CPU. So there's that.

Second, latency is a big thing in user experience. Go ahead, follow this author's advice, and do your JSON-to-HTML rendering on the client. See how it affects your latency. See how it affects your user experience. See how the latency affects your SEO standings. Try it out.

So once you realize you don't want to use client battery life, and you don't want to use client computing anywhere it would make the user experience perceptibly more latent, what're you left with? Yeah, sure, you could use some background computing power in the style of SETI-at-home and so on... but if you want users' explicit consent, you're competing with those existing for-the-betterment-of-humanity projects, and if you don't get explicit consent, you'd better tread mighty carefully.



"Second, latency is a big thing in user experience. Go ahead, follow this author's advice, and do your JSON-to-HTML rendering on the client. See how it affects your latency. See how it affects your user experience. See how the latency affects your SEO standings. Try it out."

I think this _is_ actually worth trying out (albeit as an experiment). If you can send JSON to the client (and have already cached the templates) rather than full rendered (uncacheable) HTML, you can (hopefully) reduce the amount of data that's being transmitted. This saves you in

* latency - downloading a small JSON file will take less time than downloading a large HTML file (although with 4G and later high-bandwidth mobile data this becomes less relevant) - at what point does the additional download time offset the template-rendering CPU time?

* CPU usage (and hence battery life) - if we assume HTTPS for the download, the TLS decryption isn't free - at what point does it use less CPU to render your JSON client-side than to download a big file?

* radio usage (and hence battery life) - downloading more content means your radio must be on for longer, which is likely to use more power - at what point does the additional radio usage offset the CPU usage?

In each case, I don't know where the balance lies, but I don't think it's clear cut that server-side HTML rendering is always a better thing on mobile devices.

Having said that, I definitely agree with you on the battery life for general computation point - I'm not going to be bitcoin-mining on my cellphone! ;)


For web stuff and just generally using mobile phones as "dumb" computing devices I agree. However there are other examples where computing on the phone makes more sense than computing in the cloud:

A) Whenever data volume is large and it would take forever to shove it up to some server.

B) Whenever offline service availability is crucial (i.e. you don't want to be dependent on network service availability)

C) Whenever you want to be in control over where your data ends up.

For example Computer Vision is an instance where these criteria are usually met.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: