Hacker News new | past | comments | ask | show | jobs | submit login

In what way is tech difficult?



How is this even a question?

If you want E2E encryption, you sacrifice good full-text search, because you have to build the index on a computer that has access to the plaintext, which means doing it on an endpoint. It’s not as nice as doing it on the server.

If you want to hide metadata, the state of the art is onion routing, and that adds a lot of latency. The only other way that’s even been attempted is Enclave computing, which basically just moves the trust from the service operator to the enclave vendor. Enclave computing is better than nothing, but it belongs in a defense-in-depth approach, not a privacy strategy in itself.

And if you want to avoid fingerprinting, you lose responsive design. Those two things are just directly in conflict here.


> If you want E2E encryption, you sacrifice good full-text search, because you have to build the index on a computer that has access to the plaintext, which means doing it on an endpoint. It’s not as nice as doing it on the server.

I'd argue it's better than on the server. Bring me back local computing, please. It's more private, more performant and more energy efficient. The attempt to centralize computing into central server nodes is not reasonable.


I'd say it's probably the single most important question.

Engineering is about trade-offs; understanding which trade-offs are acceptable means understanding what it is you're trying to solve; then assessing whether the problem is being looked at from the correct point-of-view (and for that matter whether it's even a technical one).

For example: Journalists within hostile nations are risking their lives every day. Because of this, they're keen on keeping their communication private and away from prying eyes.

So far, the compromises largely involve zero-days and social engineering. The former is really due to the shaky foundation of software today. Every single best practice I see is just awful. Whereas social engineering is an on-going problem and is a policy/procedural/cultural problem, not a technical one. The union of bad software and tolerant social protocols make privacy difficult.

Going back to your objection: For use cases where your life is on the line, is a few seconds or even minutes of latency really a problem? Extend the question, for cases where it's not life-threatening but life-altering. What trade-offs are most users willing to make then?

If you're not willing to make any trade-offs no matter what, then it's difficult or even impossible; and also a sign of a poor engineering process.


in a collective way

i.e. it's 'easy' for a individual (even a smallish similar group). But in society-scale? difficult.


I would say that's backwards. Sure, politics is slow, but we've seen it succeed exceptionally with aerospace safety (meaning planes built to protocol and maintained to prevent crashes, not TSA security theater) which is an insanely impressive global feat.

Tech has the most impact on user privacy, but there's no financial incentive to uphold privacy and so the slow political stick is required.

Meanwhile users can do their best installing ad blockers, navigate the snake-oil fields of VPN vendors, estrange themselves by quitting social media, set a different search engine on all their devices (which probably still use Google/Microsoft), maybe look into TOR and Whonix, ensure they use email masks and unique usernames for every service, buy Twilio numbers for account sign-ups, get their contacts to use E2E-encrypted messengers and PGP, get a dumbphone that doesn't have GPS enabled at all times, a laptop with camera/mic killswitches. It's not clear which of these steps are excessive or impractical for the layman, aside from ad/tracker blockers.

So the tooling to uphold user privacy is there, but it's nuts to think the solution is for everyone to adopt better privacy-preserving habits rather than slowly killing the business model of the personalized ad industry. A great byproduct of this is that governments will see less surveillance tech vendors to buy from.


> it's nuts to think the solution is for everyone to adopt better privacy-preserving habits rather than slowly killing the business model of the personalized ad industry. A great byproduct of this is that governments will see less surveillance tech vendors to buy from.

Right to the heart. The personalized ad industry has exacerbated the issues of privacy, because it's in their interest to do so.

Having enforceable ground rules in place, in the form of laws, will be difficult because of the money involved; and also the sizeable number of software engineers feeling like they're above the politics.


Is that really about tech?

I remember when both Intel’s CPU ID and Microsoft’s system updates (because it sent your info to their systems) caused an uproar.

Now we don’t blink when talking about telemetry.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: