This is genius in that it starts by seeming to be a portentous talk about the evil other and turns the tables to show that the enemy may be us: How do we let ourselves be robots? Do we really like social control? Why do we expect the technology that got us into these problems to get us out of it.
I don't think we are really in a terrible place. In so many ways the world is better than it was. But we are kind of blind to the problems we are creating. Read this.
I don't think we're blind to the problems being created. It's more that there are no obvious solutions beyond stasis, which is no real solution as it just fixes us with today's set of problems instead of tomorrow's.
I like Maciej's talks, they're always entertaining, but at the end of them I always feel not quite satisfied. He is good at presenting well known problems in amusing ways but rarely identifies solutions beyond the hopelessly vague (like "we should all think about stuff more").
He is also utterly resistant to giving large companies any credit at all. Google and Facebook have moved the needle on strong cryptography and anti-surveillance tools more than any other groups, but in these talks they're always the bad guys because they make money through advertising instead of credit cards (which are legally linked to your real name and address: far far worse for privacy than ads).
A big part of this talk boils down to, the US Government can order American companies to undo any protections they themselves create. But this is hardly worth commenting on. It is a political problem and the only solution is for political candidates to appear that manage to appeal to large numbers of voters whilst simultaneously being strong on civil rights and shutting down surveillance systems. Neither Clinton nor Trump seem very likely to do that, but nor did any other candidates. The issue matters to people but it matters a lot less than other issues.
There are two point I would dispute. First, I do give large tech companies (and the NSA!) credit for being internally heterogeneous, and doing good things along with bad.
Second, it would be ridiculous to give Facebook or Google credit for making anti-surveillance tools. That would be like praising tobacco companies for inventing a better cigarette filter.
My beef is that you sort of place the finger of blame on big companies for the concentration of data in the cloud, even though that appears to be a natural evolution and what users want. It's not like Google or Facebook engaged in an evil master plan to force users to give them lots of data. Users willingly did this because they didn't want to manage that data themselves, and if Google/Facebook hadn't offered them that service they'd simply have gone to another company that did.
Given that this is the way technology has evolved independent of any one firm, is it really fair to compare them to tobacco companies?
Raising awareness is important - and urgent - work. These problems are not "well known" outside of the HN/tech bubble.
> in amusing ways
Also important, as it is much more memorable than the exhaustive clinical delineation offered by (e.g.) Pro Publica
> but rarely identifies solutions beyond the hopelessly vague
If you don't have a solution to what you see as a critical problem, do you wllow in despair? Or do you raise awareness of the problem in the hope that, individually or collectively, others will devise solutions?
I yield to no HN commenter in sympathy for the Google security team, as my comment history will surely indicate, but it's possible to take that sympathy too far. Yes, Google's security team has done important things for the privacy of all Internet users. However, for its own users, it has also created the largest subpoenable body of private information in the history of the world.
Google's security team has its hands full just shipping a browser that doesn't unlock the computers of all its users for anyone who can write a heap overflow exploit (which is why you should probably prefer Chrome over other browsers). They are not in fact moving the needle on the problem of protecting their own data from hostile governments.
There's no "we" about it. Broadly, there are people who like having social control, people who like feeling that others are under control, and people who don't like control.
To be technical, that'd be the second group if it was phrased as "people who like feeling that others are in control", using the word "in" instead of "under".
I believe there is another group: People who like control, when everyone involved understands and agrees with it.
It can be very hard, and it involves a consistent questioning of authority. Including the authority of ourselves, and the authority we hold over others.
We can have modern medicine, but we have to trust the people researching it.
We can have modern robots, but we have to trust the people building them.
Fear is a great way to prevent people from questioning. But if we learn to question our fear itself, it can be useful to have been afraid.
I don't think we are really in a terrible place. In so many ways the world is better than it was. But we are kind of blind to the problems we are creating. Read this.