> And that, folks, is why we as engineers and hackers have a moral duty to be very selective in the types of work we take on and for whom.
Computer yechnology work is still a wild west. Other fields which have similar ethical considerations - medicine, chemistry, engineering - have ethics courses which make it clear to the graduates that their professional decisions have consequences.
There is no such standard for computer people. It still baffles me that people would voluntarily work at implementing something like PRISM - I would literally quit my job over something like this; similar considerations have popped up in the research I do for my Masters thesis - but ethics is definitely something we would talk more about.
Yet there is no shortage of doctors willing to assist in torture of prisoners at Guantanamo, assist in executions of prisoners, or conduct non-consensual and medically unnecessary procedures (inserting catheters, rectal examinations, etc.) upon demand of the police. The problem isn't lack of ethics courses but a fundamental flaw in human nature, which is that people easily forget all their personal ethics when someone in authority (whether their boss or the police) tells them to do something.
> people easily forget all their personal ethics when someone in authority (whether their boss or the police) tells them to do something.
I don't know whether you're aware of this or not, but you just described a famous experiment which essentially arrived at the same conclusion. It's worth a read for those not familiar with it.
I know of no better or more accessible review of the psychology of evil than "The Lucifer effect" (2007) by Philip Zimbardo, a psychologist who, in spite of a long and illustrious career, probably is still best known for the infamous Stanford prison study.
The companion website to the book is still up on http://www.lucifereffect.com/
Zimbardo has stated that he is still pained at the memories of the Stanford prison study, which had to be aborted prematurely because the vile characteristics it exposed in normal, healthy participants was going out of control. He reluctantly revisited the study in detail because he saw striking parallels to the torture and humiliation in the Abu Ghraib prison under US occupation. While he's at it he also reviews much other research into the psychology of authoritarianism, obedience, dehumanization and other facets of social psychology that add up to what may very reasonably be called Evil.
I might add that organizations like the CIA are well aware of these principles, and appears to have been working on ways of weaponizing them.
It is my opinion that they are using them to corrupt individuals and whole social environments when that suits their ends. In fact, in spite of "The Lucifer Effect" being a fascinating read from beginning to end, one of the parts I found most interesting was the evidence indicating that US intelligence actively created the psychological conditions for the abuses at Abu Ghraib.
Highly recommended reading, and very relevant for much of the potential consequences of mass surveillance.
I haven't looked into it myself, but I did hear that there is an alternative interpretation of that experiment. Apparently the test subjects did as they were asked, up until the very last prompt used, which was the strongest. Less a request and more a demand. As soon as the prompts stopped appealing to the test subjects desire to further science or not fuck up the experiment, but began making demands, the test subjects almost universally balked.
This interpretation isn't really any less disturbing, since it suggests that people will commit atrocities and simultaneously believe that they are doing the right thing. Perhaps it also says something about the general public's trust of, not necessarily authority figures, but rather scientists.
Reading up on this has been on my back-burner for a while, this interpretation might be well considered and discarded, but I thought I'd throw it out there.
> people easily forget all their personal ethics when someone in authority (whether their boss or the police) tells them to do something.
Now if we could get some prominent figures in tech with some authority to get more involved in these matters, but no - they're all a bit too comfortable in their well-paid, US-based jobs to risk anything.
PRISM people thought they were doing right. All NSA - based on all the official and unofficial responses - think they are doing right. That they help building a safer America. That they have to sacrifice privacy for safety. Etc, etc.
I don't think NSA people are evil. They believe that the things they do are for good.
I don't think NSA people are evil. They believe that the things they do are for good.
Evil like in the movies doesn't exist. In the real-world evil is simply a lack of perspective (some would call it empathy instead).
No one ever wakes up and decides they want to be a villain. They always have some sort of logic that rationalizes their actions as being reasonable if not outright good. The more they act on that lack of perspective the greater the evil they perpetrate.
"There is no conspiracy. Nobody is in charge. It's a headless blunder operating under the illusion of a master plan."
"It's all the same machine, right? The Pentagon, multinational corporations, the police. If you do one little job, you build a widget in Saskatoon, and the next thing you know, it's two miles under the desert, the essential component of a death machine."
Quentin: But why put people in it?
Worth: Because it's here. You have to use it, or you admit that it's pointless.
Ok, if ever there was a time to bring out this particularly good industrial/ebm album, it is right now. In particular, this track, which includes samples of the above quotes:
I agree that normal people are generally not evil, but sociopaths can very much be indistinguishable from some particularly evil movie-characters.
I worked with a sociopath in the past and have studied it in literature, and a sociopath by definition sees meeting their own needs as a priority. Regardless of the effect on those around them.
A sociopath is willing to inflict suffering on others for even minor gains of their own. Sociopaths seek out high-power positions and often thrive in them, and those working with them often suffer as a consequence.
You know, people talk about sociopaths a lot, and I get why, but I have a problem with the definition. The idea is that a sociopath is someone who lacks empathy (or, so I've read, is able to switch off their empathy) and prioritises their own needs. The behaviour is characterised as selfish, as in putting themselves first. This doesn't really make sense to me, because there's an implicit assumption about what actually benefits the sociopath.
Taking a step back, as far as I can see, a sociopath is someone who (either by choice or by nature) prioritises certain social drives over other social drives. The drives to be empathic and obey social expectations, norms and rules get ignored. However, the drives for status, money and power are prioritised above all others. These drives are still social in nature. They don't actually convey a fundamental biological advantage.
I personally choose to aim to be a warm and caring human being, and, as a consequence of that, I have a really great relationship with my girlfriend. When we have kids, our kids will grow up in a loving supportive environment and so will have a good chance of growing up strong and well balanced. Being a sociopath would probably get me more material possessions, but I would have had to settle for an emotionally weaker partner (who I could dominate) and I would end up with messed up kids with a lower chance of success and survival.
From my own personal experience, people who fit the sociopathic archetype aren't really like evil villains. They're more like computer game addicts, fixated on goals that don't bring them happiness, and that get in the way of forming genuine connections with other human beings. I can see why people who are the victims of their behaviour characterise it as selfish, because they see the world as a competition for money, status and power and they think they are losing out. However, that competition is just a game, and the grand prize is not happiness.
I would be very curious to see a study of what you point out. Are some people exhibiting negative sociopathic traits simply because of their upbringing or social context?
Although I find these points interesting, it does seem to me like this leads us into the age-old philosophy discussion on ethics. It seems like ethics can be argued to be based on intention, effect, something else, or all. I have read about sociopaths that are experienced by others as good people. However, a sociopath seem to have a stunted emotional life and I am sure this disability will always have numerous subtle negative effects on people that have an emotional relationship with them (e.g wife, kids, friends).
That's not entirely true. Like 4% of people are sociopaths with an inability to feel empathy for others or feel guilt for their actions. There really are evil people who know what they are doing is wrong and just don't care.
You are the second person to bring up sociopaths as if they don't match the definition of evil I wrote above. The thing is, they are the very personification of lacking empathy. They believe that their own welfare is more important than anyone else's. They don't see that as evil, they see it as the way of the world.
The point is they don't rationalize or justify what they do, they simply don't care. They know it's wrong (by societies/normal people's) standards, but it doesn't violate their own.
This is what I consider evil, and I think it's important to separate it from people who think what they are doing is ok. I.e. "Someone else would have done it anyways", a thief who steals from a bank because "they have insurance" or "they are rich assholes who don't deserve it", or a dictator who tries to do what he thinks is best for the country even though his policies are bad.
Indeed. But also imagine that as an engineer on any secret NSA project, being shown real information regarding atrocities that have never been publicised and knowing that you can help to make the world a better place. You would have the passion and motivations to do these things and have meaning in your life and work.
Evil in this case is an aggregation of actions by many parties, not individuals and is rarely committed by the tool makers. However, it's always justified by the perpetrators.
imagine that as an engineer on any secret NSA project, being shown real information regarding atrocities that have never been publicised and knowing that you can help to make the world a better place.
Yep. Of course if that engineer had perspective he would have to wonder if there was more to the story than just what he was being shown. It is easy to doubt the people we already think are wrong, the hard thing is to doubt the people we agree with.
Thanks for pointing out "ambiguous". I believe it is very important to look at the situation and social pressures in addition to personal character.
I also believe you are to significant extent wrong about the people at NSA believing they're doing good, however.
Bill Binney, Jesselyn Radack and Thomas Drake all have personal experience with the matter, and here's what they had to say about it:
https://www.youtube.com/watch?v=qBp-1Br_OEs&t=1h37m59s
I would add that I think much of the material Ed Snowden released, to the extent that it shows personal expressions, to me points more to these people viewing the Intelligence organization as the new "we", and /everybody/ outside, US citizen or not, as a bit less than fully human.
That cheeky smiley on the sketches detailing how they broke Google's SSL?
"TOR Stinks" and similar flippant expressions in the documents about attacking the integrity of TOR?
And so on.
To me that isn't the look of someone who thinks they're making difficult, serious choices to protect the greater good. It's the look of someone who thinks they're better than everyone else out there and they can do whatever they want, including having "a little fun" toying with those other inferior creatures, because they think no one can touch them.
That, I think, is the general mindset we're dealing with. Allow for a fair deal of individual variation of course.
I get a very icky feeling when I read academic papers that explore how to mine network traffic for information on how to discover terrorists. And sure enough, when reading the funding information, the money for the projects came straight from the DHS. This icky feeling was there long before I heard about Prism.
You don't need an ethical guidebook to consider the implications of the work you're doing. I don't think lack of explicit instructions is an excuse. I don't want to invoke Godwin just yet...but suffice to say, a lot of bad things have been done in the name of good. And there were people in the loop who had the ability to see what was going on.
"Your enemy is never a villain in his own eyes. Keep this in mind; it may offer a way to make him your friend. If not, you can kill him without hate — and quickly." -Heinlein
Indeed. There's also a huge spectrum of ethical issues in computing, from black hat hacking to mundane issues that many may not even recognize as having an ethical dimension. For example, I turned down an offer from a major company (won't name here) to work on analytics for their advertising platform. It was a difficult decision and involved multiple factors, but part of it had to do with my discomfort regarding the ethical issues of search-engine advertising. I found it hard to garner support for my viewpoint when discussing my decision with others, and it was striking to see how many people consider search advertising to be relatively benign and not worth turning down a job over. I learned an important lesson about how much moral views/priorities can differ, even amongst people whose worldviews largely overlap.
So, the standards exist but people choose to ignore them. Which, I guess since you're not forced to be a member of either professional society as a "computer person", isn't all that unreasonable. But personal ethics should win out in these situations without someone else having to tell you that it's wrong.
I took a look at the ACM Code of Ethics, and I don't see anything that would be violated by working on PRISM. Specifically:
1.7 Respect the privacy of others.
..
User data observed during the normal duties of system operation and maintenance must be treated with strictest confidentiality, except in cases where it is evidence for the violation of law, organizational regulations, or this Code. In these cases, the nature or contents of that information must be disclosed only to proper authorities.
That specifically allows the kind of work done by PRISM.
2.7 Improve public understanding of computing and its consequences.
Computing professionals have a responsibility to share technical knowledge with the public by encouraging understanding of computing, including the impacts of computer systems and their limitations. This imperative implies an obligation to counter any false views related to computing.
3.1 Articulate social responsibilities of members of an organizational unit and encourage full acceptance of those responsibilities.
Because organizations of all kinds have impacts on the public, they must accept responsibilities to society. Organizational procedures and attitudes oriented toward quality and the welfare of society will reduce harm to members of the public, thereby serving public interest and fulfilling social responsibility. Therefore,organizational leaders must encourage full participation in meeting social responsibilities as well as quality performance.
Or number one on IEEE?
1. to accept responsibility in making decisions consistent with the safety, health, and welfare of the public, and to disclose promptly factors that might endanger the public or the environment;
Given, what's "best for the public" is up for debate (at least according to the people doing the collecting).
Prism may have been built by people holding their noses for the paycheck, but it could also have been built by people who genuinely believe that its social benefits outweigh its costs. Even if only 0.1% of software professionals believe this, it's enough to staff the project.
I agree. Just look at who Palantir hires. I imagine its a pretty small number of nose-holders and more often people who think they are doing good and people who think spying is a necessary evil so lets put fine grained controls on it.
I'm pretty sure there's also a good number of people who actually enjoy the craft. How many geeks do you know who wouldn't enjoy the power-trip? From LOVEINT to immature Googlers, there's an abundance of literature on the joys of doing evil.
Slightly OT, but I remember feeling QUITE shocked when I found that one of my favorite bloggers, Rands (aka Michael Lopp), was some sort of honcho over at Palantir. He's written some good stuff on management of 'geeks' and high-performance people in general, it was a sad day when I found out about his employer and removed him from my RSS feed.
While in school for Computer Engineering (2002-2007 or so), I actually had two ethics classes, one specifically for Engineering and one for Information ethics. I dont know how standard that is, but all the engineering disciplines had to take the Engineering one and computer scienc-y students had to take Info ethics.
The extent of computer ethics to me was "here is therac-25, you have an ethical responsibility to do your job well. here are various laws, don't break them." There was never talk of "consider the ramifications of your project, assuming it works".
I took a couple engineering ethics classes at the University of Virginia (class of '06), and found them valuable and thought-provoking. The curriculum was called "Science, Technology, and Society". More here: http://en.wikipedia.org/wiki/Science,_technology_and_society
Dayton, Ohio. We talked about a lot of interesting topics, like Whistleblowing (particularly relevant nowadays) and what sort of ethical obligations you have, in addition to stuff like "dont cut corners, dont be lazy, etc", the normal engineering things.
I graduated with a degree in Computer Science in 1997, and we also had to take an ethics class. However, it was more along the lines of knowingly creating software or hardware that would cause harm. To be honest, the majority of the tests we had were common sense..I didn't even have to even look over the material.
That does not make our decisions inconsequential. And yes, we should talk a lot more about ethics in compsci courses.
As for people who work on things like PRISM, it often happen in small steps, one ethical shade of gray at a time. When you get to a PRISM, your worldview is so completely distorted, you think it's a natural thing to do.
I'll never forget when I was taking CS in high school and we were forced out of the lab for a week and had class in the library. Our teacher spent the week teaching computer programming ethics. We talked about a lot of stuff, as you can probably imagine, but I was glad to have someone show me what he believed a good path to take with development was. In the end we all had to write a paper that he graded pretty damn aggressively for a cs class. Thats the only time in all my cs education that computer programming ethics in this context came up. Definitely agree it deserves more conversation.
Is it possible that the development of these technologies is compartmentalized? And the single engineers only know the part that they are supposed to work on but don't really understand what is the larger project that their slice will be a part of? It would probably be hard to pull this development methodology off.
Computer yechnology work is still a wild west. Other fields which have similar ethical considerations - medicine, chemistry, engineering - have ethics courses which make it clear to the graduates that their professional decisions have consequences.
There is no such standard for computer people. It still baffles me that people would voluntarily work at implementing something like PRISM - I would literally quit my job over something like this; similar considerations have popped up in the research I do for my Masters thesis - but ethics is definitely something we would talk more about.