Hacker News new | past | comments | ask | show | jobs | submit login
Anonymous Google Employees Voted Arms Control Persons of the Year (armscontrol.org)
178 points by Nemant on Jan 10, 2019 | hide | past | favorite | 143 comments



Unfortunately, the divide between Silicon Valley and the Pentagon will be bad for national security in the long run. Recent news cycles keep revealing yet another piece military equipment suffering from software weaknesses. Technology does not stand still, if Russia and China have more of their most talented engineers willing to work on national projects than us that will inevitably lead to a competence gap.

Here is an interesting article about the cultural divide and how it can be mediated:

https://www.defenseone.com/ideas/2018/12/divide-between-sili...


The article makes it feel like these changes happened because less people are enlisting or know people enlisting. I think that’s just another symptom of how some military programs lost a lot of legitimacy in the minds of some of my generation who were coming of age around the war on terror. It’s hard to feel like the good guys in a lot of these issues like drone warfare against non-combatants, and so the military was never a career I considered as a result, and I suspect I am not alone?

Edit: I’ll add, my generation clearly remembers 9/11 and the response to it. I think it’s not a lack of awareness of the current mission but _because_ of awareness to parts of it that many people write off the military.

Disclaimer: I am a Google engineer, but with no work relationship to their AI efforts and my opinions are my own.


I think the media has a lot to do with that.

They run stories about how the military bombs some civilians while fighting jihadists, but not about how those same jihadists routinely round up and torture children, kill scores of locals, etc.

Outside of Iraq, which fuck Bush, can you name something about the "War on Terror" you think we'd be better off not doing?

I think it's a really shallow view to say it's better to not bomb a few civilians... then leave them to be preyed on by violent locals instead.

I'm a millennial, and didn't enlist at 18 because I didn't want to go to Iraq (and still don't support starting that war), but I'd have gone to Afghanistan, and as I get older, I find myself much more willing to support the military.

I suspect I'm not alone in that, either.


You seem to be implying that it's perfectly fine for the US to bomb a family home if a terrorist is in there, accepting civilian casualties, because they might harm the population?

Who gets to make that determination? A random drone pilot? An officer?

Also please don't pretend any of the US military engagements since WW2 had any other purpose than serving national interest.


> Who gets to make that determination?

We do. Because we can. That is how the world works. If you don't make the decision, it gets made for you in the other direction.

> Also please don't pretend any of the US military engagements since WW2 had any other purpose than serving national interest.

This is a pretty childish sentiment. Reasons for war are complex. The world is actually a complicated place, and admits multiple concurrent motivations.


> Reasons for war are complex.

Yes they are. I only said that the motivation always was national self interest not benevolence.

> We do. Because we can.

I think that exhausted my tolerance for further discussion with you.


> Yes they are. I only said that the motivation always was national self interest not benevolence.

I'm not even sure what that is supposed to mean. Nobody has ever fought a war for 'benevolence'. It's a silly concept. If you know anything of history, you'd also know that WWII was not fought for 'benevolence'.


He's a millennial. All us Millennials know that it's 2019, and there isn't a reason for war in 2019, anywhere..


You just claimed that if we don't get them they'll get us, and then called someone childish.


> You just claimed that if we don't get them they'll get us

No I didn't. Read what I said again.


I read this - "If you don't make the decision, it gets made for you in the other direction" - and I understand that just as GP wrote.


And how exactly does that mean "if we don't get them, they'll get us"? What it means is that not acting is as much an action as doing so.


Not acting is always as much an action as doing so, if you consider actions abstractly. In this case I don't think it deserves stating.


It did in the context of what I was replying to, which is why I stated it.


So we haven't moved an inch past the old lie that "might makes right", then? How utterly depressing.


> So we haven't moved an inch past the old lie that "might makes right", then? How utterly depressing.

Might makes right to decide what is right. That is now and always will be the case.


I guess that depends on how you define "right". Might lets you impose your will on others. Being successful in doing so does not speak to whether or not your will is right.


Indeed. Being successful doesn't mean you were right to do what you did. However, having the ability to impose your will and choosing not to is a moral choice, just as much as choosing to impose it is. If we have the power to stop terrible things from happening in other parts of the world, at some point, it is our duty to. This is complicated and wrapped up in issues of sovereignty, of course.


I agree. I'm not a pacifist, and I certainly think there are situations where violently attacking people is, on the whole, the right thing to do.

I'm merely objecting to the notion that being the toughest kid on the block implies it's always right to use force. Might does not equal right. Sufficient might only equals military victory, nothing more.


> I'm merely objecting to the notion that being the toughest kid on the block implies it's always right to use force. Might does not equal right. Sufficient might only equals military victory, nothing more

Absolutely. I did not mean to imply that in any way. What I meant was that might gives you the power, and therefore responsibility, to make moral decisions about when and where to impose your will on others when they are acting 'sufficiently immorally', however you choose to define that.


> implies it's always right to use force.

Ever notice that people arguing for war are always anxious that the opportunity will slip away if we don't attack _them_ this instant?

Ever also notice that _them_ look pretty much like a lot of erstwhile allies that we aren't at loggerheads with?


We do. Because we can. That is how the world works

That's all well and good, but I don't want to hear a lot of whining from your corner when the next 9/11 happens.


Indeed. While I understand shock in USA at 9/11, I also understand where opinions that "America deserved it" had their merit.


> any other purpose than serving national interest

Spreading democratic and egalitarian values is in the national interest.


As much as I'm extremely reluctant to endorse the actions of the US government and military, I don't think the right answer in cases of moral uncertainty is "I guess we'll just leave things as they are and hope they turn out for the best".


> You seem to be implying that it's perfectly fine for the US to bomb a family home if a terrorist is in there, accepting civilian casualties, because they might harm the population?

Yes, just like you should throw the switch on the trolley car to kill one person instead of let it kill five.

Other people don't agree.


The US's military engagement in WW2 and prior was also primarily to serve national interest.


Part of it is also how we teach history. US schools have a tendency to only focus on the wars in which we are clearly the "good guys" until you get to the post-WWII wars in which things become much more clouded morally. This gives off the impression that the US and its military have sort of lost our way in the last half century or so. The truth is that all war is generally bad and only our perception of it has changed. Look through the Wikipedia page of US conflicts and you will see a long history of wars in which our actions were morally questionable that we are never taught about in school. And even our "heroic wars" like WWII include plenty of war crimes at the hands of the US.


You’d be surprised then by how history is taught in other places. Even when people look at themselves introspectively, most people will see the good over the bad.

I don’t think were much worse than any other major actor in this regard. I have an incling we do a fair bit better than most at catigating ourselves.


Bad wars are a standard part of history education, from the Trail of Tears to Jim Crow to the Spanish-American war.

People are willfully ignorant.


Or we could stop arming our proxy armies who are out there killing civilians on the daily.


can you name something about the "War on Terror" you think we'd be better off not doing?

Blowing up people's weddings via drone? Corroding civil liberties? Equipping small-town police forces with military equipment and mentalities?

I could go on. There is a lot more to dislike than just the invasion and occupation of Iraq.


yes, as a country we have to stop playing into the fear that is being peddled to us.

Terrorism is not about fostering abuse of constitutional rights and responsibilities, it is about fostering fear of our constitutional rights and responsibilities.


Almost everything about the war on terror I think we'd be better off not doing. I supported going into Afghanistan, but nation building/occupation was a non-starter. We started (CIA pushed the Arab Spring) the civil war in Syria, which has already cost a half a million lives and likely will scale over a million. So that was a terrible idea. We're still selling weapons to Saudi Arabia so they can genocide Yemen, where there's now open air slave markets.

It's really hard to name a single good thing to come out of the war on terror.


> I think that’s just another symptom of how some military programs lost a lot of legitimacy in the minds of some of my generation who were coming of age around the war on terror.

It's not just your generation. I'm in my 50s and this opinion isn't rare in my age group either.


I'm in my 50s. All my friends have similar feelings to yours. We have endless wars, bombing, drone strikes with no goal other than vaguely stopping terrorism. We are in Afghanistan, Iraq, Syria, Africa now, Pakistan. We kill militants and/or terrorists that wish us harm, and we annually bomb a number weddings and kill a lot of innocents.

We don't have a strategy or goal or even clear enemy. We just do it because we are powerful. Of course there are people that want to kill us, blow up our airliners. But it's not clear we are doing anygood.

We have to set actual goals, not just being there and bombing targets of opportunity, saying sorry when it goes wrong. What's our goal? We don't have one.


You might be just too young to remember it in real time depending on where in your 50s, but people slightly older than you got really jaded with the US military due to the Vietnam war around the time you were born.


Yes, you're right. I was a kid (in a military family) at that time, but not so young that I don't remember it!


> Unfortunately, the divide between Silicon Valley and the Pentagon will be bad for national security in the long run

I work for Google, opinions are my own.

I agree. However I think in the case of Google and most other companies, the issue is a little more.. awkward? Besides the fact we have offices in other countries, we also employ many people in the US from other countries. Obviously those people may have very different feelings about working on a project for US national security. I don't think I've been on a single team at Google which did not have people from abroad.


I'm sure this isn't the case for every project Google does with the US government, but do foreigners usually get placed on defense projects? Security clearances are generally limited to Americans, and exceptions are only given out on a case-by-case basis.


> [at Google] we also employ many people in the US from other countries. Obviously those people may have very different feelings about working on a project for US national security. I don't think I've been on a single team at Google which did not have people from abroad.

I honestly think the US military would have more problems with having foreign nationals working on a military project than many of the foreign nationals themselves. I think it's difficult to get a security clearance if you're not a citizen, and it may even be hard if you're a dual citizen.

I would think that naturalized US citizens would have the same interest in US national security as native-born Americans. It's a little racist to suggest that they don't, and it would be a little disingenuous for them not to.


A hostile foreign power has access to millions to billions of its own citizens and a tiny number of agents who are US citizens because of the comparative difficulty in acquiring agents because such a situation is risky for both parties.

Given substantial financial resources like a state actor getting that person to naturalized status seems relatively trivial.

It may simultaneously be true that on average natural and naturalized citizens are equally loyal, moral, and trustworthy and naturalized citizens in sensitive areas are a risk because it makes it easier for foreign powers to infiltrate you because of the comparative difficulty of acquiring the services of a natural born citizen.


You're thinking of xenophobia, not racism.


Transparent de-escalation is the solution, not the age-old rhetoric of 'staying ahead in the arms-race'.

_Most_ people regardless of country of residence or origin want a peaceful and fulfilled, free life. You can never please everyone but arguing from basis of 'fear of the other' is a bit disingenuous and leads to these literal arms races.

That's not to say defense spending isn't important, but it has to be stressed with the word 'defense' and with an inherent bias towards security and safety. (I don't believe technology products are inherently neutral; as a result of design they are better or worse suited towards particular usages, and that needs focus too)


The big fish is always going to have challengers from somewhere. You may be right about the average case but the world is quite large.


(also, fwiw - maybe we're not collectively ready for this yet, and sure, waiting until 'the other' is in fact ready to de-escalate themselves too is important in that kind of scenario)


> _Most_ people regardless of country of residence or origin want a peaceful and fulfilled, free life. You can never please everyone but arguing from basis of 'fear of the other' is a bit disingenuous and leads to these literal arms races.

This is all fine and dandy, but some of the most powerful militaries in the world are run by authoritarians who optimize for their own interests rather than what's good for their population. Russia's people generally gain nothing from Putin's belligerence, yet the belligerence happens. The CCP rolled the tanks on its own people; do you really think they'll hesitate to roll them on foreigners if it suits their interests?

I don't worry about the aggregate opinion of Russians or Chinese; I worry about Putin and the CCP.


You're potentially a bit unfair to militaries here; good standing forces have separation from their governance, their own chains of command, and ideally ways to ignore and report inappropriate orders (especially ones which violate terms of engagement and the rules of war).

Yes individual cases will always overstep those, often due to extreme power dynamics, but those should be taken as lessons to learn from rather than taught as the normal order of things.


The U.S. arms industry is huge, and so is defense spending. If the DoD wants better talent, then budget accordingly and pay high salaries and fix up their recruitment messages and surely some smart CS grads will go to them. Why should our military be dependent upon SV civilian companies that have workers who don’t want to work on military applications? Then just hire the ones who do, fund new military-oriented startups. This is a free market and the Pentagon needs to pay to play.


They are paying to play -that's how outsourcing works. The high salary of Google devs is priced into the contracts the pentagon pays. So it's not an issue of salary level but of their inability to build up an internal structure (to which their public image among the SV crowd contributes) which causes them problems. Can this be solved? Maybe, but not simply by raising salaries.


True, developing sustainable institutions with vision and effective knowledge transfer/accumulation mechanisms requires more than cash. It requires cultural attraction and perhaps some measure of prestige. NASA, and the major military branches have this. Some don't. Those that don't revert to the lowest common denominator for bureaucracies - a short-sighted transactional entity that shuffles around one-off projects.


IMO the big problem is that many of the big SV companies build end-user products. And to protect that they shouldn't want to get into arms contracts.


That's exactly the scenario we'll see over the next ~20 years, as parts of Silicon Valley refuse to go along with the Pentagon.

The military industrial complex will put tens of billions of dollars into backing new start-ups that will cooperate over that time. Those companies will receive favoritism from the state over time, to Google's detriment. That will include technology transfer and regulatory favoritism (AI will be a regulated industry 10-15 years from now, and will be regulated forever thereafter). For every worker in Silicon Valley that refuses to do military work, the Pentagon will find 100 outside of Silicon Valley that are more than happy to do so. Numerous large companies up and down the tech chain will cooperate as well, including Microsoft, Amazon, Intel, IBM, Oracle, HP, Dell, Cisco, TI, Micron, nVidia, and dozens of others. The contracts they receive over time will be a lucrative part of their business (it already is in many cases).


You do realize Larry and Sergey's initial funding to create a search engine came from DARPA, a military program, right?


> Recent news cycles keep revealing yet another piece military equipment suffering from software weaknesses

It's difficult to say this isn't a net moral good, as things currently stand. Competent tools put to immoral ends aren't somehow more moral than incompetent tools. The only fix is to change the ends.


That only holds true if you consider the military rivals of the United States to be greater forces for good in the world than the US.

I would consider that an indefensible position.


The stance held by Google engineers seems to be more subtle than that, although it is fairly clear: a weapon produced by an American company may go either to kill a civilian in a pointless war in a country nobody has ever heard of (high-probability), or to kill a Chinese soldier in a future war for the survival of our country (low-probability). The US government could address this by improving the mix ratio between pointless wars that benefit nobody and wars for the survival of our country, so that a weapon would be more likely to end up used for the latter than the former.


> The US government could address this

Yes, the US government could. Interestingly, the US military couldn't. Remember to vote, kids!


Sure, but it is defensible that a greater balance of power might be a net good for the world. (Also the devil is in defining net good). And it is also quite defensible to restrict hard power to ethical channels. There's a reason we don't use chemical weapons even though they would definitely give us an edge: The moral harm outweighs the tactical benefit. (Until it doesn't, as we saw in Syria.)


I don't want a balance of power. I want most moral actor (however flawed they may be) to have a technical and tactical advantage over those that would behave immorally.

The world didn't get together and unanimously decide to stop using chemical weapons. The few strongest forces in the world decided that no one was to use chemical weapons, and by that threat of force the world became a better place.


Exactly, I don't see how setting up another cold war for the sake of proportionality will be good for the world. In the cold war nations still engaged in brutal proxy wars, and everyone lived in fear of a massive hot war. Doesn't seem preferable or more stable.


Flawed? Is that it? Most moral according to US citizens is a pretty significant blind spot.


Who watches the watcher?


Militaries may be a necessary evil, but in no way a force for good.


I think you have it twisted around backwards. I believe goodness itself includes an obligation of strength.


I think self preservation is good. Am I wrong? Do we not have the right to fight for our existance?


> It's difficult to say this isn't a net moral good, as things currently stand.

I disagree but am open to being persuaded otherwise.

I don't consider national defense to be an immoral goal. I believe it's unrealistic for there to exist a world without weapons. That being the case, I believe it's important for us to constantly improve our military capability otherwise we will be left behind.


>I don't consider national defense to be an immoral goal

My personal opinion is that what the United States has been doing for over 20 years has not been national defense


I agree but I consider improving military capability to be orthogonal to decision making into what the military does. If we wait until the military does only that with which we agree, then it may already be too late.


You raise that in an interesting way. I'm all for national defense. But we aren't using them for national defense. I'm all for having a military strength such that we don't have to have wars with Russia and China, powerful, belligerent countries. Strength is a deterrence.

But the capabilities being built, like drone bombing are primarily used in our endless wars, and used by our 'allies' like Saudi Arabia in places like Yemen.

Have you read the book "Forever Peace" by the author of the more famous "Forever War", Joe Haldeman? F.P. is a world with an endless battle against people that could stand in for the endless wars against Islamic terrorists we fight today. In this world they have robots that can be remotely controlled and you can just walk up to people and kill them. It really reminds me of the way technology could make wars in the future.


> I believe it's important for us to constantly improve our military capability otherwise we will be left behind.

Looking at the amount you spend on the military, it would be ridiculous if you weren't 100x further ahead of any other nation already.

As an Australian, I'm personally happy with the US being ahead over the alternatives - but if I was a US citizen I would be seriously questioning how far ahead you need to be if it's really about "defense".


That's well and good until China is invading Taiwan and Russia is invading Ukraine and there is no deterrence from the west to make them think twice.


But we do have enough strength to have deterrence. Our military budget is equal to the next 8 biggest countries together. The three biggest air forces in the world are, in order, us air force, us navy (their airplanes etc), us army. Each is bigger than any other countries' air forces.


I don't understand why that's necessarily such a bad thing. Should we be worried that other countries have more strength in this area than we do? Plenty of other countries are not the top in this category and they seem to get along just fine. I'm talking specifically about weaponry, not really defense infrastructure/innovation generally, which at least has some chance of outer application


Google is actively helping the Chinese military by opening their AI center in China. Every bit of the IP they develop in China will be in the hands of the government and thus weaponized by the Chinese military. Yet I haven't heard a peep of protest from the same people who criticized project Maven

https://taskandpurpose.com/google-china-artificial-intellige...

https://www.cnbc.com/2017/12/13/alphabets-google-opens-china...


That's a basic AI research center, and it's hiring people away from other projects they could be doing in China.


SV devs don't contribute significant volumes of code to military equipment. They're mostly being used to build the panopticon. The incumbent contractors have plenty of staff who will do the work others refuse.


You're making the classic fallacy based on your anecdotal experience. Lockheed, Boeing, Raytheon, Palantir, Thales, SSL, Redhat (IBM) and more definitely make a good living off the MIC in the Bay Area. USARL, DARPA and the national labs also do a bit of work in the Valley. Heck, Stanford and SRI grew out of the MIC.


You assume the military power has a significant role to play in the future.

Building secure civilian products, countering corporate espionage, and building products bæused world wide, all probably play a bigger part.

As do diplomacy. Disagreement between the bigs powers haven't been solved with military power for a long time.. why would that change?


It could change if belligerent idiotic leaders get in charge. Remember back in WW 1, at every step of the way leaders of multiple countries thought "surely the other side won't go to that step, they'll stop if we just increase the pressure a bit". Tomorrow a chinese pilot could bomb or crash into a us ship, try to sink a carrier by Taiwan. That could start the war. Someone could take it upon themselves in Russian-occupied Ukraine to move on to the rest, or bomb Poland. Or the bomber flights the us and russia send to each other's borders could crash into the other countries fighters.

We have acting leaders at many levels of the military, and a president who talks the big talk but probably hasn't ever deal with a real crisis where the other side won't fold or de-escalated something ever. He's got no reasonable advisers left.


If I were an oligarch in a non-U.S.-allied industrial nation, would I not love having a comment like this at the top of where it is? I get to point to the American tech industry culture and say "they are trying to wipe us out!"


A lot of Google employees supported project Maven, but with liberally-controlled company and with witch-hunt mentality most of these people choose to keep quite.


That's probably true, but the blame belongs squarely on the government.

Edit: Not clear why this is getting so many downvotes. If the government is not trustworthy, people won't want to cooperate with the government. Demanding that they put aside their ethics for the sake of patriotism/national security/etc., while ignoring their legitimate concerns, is pure propaganda.


> The runners-up in the vote for the 2018 Arms Control Persons of the Year were the founders and co-chairs of the International Gender Champions Disarmament Impact Group... The impact group developed specific aims for expanding knowledge about the importance of gender issues and practical actions for bringing gendered perspectives into disarmament discussions.

I find it just a little bit hard to take them seriously if they feel this is the second most significant group in arms control last year.


The world must be a wonderfully prosperous place if these are the remaining sorts of problems we have chosen to organize around.


No one is intended to take a "Arms Control Persons of the Year" seriously as an objective ranking. It's about raising awareness and driving the discussion.


I don't understand why you don't think gendered perspectives regarding disarmament discussions are useful. Feminism has a long and entrenched history within peace and conflict studies and international relations, and its insights are often useful, particularly with respect to things like disarmament and post-conflict peacekeeping.


I'll bite. Got any concrete examples of this in actual application? To say it sounds like a parody of the current mess of 'gender studies' would be an understatement.


There is a lot of cynicism here in response to a strictly positive social effort. If we (nationally) are actually behind in some kind of "AI" drone-targeting arms race, I suspect a great many of these objectors would be willing to do work to, say, disable incoming drones, or develop systems to scramble/mislead automated targeting systems.

I feel that it is the responsibility of every moral and conscious agent to oppose dark patterns and negative trends within their place of work whenever possible, and while it is easy (and apropos) to accuse Google of perpetrating malicious patterns, I think we ought to laud and publicly encourage internal currants that oppose that trend, not smirk at them or deride them for not doing enough.


Isn't it inevitable that AI will be used to improve targeting technology for weaponry? Am I wrong in assuming that if U.S. doesn't develop this tech some other country will?


It’s not inevitable if we aren’t willing to let it happen. It used to be “inevitable” that poison gas, napalm, cluster mines, etc. were inevitably the future of war, but as a species we decided we weren’t okay with the effects of those technologies, and we’ve subsequently been reasonably successful at not using them.


I mean, all of those things are still being used, though....


Don't let the perfect stand in the way of the good. If the treaties help prevent 90% of the usage, they seem worth having.


That doesn't work in the real world where the remaining 10% of countries are willing to use their superior military technology to dominate others.

The only way for a country to be truly free is to be either equal in power to other countries - or barring that, the most powerful country on the block. Since the former is impossible due to the nature of reality, all countries aim for the latter.


In practice that top 10% is generally the one holding itself accountable to arms treaties, it's normally smaller groups that go for the cheap-and-horrible.


Imagine if all engineers had boycotted weapons development due to the heinous nature of napalm firebombing in WW2. US military competency stagnates at the stage of napalm. Would that actually change the behavior of the government's military, or would they continue to use napalm in Vietnam and beyond? As history has borne out, I think the latter is the case. The gov will assume it is the best tool so far to do the job, and preferable to the prior alternative of mass cannon bombardment and infantry deployment.

Development of smart munitions, better sensors/intel, and targeting precision has reduced the scale of military operations, entrenchment, and collateral damage. I think that was a form of technological disruption that was overall for the better.

There's a valid counter-argument that making war smaller and easier will lubricate the willingness for politicians (and the public) to enter into war, or maintain a state of pseudo-war. That is certainly a drawback.


This is the arms manufacture argument. Sure, smarter, more efficient killing tools may seem like a benefit, but it's always framed in a "us vs. them" argument.

What happens when you're the "them" at the receiving end of these smart weapons? Weapons tech is a pandora's box, once opened, everyone has it and you can't close it.


The progression was always towards more targeted weapons, because they have less collateral damage.


And a much higher chance of actually destroying the target. The US dropped more tonnage of bombs on Vietnam than the entirety of WWII, and yet random destruction is rather ineffective.


> Am I wrong in assuming that if U.S. doesn't develop this tech some other country will?

You are not wrong. Alibaba, Baidu, etc. work heavily with the Chinese government in this area.


Is that the old tired "if you won't help build tools to butcher people, evil chinese will?" argument?

This zero-sum, jingoisti, outlook on the world has caused most damaging and bloody wars in history. World policy is not a zero sum game and every human does not have to help kill other humans for the world to be at peace.


> "if you won't help build tools to butcher people, evil chinese will?"

It's not that they will. It's that they are.

When someone who most certainly does not have your best interests at heart is building technology capable of crippling or dominating you, what response do you suggest? Passivity?

The only recourse here is to either convince them to step down or to - at the very least - match them. Anything else puts you at a disadvantage and puts your citizens in danger.


> It's not that they will. It's that they are.

[Citation needed]

> When someone who most certainly does not have your best interests at heart is building technology capable of crippling or dominating you, what response do you suggest? Passivity? > The only recourse here is to either convince them to step down or to - at the very least - match them. Anything else puts you at a disadvantage and puts your citizens in danger.

You are presenting a false fallacy where the only two options are "do nothing" and "invest all possible resources available to the country into killing other people". A casual look at history book would quickly show you that escalating arms races are not beneficial in the long run - even to the country winning them, since they degenerate into building murder tools at the expense of their own citizens. Much like modern US, which is incapable of providing healthcare to their citizens.

That does not mean that no resources should be put into defense, but every civilian IS NOT morally obligated to help kill other people.


Do you really need a citation for what the Chinese government is up to with classified military technology? (As if the Chinese government willingly publishes this info)

I would argue that the very raw destructive power that military weaponry has, in particular with regards to nuclear capabilities, has actually reduced the odds of another world war.


> [Citation needed]

The Chinese government has imprisoned 1 million+ members of a ethnic minority. I'm sure you've heard of this, if not, you need to do some research...

> You are presenting a false fallacy where the only two options are "do nothing" and "invest all possible resources available to the country into killing other people".

No one is presenting that.

> A casual look at history book would quickly show you that escalating arms races are not beneficial in the long run

[Citation needed]

From my perspective, it sure did help, as nuclear weapons resulted in the inability for large countries to go to physical wars (for now, at least).


> World policy is not a zero sum game

The Pentagon does not deal in World Policy, it deals in martial policy. As General Mattis said, "If you don't fund the State Department fully, then I need to buy more ammunition ultimately".


And as we've seen, it'll eat all the money you can throw at it and even more, even if it means no healthcare, education or basic infrastructure in the country they're supposed to defend. Militarys appetite for money is limitless.


> and even more

I find it hard to believe the military spends more than Congress gives it. Military doesn't decide the budget either. Of course a lot of pork is reps spending money in their districts to "create jobs", even when the military doesn't want what they're making [0]. Clearly Pentagon isn't in charge of that spending, or they wouldn't be spending money on what they don't want.

[0] https://www.military.com/daily-news/2014/12/18/congress-agai...


>Military doesn't decide the budget either.

This relies on the (wrong) assumption that military leadership doesn't hold sway in the government. Historically, when the pentagon asks for more funds, the USG provides.


No, there is a difference: if you aid the US or not, the US will butcher their enemies. Whether or not you aid the US, companies are helping China butcher their enemies... but who are China's enemies? The US is one of them.


Face recognition is categorically inevitable for small targeting systems.

The danger with AI in particular is that eventually the technology gets trivially accessible. You don't need to hire specialists, just wait a bit and download the library and get the how-to book from Amazon or Barnes and Noble.


That’s a rather common argument. The glib answer is a reference to the Eichmann trial. To (slightly) elaborate, in such a scenario it would at least be only the second (or theirs, fourth...) best team working on the technology.

There are also instances of successfully preventing technology from becoming commonly used in warfare. Nuclear tech is one, but not a perfect fit because it requies huge investments. But chemical weapons are quite comparable to AI, in that any chemistry grad could create such weapons, yet they have been used far less often than mere feasibility would suggest.


If AI can reduce friendly-fire or weapons striking the wrong target, isn't that saving lives where both of those examples currently kill an extraordinary number of innocent people?


>The second runner-up was South Korean president Moon Jae-in. He was nominated for promoting improved Inter-Korean relations and a renewed dialogue between Washington and Pyongyang on denuclearization and peace that has led to a number of significant steps to decrease tensions, including a North Korean moratorium on long-range missile and nuclear testing, a halt to U.S.-South Korean military exercises, and steps to avoid military incidents along the demilitarized zone that divides North Korea and South Korea.

IMO this is the only one that matters in the list. The Pentagon will get another US Based Tech company to aid them in the pursuit of new weapons because there is just too much money to ignore it. Meanwhile peace talks between South and North Korea actually reduces the chance of a thermonuclear war between two nations. We can only hope that other nations follow suit.


I think younger engineers' morality towards the military distinguishes between technology for great-power warfare, and technology for bombing weddings in Yemen. Aside from some real outliers, nobody of any age wants to lose a war with China. However, it's the "wars of choice" that are problematic.


Why are you talking about a war with China? Is USA preparing to attack China in the near future? There has been a very concerning rise of targeted anti-Chinese articles on this site, is that a preparation for war?


I'm cautiously optimistic that increased AI use in the military is going to be a good thing for everyone.

Realistically we're not going to solve the problem of nation states wanting to protect their interests halfway across the world, which'll among other things mean killing some "combatants" from a drone.

But we can hope to do things like improve targeting, and a reduction in civilian casualties or collateral casualties. Right now the "AI" is some group of 20-somethings sitting behind a computer in Nevada, what if we trained an AI instead, and could e.g. hold legislative audits on what that software was configured to target?


> I'm cautiously optimistic that increased AI use in the military is going to be a good thing for everyone.

> what if we trained an AI instead, and could e.g. hold legislative audits on what that software was configured to target?

You are much more optimistic than me.

If those audits ever happened, they would be held in secret, and have very different goals than most people would consider moral.


Perhaps for political purposes the businesses which bid over government contracts ought not also be consumer-facing ones.


They're perfectly ok with scanning emails and providing that data to hostile countries and getting someone shot in the face, but if the person is shot in the face with an AI-powered weapon, they're not?

Wow the faux ethical reach-around they've giving each other over this is comical.


I wonder how many of these employees are actual us citizens or not.


Beware of the hubris of those surrounding Peter Thiel, whom are war hawks and pro military startups of all sorts... including Palantir which is used to target and eliminate dissidents. And even more troubling are a few of the Ukrainian and Russian "entrepreneurs" whom come to the Valley as being flag-waving capitalists, but it's difficult to ascertain their actual allegiances because the Valley lets people pop up out of nowhere without references, and hands them money and influence.


Title Confused Me To No End


But yet all 4,000 of them still work for the largest surveillance corporation that has ever existed. At least now they think they have the moral high ground.


They track us in real-time down to a meter, predict when and where we will travel, know who we interact with, know our DNA[1], know what food we like, when we are sick, what our political leanings are, can predict what our children will like / dislike and probably can predict when we'll die...

If any of that isn't completely true yet, it will be.

[1] https://www.scientificamerican.com/article/23andme-is-terrif...


That's true and it's terrible. But they are not building actual weapons that kill people. That's worth something.


Being in China and calling Xi Jinping Winnie the Pooh over gmail would be a good test of this, no?


They are building the infrastructure for the people who want to use weapons.


It occurs to me that attacking people who are probably on your side, and are in a position to influence Google's policies, is probably not the most effective way of enacting change.

Unless your real argument is "Everyone's evil so don't bother trying".


Ah, the ole "sitting on the sidelines is helpful" argument. It's morally defensible for sure, and neutral, but pretending it is actually helpful is wrong. It doesn't help in any meaningful way to change things (in any direction) if you sit on the sidelines. Ironically, people who are the ones who always seem to think the have the moral high ground.

Of course that doesn't mean you have to join google to make a difference, but pretending that you have to not be working at Google to be helping change things is just silly nonsense.


No it's different. It's closer to this: https://en.wikipedia.org/wiki/Non-cooperation_movement

It has an effect, you just need to have a large mass of people to make it work.


I agree you can have an active non-cooperation movement. That does not appear to be suggested here.

It's the difference between "i won't participate in patenting software" vs "i'm actively avoiding any companies or software that file patents"

The former i see a lot, and it does not help in any meaningful way, in part because their participation is not required. It doesn't help, it's just something people do to pretend they are helping without having to do anything real.

The latter would be something useful, though it does take large groups.

Not working for google "as a way of helping" is clearly the former. Google doesn't need their help, they will do no good by leaving. They have plenty of other jobs, so it's not hard either. They likely can do more good by staying and agitating than by leaving and being ignored.

Additionally, the argument that they must leave google to have an impact is also clearly silly.


>"Ah, the ole "sitting on the sidelines is helpful" argument."

That's not even the OPs argument, nor is it some "ole" argument. You have both put words in their mouth and framed it as some classic well-known fallacy of which it is not.

The OP is making the distinction between "voting with your feet" which takes real commitment and has immediate effects versus "signing a letter"[1] which involves nothing more than a few seconds of your time without having to leave your desk.

If Google has trouble attracting talent due to matters of conscience it directly impacts its abilities to build new services as well as improve existing service in order to increase revenue.

[1] https://www.nytimes.com/2018/04/04/technology/google-letter-...


I have done neither, nor has OP retorted, so ...

Also, voting with your feet takes no real commitment when tech jobs for googlers are plentiful and easy. Avoiding working for Google does precisely nothing on it's own.

The closest you get is the non-cooperation type of movement the other reply mentions, which is an active thing, but that's not being suggested here.


[flagged]


Can you explain more?


Please don't respond to flamewar comments by encouraging more.

https://news.ycombinator.com/newsguidelines.html


[flagged]


Huh, there was certanly "a fit" inside Google around Dragonfly as well - and the only big corporation that actively assists China at this point is Apple. So I'm not quite sure what the OP meant?


I feel like Maven got saber rattling from employees at least one order of magnitude more than Dragonfly. At least I saw a LOT more press and general discussion.


Or perhaps Maven got more media coverage than Dragonfly. Selection bias can be a real thing.


It looks like Dragonfly is cancelled too. https://theintercept.com/2018/12/17/google-china-censored-se...


The only reason we knew about it the first attempt was because it was leaked, whose to say they won't just try again in the future...


They also stood against Dragonfly so that complaint holds no water.


Meanwhile Amazon makes $$$ and increases their overwhelming user base from D.C.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: