Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Inmates in Finland are training AI as part of prison labor (theverge.com)
157 points by cpeterso on April 1, 2019 | hide | past | favorite | 104 comments


Why should I be outraged at this? How is this much worse than any other prison job?


It's not worse, but none of them pay anywhere near market wages, which offends some people. I have mixed feelings about all prison labor. I'd prefer we paid a more reasonable wage, but it's nowhere near the top of my list of things I'd change if I were to wake up king one day.


Considering that housing and food is "free", the Finnish prisoner pay of around 5 € per hour is actually quite close to market wage, if not even above...

(There's no legal minimum wage in the country, though collective agreements have legally binding minimum wage clauses in branches of business where they are applicable.)


Ah, that's orders of magnitude more than US prisoners are paid. Looks like average is below $1/hour.[0] Which is actually lower than I had imagined and makes me want to retract my previous comment about mixed feelings. :|

0 - https://www.prisonpolicy.org/blog/2017/04/10/wages/


I think there's not much to be outraged about that a new type of prison labor will be added to the palette. But the marketing efforts of the company sound deceptive to me. They pretend to be delivering some sort of social moral value, but actually simply use prison labor to reduce costs.


exactly, the only outrage is them being paid and not using that money to help the victims of their actions


Depriving them of pay is not required. They are still humans themselves, and they're paying for their actions with prison term.

Societies that confuse prison with vengeance have more crime, and end up worse for victims themselves.


not true, even in feudal times there were less crimes because of harsher punishments, even in much harder times, i don't think that prisoners should be abused by the way while in jail also (and this happens mostly by other prisoners as well) that should be strictly prohibited


Sounds like you're pulling stuff from out of your ass, the truth here is corrective vs punitive, corrective leads to a lot less recidivism, unlike punitive. All you have to look at is recidivism rates in the US vs northern European countries that have corrective prison systems in place. Norway has a 20% recidivism rate compared to the US's 67.8% withing 3 years, and 76.6% within 5 years.


>not true, even in feudal times there were less crimes because of harsher punishments

In feudal times violent crimes, thefts, etc, and especially murders and rapes, were at rates way beyond today.


>Exactly, the only outrage is them being paid...

From the article:

>...though the CSA is responsible for figuring out how much of that goes to the prisoners...

Are you sure they're getting paid, right now?

>...not using that money to help the victims...

And, at what point, would this vengence form of justice stop? How much would be "adequate" for compensation to their victims? Are they still not penalised, even after they've "paid their debt to society"? ....because, last I recall hearing of the prison system in the United States, the punitive punishment doesn't really end - from being unable to vote to being unable to get a job because of a record to a number of other maliciously designed rules/laws intentionally meant/designed to further punish them, yeah?


i don't think the us system works well and is a model to look up to at all, its too soft on them and creates hardened criminals if anything, too few characters in the comments to explain to you but look at japanese system as an non perfect system that works much better but is quite harsh


The US system is too soft? Wow, that's not something I hear very often. The US has one of the most regressive, punitive, non-effective prison systems in the western world.


Exactly it’s too soft and allows for inmates to be harassed and assaulted by other inmates, creating prison gangs and things like that. So it hardens the criminals and creates even more problems.. I would prefer the system like the one in Japan, which again is not perfect at all but much better than US system though it’s criticized to be very harsh but has great results and Japan has amongst if not even the lowest crime rate in the world


Their victims may already have been awarded money, and if a prisoner makes money in prison it may end up being used towards paying that down, as any money they would have made.

Earning "okay" money for prison labor is also pretty normal Europe and can give prisoners something to fall back upon once they are released, as well as new skillsets.

If you didn't pay them, prisoners may also end up not working at all. Unlike in the US, slavery there is completely illegal even in prisons ("penal labor"). You can't force them to work by punishing them.

There is nothing to be outraged about, unless you're being blinded by a reprehensible and base thirst for third-party revenge.


Prison slavery is explicitly legal, in the US Constitution, and is widely practiced. https://en.m.wikipedia.org/wiki/Penal_labor_in_the_United_St...


Payment for labor is important on pragmatic grounds alone given incentives. If given only negative enforcement then it is do enough to avoid it and more paid overseers. If there is some positive enforcement then incentives align enough to not just "pretend to work whenever they can get away with it" as optimal.

Not to mention the rehabilitative effects - you want convicts looking for delayed gratification and honest work instead of big scores.


Also, giving prisoners privileges such as television or the chance to earn a little money makes it easier to prevent misbehavior by threatening to remove those privileges. This has been shown to be as effective (in most cases) as harsher punishment, but without making the prisoners as angry and resentful.


Good intent from the labellers isn't always required for a good model but it certainly helps, especially if the model training process is unsophisticated. I'm sceptical enough of this being the case for poorly-paid annotators - surely for prisoners it could be even worse?


Prisoners have minimum pay 4.7 or 5 EUR and prisons can't take money for nessesities. 5 EUR/hour without rent, travel costs or food etc. Seems descent low wage.


I think with labelers you just need intent "the right way" which is more consistency regardless since you can correct "always wrong" to "always right" for binary inputs or othervtricks for larger numbers.

There was the talking banana story where the sheer determination to make the robot racist provided intense testing. https://www.google.com/amp/s/kotaku.com/racist-twitch-trolls...

Really so long as you know how to handle it they can be useful. If you want your system to be robust you need to see how it handles "abuse".


If we trust for profit private prisons to fight wildfires I don't see why we wouldn't trust them for other things.

https://www.cnbc.com/2018/08/14/california-is-paying-inmates...


Because one has a fuckton of karma and social recognition and the other doesn't ?


>for profit private prisons

Who trusts "for profit private prisons"? Other than "libertarians" and people making money off of running them?


What is with people and thinking all libertarians are idiots? I’m liberatarian and I despise for profit prisons. They have perverse incentives


Not sure where you're writing from, but there's some muddying of the term in the US. While libertarianism outside the US has roots in anarchist anti-authoritarianism, in the US, right-wing libertarianism has a bit of a reputation for using anti-statist language to advocate for stripping away any kind of limitations on capitalist enterprise, regardless the incentives.


We don't trust for-profit private prisons. They're universally reviled.


There are different uses of the word 'trust'.

Portions of the U.S. do, in fact, trust prisoners of for-profit institutions to fight fires. They have no choice in the matter. (A related definition of 'trusted' used in security is people who cannot be denied access to information.)


>They have no choice in the matter...

???

No dog in this fight really, but if they have no choice, can they really be said to trust? I mean, maybe through some kind of rhetorical gymnastics. Historically speaking, as soon as humans who lack choice actually get "choice", they tend to sprint right off the reservation as it were.


> Historically speaking, as soon as humans who lack choice actually get "choice", they tend to sprint right off the reservation as it were.

My uncle used to do this work and it was purely voluntary, he liked it because he got out into the great outdoors instead of sitting around the prison yard being bored.

They don't let the "hardcore convicts" out into the forests but the minimum security prisoners who could easily escape if they really wanted to, my uncle used to get caught doing stupid stuff (mostly theft to feed his drug habit) and would do his time without causing trouble so he was trusted enough to not run off into the woods causing a statewide manhunt.


Its getting so hard to spot the April Fools day stories.

After reading, I'm guessing this isn't one of them.


It is not a hoax/joke.

For the record, the article is older than that. The CSA press release is over two weeks old.


I know. I was scanning the front page for likely fools day stories, and this seemed the most likely.


Unconventional, but as a means to fill a "Mechanical Turk gap" it seems like a fine approach as the results can be isolated and tested. Inmates read and summarize with the benefit of being paid (and perhaps learning something in the process based upon the material read), Vainu gets a training set.

That said, the headline just begs for a "Alexa, shank the snitch" kind of joke.


As I've been saying, the "AI threat" is not that it'll autonomously decide to hate humans but that it will, under the control and guidance of actual humans, be used for anti-human purposes.


The major, implemented, commercial use cases for AI have so far been anti-human:

- Youtube's deep learning recommendation engine. Thanks to Guillaume Chaslot for publicly stating what I suspected was going on at Google https://www.nytimes.com/2019/02/19/technology/youtube-conspi...

- Google, Facebook, etc use of machine learning for advertising optimization.

- Presumable widespread use of facial recognition cameras in China (an area I don't know a lot about.)

How are we supposed to expect the leaders in AI to perform ethically in the future, or today, when their first foray in to machine learning commercialization has been sloppy and ethics-free?


> How are we supposed to expect the leaders in AI to perform ethically in the future, or today, when their first foray in to machine learning commercialization has been sloppy and ethics-free?

Remove the billions and billions of dollars in profit to be reaped from making humans obsolete.


Those are the ones you hear about, because they're controversial and easily relatable to most consumers. Machine learning has had a major impact in "blasé" industries that you are omitting, such as healthcare, finance, and manufacturing.


Which part of this story is about that? I'm not seeing the connection. Are you worried that the inmates might be manipulating data?


i think he means that humans will abuse the ai (tech) and then use it for bad purpose


And what's new here? Any technology can be used both ways.


Well, you can kill someone with a lawnmower if you try enough, sure.

But some technologies are more apt to be used in bad ways than others, and more nightmarish in the uses they enable.

(E.g. nuclear technology, in the form of bombs, can enable total annihilation of the planet. Glass-making technology, not so much -- what, we'll fill the planet with jars?)


yes, no news, you can use ai for bad or good things, guns to save people from bad people or do bad things etc..


Guns are a bit different because they are designed to kill. I fail to see how you can save lives with machine guns unless you are in a war zone


I fail to see how you can save lives with machine guns unless you are in a war zone

Being attacked by a pack of wild animals. Not that I care about the politics, but it immediately jumped into my mind.


Do you really need machine guns to scare off a pack of wild animals ?


Defending against a [lynch] mob


It's not that it would decide to hate humans, but merely needs to be indifferent to humans.


A lot of humans hate other humans. Training the robots with the same biases as those humans persists the hate and gives it a veneer of objectivity and science. Interpretability of machine learning algorithms is key, otherwise you just have a black box that made some decision for reasons, and then it's hard to demonstrate the bias.


I hope they're not asked to label if specific actions are morally correct or not :)


That's a stereotyped characterization of why people go to prison that HN should be wise enough not to propagate.


So, we won't find a correlation between people who go to prison and propensity to immoral behavior? Or even intentionally mislabeling immoral behavior as moral.

HN should be wise enough to know that correlation is not causation, and that statistical differences in behaviour between groups do not necessarily describe an individual from the group. But that does not invalidate OP's point.


I think inmates or criminals have a higher rate of mental and physical diseases, as well as bitterness, grievances and other factors that skews their vision of reality, and make them judge situations differently from people who are not inmates.

It does not mean that there are criminals or inmates who are incapable of judging if a situation is moral or not. It's more like on a statistical basis, inmates/criminals are more likely to give ambiguous or erroneous moral answers to situations that seems obvious to others.


The main driver of incarceration is income. Main driver of where you end up is the wealth of your parents.

https://www.vox.com/identities/2018/3/14/17114226/incarcerat...


Yes, because without appropriate guidance and modeling, people create Hobbesian dystopias. Countries like Finland and even the US are anomalies which need costly, active, intervention to maintain.


I'm sorry, I don't see the contradiction between:

a) Convicts make morally-bad choices at a higher rate than the general population.

b) Being convicted of a crime is correlated with low income.

You can sympathize with criminals and want to help them get back on their feet without also looking at them for moral guidance.


I never suggested looking to them for moral guidance. I was responding to op's quote "inmates or criminals have a higher rate of mental and physical diseases, as well as bitterness, grievances and other factors that skews their vision of reality, and make them judge situations differently from people who are not inmates".

According to the ACLU, over half of arrests in 2010 were for Marijuana possession (https://www.aclu.org/gallery/marijuana-arrests-numbers). I can guarantee that most of those arrested were not upper class trust fund kids even though those same kids likely used drugs at the very least an equal rate than those arrested. In addition, once arrested, those without the means to hire a good attorney are much more likely to either be convicted and sentenced to jail time or become stuck in the criminal justice system. Should we doubt the moral character of all those that used marijuana or only those that were arrested and incarcerated?

I did plenty of drugs when I was younger but was never arrested. I was even pulled over while underage and released without being charged even though I had a ton of beer in my vehicle (I was not drinking yet). Is my sense of morality better than those that were arrested for the same crime? Sometimes the system is unfair and those without resources live much harsher versions of it. It does not make them any more or less moral than those that are able to live the nicer version of the same system due to having richer parents or a more desirable skin tone.

With that said I am not defending people who commit violent crimes. Another interesting avenue to explore is why only one person was jailed in the US on charges related to the 2008 collapse. Were the people responsible for that more moral than the people jailed for drug possession?


>I never suggested looking to them for moral guidance.

Then I'm not sure you were aware of the context of the thread you joined, which started with this remark:

>>>I hope they're not asked to label if specific actions are morally correct or not :)

The poster who you were responding to affirmed that they were defending that claim:

>>It's more like on a statistical basis, inmates/criminals are more likely to give ambiguous or erroneous moral answers to situations that seems obvious to others.

The only part you're responding to is saying that crime correlates with income, which doesn't contradict the broader or the narrower point: even if that correlation itself is true, it can be via the correlation with the other factors the parent mentioned.


Alright, I’ll bite. Are you saying that most inmates are serving time for alleged violations of law that have not actually occurred?


Not OP, but just because you've broken the law once doesn't mean you're a bad person, much less lack moral agency.

People make mistakes. Sure, some deserve to rot in prison for irredeemable crimes, but many (at least, in my context as an American citizen) are there for nonviolent drug offenses or failing to pay fines.


That's irrelevant. All that the OP's point requires is that, on average, the moral choices of such people are worse. The existence of outliers -- or even common exceptions -- does not refute that.


You break a lot of laws every morning while driving to work; doesn't mean your moral choices on average are worse than someone else.


My frequency of lawbreaking would indeed (anti-)correlate with the quality of moral decisionmaking, and my frequency of lawbreaking is likely lower than those who have been convicted (at least if severity-weighted).

Further, the comparison was against convicts, who do it frequently and severely enough that someone finds it worth the money to prosecute and get a conviction. And at that point, yes, a correlation appears.

You're still making the same fallacy: "I can find an exception, so the correlation doesn't hold." That doesn't follow.

(And, FWIW, I don't drive to work.)


I think he is saying that the reason why people do bad stuff is not because they don't know what's moral. It's because they made a decision to ignore moral.


While I would personally wager that those in prison are on average less moral, there was nonetheless a valid point above that those who commit crimes don't necessarily lack morality. The argument here would be that law does not perfectly equate to morality, and separately, that people will commit crimes out of the lesser of two evils in some given scenarios.

Additionally, despite the reasonable conclusion we can make that those in prison are less moral, it probably does no good for recidivism rates for us to talk about how immoral prisoners must be. It stigmatizes them, and puts more barriers on the path to recovery and reintegration into society.


Theft because you're poor and hungry is a really shitty situation to be in and is done out of desperation, not malice.


Well, in Finland you should have no need to steal as social services generally attempt to support people in the general case above biological poverty (ie. pay for food and shelter).


Not everyone is protected by social services. Finland still has a homeless population, and not all of them are mentally or psychologically challenged.


Illegal and immoral behavior are not the same thing, especially when you consider bias in the legal system as to whom the law is enforced against (poor, male, minority).


"It was a morally ambiguous stabbing!"


It's quite common for drinking buddies to quarrel, and once the perpetrator has sobered they are sorry and give themselves in.


Straw man.


On the other hand pretending that people who go to prison are usually righteous is really unbelievable.


This was literally a plot point in _The Dark Knight_: https://www.youtube.com/watch?v=K4GAQtGtd_0


The Dark Knight isn't actually a rigorous academic study on virtue of prisoners.


Likewise for The Terminator and AIs. People almost treat these movies like they're historical documentaries, just so they can toss out cheap relatable memes.


It’s even more wrong to portray prisoners as a bunch of hapless victims. I’ve had to interact with prisoners in a previous career and less than 10% are people I would allow in my home.


Anyone use VOTT for labelling training data? Man that is just a joy to use compared to labelImg, especially the template stamp feature and label hotkey.


I gave up on VOTT, ended up going with RectLabel on macOS. IMO better workflow, and being able to copy and paste annotations from one image to another (to then go fine tune them) blows both VOTT and labelImg our of the water.


I checked VoTT and RectLabel and they seem to be desktop apps. Can they be used by a tagging team (more than one person), especially if some of the taggers work from home?


Not really, which is another big problem for all of them (including RectLabel). No great solutions all around :(


Skynet labor camps


On the other hand, the prisoners can proudly write "Machine Learning" on their CV instead of having a gap!


My thoughts, exactly. Or even like The Matrix where humans are farmed for energy. Here, they are farmed for their mind to train the next AI, which might in a worst case scenario rule them someday. Somehow reminds me of this beautiful piece [1] by Maciej Cegłowski .

[1] https://idlewords.com/talks/superintelligence.htm


I once found what looked like an early draft of The Matrix. Instead of body heat being used for power, part of each enslaved human’s brain was used as an organic supercomputer to control the fusion reactors.


The story goes that the studio executives wanted it changed to energy because they thought that viewers wouldn't understand using brains for computing power. Hard references to this fact are hard to find, though https://scifi.stackexchange.com/questions/19817/was-executiv... seems to collect a couple of secondary sources.


Strange that viewers would struggle to understand using captive brains for computing power, but would have no problems with a whole lot of other concepts from the movie...

Then again, the plot is the usual Hero's Journey (Campbell) like the majority of Hollywood blockbusters tends to be, but I guess the devil is in the details (illusory matrix/maya, recursive resurrection/enlightenment of Neo, the Christian theme of consuming an apple (pill) from the tree of knowledge and the subsequent expulsion from "paradise", etc.)

The Matrix trilogy is good on many levels, but I can't buy the "people used as batteries is easier to understand than people as compute accelerators". There are plenty of other, perhaps more relevant, places to dumben down in the Matrix if one goes down that road.


That looks like it could be the version I read. Thanks for the link!


> like The Matrix where humans are farmed for energy

That's how Morpheus was told or wanted to present the reality. You cannot produce energy by breeding and feeding humans due to the law of conservation of energy.

As later in Animatrix Wachowski sisters showed, the machines were simply conserving humanity.


I forget where I read it, but supposedly in the original script, the machines were using humans for the computing power of their brain. They went with body heat instead because that was something audiences would more easily understand.


>which might in a worst case scenario rule them someday

Or, in a more realistic scenario, might learn how to discern between an article about a tech company and an article about an orchard.


This is a LitRPG novel waiting to happen


What could go wrong?


Last week there were the poor women in India who were making a living out of labeling data, now it’s the inmates in Finland, I wonder who will be next? Will we have to boycott the likes of Google or FB for how the label their data the same way some people used to boycott clothes made in Bangladesh?


> poor women in India who were making a living out of labeling data

Is there something unethical about hiring people to label data?

> now it’s the inmates in Finland

Forced labour in general is always up for debate, but is there something about labeling data that is inherently worse than other forms of prison labour, such that tech companies deserve a boycott?


I've spent weeks labelling data as part of my CV projects. It's much better than scraping graffiti off a wall...


> Is there something unethical about hiring people to label data?

Inherently, no. The question I'd ask is - if labeled data is so important, would we pay them equitable benefits of the outcomes we achieve after the usage of this data? Or, do we simply fire them once they fuel our algorithms. It reminds me of "blood diamond" like practices.


Screws are also very important, I imagine millions and millions of IKEA furniture falling downs if screws disappear magically. More seriously, for some structures it's much better and easy to use screws than nails. (Nails are very important too.) Anyway, I don't imagine that the operators in a screw or nail factory are getting payed millions per year.

There are hundred of economy books explaining how much you get paid, with different theories and interpretations, but if you can be replaced by a person that will do your work a minimal wage, you will probably get only a minimal wage.

Also, each label of the data is not so valuable, some are wrong, some are unclear, you must aggregate and process millions of labels. Imagine how little value they do get for each Captcha reply.

> Or, do we simply fire them once they fuel our algorithms.

Most works are like this. Imagine you are the main architect/engineer in charge of building the Empire State Building. They pay you for a few year to do the plans of the building, hiring the workers, buying the materials, whatever is necessary to do. After the building if finished, they thank you and now you are unemployed. (The workers are unemployed too.) You don't get a cut of the profits of the building.


The funny thing is that it us the exact sort of "low tier - low skill effectiveness, high demand, high supply, low wages" job that aitomation was demonized for getting rid of they are facing complaints over making.


> I wonder who will be next?

You do that every few days when you solve captchas.


My point was putting AI knowledge in the hands of people with a morally dubious history, but then I thought about tech executives and IT management, and maybe nothing's different..


So they'll get AI with criminal mindset as a result? See where it leads to?


They were classifying news articles. How could that possibly have a "criminal mindset"?


I saw the Robo-Cop movie.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: