Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Predictive policing does nothing more than sending policing resources where crime is more likely based on statistical models.

It does not racially discriminate and it is not racist. It does not harm anyone.

I feel that this is posturing and shooting the messenger. If crime is statistically higher in "black neighbourhoods" the issue will not be solved by pretending it isn't.

If the technology does not work then of course there is no point spending more money on it. So, does it work or not? Here this feels political, not pragmatical.




Everyone does something criminal from time to time. Policing A more than B will lead to more arrests at place A, increasing their crime statistics, leading to more policing ...

For example drug use is about equal between white and black Americans, but since black people are more strictly policed (more frequent in traffic stops etc) they are arrested and sentenced for drug use far more often than white Americans. [0]

0: https://www.hamiltonproject.org/charts/rates_of_drug_use_and...


> Everyone does something criminal from time to time.

Everyone? Maybe this would be true if you'd include non-criminal offenses (those you'd get a ticket for). Most people in their lives will probably jay-walk, or speed, or get a parking ticket, or something else in this category.

But you're saying that everyone (I'll read this as "most people") will do something that would get them arrested if caught from time to time? That's an outlandish statement.


Well... i haven't read the book, but it is widely cited:

https://www.amazon.com/dp/B00505UZ4G/

The author claims the average adults commits three Federal crimes a day.


I've had this conversation before and no one has been able to defend their position outside of "there's a book with this title" or "but weed though".

Even looking at the reviews of that book, it seems like the author doesn't even argue the title. It's just specific cases of people getting screwed over in court. (Which of course happens, I'm not arguing against that.)


In statistics there is a thing called bias, which can cause a lot of problems if not correctly handled.

An example of bias is historically most black people default on their loans. ML is deployed to predict if someone might default on a loan. Because ML does not understand bias, it sees the person is black and denies them for that, purely off of the fact that black people historically have defaulted more on their loans.

Bias is when ML sees something not relevant as a pattern and uses it as a feature to determine the future. Instead if race was filtered out, it might have seen historically most black people who got a loan were weak in other areas, like income or income stability or something else that actually factors in. It then could predict the future with a higher level of accuracy.

Police bias is worse than other industries, because it creates a feedback loop. If you think a black person is more likely to commit a crime, and you put more resources into that, then you're going to find more crime. This increases bias and it feeds on itself.

It seems the common fear on YC is the algorithms in predictive policing have a strong bias, causing problems. This is a legitimate risk, but imho not because of the algorithms but because of how they're used. They blindly give insights and police officers use this to increase bias, amplifying the issues we currently have.

On the NSA level the algorithms, which are not predictive policing, deal with bias much better and work quite well. They're scary good, better than having someone watching you at all times. Though, I guess that's a bit off topic.


> Bias is when ML sees something not relevant as a pattern and uses it as a feature to determine the future.

This alone does not fit the definition of statistical bias. Statistics is not capable of proving causation. So a completely trivial fact or feature can be a great estimator nonetheless. Of course if the system changes then you have a problem (with or without a causal model).

If historic data is warped by police of course, then you're using a biased sample and will indeed get a biased estimate that way. But if you correctly use randomly sampled data and a method that has been mathematically-proven to be unbiased, then if your model says to deny loans based on race, you would likely make fewer bad loans more by following it versus ignoring this information.


> It does not racially discriminate and it is not racist. It does not harm anyone.

This is a dangerously irresponsible statement.

If police are disproportionally spending time in specific neighborhoods, and arrests are disproportionally made from that population vs. actual crimes committed, this will be captured and reinforced in the predictions.

So yes, it's racists and it harms people.


So you're saying that more police presence is harming people. How? If there's no crime there are no arrests or reports after all...

Your reply is rather aggressive for not apparent reason. As I said this is highly political in the middle of the current hysteria.


> So you're saying that more police presence is harming people. How?

Ask the disproportionate number of Black men in prison. Ask white rapist Brock Turner why a white judge wanted to let off "because he's a good boy", or the Central Park Five or the Georgetown Jacket Three that spent decades in prison because white cops and prosecutors assumed their guilt based on the color of their skin.

If my posts sounds aggressive, it's because your posts are dismissive of the terrible culture that's led to the biased data that would be used to make predictions, and you have the gall to claim it's all fine and dandy.

> current hysteria

There is no hysteria going on right now, it's a quite reasonable response to decades of bad decision making on the part of police departments across the U.S.


You moved from police presence to sending innocents to prison... That's quite a step.

Predictive policing is nothing more than police presence. If police arrests innocents and the justice system sends them to prison that's quite another issue. On the whole I suspect that the number of innocents sent to prison is rather low.

There's hysteria alright on 'racial issues' at the moment.


> You moved from police presence to sending innocents to prison... That's quite a step.

How do you think people get arrested? They show up at a police precinct and turn themselves in?

> There's hysteria alright on 'racial issues' at the moment.

Yeah, I'm sure the Civil Rights & Suffrage movements were just "hysteria" too right?


> So you're saying that more police presence is harming people. How?

If the police were trained well, then I might agree with you. But they're not. They're trained to expect every encounter to result in an attempt on their lives. They're trained to escalate instead of de-escalate, meaning that a run-of-the-mill interaction is more likely to result in violence than it needs to be.

Labeling what's going on now as "the current hysteria" is painfully dismissive of the real harm that police are doing to people.


Any sort of bias present in the training data will be replicated in the model. If the police are biased in whom they target, that group will naturally show a higher crime rate. Which would easily be picked up in any sort of statistical model. Leading to a biased model.


> It does not racially discriminate and it is not racist. It does not harm anyone.

False. I agree that the software itself is not racist, but the decisions it makes are only as good as the data you feed it. If you feed it racist data, then you will get racist decisions from it. And given that policing has had racial biases for centuries, all we have is racist data.

And the more you act on its racist decisions, the more racist feedback it will have to act on, giving you more racist decisions in the future.


> where crime is more likely

This is the fundamental disconnect in your argument, because historical recorded arrest rates are not necessarily reflective of actual crime rates.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: