Hacker News new | past | comments | ask | show | jobs | submit login

So, suppose we live in a fantasy world where a smart gun is developed that has extremely high reliability. Would you agree that would represent an overall improvement in firearm technology?

I'm trying to understand if the objections are rational, or religious.

I honestly can't tell.




If you can make a smart gun that is as reliable and foolproof as a Glock, without destroying the ergonomics, you'd have something.

Failure is not an option. 5 9's isn't good enough, and we can't manage that in software with redundant systems. Beating that, in the meager amount of free space a gun offers, is a pretty tall order.

I'm not being religious, I'm being realistic about just far away "good enough" is for a smart gun.


Thank you! I appreciate that you took my comment seriously and decided to respond in kind.

I totally agree with you. It's an extremely difficult ask, and I'd never claim that the current iteration of this technology is anywhere near workable.

Another commenter, here, used the analogy of self-driving cars, and I think it's an apt (if imperfect... the problem spaces are profoundly different) one, though perhaps not the way they were intending.

In both cases, the technology problems are enormous, and the failure modes could result in the loss of human life. So it better be pretty damn foolproof if it's to gain any kind of widespread adoption. Yet, you don't get the same level of negativity levelled at efforts to build self-driving cars... that it's impossible and car companies should have never tried.

The original question was: "Are there any gun- and tech-literate people who actually think smart guns like this are a good idea?"

I think the nuanced answer, here, is: Not yet. But perhaps some day.

It seems we might agree on at that!


The only people touting "smart guns" as a solution are people who seem to think that firearms are the actual problem. Let that sink in for a moment.


The same argument could be made for child safety caps on bottles of Tylenol or bleach.

That's not a rational argument. It's an appeal to emotion, and a bit of a dog whistle.


Unlike guns, bottles are meant to be left unsupervised. Guns and other deadly implements? Not.

See, the original gun safeties are designed to prevent misfire by accidental trigger pull and/or during maintenance.


Understood.

My point, which seems to be getting misunderstood, is that exploring additional safety features is not "blaming" guns or somehow diverting attention away from societal issues.

Mandating seatbelts, smoke detectors in residential homes, safety features on power tools, etc, etc... there's countless examples where technology has been used to increase safety, not as an excuse to avoid solving societal issues, but as a complementary tool in addition to education.

So the following statement:

The only people touting "smart guns" as a solution are people who seem to think that firearms are the actual problem.

Is just not an honest argument or debating tactic. It's an appeal to emotion, an ad hominem attack on those who disagree, and I maintain a dog whistle for those who think it's just Big Government yadda yadda yadda.

Of course, given the maturity of the technology, mandating it now is likely the wrong move. I get that government wants to pressure industry to push this technology forward, but there are alternative ways to incentivize that without government regulation.

Still, IMO, having government pushing for this kind of technology makes plenty of sense. And believe that does not require thinking that "firearms are the actual problem".


Lol. As a parent, you are incorrect.


Have you ever fumbled around unlocking your phone? Can you do it instantly every time you pick it up?

This would be unacceptable for most gun owners if it happened even every thousand rounds.

It's rational.


You're already making a ton of assumptions about the entire possible solution space (specifically that it requires an active effort to unlock the gun that necessitates "fumbling").

How is that rational?


There are two failure modes of "smart guns":

1) fails to unlock: "wild animal or another human killed me first"

2) too easy to unlock: "a child undid the lock and shot another child"

Just like self-driving cars, even if you get a 10x or 100x net reduction in deaths, the first time someone dies because your "smart gun" failed, there will be a huge press cycle and mass hysteria. And probably lawsuits.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: