Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I was quibbling about semantics. I am aware that cars can be dangerous.

> and the damage it can cause on accident

You are conflating (un)safety of a tool used in the way it is intended (kitchen knife = cutting a steak) with accidents (cutting a finger) and with malicious use (stabbing people). Those three categories are not the same for object-that-may-act-as-weapon and object-designed-as-weapon.

Conflating them collapses the number useful things we can communicate.

So are you concerned about Tesla intentionally building killing instruments? Or potential for accidents? Or the potential for intentional misuse?

> The fact how dangerous cars is is very much underappreciated by people in general, as evidenced by the number of morons on the road.

Cars also provide immense utility. If all they did were providing the thrill of speeding then they would probably be banned as too dangerous. One of the tradeoffs is the overhead of enabling people to drive. We could drive down the number of morons by requiring astronaut training for vehicle operators but again, that tradeoff seems too harsh and it's more efficient to occasionally let people die in traffic accidents than letting them die because nobody qualified as ambulance driver.

> We have a for-profit race by companies, many of which can't be trusted with getting software right

In the short term this may cause more deaths than necessary. But on the other hand it might be the quickest way to find a winner and then hold the rest to the same standard. As long as the experimental fleets are small they are just a blip in the statistics. Right now they should be equated to the yearly batch of first-year drivers who have an inherently higher risk profile due to lack of experience. We still accept them on our roads in the expectation that they improve.

What is important is to make sure that they are as good as or better than humans once they roll out in large fleets.



> So are you concerned about Tesla intentionally building killing instruments? Or potential for accidents? Or the potential for intentional misuse?

The latter two.

> We could drive down the number of morons by requiring astronaut training for vehicle operators but again, that tradeoff seems too harsh and it's more efficient to occasionally let people die in traffic accidents than letting them die because nobody qualified as ambulance driver.

I don't think this is the real reason. You don't need astronaut-level training for vehicle operators, just more than the ridiculously low standard of today, and more importantly, much stronger and harsher enforcement of traffic laws. I doubt that this will reduce the number of qualified ambulance drivers.

I suspect the real reason we tolerate so many morons on the road is path dependence. When cars first appeared, they were rare, slow and safe. In the couple of decades it took to get to the present density and speed of cars, it became a social status symbol, and something politically impossible to rein in.

> What is important is to make sure that they are as good as or better than humans once they roll out in large fleets.

I'm afraid that with self-driving tech based on neural networks, with no ability to inspect and verify what's going on, we'll eventually have to eat the risk and roll them out in large numbers before we know they're as good as humans.


I agree with your general sentiment overall.

I do not agree that neural networks are a "black box" with "no ability to inspect and verify". Even putting aside the many methods to understand what a neural network is doing without running it, at core, neural networks are well tested instruments. That's how they learn-- by testing themselves.

Obviously it's possible for a neural network to have odd behavior in circumstances not accounted for but that was always going to be possible at the level of complexity we're talking about here.

We're talking about cutting edge technology here -- and I agree with your general sentiment. I just don't agree with pinning the blame on "... based on neural networks". The same factors would apply to any codebase of this complexity.


> Even putting aside the many methods to understand what a neural network is doing without running it

Name three :).

> neural networks are well tested instruments. That's how they learn-- by testing themselves.

Last I checked, neural networks are well-tested in a sense that if you throw a big database and a shit ton of compute at them, they'll learn to accurately work within that database. Step out of it, and all bets are off. We're better at this than we were 30 years ago - good enough to apply this technology to consumer-level products in which mistakes don't really matter. I'd be wary of applying even current neural networks to safety-critical tasks.

> Obviously it's possible for a neural network to have odd behavior in circumstances not accounted for but that was always going to be possible at the level of complexity we're talking about here.

The problem is that with NNs, the odd behavior is usually totally unexpected, and you can't really inspect the network beforehand to discover the possible ranges of error-generating inputs. Everything works fine but every now and then you get a patterned sofa classified as a zebra, or a car + little noise classified as a toaster. And then there's no obvious relation between multiple misclassifications, because the reasoning structure of the neural network is implicitly encoded in its weights.

> The same factors would apply to any codebase of this complexity.

I think there's a fundamental qualitative difference here. A codebase can be complex, but ultimately it has a structure, and usually (in case of ML) represents a well-understood mathematical structure. Neural networks have simple code, and the whole complexity is hidden in opaque matrices of numbers, where even single changes usually have global effects.

I'm not trying to dismiss NNs in general; I just don't trust them in applications where health and safety is at stake.


>Much stronger and harsher enforcement of traffic laws.

I'm sure that making the primary means of long (greater than walking) distance transportation for the majority of the population more expensive and higher stakes is going to work out great in the long term. I can see the parallels with healthcare. Creating yet another part of life where a single screw up that is capable of ruining the financial well being of someone living slightly better than paycheck to paycheck is not going to do positive things in the long run.


I'm not thinking about screwups, I'm thinking about reckless behaviour and endangering people. For example, treating speed limits as suggestions instead of hard constraints, overtaking in places where it's not allowed.


>I'm not thinking about screwups, I'm thinking about reckless behaviour and endangering people

But who determines which is which? The letter of the law really, really, really sucks when it comes to traffic law. I haven't collected data but I'd wager that pretty much nobody follows the letter of the law for an entire drive from A to B

>For example, treating speed limits as suggestions instead of hard constraints

That's more human than reckless. Outside of places with Orwellian enforcement (I'm looking at you Europe with all your cameras) they really are. The vast majority of people go at a speed they feel comfortable in the conditions. This is why (conditions permitting) traffic flows at 80 even when the sign might say 55. People only follow the speed limit when they feel it's a comfortable speed. This is why the general recommendation is to set speed limits for the 90th percentile speed. If you don't do this on highways you get people doing the (inappropriately low) speed limit in the wrong lane. Passing on the right, tailgating and all the other things caused by traffic friction which is more stuff for drivers to keep tabs on and that decreases safety for all. There have been studies on this (Google "traffic friction" and filter out everything that has to do with literal friction). If anything speed limits on multi-lane roads should be raised to reflect the speeds people actually drive. I hope we'll see more dynamic speed limits in the future since they'll help a lot.

>overtaking in places where it's not allowed.

If every 100th instance of an illegal pass (usually on the shoulder when waiting for someone who's stopped to take a left turn or on the right on the highway) in my state resulted in a ticket it would probably be about a year before half the state's drivers hit the three strikes cutoff and had their licensees revoked.

The picture I'm trying to paint here is that aggressive enforcement of existing laws would probably be bad for the population at large because people break traffic laws in inconsequential ways all the time and that stronger enforcement of them would just screw people over (unless of course you enforce them so well that important people get screwed which would result in the laws changing) and discretion isn't an answer because that just results in profiling.


Whenever someone labels something "Politically impossible" I laugh.

America banned alcohol by an amendment, pretty much the hardest political barrier we have

It's just an excuse IMO


Prohibition was adopted by statute first.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: