Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There are numerous references to increased congestion as a result of the reduced speed limits in built-up areas, but the UK has a fairly strong "You should always be 2 seconds behind the car in front" element to driving rules, and naively if there's two seconds between every car then the number of cars passing a particular point in a given amount of time is going to be the same regardless of how fast they're travelling, so is there a real model where a lower speed limit reduces throughput in a meaningful way without violating this constraint?


This is asymptotically true for high speeds but false for low speeds where 2-seconds-car is comparable to the average length of a vehicle.

If the speed limit was (magically) 0.001 mph then raising it to 0.002 mph would double throughput.


But 20mph is not low enough for that to be a problem.

2 seconds at 20mph is ~18m.


It is not an overwhelming factor but if for example you were to go from 20mph to 10mph you would see an effect.

Overrall lower speed limits (when streets are designed to take advantage of them) have a lot of benefits, this is in the category of minor drawbacks.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: