Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Great points. And I'd add that the other implementation issue here is the human system any company's intentions get filtered through. So the interests modeled are not the entire corporation, but C-suite intentions passed through executive intentions, manager intentions, and then worker intentions.

E.g., it might be in the long-term interest of the company to not run over pedestrians. If nothing else, the PR cost is very high. But when an exec wants to be first in the field to demonstrate personal success, then hitting a made-up date becomes the priority. Which implicitly puts "not kill people" lower. Middle managers don't want to get blamed, so they'll favor a more complex, muddled organization structure. Per Conway's law, that means muddled code. So instead of the car's software reflecting "don't run over pedestrians" as a key goal, its true priorities are things like "gives a good demo", "was delivered 'on time'", and "reflects an architecture clever enough to get that architect promoted that quarter".

It's not impossible that good software comes out of this system, but the deck is certainly stacked against it.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: