Personally, I find it astonishing that the manufacturer hasn't caught this.
Here's my (probably) unpopular opinion: In the very near future, you might have to become 'certified' as a programmer for anything that touches the internet EDIT: AND has the possibility of killing someone.
Much in the same way that a regular Joe can't build a bridge over a river without a few Civil Engineering certifications, a programmer can't make an internet-connected program that touches the internet and also controls a 2 ton moving vehicle.
There are rookie mistakes, and there are systematic failures; a LOT of these recent issues seem to be systematic in nature. The easiest fix in that case is to fix the system.
Certifying the people only really allows for punishing the people responsible for not anticipating problems in their systems. To make the certification worth anything you'll need a set of guidelines against which these kinds of systems should be engineered. Formally verifying the system as a whole against those guidelines seems like a far more effective approach, as it (theoretically) results in the system itself being verified as safe.
When Ford released the Pinto, they determined the cost of simply paying out potential lawsuits for driver deaths would be less than the cost of actually fixing the cars' tendency to explode when rear-ended, so they just let people die until the numbers balanced out[0].
Granted, that was a hardware issue and not a software issue, but if the risk of a class-action lawsuit is involved, and that risk is magnified by a recall or public awareness of the problem, then the motivation may be the same.
>In the very near future, you might have to become 'certified' as a programmer for anything that touches the internet.
I think a better option would be to mandate that certain things can't touch the internet. It would be absurd if I, someone who writes PHP/SQL and C# crud apps for a living, had to be certified as a systems programmer (or the other way around.)
The problem to me is the insidious way we as a society have sort of collectively accepted that connecting things to the internet is inevitable and always a good idea. Instead of making sure the engineers who connect the car's brakes and steering to the internet through the radio are competent, how about not making that decision to begin with?
The Pinto analogy is dangerous. The flaw in the Pinto was that when it got into a certain, common type of collision there was a higher-then-normal chance that its fuel tank might explode.
The risk was NOT that because of this vulnerability, there was a chance that bad guys might go around ramming into Pintos to make them explode. Who would even do that? Or maybe that if you rammed a knitting needle into the back of a Pinto it would trigger a fuel leak. Well so what? Probably similar issues exist in any number of cars, but it's not something people would actually do (and if they did it would obviously be criminal), so... that's not really an engineering flaw.
So the insecurity of infotainment systems in cars is NOT like the Pinto issue. Nobody is suggesting that when the infotainment system plays a Mariah Carey song it will, because of a bug, shut off the brakes. Or that during a common kind of collision, the infotainment system causes the airbags to fail to fire. No, this is only a risk if someone actively attacks the infotainment system and deliberately shuts off the brakes, or triggers an acceleration, or whatever.
There are already ways to physically sabotage a car to make it dangerous to drive - cut the brake lines, puncture the fuel tank, whatever. Those are, to a certain way of thinking, security vulnerabilities. If you're someone who is concerned about people trying to kill you, you probably take extra precautions to secure your car to mitigate those risks. Ordinary people just park their car on the street and assume assholes are not going to trigger a slow leak in their brake fluid.
So why should ordinary people be concerned that someone might install a trojan in their infotainment system which is going to shut off their brakes on the highway?
The chance may be low, but I think people should be concerned because it exists at all. It doesn't have to be possible to install a trojan that shuts off a car's brakes on the highway - that possibility is the result of engineering choices and compromises which could have, and still could, be avoided. It doesn't have to be an acceptable level of risk for drivers to take on themselves, either. That there's little chance of someone taking advantage of it is irrelevant.
My point is that we're used to thinking of computer security as being an absolute necessity because, for example, it is of value to a bad guy to simply take control of a network-connected computer to become part of a botnet. Or to use it to obtain valuable personal data which passes through the computer. But I think that leads us to overestimate the risk created by the possibility that hacker activity could extend to the physical realm, because if a hacker crosses that line, it has consequences.
My car's infotainment system doesn't know my credit card number. It probably makes a pretty poor botnet member (but even if someone does hack it and turn it to churning out spam they're not going to be trying to get to the CAN bus). And there's little value to a badguy in just breaking my car for no reason. If someone just wants to break cars and hurt people, they can go throw rocks off an overpass. People do do that, admittedly, but they're not revealing an engineering flaw in cars by doing so.
If you can use this hack to pop the door locks and override the ignition, then maybe we should be talking.
You can build a bridge over a river if you own the property it traverses.
The civil engineering certifications are only to absolve you of personal liability if any person crossing the bridge is injured due to its failure. That is sort of important, but the same could also be achieved by the owner posting it against trespass, and requiring that any authorized users sign a liability waiver beforehand.
If you own the rights to code that ends up causing injury due to a known flaw in it, you're liable for the harm. That's all there is to it.
The certification-based "out" only applies if the auto manufacturer hired exclusively certified software professionals and allowed them to do their jobs without coercing them to cut corners. This is something that will clearly not happen until well after the first huge damage award for software malpractice occurs.
I am emphatically against any prior licensing restraint on individuals seeking to enter or innovate in this industry. That's just protectionist bullshit. It doesn't make the code better. A legal regime that rewards following software best practices by limiting product liability, on the other hand, that seems rather useful.
NCEES has introduced a software engineering PE exam.[1] Not many people are taking it yet (18 total test takers for April 2015) [2] but that isn't surprising since it's brand new, many "software engineers" don't qualify for licensure, and only a bit more than half the states even license for software PEs. Personally, I don't see how much of a difference licensure will make for safety-critical projects given that regulated industries are already, well, regulated. Nevertheless, forcing someone to take professional and personal liability for their work, as well as establishing a common body of knowledge, both strike me as good in general for safety-critical work.
Respectfully, I think programmer licensure is an oversimplification of the problem. Far more goes into the design of software systems than just the software. Product Management often wants things done a certain way. Hardware guys want things done their way.
Take this hacking example. Couldn't you imagine a situation in which, a licensed software engineer is pleading with PM to separate the entertainment system, from the controls? But, PM pushes back, and wins, on the account of cost savings?
> Take this hacking example. Couldn't you imagine a situation in which, a licensed software engineer is pleading with PM to separate the entertainment system, from the controls? But, PM pushes back, and wins, on the account of cost savings?
Then the engineer, being professionally and personally liable for their work product, notes that they are ethically obligated to "hold paramount the safety, health, and welfare of the public" and that "if engineers' judgment is overruled under circumstances that endanger life or property, they shall notify their employer or client and such other authority as may be appropriate."[1]
> I have no trouble requiring a certification for developing software that runs on systems upon which lives depend, but... the internet?
Yeah, you're right. I suppose I was a bit overzealous with my first comment. Systems that can cause immediate harm should require the certifications as I mentoned.
Internet in general would be very difficult to manage.
These stories probably need to be reframed as (e.g.) "Infotainment Flaw Allows Complete Vehicle Takeover", rather than being hacker centric. It may be difficult, ego-wise, to step out of the spotlight (like the giddy hackers in the wired video [1]), but it's necessary.
Personally I would blame both. The manufacturers should be criminally liable if anyone gets injured and required to do a recall for defects. Security research should be legal and encouraged. However anyone that maliciously uses a defect to injure someone (or attempt to) should be treated as if they had used a physical weapon.