I have limited web development experience so maybe someone can fill me in. Why is securing a web server so difficult? I understand they are complicated systems with many technologies, but large companies with lots of smart engineers can't seem to solve this problem. Can most of these hacks be attributed to user error?
It's a combination of things, but mostly, every piece of software expands your attack surface. Most systems are not written with defence-in-depth and most have all-or-nothing authentication/authorisation.
For shared hosts it's worse: a flaw that punches through can affect many customers at once. And there's often a larger attack surface because of the complexity of hosting multiple customers in a single system.
Take, for example, Wordpress. It uses a single login into a relevant MySQL database. Plugin code has the same level of access to MySQL as the core Wordpress code. If a there's a defect in either, your MySQL database is wide open to receiving SQL statements.
If you didn't configure the database properly, that wide open pathway now leads to other databases, or to shell access with interesting privileges, and so on.
So the design of wordpress (and in fairness, pretty much all web apps follow a similar pattern) both increases the attack surface and decreases the depth of defence.
It can happen for a number of reasons and isn't necessarily with securing a webserver.
Sometimes security issues get that "well, it isn't an issue right now" view and get put in a backlog and never resolved. Other cases it is a lack of good code reviews or security reviews. Or inexperienced developers (at least inexperienced with that kind of security).
The truly sad part is the number of examples of this kind of thing blowing up yet companies are still making the same mistake. At what point does someone think of checking their own apps password practices to ensure they aren't next?
One practice that's starting to help, a bit, and it pains me to say it, is security audits. There are a few standards emerging for SAAS services, especially at the B2B level. And as startups (or non-startups) start intersecting with large, regulated, publicly-held businesses, these requirements propagate, and with audits, receive multiple reviews.
The process is far from perfect, and many of the standards are laughably lax (and yet ... they still aren't met). Because they're standards, the requests are fairly uniform. A number of issues regarding Dreamhost have been on my own back burner for months, and I'm finally getting that Round Tuit this afternoon.
Pushing for good, solid standards would be a net benefit. That's the silver lining here.
This sounds promising. I've seen the answers on stackexchange and quora about web security, but until my site is tested by a hacker it is difficult to know if I've done everything properly. So I like the idea of some kind of audit to help in this process. Maybe hiring some white hats to pound on my site would be a good idea.
Mentioning standards and security audits makes me cringe in their current state. Certainly agree with you that they'll help, but many are draconian. Things like physical access lists when you use cloud services... or policies on tape backups. :)
I'm hoping that they'll evolve toward slightly more sanity than insanity.
Some institutional laws suggest that this is wildly optimistic on my part. Security theater exists because bureaucrats and legislators want to appear to be doing something. Vendor-based solutions are mandated because vendors have political clout. Effective tools are difficult to deploy in real-world scenarios -- filled with failing devices, intermittent communications, poor training, and worse end-user understanding.
Securing them properly is pretty straightforward, but it takes time and skill. Time+Skill = money. Web hosting companies compete primarily on price because its the only thing that is easily compared. How does one measure a service provider's competence? Like used cars, the quality ones get pushed out of the market by the dirt cheap lemons.
It's more cat and mouse, as complexity grows with computing power, there is a gap between the state of the art, and commodity systems that are sold for profit in global markets. Because UNIX works so close to the hardware, it is especially prone to novel methods of attack if left unattended.