Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'd attempt an answer.

1/ imagine running >1000 legacy applications, some never updated in 20 years 2/ imagine a byzantine mix of local data centers, VPCs in aws/gcp/azure 3/ imagine a IT departament run by a lot of people who have never learned anything new since they were hired

That would be your typical large, boring entity such as a bank, public utility or many of the big public companies.

Yeah, there is no law of physics preventing this, but it's actually nearly impossible to disentangle an organization from decades of mess.



That's why we've invented emulators, sandboxing, ...

People have continued to run old management systems inside of virtual machines and similar solutions. You can sandbox it, reset it, do all kinds of wondrous things if you use modern technologies in an era-appropriate way. Run your old Windows software inside of a VM, or tweak it to run well on Wine if you have the source. The reason this mess happened is that all of those software are literally running a desktop OS in mission critical applications.

I have worked as an embedded engineer for a while and I can't count the number of nonsensical stuff I've seen incompetent people running on unpatched, obsolescent Windows XP and 7 machines. This mess is 100% self inflicted.


I think these are just technical excuses, but the real answer lies somewhere in the fields of politics and economics. If people in charge are to make a decision – then us tech nerds are going to migrate and refactor 1000 applications and update 20 years of byzantine code mess. I saw entities so large and boring they can barely move one step – changing rapidly and evolving once their economic stability is at stake, and this is a great example of such a disruption which can push them into chasm of change.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: