It's not the title that's the problem. It's the part where people and go write software that gets deployed at scale in an environment where bugs can cause very real and significant damage (monetary or otherwise).
The title is part of the problem, because it reveals the culture, slapping cool titles without upping oneself to what those titles actually mean.
As for the rest, anything that brings computing to level of the rest of other professionals, has my signature.
A Software Engineering professor of mine used to say, many applications are akin to buying shoes that randomly explode when tying shoelaces, whereas a minor defect on real shoes gets a full refund.
> The title is part of the problem, because it reveals the culture, slapping cool titles without upping oneself to what those titles actually mean.
The irony is there are actual disciplines in software that are worthy of being called "engineering"--how the hell does an engine ECU work with the level of precision that it does? ABS systems? Hell, how about most electronic control systems on an airplane?
These are some of the most impressive feats in software development, and I've heard near 0 about any of them.
Yet the "industry" is hyper-focused on mashing together "containerized" monstrosities to put strings in databases, or to find a new way to add a chatbot to something that doesn't need it.
I don't know you but i bet that if you and me were locked up in a room together for a month we wouldn't be able to 100% agree on "codified standards for how to do things" :)
Industry isn't mature enough for that and it's perhaps doubtful that it will ever be. See the halting problem.
I don’t think it’s necessary to agree completely. You could start by codifying a minimal set of things that the majority of people agree on (user data sanitisation, authentication handling etc) and then build on it over time.
The standards could also help codify more meta things, like vulnerability policies, reporting and outages. This would be helpful to form a dataset which you can use to properly codify best practices later.
The main problem is that this increases the bar for doing software development, but you can get around this by distinguishing serious software industries from others (software revenue over a certain size, industries like fintech, user data handling etc)
ye gods, can we come up with a "law" to describe appealing to the halting problem?
Just because there are unanswered questions that doesn't mean we can't have bare minimum codified standards.
Furthermore, standards aren't invalidated just because practitioners disagree with them. Plenty of <insert engineer type>s disagree with the standards body of their respective field, they still follow the standards out of fear of prosecution or simply as a path of least resistance and when those standards are found to be defective, they (generally) evolve.
We don't need to solve the halting problem. We just need to come up with a sensible set of practices that, if followed, make the risks small enough to be considered acceptable. Then we can point at that list and say, "this is what the reasonable expectation of due diligence in software engineering is" - and legally enforce that.
Every other discipline has education requirements, codified standards for how to do things etc.