It's less about risk aversion than it is about position, size, and complexity. As these things grow, the incentives change and the ability to understand what the organization even is becomes impossible.
A startup starts at the bottom. It begs investors, customers, and employees from a position of optimism and humility. The organization enthusiastically changes itself to find a good balance between those three or it dies. As the organization grows, it starts demanding everyone else change for them instead. Google's interviews are an example. Its famous customer service is an example.
Then we get to size and complexity. Thanks to Dunbar's numbers, we know that there are numerical limits to a human's ability to know people. This makes sense. I can know everything about 6 people, most things about 50, and keep track of about 250 well enough. As the organization grows, your ability to know it disappears. You begin making abstractions. Instead of knowing exactly what Susan does, you say she works in X Department, for Y Initiative, doing Z position.
Google is so big that one person can't understand it anymore. The inevitable reduction to a corporate abstraction occurs and then people treat it like the X Company, which is just like Y Company but makes X instead of Y. Short term revenue and expenses are the only measures at the end.
And in this faceless abstraction, the professional parasite class infests and extracts resources and morale. Eventually the C-suite stops fighting it and joins in on it until only the sheer size and momentum of the company keeps it going. Maybe an investor group will come and force a rework of the company, but not before the company is just a shadow of the shadow of its former self.
I explain it in the third paragraph but to illustrate it further: Consider a function. A function that is 1 line long is immediately understandable. At 10 lines, it is readable within a minute or two. At 100 lines, it is maybe legible to someone who lives in that function. At 1000 lines, it is a black box. Human organizations are the same way.
You might suggest refactoring, which is what companies do too. They create departments, promotion ladders, org charts, and mission statements. The problem is that abstractions leak by design. As your abstractions accumulate and change outside your view, your ability to understand the entirety reduces.
But that has to do with the capabilities of the executives involved, it doesn't make it impossible. Just like in your example, there are many, many developers that can perfectly understand large functions or code bases without issue.
If you have such a code base and you hire people that are not equipped, either through inexperience or capability, of managing that code base that is a resourcing issue.
If your executives cannot understand and control the organization they are tasked with controlling and cultivating, then they should be fired.
Except large code bases do the same. They regularly die when their ability to be understood drops too low. Even with well organized code, they are pushed to add features until they aren't understood at the deepest level. Once you hit millions of lines of code, even when you spend decades in that code base, you still forget changes you made even if you have an overarching picture. That's ignoring other people working on it all the time. The understanding gets reduced to contracts, types, and interfaces.
And most importantly, humans are more complicated than code. With enough time and knowledge, I can accurately tell you what any piece of code does on a single expression or statement. Humans regularly do things they don't even know for purposes they don't understand.
Do you have any resources to learn this. How to untangle the situation. What would happen if the resources indeed isn’t the problem to tackle, rather its complexity that is hard to untangle.
too many things going on, involving too many people and nobody can possibly keep track of it all in their head. You have to split it up. But by splitting it up, the left hand doesn't really know what the right hand is doing.
So controls and processes are put in place to ensure no bad outcomes are possible, but this also prevents good, innovative outcomes from sprouting.
Fundamentally, it's a loss of trust that can exist in a smaller organization.
A startup starts at the bottom. It begs investors, customers, and employees from a position of optimism and humility. The organization enthusiastically changes itself to find a good balance between those three or it dies. As the organization grows, it starts demanding everyone else change for them instead. Google's interviews are an example. Its famous customer service is an example.
Then we get to size and complexity. Thanks to Dunbar's numbers, we know that there are numerical limits to a human's ability to know people. This makes sense. I can know everything about 6 people, most things about 50, and keep track of about 250 well enough. As the organization grows, your ability to know it disappears. You begin making abstractions. Instead of knowing exactly what Susan does, you say she works in X Department, for Y Initiative, doing Z position.
Google is so big that one person can't understand it anymore. The inevitable reduction to a corporate abstraction occurs and then people treat it like the X Company, which is just like Y Company but makes X instead of Y. Short term revenue and expenses are the only measures at the end.
And in this faceless abstraction, the professional parasite class infests and extracts resources and morale. Eventually the C-suite stops fighting it and joins in on it until only the sheer size and momentum of the company keeps it going. Maybe an investor group will come and force a rework of the company, but not before the company is just a shadow of the shadow of its former self.