A rule of thumb I've heard from various companies is that employees should be worth at least three times their salary to the company. That is your turnover should be more than sum(3*salary_i) for i employees.
A nice Fermi problem: IBM's revenue (2014) is around $90bn, from this rule of thumb this could support around 300,000 developers at a mean salary of $100k. From the model above, IBM's 400k employes should be producing a turnover of $120bn. The shortfall of $30bn equates rather nicely to the 100k employees they're laying off.
This seems oversimplified. IBM is big internationally. They have as many perhaps even more employees in India than in the US. Using a "mean" salary of $100K seems oversimplification, not just India and US but most other countries have different payscales, though its interesting that you've made the numbers work
Of course not, this is a gross oversimplification. I was just surprised at how well the numbers worked. I would argue though, that it is a good rule of thumb, i.e. their revenue is not sustainable given the number of employees that they have.
IBM like any company will employ support staff, admin/HR, cleaners, managers and so on. Plenty of those people earn over $100k and plenty of people earn a lot less. Ultimately we don't know the mean salary of the people they're (maybe) laying off.
But hey, it's a Fermi problem, order of magnitude. I would guess it's not too far off.
A nice Fermi problem: IBM's revenue (2014) is around $90bn, from this rule of thumb this could support around 300,000 developers at a mean salary of $100k. From the model above, IBM's 400k employes should be producing a turnover of $120bn. The shortfall of $30bn equates rather nicely to the 100k employees they're laying off.