My own understanding of the difference between AI and AGI is that the former is just a bag of tools we build to solve problems efficiently and the latter would be the unifying tool and that perhaps even though we don't understand it there's hope that some consciousness would arise from the complexity. So until we get on with AGI human software developers would be very much needed to do the plumbing and maintenance of these interconnected AI systems. Once/if AGI comes to fruition it should be able to get a meta understanding beyond the sum of all its parts and would be able to grow and maintain itself without supervision form us (maybe not at the beginning but eventually). It would still need engineers to maintain the physical aspect of it, but that would no longer be programming as we know it now. And AGI is far far away if not an illusion that we can pull it through so I'd not worry much for now.