Sure, but I assumed he was referring to the biggest computational bottleneck. All of those others are just tooling problems or data quality problems. For DNNs the hours-to-weeks-long training times make it hard to iterate manually or do any kind of rational optimization of architecture and hyper parameters.
Sure, if we just had stationary, perfect data and perfect objective functions it's only computational, operational, monitoring, and maintenance complexity that holds us back.