Both Android and iOS, but especially iOS, have made large accessibility regressions by switching to a frequently gesture based interface where all sorts of quite similar swipes do different things changing the screen context and revealing hidden modes. For use by people with e.g. Parkinson's disease this is catastrophic. Voice commands can somewhat compensate, but I think we're all familiar with the many ways in which voice commands can fail to perform even simple requests, and also how sensitive the voice assistants are to changes in both speed and volume of received speech. While iOS does have some good accessibility settings, they are not comprehensive, and overall usability has in some big ways declined substantially, even while more and more tasks have to be done online.