Most Python projects aren't this tough. I suspect that they're using wonky libraries like Pandas, Numpy or some such that prioritize raw power over ease of installation.
I've not touched Python in a couple years, but Pandas/NumPy used to be the defacto libs for anything to do with data science, are they considered "wonky" now?
I mean, there are at least two different companies (Enthought and Continuum) that were founded on making the major scientific python packages easier to install.
Just to be clear, pandas and numpy are not the "wonky" libraries. They are, in my experience, basically two of the most easily installed and dependency managed libraries in python, given their ubiquity and maturity. Maybe there are machine configurations I'm not familiar with that they are not easily compatible, but I've never seen them cause issues. Usually it's cuda or other gpu stuff, or conflicts in less regularly maintained packages
2be honest numpy is ez to install on all major platforms. In deep learning I'm almost never saw usage of pandas but deep learning models have problems with PyTorch as some projects just lock old PyTorch version that just dosent work on new/old python version.