Right? As I was reading GP, all I could think is that there are a million other instances of an issue having two interested but opposed parties, one being larger than the other, and in advanced pluralistic democracies, we generally find ways to balance the two interests that don’t involve completely hosing one party. I recognize that the Electoral College was developed in a context where people were very concerned about tyranny of the majority, but it’s a drastic solution to a not altogether exceptional problem.
It’s unfortunate that it’s third party, but conda has the unquestionable advantage of being the only Python-centric packaging system that has a reasonable shared binary library story
I'm curious, do you not find wheels + manylinux reasonable? I agree that until recently, Conda definitely had that advantage, but now that you can `pip install scipy` and have that get you a working library and not try to compile things on your machine what does Conda offer beyond that?
I guess one thing Conda has that the pip ecosystem doesn't is that it supports installing non-Python shared libraries like libcurl on their own. Is that an advantage? (We absolutely could replicate that in the pip ecosystem if that was worth doing, and it's even not totally unprecedented to have non-Python binaries on PyPI.)
I think it would definitely be great if pip could install non-python dependencies. One problem right now is that many projects will tell you to just pip install xyz. You execute that, things start building, and the process fails partway with some cryptic message because you're missing an external dependency. You figure out which one, you install it, start again, and another dependency is missing. Rinse and repeat. It's definitely not a turnkey solution, and this issue trips up newcomers all the time.
With respect to versioning, I think pip should be way more strict. It should force you to freeze dependency versions before uploading to pipy, not accept "libxyz > 3.5", but require a fixed range or single version. That would make packages much less likely to break later because newer versions of their dependencies don't work the same way anymore.
Does pip allow version number dependencies? Conda is able to upgrade/downgrade packages to resolve conflicts, whereas pip just seems to check if a package exists and shrugs when there's a version conflict.
pip does handle versioned dependencies and ranges, and know enough to upgrade existing packages when needed to resolve an upgrade. Its resolver isn't currently as complete as Conda's - see https://github.com/pypa/pip/issues/988 . (On the other hand, the fact that Conda uses a real constraint solver has apparently been causing trouble at my day job when it gets stuck exploring some area of the solution space and doesn't install your packages.... so for both pip and conda you're probably better off not relying too hard on dependency resolution and specifying the versions of as many things as you can.)
I guess you'd have to see historical margins to make any sort of reasonable projections, but I think GP is saying that they've so increased their unit margin on the ride-hailing to near zero that it's dishonest to say they're running it a massive loss and have no route to profitability.
I live in lower Manhattan, and there have been many shootings within a mile of me in the past year, even a few within two blocks of me, but because the per-capita numbers are so low, I'm pretty unconcerned about being shot, and so it doesn't really bother that I'm near the shootings. I don't really see how standardizing by (people * area) does you any good in capturing public safety, or even perceived public safety
I’m not aware of any evidence that that’s actually the motivation behind preventing deflation. Just about every economist will tell you that deflation is undesirable because it incentives hoarding accumulated currency rather than investing it in productive capital because of the decreasing value of production relative to currency.
In addition to the points mentioned, upkeep is really continued development that must be performed within historically set constraints. I found this short video about the work at Euston Hall interesting—though Grafton inherited the estate, he came from the music industry in Nashville as a relative outsider with an eye for costs and revenue. https://m.youtube.com/watch?v=hypmo53fWfw
Live-in housekeeping is around £50k a year for a couple - not bad for living rent-free on a nice estate - although I used to know someone who provided house-keeping for much less in return for not having to pay for rent and food. So cheaper options are possible.
Gardening is maybe £25-30k/yr, unless the estate is absolutely huge.
Heating will cost a lot in the UK. So will maintenance, especially if there's significant land involved. Roads, fences, walls, drains, sewers, and so on are all more expensive than most people realise, and large estates are usually listed, so there are expensive restrictions and maintenance obligations.
But I'm not quite seeing how that would add to up £500k for a £1.5m estate - which would be fairly small, even with the way prices were back then. I'd have expected something closer to £100k to £200k.
A friend of mine (who lives in a house of a much smaller scale) spends £2000 a year on gravel alone for his drive. Then he either needs to spend weeks spreading or pay someone to to do the same. Once you have a lot of something, things sure get expensive.
On a similar note - say he wanted to replace his doors Now he won't just want to use form £100 from the hardware shop, so realistically will be looking at £600+ for nice handmade bespoke internal doors. And then a future £100 for the door furniture, maybe £100 to get each painted and prepared and maybe £100 for the fitting. Before you know it looking at £1000ish a door. He had over 25 doors, and this was 'only' a 6 bed house... yes you aren't replacing these every year, but just an example of one of the costs.
If anything, the "best and brightest" folks in Hertz management who drafted the policies that both the phone assistant and the person behind the counter had to try to navigate (and who both acknowledged the policies they were struggling against were dumb) are the ones who failed big time here, not the every men that GP is insulting.
To be clear though, I doubt that someone in Hertz management sat down and devised this scenario deliberately to save a few bucks, hoping that nobody with social media clout would ever run into it. My guess is that it's the result of a web of misaligned incentives and organizational cruft that resulted in a creaky but mostly-working system that nobody ever fixed.
You can use your imagination to fill in the details for fun - maybe 8 years ago some stressed-out shift manager without enough people to work the rental desk (due to a freak corporate-mandated hiring freeze) got a call asking for updated contact information for his location and he hesitated a moment and the caller, a temp on loan for the day, helpfully suggested that they just leave the extension blank for now, and marked the location done. And over the next 8 years the company was making its numbers and there was no need to rock the boat and dump a bunch of money into management consultants to tear everything up and start over fresh, and the VP of airports just had a grandkid and was letting his lieutenants run things on autopilot, and the CEO was happy as a clam with his 2pm golf games and expensed steak dinners and had no idea that anything was wrong, etc etc etc etc. The WSJ exposé following Hertz's surprise Chapter 11 filing practically writes itself.
In what sense was Martin "doing the right thing" by hoarding documents seemingly for its own sake/to satisfy some tick? I don't think it's fair to the others you listed to lump Martin in with them.
The specifics of his case are irrelevant; they are using any and all opportunities to create an atmosphere of extreme fear.
Assange isn’t even subject to US law and doesn’t/didn’t have a clearance. They’re going after him on whatever grounds they can scare up simply for publishing.
It’s part of a wider plan, to shore up the fact that they are vulnerable, which Snowden most effectively demonstrated.
Can you expand on point 4? Conda-forge releases perfectly fine tensorflow-gpu builds, with the caveat that they don’t ship stubs or the actual NVIDIA driver with them so it’s not truly standalone, but the same can be said of pytorch or really any GPU-enabled package.
A particular build of Tensorflow X requires version A of the CUDA library, version B of the CDNN library, etc.
It is a common situation if you work on a data science team or want to play with models you find on Github that some of them require X1, A1, B1 and some others require X2, A2, B2.
The CUDA and cuDNN libraries are ordinary userspace libraries so if you package them for anaconda you can install them into a virtualenv and have different versions of the libraries sitting side by side and never get an error because the library versions don't match -- and I've done that on both Windows and Linux.
Anaconda can't ship conda packages like the ones I describe because NVIDIA insists that you download the libraries from their website, register to get senseless spam, screw around with installers, etc.