"The instances are launched as EC2 Dedicated Hosts with a minimum tenancy of 24 hours"
These are just rentable Mac Minis, not VMs. This will have only one use case and that's for build servers. Unless anyone has scalable AppleScript jobs to run?
But they will have to update to Big Sur in the feature. So if they don‘t set the limitation now they will have to „worsen“ the service later, it‘s best to the set the limitation now even if not required.
They still think of themselves as a hardware firm, this restrains usage elasticity to a level where system hardware scaling is a dominant cost driver rather than system performance.
Besides the inefficiencies of physical space at AWS sites, it is also a fine business for the cloud provider as it generates a less volatile usage pattern than normal. As Apple is acting as a monopolist by dictating the same terms to everyone, they further remove any competitive advantage of providing anything else.
I vividly remember a video interview with Steve Jobs from a long time back when he was clear that Apple is not a hardware firm, it’s a software firm, and admitting that it took him far too long to realise that.
I think you are missing the point intentionally. You can't run those apps on an Apple iPhone, but I suspect you knew this.
It's ok to take the literal definition of the word monopoly and take it to its logical conclusion, I suppose. The U.S. government will not do that, however.
If you don't like it then leave. That's how Apple operates. I still can't comprehend why people insist on licking Apple's shoe soles but then complain that it tastes awful.
Wow. That is insane and stupid and incredibly fucked up. I've had to rent Macs in the cloud on a rare basis to test certain latest-version Safari issues, and it's always a crap connection and a miserable experience. But is it enforceable for them to stipulate the minimum length of time of a VM lease?! That's your machine. You paid for it. If you want to rent it out to someone for 3 hours that's your prerogative. I don't see how they can possibly enforce that or how it could be legal as a seller to tell someone how to use their hardware.
If you ran a webserver hosting Great Cat Pictures on some Macs, for instance, your customers might pay a subscription fee to access your site, meaning they are paying for you to do processing and storage on their behalf. So that's an example that's clearly not leasing or subletting the devices.
Likewise, you might read it that Circle is simply providing a service to users, and the users are paying a fee for the service.
But a CI service will set up your dev account, your container image, to run your jobs. It'll even let you shell in, and it tears it all down when you're done.
And it really starts to look like leasing when they also charge for time used on various hosts. (IIRC, Circle charges for this a bit obliquely as max parallelism.)
If it went to court, Circle might argue the hosts can only be used within their larger CI system, that they don't guarantee a particular task will complete on a given host, and that they're not providing other requirements for virtual hosts, e.g. dedicated routing or names. And then Apple's lawyers might counter all that.
So this is where I think lawyers would start digging through case law to figure out where providing a service ends and leasing begins.
It's reasonable to think running CI service is similar to just running httpd. But what divides? Anyway CI service runs Apple software like Xcode. Even httpd uses Apple's TCP stack.
I've done iOS builds using Github actions. The build minutes cost 10x as much as Linux. Anyone out there offering MacOS in the cloud is doing the same thing. Just racking Mac Minis. It means you can't slice CPUs or memory like a VM so it's less efficient and more expensive.
AWS needs to buy a lot more mac machines from apple. Without the limit everybody would start a vm for a few minutes to run a ci job and stop the machine. So there is no „sharing“ of a machine within a 24 hour window if you only need the machine shortly.
I doubt Apple supports this in any way. It means that if you want to develop Mac/iPhone apps, you no longer have to buy Mac hardware. I'm surprised they aren't making this harder.
It's going to be out of their hands on this one. Apple have pretty strong EULAs.
Microsoft's licensing for clouds is also a pain in the arse. You (the cloud platform) have to pay for a full months license the moment an instance is created. The way it's structured you have some wriggle room, e.g. you have your placement algorithm land new instances on where Windows instances have already been in a given month, so you don't incur additional licenses (because the license "transfers"). It gets worse if you start wanting to do things like run SQL Servers, where it's the entire month license outright with no prorating, and it applies to a specific instance instead of the machine/VM slot.
Strangely enough, despite all the trends in the market, operating system vendors are determined to make it harder for people to pay them to use their stuff, rather than easier.
Strategically it makes sense. Apple is a hardware company, so making it difficult to run their OS in the cloud means more people need to keep buying their hardware. Microsoft has their own cloud, where they don't have to deal with the license requirements they force on other cloud providers, which gives them a competitive advantage over their competitors (when it comes to windows servers). The key is that these companies don't just sell operating systems. Red Hat on the other hand, which doesn't have competing market pressures, is much more friendly to licensing in a cloud environment.
But is using legal mechanisms to reduce a competitor's advantage (where the competitor licenses your product) considered to be antitrust? In the internet explorer case one of the major disputes was whether Microsoft manipulated its API to favour explorer versus other software.
IANAL, but I think only if you have monopoly power (at least as I understand US antitrust law), and MS doesn't really have a monopoly on the server OS market.
> Microsoft's licensing for clouds is also a pain in the arse. You (the cloud platform) have to pay for a full months license the moment an instance is created.
This is exactly how I would expect the licensing to work.
Can't you also bring your own licenses for AWS/Windows? This all makes sense to me too... if you want to run a Windows instance, you need a license for Windows (on top of the hardware fee). Microsoft says that AWS can loan you a license (if you need one), but AWS would still need to hold a full license.
In terms of "other people's computers" it seems to fit well. Nice to be able to build/CI test against macOS without having to provision and maintain apple hardware.
You “can”, in a technical sense, but not legally—Apple’s licensing for macOS only allows it to be run on Apple hardware. (You can run a macOS domU on another OS’s hypervisor, but that hypervisor has to be running on a Mac.) This matters when you’re a corporation.
I don’t know about the legalities, but according to Apple you are only allowed to run macOS on real Mac hardware. Anything else is probably an EULA violation or something.
As an aside, in the script you link to is this line
It's against Apple's T&Cs, though. If you're doing it at home by yourself, who cares. But a company that's interested in automated testing etc. isn't going to see it as an answer.
Get an old old old Mac pro from ages ago. Upgrade its motherboard. Then upgrade its RAM. Then upgrade its hard drive. Then upgrade its power supply. It will still be an "Apple-branded computer" because of its case. It will just have received a bunch of upgrades. Big "haha" to them.
(Or if you're actually a corporation just buy a Mac.)
You can't really test apps if there's no graphics acceleration. If you want graphics acceleration, you'll need to run a macOS VM with GPU passthrough using virtio or Proxmox. I did that for a bit until I just gave up and bought a Mac.
"These are just rentable Mac Minis, not VMs. This will have only one use case and that's for build servers."
It's difficult for me to believe that a hyper-scaled cloud deployment like AWS will rent individual mac minis to people. It sounds like a business idea I would have and then come to realize how labor intensive and inefficient the entire thing was.
At the same time, I wonder why there is not a well developed, well documented cross-compiling toolchain available for (whatever you are doing with a mac build server) ? Why not use your local (laptop) mac to do the dirty work and then run a (very complicated) cross compiling chain on a much cheaper, non mac, cloud instance ?
In the past when we needs iOS jobs to run from Jenkins we literally plugged a Mac Mini into the wall and forwarded a port from the router so it was reachable as a build agent by our cloud instance of Jenkins.
What other thing is worth doing on MacOS at scale in an off premise cloud? I'm genuinely curious if there isn't something I'm missing out on. Build and test for Apple related stuff is the only thing that comes to mind.
You can get more than 10 times as many ARM cores on a server chip compared to a M1 Mac. There is no way it is more energy efficient or cost effective to run 10+ Macs versus one server. Amazon designs their own chips, too.
But until now nobody was able to build an ARM cpu as fast as apples cou. Every server arm cpu in the past was still a lot slower than intel cpus, despite packing a lot of cores on a chip.
These are just rentable Mac Minis, not VMs. This will have only one use case and that's for build servers. Unless anyone has scalable AppleScript jobs to run?