There are tools though through which we can mitigate verticals and have in the past:
- regulated markets (eg limiting service bundling such as free email financed by targeted ads, tying devices to social logins, predatory social network effects, mandating availability of profile data for ad targeting to third parties, stopping ad platform Ponzi schemes eg without objective third-party efficacy measurement, create strong consumer laws against entertainment devices wanting to track and send ads to you, rethink 5G networking)
- mandate standard formats for your data/text providers to export at your request for you to move to another provider
- stop hw/sw bundling and/or tax closed hw/sw platforms different from general-purpose computers (such as has been done with Playstation)
- net neutrality
- return to public sponsorship of development for standards in information processing and demand alignment to such standards in at least public tenders or beyond (such as in the construction industry)
- mandate E-commerce transactions be represented through standard, signed order and billing manifests (rather than allowing proprietary ad-hoc/pseudo currencies which would also be problematic for tax authorities) or create/mandate digital currencies
The challenges we're facing as civilization due to digitalisation aren't entirely new and have been addressed for eg the telco and financial industry. It's only in the last decade that we're being brainwashed to become slaves to "the Cloud".
There should also be a ban on IP holding companies - that is the scheme where big company creates a parent company in a tax haven that holds IP and then subsidiaries pay "license fees" to effectively remove as much of taxable revenue as possible thus getting competitive advantage over companies who pay right amount of taxes and not using any "tricks".
Collection of personal data should also be restricted to only serve the purpose of the product and for legal reasons. For example a company providing email service should be forbidden from selling analytics data to advertising companies or to fuel their own ad services.
I can also see that companies should be required to release schematics of the products and documentation for any file formats etc. they use, so that when you save a project from a given program you should be able to convert it to another format.
Companies should also be prohibited from using AI as customer services.
This is a fair comment, but its a bit out of context. The article is addressing the increasing power that software (tech) companies have in the manufacture and control of hardware.
But why is this a problem. In the past it was Intel and occasionally AMD or some of older x86 that fell by the wayside. PowerPC was there as well I suppose for awhile.
In many ways it's actually way more diverse now than it's ever been, just that some of these players are not making their innovations generally available but rather keeping them inhouse.
I feel the barrier has been high for quite some time, Intel was/is a monster that is occassionally caught napping. Maybe they'll wake up again and pull something big out of the hat, turn into a IAAS company as well maybe.
To be honest all of this seems like a reasonably decent competive market compared to the old days, we seem to have collectively forgotten the decades of wintel.
Problem is that when your M1 cpu goes bad, you cannot buy the chip from Apple to replace it (tools to do that are fairly inexpensive now and within hobbyist reach). Something like this should be regulated - companies should be required to provide spare parts for x period of time after product phase out. Companies should be forbidden from telling suppliers to not sell spare parts to 3rd parties.
We need regulation for the right to repair.
Sure, but why do consumers need the government to tell them this is a valuable aspect of the technology they've purchased. It they don't value it they don't value it, having the government mandate and tell you what's good for you helps no one in the long run.
The big companies have no real problem complying with the regulations, it's quite clear that consumers don't actually give a damn. All you've now succeeded in doing is making sure any startup has a tonne of bureaucratic nonsense to wade through to get it right.
What you need to do is identify where a transaction has not been made between two parties for fair recompense. One potential area is to say that without adequate repair cover you are now forced to dispose of your electronics and pay for the safety disposal so that it does not pollute the environment for everyone else. Then a consumer might see the need for repair rights and make an informed value decision.
Assuming that top down legislation will not have some unforeseen interaction that drives terrible outcomes is the fallacy of big government, it just consistently fails at it, I've seen it so many times over my life time that it's farcical.
As an anecdotal example; circa 2011 the australian goverment brought in legislation to make portable A/C units more efficient. Great, right? Wrong, the government in it's infinite ineptitude managed to word so that it only effected dual vent portable A/C units and left single vent A/C units unaffected. Instead of making dual vents more energy efficient the companies pulled all dual vents from the market and now you can only get single vents which are woefully inefficient. This is not the companies faults, they did exactly what you would expect them to do. The government fixed the legislation 11 months ago but unfortunately we can still not get dual vent A/C at all in Australia.
Are you saying that being able to repair something independently for a consumer doesn't have value?
If free market cannot solve the problem, then regulation is needed. You can see that market is moving towards non repairable single use devices or that they can be only repaired at a extortionate price by authorised repair shops.
If there is no regulation, all companies will be moving into that direction chasing that extra profit and consumer will not be able to vote with their wallet if everyone does it.
Your example is comparing apples to oranges. I am not saying that regulation should influence product design, but rather the producers should give access to spare parts and schematics.
I agree that regulation like for A/C example isn't great.
He's saying that when consumers routinely buy products that need to be serviced by first parties, over devices that can be opened and repaired by the end user, the market is voicing its opinion on whether reparability has value to them.
I personally prefer the latter, but I also don't think that people loving Macbooks or integrated ARM SoCs is going to change whether I can get a Thinkpad where I can upgrade the RAM with a Philips head.
Sure, it might cost a little more further down the line, maybe Intel won't be the behemoth it's been for the last 25 years, but if most people are willing to pay for their black box to get repaired, why does that mean the government needs to step in?
It's been happening in the car market too: even entry-level sedans are becoming black boxes more and more, but people snatch 'em up.
I should also say I'm very much in favor of the "right to repair", but more in the vein of "you can't legally forbid me to repair my John Deere tractor" than in the vein of "Apple shall hand over detailed schematics and replacement parts to any repair shop that demands them".
I think you’re misconstruing his points by bundling it with the overused interrogative of “are you saying...”, he is saying it has value if the people so decide. A government regulation forcing people to do something they already aren’t doing leads to the sort of didactic nonsense like running government sponsored commercials every 20 minutes telling you to wash your hands and use hot water to combat covid.
> why do consumers need the government to tell them this is a valuable aspect of the technology they've purchased. It they don't value it they don't value it,
I would argue that a lot of consumers _do_ value it, but they don't know enough to ask about that up front.
Most people want to buy a thing, then use that thing, and when it breaks they reasonably assume that they should be able to get it fixed if they'd like. Cars work this way, appliances work this way, etc.
I think the reason why people don't ask for this up front is that
(1) most people would assume they have the right to repair (and thus don't think to ask) and/or
(2) life is short, and we shouldn't need an advanced degree specializing in a particular category of products in order to know all the ins and outs of buying one :)
> Sure, but why do consumers need the government to tell them this is a valuable aspect of the technology they've purchased. It they don't value it they don't value it, having the government mandate and tell you what's good for you helps no one in the long run.
I don't think it's that simple at all. There are lots of small reasons people right now act the way they do which in the aggregate makes it look as though they don't value repairability. That doesn't mean that they wouldn't prefer a world where repair was easy and cheap. It also doesn't mean that a world where repairs are easy and cheap isn't preferable to one where it isn't, in a bigger perspective (environmental, for one).
> Assuming that top down legislation will not have some unforeseen interaction that drives terrible outcomes is the fallacy of big government, it just consistently fails at it, I've seen it so many times over my life time that it's farcical.
Hmmm... Ok... I think get it now! Regulations are always bad and never good. Wow, it's so simple. Thank you for this valuable and insightful information.
Because eventually all other values besides profit for 5 companies will be sacrificed if you go down that road.
Why do consumers need the government to enforce child labor laws? FDA regulation? OSHA? These lessons are written in blood.
Yes, you might need to work on the regulation in order to get it just right but "who cares, whatever" is not acceptable unless we want to live in an effectively-communist world with a few vertically-integrated corporations setting all the rules.
It would be great if every conversation about regulation on HN wasn't dominated by posts re-litigating if the concept of regulations should exist in the first place.
Interestingly enough, these contexts form (imo) an single example set.
Chip-OS dyads like wintel. Platforms duopolies & oligopolies. Neutrality & commodification vs lock-in dynamics. Public or free standards as an alternative to platforms. etc.
A lot of these dynamics have recognisably repeating in the computing industry (also telecom & media) for 4 decade.
OTOH, scale is a quality unto itself, and as this article suggests... Vertically integrated computing is very much a megacorp endeavour. I think the difference here is the scale of these companies.
The thing is that governments are generally influenced more by the commercial interests of large corporations and capital-owners, than by popular interests. There needs to be a very strong "pull", with a lot of organizing and pressure on politicians and agencies, to counteract corporate influence - and it's not enough that the reality is obvious.
Also, in most world states, large corporations are able to threaten leaving - stopping activity in the country - which can be quite a hit to the local economy for smaller states and larger corporations.
Another point which hinders your suggestions is the imbalance between world powers, especially the US (and to some extent the EU and China) - so that if one or several of those are beholden to corporate interests, then the others, or world-wide bodies, will find it very difficult to mandate things, apply taxation etc.
The elimination of corporations as special legal entities would make governments more directly controlled by people rather than wealthy megacorps. It is an unpopular line of thought because of the perceived economic benefits of megacorporations, but I'm not so sure they really are a good use of the talent that gets sucked up in them. Maybe we'd have a more innovative society if our culture took back personal responsibility.
> elimination of corporations as special legal entities would make governments more directly controlled by people rather than wealthy megacorps
Unlikely. Prior to incorporation, families were the operating unit of commercial and thus military and political activity. This pattern would resurrect itself: families, like corporations, can be immortal. That means they can plan and be trusted for the long term.
Note, too, that incorporation gives rise to charities, unions and independent lobbying groups like the Sierra Club. You lose those when you wipe out legal personhood.
Can you elaborate? Corporations are legal entities that aren't an individual or merely a group of individuals. Are you suggesting that corporations will be legally treated as the group of the shareholders? Something else? I'm not following.
Not sure I understand what you're saying in the last paragraph, but critique against lobbying definitely hits closer to home recently in EU at least; eg. just yesterday, French magazine Le Point has published a leaked Google memo detailing their strategy against new EU legislation [1]; paywalled, unfortunately, but [2] (in German) has some more details.
After all, politicians in democracies should be responsible to their electorate, shouldn't they?
> politicians in democracies should be responsible to their electorate, shouldn't they?
Unfortunately, I don't think we can say that they should.
I mean, in democratic ideology they should, but democratic state institutions, legal and organizational arrangements are not fashioned to achieve this outcome; and most democratic states were not, after all, created by the mass of people setting up those arrangements. Rather, elite groups and powerful interests set such states up, with democratic elections being a mechanism for legitimizing the new regime among the masses, or a concession to public pressure.
> Not sure I understand what you're saying in the last paragraph
If a big American corporation is transferring lots of money from Europe to America while engaging in some sketchy behaviour, Europe might be more enthusiastic about clamping down on that than America. For both financial reasons and due to different political priorities.
It's not like Europe could unilaterally reverse Google's purchase of Youtube or Facebook's purchase of WhatsApp.
Bingo. We need standard communication protocols where the identity is separated from what is shared and so the platform is only a UI skin and users can easily swap. I.e. SMS for social media. Enforcement is an issue but only matters for market power of these large firms.
The new challenge is both parties, thanks to social media/news media reward mechanisms, cant agree to anything without loosing their behaviour conditioning like counts.
Until underlying reward mechanisms change political will won't have a chance to build.
As an example of this I do wonder if the PC industry realises the bind they’re now in with the Apple and the M1.
We don’t know the unit cost of those parts, but we know from past statements they are a very small proportion of what Intel is charging OEMs for their processors.
Given that the perception has always been that MacBooks are the best built laptops available, but priced at a substantial premium, it’s been fine for a Dell or HP to coast by on lower prices and thinner margins (as long as Apple wanted 30% or so in profit on each unit sold it was easy to compete on price and still make a living).
With the M1, Apple now has the option of competing directly on price while preserving their margin, with a substantially faster machine, that still has the other premium elements (e.g. the aluminium body, the massive trackpad).
I don’t know how any one of their competitors can compete with a similarly-priced device that is faster, better built, and commands much higher residual value down the road.
Maybe I’m being naive but I’d be very worried right now if I were Dell/Lenovo/HP, especially given that Intel doesn’t have a working 5nm process.
It’s even worse for companies like Hynix, given the performance gains from the on-package memory architecture in the M1. AMD are going to have to go down that route now to compete, and that doesn’t leave much room for direct-to-consumer DRAM vendors (although given the questionable histories in that market it may not be the worst thing for the consumer).
My impression of Apple is that they have no interest in competing on price. They are a premium brand. In the same way Louis Vuitton could sell cheaper handbags compete on price because they bought & optimized the costs of making leather, but they’re not going to.
Apple lowered the entry price of an iOS phone to $400 this year. There’s more profit to be made via monthly payments for services, so getting more iOS and macOS products into people’s hands is in their interest.
I’m not saying they will sell third rate quality products like $400 laptops or $200 phones and tablets, but they are interested in lower prices and making up for it by getting people to purchase more from the whole ecosystem.
Apple has lowered the prices on all of their major product lines: iPhone SE ($400), iPad ($329), Apple Watch ($279).
I can't see anything in the way of Apple creating a lower priced MacBook SE at around, say, $700. With that move, Apple could essentially own the technology stack (iPhone, iPad, Mac, accessories, etc...) of millions of consumers previously priced out of the Apple ecosystem, consequently driving services revenue.
It would be interesting to know how companies like Apple account for the market of older models and how that feeds into their services revenue. For example if you can go on Woot and purchase an iPhone 8 for $399, while Apple didn't make money off of that sale, they will certainly make money from app purchases and iCloud subscriptions. And at $399 there's bound to be people willing to purchase an iPhone even if it's older. With Apple's iOS support reaching back several generations, it's not always a bad deal.
So in a way, even though Apple isn't making money directly off the sale of that old iPhone, they're still able to feed this lower-end market and make money from apps and services, without directly lowering their prices and maintaining their perceived high-end status.
Edit: I gave an iPhone example but it certainly extends to the Mac lineup as well.
I made up the numbers as an example. It doesn't really matter, my point is that Apple's old devices, be it iPhone or Mac, still have value years later. So even when people are fishing the old stuff from the bargain bin sites, Apple is still making money off of them. As a result, Apple doesn't need to compete with the $400 Dell because people will still be buying the 5-year-old MacBook Air from the bargain site and with it, buying app/iCloud subscriptions.
Yeah this. Apple already has fairly-successful and well-established price levels, they are not going to drop them anytime soon. If these chips really are significantly cheaper to make, that just means they will make more money per chip, at least until setup costs go to zero. Cook is a supply-chain guy, he knows the score: final price is what the market will bear, not some fixed calculation over costs; you win by squeezing costs down and pushing prices up at the same time any time you can.
I'd say 17% of the computer market is sizable but still small and segmented enough to be premium. Macs don't factor into the purchasing decisions of the vast majority of ordinary people.
I’m trying to communicate they’d rather make more per unit sold, and lowering costs while maintaining quality increases is good for making more money. Maybe upscale would be better than premium.
List price for the kinds of Intel chips Apple used/uses is somewhere along the lines of $200, though I can't imagine Apple ever paid that much. Some quick googling says that Intel maintains a gross margin of 50% or more, so manufacturing cost is $100 or less, and that doesn't include R&D work. This is all just napkin math, but I don't see how the M1 would allow Apple to shave more than $50 or so off their prices, which still gets them nowhere near Dell/Lenovo/HP territory. Apple laptops have substantially better screens than your average "I just want to use the internet" machine, better sound, mic and camera, better hard drive and an aluminium body, all of which has to be paid for somehow.
The current M1 macs already are better value than the Dell, etc options. The 1000 air has a much better cpu, better gpu, and much better battery life than the corresponding XPS for the same price. The only demerits are the shitty Webcam and largish bezels for 2020.
Unless you think Apple is loss leading on the M1 devices, which I just can't see them doing.
After looking at Apple's bandwidth (68GB/sec or so) and latency (random latency 32ns or so, if TLB friendly) I think it's time to consider ditching dimms. Half the latency, higher bandwidth, and lower costs sounds good to me. Just hope that future motherboard makers with LPDDR4x-4266 have motherboard options with 32GB ram instead of today's rather low limit of 16GB on the M1.
Most of DRAM latency is actually not DRAM itself, but CPU, and its controller.
DRAM always been at aroun 10-20ns of latency. PCB traces themselves add a maximum of 6ns roundtrip.
DDR PHYs on both sides do huge amounts of analog, and digital signal conditioning. Refresh system add random latency if you hit a refresh cycle. Prefetch-burst mechanism also adds limits of your effective random access times.
This is why HBM may well beat regular DDR on real world latencies.
Interesting, thanks for the detailed info. I've done a fair number of benchmarking of different CPU architectures and never seen a latency lower than 60ns or so. My jaw dropped when I saw the M1 chasing pointers on large random array (much bigger than cache) at 30ns or so. That's a pretty amazing latency for l1 cache miss, l2 cache miss, and retrieving a cache line from main memory.
I don’t know that there’s an architecture limit that means that higher than 16GB is infeasible. I suspect we’ll see 32gb in the new iMacs and presumably a lot more than that in the next Mac Pro (which I’m really excited about in terms of how they scale the architecture).
>As an example of this I do wonder if the PC industry realises the bind they’re now in with the Apple and the M1.
Apple has had their own chips before and the PC industry is fine. These sort of things are always cyclical. It's easier for Apple to do the M1 when it's a shiny new thing than keep innovating for a decade. Eventually someone will come up with some innovation they can't match and it will swing back towards integrating commodity parts.
All you have to do is look at the success of the A-series chips for the iPhone/iPad and how they've progressed over the past 10 years. While year-over-year they don't necessarily add something big and shiny (a bump in specs, occasional co-processor), if you compare an A14 to an A9 from 5 years ago the differences are huge.
It doesn't take a "shiny new thing" every year to keep the M series chips looking better than Intel. Perhaps you're right and in another 10-15 years it'll come back to commodity parts, but it would take something Very Shiny to push that at this point. And afaict Intel is in no position to deliver anything shiny enough to do that.
DRAM is already commoditized for decades. Highly competitive and pushing prices and margins low, with occasional crises that allow some manufacturers to get a windfall.
Apple has no interest to be in that market, unlike the CPU market, which is a high margin, high product differentiator business, it really hard to sell your design as outstanding when the guy next door sells a laptop with the same CPU at half the price.
I'm reminded of how retailers once they cross a certain threshold of popularity and distribution penetration begin rolling out their own branded products. Amazon Basics is an example of that but is true of almost any medium/large retailer. I often wondered when will that happen with Apple/Google/Amazon. It does seem inevitable, in the hindsight.
Beyond a certain scale middle men's costs become non-trivial and hard to ignore. In case of Apple, though it is more than costs alone because they want end-to-end control of UX. By the looks of it processor had become a limiting factor in achieving that control.
Makes me wonder, processor is the lower layer in the stack. What about upper layers? Apple have already rolled out their credit card. Maybe their own cellular network? A new WiFi standard, perhaps?
> I'm reminded of how retailers once they cross a certain threshold of popularity and distribution penetration begin rolling out their own branded products. Amazon Basics is an example of that but is true of almost any medium/large retailer.
Conversely if brands become popular they open their own retail stores. The point is not not give your profits away to others if your business is the one bringing in the customers and margins, not them.
Ruminating about the possibility that mainstream general computing is now coming to an end. I guess as long as we have Unix we will still have that.
But it looks like for the average consumer, they wouldn't really care as long as their needs are met, even if those machines are becoming more and more like appliances.
I can tell you as someone sells SaaS that I prefer rent because it allows me to serve my customers more effectively, meaning they get more value and Iget more money.
This isn't glib doublespeak. There's probably a better way to articulate this for people who were formally educated in business (not me). Basically customer lifetime value is variable on a monthly fee, meaning you can always add upgrades and provide support. But if you just sell it one-and-done then depending on the price it can be difficult to find the motivation to provide any meaningful long-term support.
as a customer, i don't have the choice to not pay for such upgrades but continue using the old one.
And yet, the SaaS subscription cost is charged continuously whether the customer likes the upgrades or not.
So no, i don't agree with the SaaS business model. It's more extractive. The point of buying a piece of software is the same as buying capital equipment - purchase once, and have it work "forever" (and since software doesn't rot like real equipment, this should be even more true).
What i would pay a subscription for is live/in-person support.
as a customer, i don't have the choice to not pay for such upgrades but continue using the old one.
It's the same with hardware. At one company we used to operate a 3-year hardware refresh cycle and this used to make a lot of sense, 3 years was a long time in hardware, and this assumption was true for a long time. But the pace slowed as manufacturers stopped making big improvements and just started eking out marginal gains. We looked at our kit and our workloads and realised that we had plenty of capacity and instead of refreshing proactively we should stretch it out to a 5 year or longer cycle and replace failed equipment with new rather than disposing of things that still worked perfectly well after 3 years. Saved a ton of money doing this, both in buying hardware and in the effort taken to move applications around. Of course manufacturers got wise to this and looked for ways to get you back on that treadmill.
Yes, but I think that misses the original point of OP's argument, which is that the variable cost nature of SaaS is beneficial to customers that don't want to or can't think about 3 vs 5 year lifecycles.
And while you may still have a perfectly valid use case for hardware, I think the growth of IaaS speaks for itself in showing that variable pricing is popular.
>and since software doesn't rot like real equipment, this should be even more true
I disagree with that to a degree. Sure the bits that make up your software don't "rot", the compute environment under which it operates is always changing and will lead to it no longer working.
Try taking any PC, whether it's Windows 10 running IE or it's Ubuntu running Firefox, and just stop updating it. How many years will you feel comfortable using it? _Eventually_ you're going to need security updates, interface with new protocols, support/function on replacement hardware when it dies, etc.
I'll acknowledge that in reality, there are internet routers, industrial machines, and the like that can probably run for decades on the same software revision given enough babysitting and like-replacement of the hardware. But even the security software around those like firewalls are going to need to be refreshed and upgraded which costs money. In the grand scheme of things, somewhere in the stack, something is constantly being upgraded and refreshed, and SaaS fits perfectly into that model.
> I'll acknowledge that in reality, there are internet routers, industrial machines, and the like that can probably run for decades on the same software revision given enough babysitting and like-replacement of the hardware.
Don't forget every medical lab machine ever made.... some of that stuff is still running windows 95
...but you can find the motivation to offer a product that just works (mostly) perfectly and doesn't need support! Rarer updates, because you just need to test your things perfectly and ensure 99.9% of user will not need any "support". And shipping things that are "done" - they don't need upgrades.
If I find I actually need any kind of support with any service/product and I can't "just use it" then I start looking for an alternative product/service.
There's something rotten at the core of everything-as-a-service and continuous-improvement and growth-hacking and everything that comes with it... Most people actually like things that are done and stay done, most stuff doesn't need to need to be fixed, and it doesn't need to evolve or improve beyond the glacial pace needed to keep up working on the infrastructure it uses...
You need to make all your money up front with one time sales, whereas you can spread it out with a subscription. The former is much harder to make a living off of, which is why you see the proliferation of SaaS these days.
SaaS is good because it gives you a constant income stream, which matches the constant development of your product.
For the user however, it means lock-in and their data being held ransom.
A good SaaS manages to remedy the downsides to enable the upsides to shine, i.e. data export, open formats, open core, open source + hosting business, ... and hopefully more diverse and better ideas to cover more markets, use and business cases.
It definitely is beneficial to be a service provider. Rent, in general, is beneficial to the rent seekers for the reasons you outline - steady, passive income[0]. It's really awesome - that's why every business in tech seems to be adopting this model.
It also seems to work for business customers, who have many tools available to handle logistics of subscription, a more equal position in the relationship, and more ways to deal with cashflow problems.
It's much less beneficial for individuals, though. Some problems:
- Even if the monthly subscription is 10% of its would-be box price, I'm likely going to pay more than I would if I could buy it.
- By subscribing, I gain a new relationship to manage. Managing relationships is cognitively taxing, there's only so many an individual can handle[1]. I don't want that relationship with you, I want to use a software products, no strings attached. Much like when I buy bread, I don't enter into relationships with the bakery, the mall, the flour plant or the wheat farmer[2].
- As a part of being in a relationship with you, I'm bound by your ToC, which limits what I can do with the product. It's not a noticeable problem when dealing with web applications, as limits are often enforced technically - but as more physical, household products become wrapped in services, this is going to become a bigger problem. Right now, the manufacturer doesn't care if I use my dishwasher for steam-cooking[3]; in a service model, this would be disallowed by ToS.
- The modern means of enforcing the core of our relationship - me paying you for access - usually revolve around the software being hosted on your servers. This limits the long-term utility of the product over a version I could keep on my computer, and poke inside if needed.
- If our relationship ends for any reason, I lose the goods. It may be because of me violating the ToS, but it also may be because your company decides to shut down the service, or gets acquired, or goes bust. It's particularly annoying with physical products, which then get taken away or become expensive paperweights. It's a risk I have to keep in mind as part of managing our relationship (and which realistically I won't, and then wake up one day with few days or weeks for figuring out how to replace your service with something else).
- It may seem that incentives of SaaS are better aligned with their customers - I pay, you deliver improvements. But another way of looking at it is, you release an incomplete product, while I pay in hope it'll get improved. Much like "one-and-done" sales have their perverse incentives, so do services (worst case that sometimes happen: making switching costs high on purpose, to ensure I keep paying).
Personally, I'm particularly tired of the relationship angle. I want to exchange stuff for artifacts that give me powers - not bind myself to ever growing number of ephemeral business entities.
--
[0] - Not 100% passive because you have to keep the infra running and offer some kind of support, but passive enough that quite a lot of companies seem focused primarily on growing their userbase, not catering for the existing users.
[1] - Rich people and businesses handle this by paying someone to deal with it.
[2] - Actually, I do enter into a relationship with the bakery, but it's entirely handled by consumer protection laws, so the only thing I need to manage about it is knowing what government branch to e-mail with the receipt. And that management cost is shared across all my purchases.
[3] - Worst that can happen is I lose warranty over some more peculiar use cases or tweaks.
> They're even worse than old appliances. You'd usually own those appliances, be able to repair them, patch them up and use them for a long, long time.
Modern appliances aren't really like this. The control panel will start registering false presses or no longer accept keypresses after a few years (especially if it's a heating appliance and the control panel is not engineered to be in a hot environment), and replacement control panels will be discontinued and unavailable.
Isn't that exactly what the average person has wanted out of a computer for a long time though?
Most people never change anything in their system, and know little more than exactly what they need to do the specific tasks they want to do (writing documents, photo storage, e.t.c) An appliance like system where "it just works" and you never need to understand any of the structure or fiddle with the computer in any way is probably exactly what a lot of people want.
I think that's a bit of an illusion. Yes, the average person wants tech to "just work", but what that usually means is "just work the way I need it to work for my specific purposes in my specific circumstances".
It's true that the average person doesn't want to spend time tinkering with tech and take the risk of breaking things. But that's not the same thing as being happy with how things work, and it's not the same thing as using tech as productively as possible.
When I write software or talk to people about how they use software, they always want it to work somewhat differently. They do want customisation for entirely sensible and very specific reasons. They just don't want to do it themselves. That's why consulting and support is a huge global industry.
The problem with "just works" is that it only has meaning where the requirements are specified in great detail with very little variability. That's always trivially the case for single purpose household appliances. It's also the case for things like making phone calls or for device drivers. No one wants to "customise" a device driver to stop it from draining the battery.
But computers are not single purpose devices. They are used for a million different things in a million different situations. "Just works" is largely undefined outside of core OS functionality. And that's why the appliance metaphor just doesn't work.
But that's not the same thing as being happy with how things work, and it's not the same thing as using tech as productively as possible.
If productivity of end-users was the goal then we wouldn't see random changes to interfaces that people had learnt, for no other reason than some graphic designer thinks it doesn't look "fresh".
The truth is for most of what most people do, computing was solved around 2010-ish and all anyone wants now is for things to be more reliable, faster and cheaper. But that's much harder work for companies that a new flat interface style, or some animated emojis.
That, and their goal isn't actually to make a better product. A better product can be a means to an end, but the real goal is to increase the rate of growth, and more often than not that flat interface style convinces some manager or consumer that they need the latest one.
Some of the reason for interface changes is that otherwise people think that nothing changed, and it's easier to make a cosmetic change to UI than make users read release notes.
What you want isn't always what you need. Things just working is nice, but I'm not sure how much people are aware of the value of actually owning things and being able to move to a competitor without loosing everything. As these monopolists tighten their grip on these devices you see them limiting functionality and use cases AFTER you've purchased them, just because they can.
I think this is exactly what average computer user wants. Also, even if I like to tweak linux to my liking, I really appreciate how MacOS just works out of the box where I don't have to configure anything (because nothing is configurable).
Windows, on the other hand is worse of both worlds - Nothing works from the start, and it's not easily configurable. (Before you bring your pitch-forks I use Windows on daily basis)
Can you detail exactly what "nothing works" on Windows for you?
I know it's still a fashionable meme on HN to hate Windows and Microsoft for telemetry and Candy Crush, but saying nothing works is a bit of an exaggeration at this point.
Using both Windows 10 and Linux at home and work I can say that while linux is the de facto OS for the server/cloud/embedded system, Windows 10 is the rock solid go-to for the laptop/gaming station, especially on laptops with nvidia cards where very few linux distros and desktop environments would not shit the bed on, especially with thunderbolt docks, with multiple displays and various scaling. Authentication via fingerprint readers or IR-cameras on Linux? Good luck with that.
Granted, that's not linux's fault that driver support from hardware vendors is absolute dog shit, but as a user/employee/consumer I'm gonna use whatever just-works™ to get my work/entertainment needs fulfilled with minimal friction, and at this point, outside the Apple ecosystem, Windows 10 has the best driver support by hardware vendors by a long shot with linux sadly still relying on reverse engineered drivers by the community or binary blobs which are ignored by the FLOSS community, which turns linux into this dumpster fire on the more "fancy" laptops that use anything more than an intel chip with integrated graphics that are nothing more than ssh terminals.
Mail app is unintuitive and has advertisement in it. Paint is barely usable, even after 20 years you still cannot resize the image normally. Screenshot app has is much less easy to use compared to Mac. Default browser is crap. I could go on and on about what's wrong with Windows.
>Screenshot app has is much less easy to use compared to Mac.
It's just clicking the snip button in the action center by the clock or pressing "WinKey + Shift + S" then draw the rectangle for the area you want captured. How much simpler is screenshotting on the Mac?
You not liking some of the default apps that come with the OS(which is understandable) does not make an OS bad.
I also hate some of the apps that come with most linux distros but that doesn't make linux terrible.
I don't care if you use Windows on a daily basis - you're just flat out wrong.
Windows isn't easily configurable? By what metric? There's a dozen ways to enter the same settings and most of the time you can just search for what it is about the operating system you want to change - in the operating system itself, with the OS's own search bar.
And "nothing works"? What the fuck are you even doing on Windows 10, where you and apparently only you, manage to break all functionality after the initial install?
I can't even remember the last time I've had a problem with Windows 10... a blue screen, hell, even an application crash.
Mail app is unintuitive and has advertisement in it. Paint is barely usable, even after 20 years you still cannot resize the image normally. Screenshot app has is much less easy to use compared to Mac. Default browser is crap. I could go on and on about what's wrong with Windows.
The IBM PC lineage of putting unrelated components together by yourself is an anomaly in the history of home computers, or computers in general. Selling computers as a whole, with set hardware and software, is what used to be the norm, and it worked well for everyone involved. We are to some extent returning to what things used to be.
It didn't work that well for everyone (except the established vendors), otherwise it wouldn't have taken the market by storm. Modularity is a good thing to have for innovation. If we're returning to "fixed sets", that probably signals the exploratory period is dying.
Anyway, "general computing" doesn't necessarily mean mixing and mashing hardware. But it definitely means executing arbitrary software with full control over installed hardware. The trend line here is definitely from "more" to "less" of "general computing". We've started with raw metal and hardware diagrams being included in the instruction manuals; then diagrams disappeared, CPUs gained abstraction layers - but it was still fine, computing-wise. But now we're in era of OSes and even CPUs limiting what users can do with them, and multiple parties owning various pieces of a computer, with nothing left to own for the user who actually buys the machine.
> Modularity is a good thing to have for innovation
Is it though? Apple has innovated a whole lot more with the non-modular Macs and iPads than any generic x86 vendor has managed in years and years.
Modularity means software has to be designed for the lowest common denominator. And that in turn means that there is nothing to gain by innovating in hardware, because the software won't support it anyway.
Somebody who controls every part of both hardware and software can innovate much more freely than you can in a modular market. x86 PCs have been stagnant for decades now.
The Apple II had slots and was thus highly hardware-customizable. The ability to add specialized hardware to a general-purpose machine gives it that much more flexibility, whether it's additional storage, data acquisition, additional network interfaces, machine controls, etc.
Sure we have things like Thunderbolt today, but there's a lot to be said for putting the specialized stuff inside the case where it isn't cluttering up the desk and nobody can just unplug it and pocket it.
It was extendable, but not modular. You couldn't just buy different parts and put them together into an Apple II. It was sold as a whole, that you could add a few bits to.
For some reason the article's explanation about "software ate everything" and jumping between each company isn't resonating intuitively with me, so I offer my own take on the situation:
For a long time, the hardware/data center offerings of Intel, AWS, etc. were close enough to what other companies needed to not make it worth their while to invest in inventing their own solutions to low % compute problems.
However, as:
-- compute loads and costs grew
-- types of compute became more specialized
-- design and build of one's own silicon became more accessible/differentiable
-- companies' needs and tolerances for paying a premium diverged enough from what Intel/AWS was offering,
it then became worthwhile for large companies (who can sustain such hardware development efforts) to design and build their own chips, either for cost reduction or functionality-enhancing reasons. Maybe they just saw the margins being achieved by sit-on-your-hands incumbents and decided, "we could do this too, and better".
This is a great summary. Sorry if it didn’t make sense. I think whats also interesting is it has become on one hand more accessible but on another less. Even with ARM cores the cost to get to 5nm is staggering and can really only be done by a few companies. It’s very obvious who has the scale.
Heck even semiconductor companies themselves are worried if they can make it to the 5nm node alone - the XLNX/AMD and especially the IPHI/MRVL merger can be viewed from this light.
What is most interesting is the timing. Software did eat everything, hardware used to keep up. Moore’s law died a quiet death and the solution is everyone makes custom chips for everything. Well then the people who have to most compute are going to make the most chips, but also at a time where the actual entry to do leading edge is the highest. It’s just a very cool time for them to have done it. I’m excited and will be following. Thanks for the feedback, the writing is hard sometimes.
No monopoly loves to talk about how it is a monopoly though, most claim to have fierce competitors. Monopolies usually pretend to be part of much larger markets; where they are just "a small fish in a big pond".
That example is certainly extreme; but these fake "competitors/adversaries" are normal i business.
It is not unusual that one competitor becomes so large that it can basically buy up all of its rivals. But it never renames them or shuts them down. They just chug along - looking like competition.
As a counter-point, all these companies might eventually lose the goodwill of the programmers by exercising too tight a control -- and having very specific tooling where knowledge can't be transferred to other jobs -- and they could find themselves starved for quality hires.
Imagine if the average programmer needs 5 highly specialized laptops to improve their job prospects -- not many will do that even when they have the money.
I mean, until it gets to that point the big companies might not care anymore, sure.
But general-purpose computing is very far from dead or even dying. What we seem to be heading towards is more like a huge divide and a segmentation that might last decades.
"Imagine if the average programmer needs 5 highly specialized laptops to improve their job prospects -- not many will do that even when they have the money."
This may also mean that the average programmer will be stuck with a concrete platform, because the cost of migration would be too high.
There is a similar situation in a different highly qualified field - airline pilots. If you are a pilot and want to change to another airline, you lose all your seniority and begin at the bottom of the ladder, even if you are an experienced captain with a lot of miles under your belt. [0]
What we both said is not mutually exclusive. Both are correct IMO.
Sadly the users don't have much choice nowadays, and I am not saying this to gloat (being a programmer) but more like a lament -- in some areas there are far too few choices (either very expensive or just not that good) and in many others the choice is so big that it might as well not exist because you can't make an informed decision in a reasonable time frame.
Yep, it seems that things are going in that direction. But I do wonder if it will stay that way because way too many programmers nowadays take job hopping for their birthright.
I am personally semi-okay with a strong specialization and several big areas -- IMO it's about time for that because way too many colleagues bite more than they can chew (me included) on a regular basis.
If these trends pave the way for programmer formal certification I'd support that as well.
I think AWS Inferentia [1], Amazon’s AI silicon, is much further along than the article suggests. The Alexa backend is mostly run on these chips now [2]. Other than this nitpick, the article is a great summary of how Google, Amazon, and Apple are vertically integrating silicon into their platforms.
on the other end of the spectrum, i can definitely see a trend in DIY : 3d printing, knowledge sharing through social media, mooc and e-learning, risc-v , arduino, rasperry-pi, etc.
One could also see a future where computers become home-built, both software and hardware.
In the same vein, I must admit that I was impressed by the work these monopolies poured into the Open Compute Project. AFAIK, OCP datacenters are rare outside of these companies, but they are one of the net goods that have come out of tech monopoly vertical integration.
I think that we (non-FANG, open-source, university) have to make sure we don't lose the ability to innovate, due to not being able to buy the hardware we need. I.e. if we can't buy an M1 class processor, we can't invent new products that would be enabled by it.
My prediction on how this will play out is that it will become impossible to exceed the FAANG based on compute price, as the article also points out, but I predict that it won't matter as much. If Amazon has a 50% cost advantage on AI by producing their own chips, then an algorithm working 2x as effective but inaccessible to Amazon would be enough to catch up.
So my prediction is that as chip designs get more exclusive, algorithmic research will become more important again.
Everyone is using interpreted languages because CPU time is still considered cheap. So if you want to prepare for this future as a coder, learn C++ or Rust.
> This is a barrier to entry that few companies can really climb over anymore, with 500 million in R&D only possible by a few companies
I wonder what could be done to reduce that barrier. It sounds as though there is a lot of value that could be unlocked by making it easier (cheaper) to get over it.
I imagine a company that could drop the Verification or Software costs down by an order of magnitude would suddenly find itself with a long lineup of customers. After all, if only 270 companies can currently get over that barrier, reducing the barrier by even $50m or $100m would open up the market to a large number of companies, all with fairly large pockets and a strong incentive to be customers.
And of course, I say this having never worked in the hardware design industry, so all are welcome to explain in excruciating detail how naïve I am.
Good read and I didn't know about the rumors of Facebook's ARM chips (for what though?), Microsoft's Project Catapult, nor the imminent custom chips for Waymo.
I think the next frontier will likely be on network bandwidth, in particular user-facing production ones beyond the internal backbones. Currently it's a finite resource that all network providers have to negotiate for and involves more than just the big tech companies. But as workloads continue shifting to the cloud (leveraging the cost efficiencies such as AWS Graviton chips), I'm expecting companies to vying for greater ownership of this external network bandwidth.
I'd be curious of any behind the scenes negotiations being made with Starlink, to leverage its growing satellite constellations to address this.
Sometimes I hear that "hardware providers are holding dotcoms in a chokehold."
That's utterly wrong. Silicon makers themselves are under intense pressure of dotcoms massive negotiating weights. Chasing the Facebook, Amazon, Google trio is not a pleasant experience as many OEMs found out.
Unlike hardware makers themselves, and OEMs, there is no way any component manufacturer can simply "turn the switch off" on a dotcom client. Such clients can simply wait out few hardware generations, and buy older products just fine.
Now, imagine yourself an OEM, and that a top chipmaker declined to ship him his latest, and greatest chip/part. This becomes instantly less funny. Imagine a graphic card vendor having to ship cards with 3 years old GPUs, or a laptop vendor having to do the same.
It's interesting how this goes in cycles. Car companies used to be integrated. VW had a factory where they poured melts in one end and pooped out completed cars at the other. The trouble is that sort of vast and sprawling enterprise breeds inefficiencies. The response to that was to start outsourcing more and more work and lean on their suppliers for innovation. Then they suddenly find they can't differentiate themselves in the market and start bring things back in-house again. VW are bringing software development back in house now because it is a key part of the product.
I guess the key is figuring out what's best done in-house and what works best farming out to suppliers. SpaceX found it cheaper to do many things themselves than go out to the established aerospace suppliers. Whether that remains the case ten years from now is open to question.
I'm not buying the argument.
Firstly I don't think that designing competitive AI accelerators requires the resources that would restrict the market to a few high-end competitors.
From what I understand, they are a sea of low precision DSPs specialized for convolution, connected by a high-speed interconnect.
While I would hesitate to claim that they are simple to design, they don't necessarily have the staggering amount of design complexity of a modern CPU or GPU.
Second, for CPUs I don't think that a complete clean-sheet design makes sense for a lot of companies. Yes, I get it, Apple has taken the world by storm - but let's not forget, that Apple gets a lot of mileage from very few designs, probably the same CPU core ends up in the iPhone, iPad, Apple TV and now the MacBook - with slight modifications.
This is the same, for Samsung, ARM,h Qualcomm, Intel, etc. - most of their CPU designs can be aimed at the market for servers, game consoles, smart devices, PCs, Notebooks, etc.
If AWS designs a server core, their entire market is AWS instances probably - not a small one by any means, but smaller than a dedicated CPU manufacturer's, thus the sensible amount of investment they can make into CPU design is lower than that of a dedicated CPU manufacturer. I'm not saying it doesn't make sense for them - it probably does - but it's far too early to predict the demise of CPU companies.
>Google - In-House Opportunity - TPU chip, Custom ARM chip for phone
Google will put their custom chip on their Chromebooks for sure, there were couple of Chromebooks already with some weird SoC earlier which was supposedly purpose built for Chromebooks, couldn't remember the name I think it was from Spreadtrum.
That makes me wonder which widely adopted software will first have optimization for new generation non-standard hardware, for example Optane storage? Postgres? MongoDB? Redis? Any of competing time-series DBs?
The difference between 28, 16 and 5nm is much less than the sales people would like you to think.
The costs are exploding, not only are revenues going to drop because there is no reason to buy anything new, but combined with more "subscription" plans where you don't own your wares and the shorter lifespan of planned obsolecence, you end up in the eye of the perfect storm.
Combine that with global peak energy + debt and covid is just another margin call.
Get out now or suffer the consequenses for the rest of eternity!
Linux on Raspberry 4 and Jetson Nano is the only exit and it will close sooner than you think.
Edit: PCIe NVMe and linux driver/OS bugs on these are the last bottleneck, hopefully solved soon!
- regulated markets (eg limiting service bundling such as free email financed by targeted ads, tying devices to social logins, predatory social network effects, mandating availability of profile data for ad targeting to third parties, stopping ad platform Ponzi schemes eg without objective third-party efficacy measurement, create strong consumer laws against entertainment devices wanting to track and send ads to you, rethink 5G networking)
- mandate standard formats for your data/text providers to export at your request for you to move to another provider
- stop hw/sw bundling and/or tax closed hw/sw platforms different from general-purpose computers (such as has been done with Playstation)
- net neutrality
- return to public sponsorship of development for standards in information processing and demand alignment to such standards in at least public tenders or beyond (such as in the construction industry)
- mandate E-commerce transactions be represented through standard, signed order and billing manifests (rather than allowing proprietary ad-hoc/pseudo currencies which would also be problematic for tax authorities) or create/mandate digital currencies
The challenges we're facing as civilization due to digitalisation aren't entirely new and have been addressed for eg the telco and financial industry. It's only in the last decade that we're being brainwashed to become slaves to "the Cloud".