Nvidia's stake in Intel could have terrible consequences. First, it is in Nvidia's interest to kill Intel's Arc graphics, and that would be very bad because it is the only thing brighing GPU prices down for consumers. Second, the death of Intel graphics / Arc would be extremely bad for Linux, because Intel's approach to GPU drivers is the best for compatibility, wheras Nvidia is actively hostile to drivers on Linux. Third, Intel is the only company marketing consumer-grade graphics virtualization (SR-IOV), and the loss of that would make Nvidia's enterprise chips the only game in town, meaning the average consumer gets less performance, less flexibility, and less security on their computers.
Conclusion: Buy AMD. Excellent Linux support with in-tree drivers. For 15 years! A bug is something which will be fixed.
Nvidias GPUs are theoretically fast on initial benchmarks. But that’s mostly optimization by others for Nvidia? That’s it.
Everything Nvidia has done is a pain. Closed-source drivers (old pain), out of tree-drivers (new pain), ignoring (or actively harming) Wayland (everyone handles implicit sync well, except Nvidia which required explicit sync[1]), and awkward driver bugs declared as “it is not a bug, it is a feature”. The infamous bug:
This extension provides a way for applications to discover when video
memory content has been lost, so that the application can re-populate
the video memory content as necessary.
This extension will be soon ten years old. At least they intend to fix it? They just didn’t in the past 9 years! Basically, video memory could be gone after Suspend/Resume, VT-Switch and so on. The good news is, after years someone figured that out and implemented a workaround. For X11 with GNOME:
AMD is not competing enough with NVIDIA, so they are not a solution.
What I mean is that whenever NVIDIA removed features from their "consumer" GPUs in order to reduce production costs and increase profits, AMD immediately followed them, instead of attempting to offer GPUs that have something that NVIDIA does not have.
Intel at least tries to be a real competitor, e.g. by offering much, much better FP64 performance or by offering more memory.
If Intel's discrete GPUs disappear, there will be no competition in consumer GPUs, as AMD tries to compete only in "datacenter" GPUs. I have ancient AMD GPUs that I cannot upgrade to newer AMD GPUs, because the newer GPUs are worse, not better (for computational applications; I do not care about games), while Intel offers acceptable substitutes, due to excellent performance per $.
Moreover, NVIDIA also had excellent Linux driver support for more than 2 decades, not only for games, but also for professional graphics applications (i.e. much better OpenGL support than AMD) and for GPU computing applications (i.e. CUDA). AMD gets bonus points for open-source drivers and much more complete documentation, but the quality of their drivers has been typically significantly worse.
NVIDIA always had good support even for FreeBSD, where I had to buy discrete NVIDIA GPU cards for computers with AMD APUs that were not supported for any other OS except Windows and Linux.
AMD "consumer" GPUs are a great choice for those who are interested only in games, but not for those interested in any other GPU applications. AMD "datacenter" GPUs are good, but they are far too expensive to be worthwhile for small businesses or for individuals.
I've found the amdgpu Linux driver to be fairly buggy running dual monitors with my Radeon VII, and found things like the fTPM to be highly buggy on Threadripper 2k/x399 to the point that I had to add a dTPM. They never got things truly working properly with those more-niche products before they just.. kind of... stopped working on them. And of course ROCm is widely regarded to be a mess.
On the other hand, my Steam Deck has been exceedingly stable.
So I guess I would say: Buy AMD but understand that they don't have the resources to truly support all of their hardware on any platform, so they have to prioritize.
Contrast that with the earlier R9 285 that I used for nearly 10 years until I was finally able to get a 9070XT that I'm very happy with. They were still refining support for that aged GCN 1.2 driver even today, even if things are a lower priority to backport.
Overall the ONLY things I'm unhappy about this GPU generation.
* Too damned expensive
* Not enough VRAM (and no ECC off of workstation cards?)
* Too hard for average consumers to just buy direct and cut out the scalpers
The only way I could get my hands on a card was to buy through a friend that lives within range of a Microcenter. The only true saints of computer hardware in the whole USA.
Both of which NVidia does a lot better in practice! I'm all for open-source in-tree drivers, but in practice, 15 years on, AMD is still buggy on Linux, whereas NVidia works well (not just on Linux but on FreeBSD too).
> I don’t judge whether implicit sync or explicit are better.
> Both of which NVidia does a lot better in practice!
Correction - if they care. And they don't care to do it on Linux, so you get them dragging feet for decades for something like Wayland support, PRIME, you name it.
Basically, the result is that in practice they offer abysmally bad support, otherwise they'd have upstream kernel drivers and no userspace blobs. Linux users should never buy Nvidia.
I don't understand what you're saying here. I've used NVidia on Linux and FreeBSD a lot. They work great.
If your argument is they don't implement some particular feature that matters to you, fair enough. But that's not an argument that they don't offer stability or Linux support. They do.
Taking very long to implement stuff is a perfect argument of bad support for the platform. Timely support isn't any less important than support in general.
Are you a product manager? Or do you just not see the irony on your comment?
Long term support means my thing that has been working great continues to work great. New feature implementation has nothing to do with that and is arguably directly against long term support.
And Nvidia seems justified in this since effectively no distro dropper X11 until Nvidia had support.
If you think taking decades is an acceptable rate while others do it in a timely manner it's your own problem. For any normal user it's completely unacceptable and is the opposite of great (add to it, that even after decades of dragging their feet they only offer half cooked support and still can't even sort out upstreaming their mess). Garbage support is what it is.
AMD is notorious for not having ROCM support on in production currently sold GPUs, and horrendous bugs that actually make using the devices unusable.
I use AMD gpus on linux, I generally regret not just buying an Nvidia GPU purely because of AMDs lacklustre support for compute use cases in general.
Intel is still too new in the dGPU market to trust and on top of that there is so much uncertainty about whether that entire product line will disappear.
So at this point the CUDA moat makes is a non issue, on top of that what works works and keeps working, whereas with AMD I constantly wonder whether something will randomly not work after an update.
A timeline of decades for “features” your biggest consumers don’t care about is a reasonable tradeoff, even more so if actually pushing those features would reduce stability.
That's exactly the point. Nvidia might care about industrial use cases, while they don't care about desktop Linux usage and their support is garbage in result.
I've been using Nvidia gpus exclusively on debian linux for the past 20 years, using the binary Nvidia drivers. Rock solid stability and excellent performance. I don't care for Wayland as I plan to stay on Xorg + Openbox for as long as I can.
Wayland support hasn't been an issue since GLX was depreciated for EGLStream. I think the Nvidia backend has been "functional" for ~3 years and nearly flawless for the past year or so.
Both Mutter and KWin have really good Nvidia Wayland sessions nowadays.
It got better, but my point is how long it took to get better. That's the indicator of how much they care about Linux use cases in general. Which is way below acceptable level - it's simply not their priority (which is also exacerbated by their hostile approach to upstreaming).
I.e. if anything new will need something implemented tomorrow, Nvidia will make their users wait another decade again. Which I consider an unacceptable level of support and something that flies in the face of those who claim that Nvidia supports Linux well.
Buying AMD (for graphics) has been the only ethical choive for a long time. We must support the underdogs. Since regulation has flown the coop, we must take respondibility ourselves to fight monopolies. The short term costs may be a bit higher but the long term payoff is the only option for our self-interest!
> Conclusion: Buy AMD. Excellent Linux support with in-tree drivers.
Funnily, AMD's in-tree drivers are kind of a pain in the ass. For up to a year after a new GPU is released, you have to deal with using mesa and kernel packages from outside your distro.. While if you buy a brand new nVidia card, you just install the latest release of the proprietary drivers and it'll work.
Linux's driver model really is not kind to new hardware releases.
Of course, I still buy AMD because Nvidia's drivers really aren't very good. But that first half a year was not pleasant last time I got a relatively recently released (as in, released half a year earlier) AMD card.
A lot of people want to use Ubuntu or Ubuntu-based distros.
I have since switched from Ubuntu to Fedora, maybe Fedora ships mesa and kernel updates within a week or two from release, I don't know. But being unable to use the preferred distro is a serious downside for many people.
ATI/AMD open source linux support has been blowing hot and cold for over 25 years now.
They were one of the first to actually support open source drivers, with the r128 and original radeon (r100) drivers. Then went radio silence for the next few years, though the community used that as a baseline to support the next few generations (r100 to r500).
Then they reemerged with actually providing documentation for their Radeon HD series (r600 and r700), and some development resources but limited - and often at odds with the community-run equivalents at the time (lots of parallel development with things like the "radeonhd" driver and disagreements on how much they should rely on their "atombios" card firmware).
That "moderate" level of involvement continued for years, releasing documentation and some initial code for the GCN cards, but it felt like beyond the initial code drops most of the continuing work was more community-run.
Then only relatively recently (the last ~10 years) have they started putting actual engineering effort into things again, with AMDGPU and the majority of mesa changes now being paid for by AMD (or Valve, which is "AMD by proxy" really as you can guarantee every $ they spend on an engineer is $ less they pay to AMD).
So hopefully that's a trend you can actually rely on now, but I've been watching too long to think that can't change on a dime.
It is possible that at some point, maybe 15 years ago, AMD provided sufficient documentation to write drivers, but even 10 years ago, a lot of documentation was missing (without even mentioning that fact), which made trying to contribute rather frustrating. Not too bad, because as you said, they had a (smallish) number of employees working on the open drivers by then.
Those who want to run Linux seriously will buy AMD. Intel will be slowly phased out, and this will reduce maintenance and increase the quality of anything that previously had to support both Intel and AMD.
However, if Microsoft or Apple scoop up AMD, all hell will break loose. I don’t think either would have interest in Linux support.
Oh boy that strikes a nerve with the "Video memory could be gone after Suspend/Resume". Countless hours lost trying to fix a combination of drivers and systemd hooks for my laptop to be able to suspend/hibernate and wake up back again without issues... Which makes it even more complicated when using Wayland.
I have been looking at high-end laptops with dedicated AMD Graphics chip, but can't find many... So I will probably go with AMD+NVidia with MUX switch, let's see how it goes... Unless someone else has other suggestions?
> Basically, video memory could be gone after Suspend/Resume, VT-Switch and so on.
This actually makes sense: for example, a new task has swapped out previous task's data, or host and guest are sharing the GPU and pushing each others data away. I don't understand why this is not a part of GPU-related standards.
As for solution, discarding all the GPU data after resume won't help? Or keeping the data in the system RAM.
Last I tried to file a bug for a crash in an AMD Windows driver I had to through an anonymous developer I found on Discord, and despite weeks of efforts writing and sharing test case they choose to ignore the bug report I the end. The developer even asked not to be named as he might face repercussions for trying to help out.
I once had an mini pc with Nvidia. I got it for Cuda dev. One day the support for it was dropped for it so I was unable to update my system without it messing things up. So regardless of Cuda I decided Nvidia is not for me.
However, doing research when buying a new pc, I've found that AMD kind of sucks too. ROCm isn't even supported on many of the systems i was looking into. Also, I've heard their Linux graphics drivers are poor.
So basically I just rock a potato with Intel integrated graphics now. GPUs cost too much to deal with that nonsense.
In your case maybe, but not according to some of the comments here in this very thread and also in some forums and YouTube videos back when I'd last checked.
FWIW, my experience gaming/web browsing/coding on a 3070 with modern drivers has been fine. Mutter and KWin both have very good Wayland sessions if you're running the new (>550-series) drivers.
Apparently it is 5% ownership. Does that give them enough leverage to tank Intel’s iGPUs?
That would seem weird to be. Intel’s iGPUs are an incredibly good solution for their (non-glamorous) niche.
Intel’s dGPUs might be in a risky spot, though. (So… what’s new?)
Messing up Intel’s iGPUs would be a huge practical loss for, like, everyday desktop Linux folks. Tossing out their dGPUs, I don’t know if it is such a huge loss.
> Tossing out their dGPUs, I don’t know if it is such a huge loss
It would be an enormous loss to the consumer/enthusiast GPU buyer, as a third major competitor is improving the market from what feels like years and years of dreadful price/perf ratio.
amd is slow and steady. they were behind many times and many times they surprrised with amazing innovations overtaking intel. they will do it again, for both CPU and GPU.
Intel's iGPUs don't seem very at risk because the market for low-power GPUs isn't very profitable to begin with. As long as Nvidia is able to sell basically any chip they want, why waste engineering hours and fab time on low-margin chips? The GT 1030 (Pascal) never got a successor, so that line is as good as dead.
Even before the Pascal GTs, most of the GT 7xx cards, which you would assume were Maxwell or Kepler from the numbering, were rebadged Fermi cards (4xx and 5xx)! That generation was just a dumping ground for all the old chips they had laying about, and given the prominence of halfway decent iGPUs by that point, I can't say I blame them for investing so little in the lineup.
That said, the dGPUs are definitely somewhat at risk, but I think the risk is only slightly elevated by this investment, given that it isn't exactly a cash cow and Intel has been doing all sorts of cost-cutting lately.
Aren't a lot of those cards sold for the audience that needs more display heads rather than necessarily performance?
This has been somewhat improved-- some mainboards will have HDMI and DisplayPort plumbed to the iGPU, but the classic "trader desk" with 4-6 screens hardly needs a 5090.
They could theoretically sell the same 7xx and 1030 chips indefinitely. I figure it's a static market like those strange 8/16Mb VGA chipsets that you sometimes see on server mainboards, just enough hardware to run diagnostics on a normally headless box.
Agree. Not only would there be no money in it to try to replace Iris graphics or whatever they call them now -- it would be ultra pointless because the only people buying integrated graphics are those where gaming, on-device AI, and cryptocurrency aren't even part of the equation. Now, that is like 80%+ of the PC market, but it's perfectly well served already.
I saw this move more as setting up a worthy competitor to Snapdragon X Elite, and it could also probably crush AMD APUs if these RTX things are powerful.
Intel sells discrete cards and their next card was setup to do AI and games competently. They were poised to compete with the low to mid range Nvidia cards at HALF the cost.
It was definitely going to upset the market. Now i understand the radio silence on a card that was supposed to have been coming by Xmas.
Oh for sure. Arc is in jeopardy. Though tbh it was already, wasn't it? Can't you see an alternate universe where this story never happened, but Intel announced today "Sorry, because our business is dying in general and since Arc hasn't made us a ton of money yet anyway, we need to cut Arc to focus on our core blah blah blah".
I just meant their integrated GPUs are what's completely safe here.
It wasn't in jeopardy for being no good, it was in jeopardy because Intel is so troubled. Like the Bombardier C-Series jet: Everyone agreed it was a great design and very promising, but in the end they had no choice but to sell it to Airbus (who calls it the A220), I think because they didn't really have the money to scale up production. In like manner, Intel lacks the resources to make Arc the success it technically deserves to be, and without enough scale, they'll lose money on Arc, which Intel can hardly afford at this point.
Calling BS on "gaming not part of the equation". Several of my friends and I have exclusively games on integrated graphics. Sure we don't play the most abusively unoptimized AAA games like RDR2. But we're here and we're gaming.
RDR2 is quite optimized. We spend a lot of time profiling before release, and while input latency can be a tad high, the rendering pipeline is absolutely highly optimized as exhibited by the large amount of benchmarks on the web.
Sorry, I'm happy for you, and I do play Minecraft on an iGPU. I just meant that about 80% of the PCs sold seem to be for "business use" or Chromebooks, and the people writing those POs aren't making their selections with gaming in mind.
(And also, I'm pretending Macs don't exist for this statement. They aren't even PCs anymore anyway, just giant iPhones, from a silicon perspective.)
That performance is not surprising, Arc seems pretty dope in general.
I hadn't realized that "Arc" and "Integrated" overlapped, I thought that brand and that level of power was only being used on discrete cards.
I do think that integrated Arc will probably be killed by this deal though, not for being bad as it's obviously great, rather for being a way for Intel to cut costs with no downsides for Intel. If they can make RTX iGPUs now, and the Nvidia and RTX brand being the strongest in the gaming space... Intel isn't going to invest the money in continuing to develop Arc, even if Nvidia made it clear that they don't care, it just doesn't make any business sense now.
That is a loss for the cause of gaming competition. Although having Nvidia prop up Intel may prove to be a win for competition in terms of silicon in general versus them being sold off in parts, which could be a real possibility it seems.
"Gaming" = "real-time-graphics-intensive application". You could be playing chess online, or emulated SNES games, but that's not what "gaming" refers to in a hardware context.
> Sure we don't play the most abusively unoptimized AAA games like RDR2.
Wait, RDR2 is badly optimized? When I played it on my Intel Arc B580 and Ryzen 7 5800X, it seemed to work pretty well! Way better than almost any UE5 title, like The Forever Winter (really cool concept, but couldn't get past 20-30 FPS, even dropping down to 10% render scale on a 1080p monitor). Or with the Borderlands 4 controversy, I thought there'd be way bigger fish to fry.
It would be amusing to see nVidia cores integrated into the chipset instead of the Intel GPU cores. I doubt that is in the cards unless Intel is looking to slash the workforce by firing all of their graphics guys.
I would guess Nvidia doesn't care at all about the iGPUs, so I agree they are probably not at risk. dGPUs though I absolutely agree They are in a risky spot. Perhaps Intel was planning to kill their more ambitious GPU goals anyway, but That seems extremely unhealthy for pretty much everyone except Nvidia
We'd have to see their cap table approximation, but I've seen functional control over a company with just a hair over 10% ownership given the voting patterns of the other stock holders.
5% by about any accounting makes you a very, very influential stockholder in a publicly traded company with a widely distributed set of owners.
Intel was already dead, even money from gov didn’t help them. It is old, legacy, bad corp. I think NV just wants to help them and use however it wants - Intel management will do anything they say.
Intels gpus are a better solution for almost all computing outside high end gaming, ai, and a few other tasks. For most things a better gpu is overkill and wastes energy
- The datacenter GPU market is 10x larger than the consumer GPU market for Nvidia (and it's still growing). Winning an extra few percentage points in consumer is not a priority anymore.
- Nvidia doesn't have a CPU offering for the datacenter market and they were blocked from acquiring ARM. It's in their interest to have a friend on the CPU side.
- Nvidia is fabless and has concentrated supplier and geopolitical risk with TSMC. Intel is one of the only other leading fabs onshoring, which significantly improves Nvidia's supplier negotiation position and hedges geopolitical risk.
> Nvidia doesn't have a CPU offering for the datacenter market and they were blocked from acquiring ARM. It's in their interest to have a friend on the CPU side.
Someone should tell nvidia that. They sure seem to think they have a datacenter CPU.
I wonder if this signal a lack of confidence in their CPU offerings going forward?
But there's always TSMC being a pretty hard bottleneck - maybe they just can't get enough (and can't charge close to their GPU offerings per wafer), and pairing with Intel themselves is preferable to just using Intel's Foundry services?
>Nvidia is fabless and has concentrated supplier and geopolitical risk with TSMC.
East India Company has been conducting continental wars on its own. A modern company with $4T valuation and a country-GDP-size revenue and possessing key military technology of today and tomorrow wars - AI software and hardware, including robotics - can successfully wage such a continental war through a suitable proxy, say an oversized private military contractor (especially if it massively armed with drones and robots), and in particular is capable of defending an island like Taiwan. (or thinking backwards - an attack on Taiwan would cause a trillion or two drop in NVDA valuation. What options get on the table when there is a threat of a trillion dollar loss ... To compare - 20 years of Iraq cost 3 trillions, ie. 150B/year buy you a lot of military hardware and action, and efficient defense of Taiwan would cost much less than that.)
Not necessarily. Territorial war requires people. Defense from kinetic strikes on key objects concentrated on smallish territory requires mostly high-tech - radars and missiles - and that would be much easier for a very rich high-tech US corporation.
An example - Starlink antenna, sub-$500, a phased array which actually is like a half or a third of such an array on a modern fighter jet where it cost several millions. Musk naturally couldn't go the way of a million-per-antenna, so he had to develop and source it on his own. The same with anti-missile defense - if/when NVDA gets to it to defend the TSMC fabs, NVDA would produce such defense systems orders of magnitude cheaper, and that defense would work much better than the modern military systems.
> Taiwanese gov prevents them from doing it. Leading node has to be on Taiwanese soil
This is bold claim. Do you have a public evidence to share? I have never once seen this mentioned in any newspaper articles that I have read about TSMC and their expansion in the US.
Nvidia's options are fund your competition to keep the market dynamic, or let the government do it by breaking you a part.
So yes. That's how American competition works.
It isn't a zero sum game. We try to create a market environment that is competitive and dynamic.
Monopolies are threat to both the company and a free open dynamic market. If Nvidia feels it could face an antitrust suit, which is reasonable, it is in its best interest to fund the future of Intel.
Will Nvidia continue to exist beyond the current administration? If yes, then would it be prudent to consider the future beyond the current administration?
Microsoft wasnt funding bankrupt Apple, Microsoft was settling lawsuit with Jobs just on the cusp of DOJ monopoly lwasuit. Microsoft was stealing and shipping Apple QuickTime sourcecode.
> handwritten note by Fred Anderson, Apple's CFO, in which Anderson wrote that "the [QuickTime] patent dispute was resolved with cross-licence and significant payment to Apple." The payment was $150 million
One interesting parallel is Intel and AMD back in x86 1991, which is today the reason AMD is at all allowed to produce x86 without massive patent royalties to intel. [Asianometry](https://youtu.be/5oOk_KXbw6c) had a nice summery of it.
Nvidia is leaning more into data centres, but lack a CPU architecture or expertise. Intel is struggling financially, but have knowledge in iGPUs and a vast amount of patents.
They could have alot to give one another, and it's a massive win if it keeps intel afloat.
Yeah I think Nvidia were hostile to Linux when they saw no value in it. Now it's where the machine learning is. It's the OS powering the whole AI hype train. Then there is also Steamdeck making Linux gaming not a complete write off anymore.
The article hints at it, but my guess would be this investment is intended towards Intel foundry and getting it to a place where NVIDIA can eventually rely on them over TSMC — and the ownership largely to give them upside if/when Intel stock goes up on news of an NVIDIA contract etc. It isn’t that uncommon of an arrangement for enterprise deals of such a potential magnitude. Long-term, however, and without NVIDIA making the call that could definitely have the effect of leading to Intel divesting from directly competing in as many markets, ie Arc.
For context, I highly recommend the old Stratechery articles on the history of Intel foundry.
My first thought was also that this relates to Intel's foundry business. Even if only to be able to use it in price negotiations with TSMC (it's hard to threaten to go elsewhere when there is no elsewhere left).
Do we want Intel to fall and bankrupt? Or do we want Intel to survive. I dont think most people are clear what is happening here. This is it. The Margin call moment.
Intel could either transform into Fabless company compete on design, and manufacture with TSMC. Or they continue to be a Foundry player, crucial to US strategic interest. You can only pick one, and competing on just one of them is already a monumental task.
GPU is burning money. With no short term success in sight that could make them cash flow positive in 3 - 4 years time frame. I have been stating this since 2016 and we are now coming close to 2026, recent market share suggest Intel is at less than 1% discrete market share. Especially given the strong roadmap Nvidia has.
This gives a perfect excuse for Intel to quit GPU. Nvidia to provide the cash flow to hopefully continue to develop A18 and A14. Manufacture Nvidia GPU for them, and slowly transition itself out to only x86 + Foundry model. Or even solely manufacture for Nvidia. US administration further force Apple, Qualcomm and Broadcom to use Intel in some capacity. Assuming Intel can keep up with TSMC, which is probably a comparatively easier task than to tackle the GPU market.
I am assuming the Intel board is happy with that direction though. Because so far they have shown they are completely lack of any strategic vision.
This seems like it could be a long term existential threat for AMD. AMD CPU + GPU combos are finally coming out strong, both with MI300+ series in the supercomputing space, Strix Halo in laptops, etc. They have the advantage of being able to run code already optimized for x86 (important for gamers and HPC code), which NVIDIA doesn't have. Imagine if Grace Blackwell had x86 chips instead of Arm. If NVIDIA can get Intel CPUs with its chip offerings, it could be poised to completely take over so many new portions of the market/consolidate its current position by using its already existing mindshare and market dominance.
This seems more like the deal where Microsoft invested in Apple. It’s basically charity and they will flip it in a few years when Intel gets back on their feet.
Using fortunes falling in the lap to kill competition is a common practice of economics (vs. technology) oriented organizations. That brings benefits only for the organization, for others it brings damages and disappointments.
Yeah, Nvidia has trillions at stake, Intel a mere 100B. It's more in the interests of Nvidia to interfere with Intel's GPU business than to help it, and the only things they want from Intel are the fabs.
At this point Nvidia is just shooting themselves in the foot with hostility towards Linux - they are actively using Linux systems for DGX systems and the dependency on Linux is only going to grow internally.
Something about this reminds me of other industry gobbling purchases. None of them ever turned out better for the product, price or general well being of society.
As an Apple user (and even an Apple investor), I'd rather that Apple went out of business back then. If we could re-roll the invention of the (mainstream) smartphone, maybe we'd get something other than two monopolistic companies controlling everything.
For instance, maybe if there were 4 strong vendors making the devices with diverse operating systems, native apps wouldn't have ever become important, and the Web platform would have gotten better sooner to fill that gap.
Or maybe it'd have ended up the same or worse. But I just don't think Apple being this dominant has been good for the world.
Or... we could still be using blackberry-like devices without much in the way of active/touch interface development at all. Or worse, the Windows CE or Palm with the pen things.
Why? Was Steve Jobs literally the only human who was capable of seeing the massive unserved demand that existed back then?
Sidekick was amazing for its time, but only on one also-ran carrier. BlackBerry had great features like BBM (essentially iMessage) but underpowered for multimedia and more difficult to learn. If Apple was out of business, one or more companies would have made the billions on MP3 players that iPod made, and any of them could have branched into phones and made a splash the same way. Perhaps Sony, perhaps Microsoft. Microsoft eventually figured it out -- the only reason they failed was that they waited for both Apple and Android to become entrenched so in this timeline they could have been the second-mover, but unlike with Apple and Android, maybe neither MS nor Google would have automatically owned the US marketshare the way Apple does[1]. If that were the case, we may have competition, instead of the unhealthy thing we have where Apple just does whatever they want.
With all due respect theres a simple answer to why Apple was destined to win the smartphone race - they had a 5 year lead over everyone else because they had the OS and touch interface tightly integrated. On top of that they managed to scale up the production of the glass necessary for the touch to work and partnered with a network provider to overcome the control network providers had over handset producers.
They had such a lead that nobody was going to catch up and eat into their economic profits. Sure Samsung et al have captured marketshare, but not eaten into Apples economic profits.
Whether you like it or not, this hard work, effort and creativity deserves to be rewarded - in the form of monopoly/oligopoly profits.
Apple has shown itself to be very disciplined with its cash. That cannot be said of for Google, who instead of taking an endless stream of vanity projects, should return that cash back to shareholders.
BB10 was the shit. Fantastic OS and (some models) a great hardware keyboard. But it was already a response to the iPhone, wouldn't have happened without...
There's nothing supernatural about Apple that meant only they could do something better than that shitty generation of devices. Remember, the portable consumer electronics market would certainly have other huge players if Apple hadn't existed to make the iPod. BlackBerry, Microsoft, and Sony come to mind. iPhone, based mainly on Apple's popularity from the iPod era, got a huge jump from that, and then the rush for native apps, which encourages consolidation, smothered every other company's competing devices (such as WebOS, BlackBerry 10, Windows Mobile) before they had a chance to compete.
To be honest, Android may have met a similar fate if Apple had been able to negotiate a non-exclusive contract with Cingular/AT&T. My understanding though was that they had to give exclusivity as a bargaining chip to get all the then-unthinkable concessions, as yeah, every phone was full of garbage bloatware and festooned with logos.
Both. Also things like sound cards, network cards, peripherals in general.
My happiness and stability while using Linux has been well correlated with the number of devices with Intel in the name. Every single device without Intel invariably becomes a major pain in my ass every single time.
It's gotten to the point I assume it will just work if it's Intel.
> And does AMD not have excellent Linux support for their own CPUs and GPUs?
They're making a lot of progress but Intel is still years ahead of them.
Earlier this year I was researching PC parts for a build and discovered AMD was still working on merging functionality like on die temperature sensors into the kernel. It makes me think I won't have a full feature set on Linux if I buy one of their processors.
Well, AMD isn't going away yet, and they do seem to have finally released the advantage of open-source drivers. But that's still bad very for competition and prices.