As TFA rightly points out, unless something drastically changes in the next ~6mo, Intel is going to launch into the most favorable market situation we've seen in our lifetimes. Previously, the expectation is that they needed to introduce something that was competitive with the top end cards from nVidia and AMD. With basically all GPU's out of stock currently they really just need to introduce something competitive with the almost anything on the market to be able to sell as much as they can ship.
1) I hate to say "year of desktop Linux" like every year, but with the Steam Deck release later this year, and Valve's commitment to continue investing and collaborating on Proton to ensure wide-range game support; Linux gaming is going to grow substantially throughout 2022, if only due to the new devices added by Steam Decks.
Intel has always had fantastic Linux video driver support. If Arc is competitive with the lowest end current-gen Nvidia/AMD cards (3060?), Linux gamers will love it. And, when thinking about Steam Deck 2 in 2022-2023, Intel becomes an option.
2) The current-gen Nvidia/AMD cards are insane. They're unbelievably powerful. But, here's the kicker: Steam Deck is 720p. You go out and buy a brand new Razer/Alienware/whatever gaming laptop, the most common resolution even on the high end models is 1080p (w/ high refresh rate). The Steam Hardware survey puts 1080p as the most common resolution, and ITS NOT EVEN REMOTELY CLOSE to #2 [1] (720p 8%, 1080p 67%, 1440p 8%, 4k 2%) (did you know more people use Steam on MacOS than on a 4k monitor? lol)
These Nvidia/AMD cards are unprecedented overkill for most gamers. People are begging for cards that can run games at 1080p, Nvidia went straight to 4K, even showing off 8K gaming on the 3090, and now they can't even deliver any cards that run 720p/1080p. Today, we've got AMD releasing the 6600XT, advertising it as a beast for 1080p gaming [2]. This is what people actually want; affordable and accessible cards to play games on (whether they can keep the 6600xt in stock remains to be seen, of course). Nvidia went straight Icarus with Ampere; they shot for the sun, and couldn't deliver.
3) More broadly, geopolitical pressure in east asia, and specifically taiwan, should be concerning investors in any company that relies heavily on TSMC (AMD & Apple being the two big ones). Intel may start by fabbing Arc there, but they uniquely have the capacity to bring that production to the west.
Steam will hardly change the 1% status of GNU/Linux desktop.
Many forget that most studios don't bother to port their Android games to GNU/Linux, which are mostly written using the NDK, so plain ISO C and C++, GL, Vulkan, OpenSL,..., yet no GNU/Linux, because the market just isn't there.
The first wave of Steam Decks sold out in minutes. They're now pushing back delivery to Q2 2022. The demand for the device is pretty significant; not New Console large, but its definitely big enough to be visible in the Steam Hardware Survey upon release later this year, despite the vast size of Steam's overall playerbase.
Two weeks ago, the Hardware Survey reported Linux breaching 1% for the first time ever [1], for reasons not related to Deck (in fact, its not obvious WHY linux has been growing; disappointment in the Win11 announcement may have caused it, but in short, its healthy, natural, long-term growth). I would put real money up that Linux will hit 2% by the January 2022 survey, and 5% by January 2023.
Proton short-circuits the porting argument. It works fantastically for most games, with zero effort from the devs.
We're not talking about Linux being the majority. But its definitely looking like it will see growth over the next decade.
It's way better now than it was back then. There was a long period of good ports, which combined with the Steam for Linux client made Linux gaming a real thing already. But instead of fizzling out like the last time there were ports, now Linux transitioned to "Run every game" without needing a port. Some exceptions, but they are working on it and compatibility is huge.
This will grow slowly but steadily now, and is ready to explode if Microsoft does on bad move (like crazy Windows 11 hardware requirements, but we'll see).
Biggest danger to that development are the gpu prices, the Intel gpus can only help there. A competent 200 bucks model is desperately needed to keep the PC as a gaming platform alive. It has to run on fumes - on old hardware - now.
Vista brought a lot of migration to Linux indeed. And the Mac being incapable of playing games is a factor for quite some people. But until recently, gaming on Linux was limited.
What's the rate of increase though? Has it been linear, 0.05% per year? No: it has increased about 0.25% since 2019. It's not just that Linux is increasing, it's that it's rate of increase is also increasing.
How is it wishful thinking? It's a proportional increase. Meaning that it has increased in both absolute and relative terms, and done so at a faster rate in the last two years. I'm stating an observation of the data. I don't see room for wishful thinking in that.
Other platforms can increase while Linux also increases, the two aren't mutually exclusive.
I'm not saying it's going to take over the market. It's a huge long shot it will ever really rival Windows. Even if makes significant headway on Steam, that also doesn't necessarily translate to a corresponding change in OS market share. But the pending arrival of the Steam Deck combined with an enormous increase in game compatibility has nonetheless set the stage for significant gains for Linux with Steam.
Which part of the above observations are wishful thinking?
I think talking about android games is moving the goalposts a bit. There's no market for Android games on windows either.
On my end, and my Year of Linux Desktop started almost a decade ago, the Linux experience has not only never been better (of course) but has improved faster in recent years than ever before, and gaming is one of the fastest improving areas.
Android seems to only be Linux when bragging about how Linux has won, yet to point out that games on Android/Linux distribution don't get made available on GNU/Linux distribution, is moving goalposts.
Not disagreeing with your overall point, but it's pretty rare for people to port their mobile game to PC even if using Unity and all you have to do is figure out the controls. Which you've probably got a beta version of just to develop the game.
>Steam will hardly change the 1% status of GNU/Linux desktop.
I agree. But it will change the status in the listings. Steam deck and steamos appliances should be broken out into their own category, and I could easily see them overtaking linux desktop
… Nvidia did release lower end cards that target the same market and price point as the 6600 XT a lot earlier than AMD though - as far as MSRP goes the 3060 and 3060 Ti bracket the 6600 XT’s $380 at $329 and $399 (not that MSRP means a thing right now) and similarly brackets performance, and even the MSRP was not received well in conjunction with the 1080p marketing. Both manufacturers have basically told the mid and low range market to buy a console even if you are lucky enough to get an AMD reference or Nvidia FE card.
I've happily taken their advice and have moved to an Xbox Series S for a good 80% of my gaming needs. What gaming I still do on my PC consists mainly of older games, emulators and strategy games. Although I've been messing with Retroarch/Duckstation on my Xbox, and it's been quite novel and fun to be playing PS1 games on a Microsoft console.
You'd think so, but somehow it hasn't happened. I'm just waiting for PCSX2 to mature a bit more on Xbox, then I'll be playing PS2 games on my Xbox too.
Same here, I was looking to buy a GPU and for half the cost of a decent one (current market prices) I instead bought an Xbox Series X and a gamepass subscription. My gaming needs have been covered since February and gamepass keeps delivering while gpu prices are still too high for a no very serious gamer like myself.
The only game I've had to buy was Forza Motorsport 7, which is deeply discounted because it's leaving the service. Oh, and I couldn't resist Splinter Cell (the original Xbox version) and Beyond Good and Evil, which are also discounted to like 4 Euros. I might just end up buying the whole Splinter Cell series.
Otherwise, I have a list of 100 games installed that I'm slowly playing through that Game Pass has given me access to. Now today we're getting Humankind (on PC) and in two days Twelve Minutes. It's a ridiculously good deal.
Intel sells expensive CPUs which are becoming useless thanks to ARM - as much in consumer devices as they are in datacenters, with big players designing their own ARM chips. GPUs are their lifeboat. Three GPU players is better than two, but I don't see much of a reason to be long Intel.
You’re overestimating the role of the architecture. X86 is just fine, and is perfectly competitive at comparable node generations. Don’t believe everything Apple tells you. ;)
I don't believe much of anythign Apple tells me. The x86 is fine, but any reasons to prefer it to other architectures are disappearing fast. As someone who's suffered (literally suffered) due to Intel's abysmal and execrable graphics performance in the past, I don't expect that they'll exactly blow out this market.
One of the biggest reasons I want a real next-gen ARM-based Surface Pro is that I want to put Intel in the rearview mirror forever. I didn't hate Intel until I started buying Surfaces, then I realized that everything that sucks about the Surface family is 100% Intel's fault, from beyond-buggy faulty power management (cutting advertised battery life more than in half) to buggy and broken graphics ("Intel droppings" or redraw artifacts on the screen) to "integrated" graphics performance that just simply sucks so bad it's unusable for simple 3D CAD, much less gaming.
IMO, the long position on Intel is basically a bet on China invading Taiwan, or the US gov't subsidizing Intel. Both are certainly extreme events, but given Chinese military expansion and US increase in gov't spending, it doesn't seem impossible.
China doesn't even have to invade: just an ever incrementally increasing amount of soft pressure, punctuated by slightly larger actions calibrated to fall just below a provocation that requires a response. See their naval exercises for just one example.
Didn't they already seed the government with chinese officials last year? there were tons of riots etc... but so many 'bad' things in news that we don't focus on it anymore
There are also more reasons to buy nvidia gpus just than their speed they have great driver support, features like RTX Voice which cleans up the audio and other proprietary features like DLSS. Though I would argue that desktop gaming is becoming a thing of the past because mobile gaming is that much more powerful and easier to use and setup.
> These Nvidia/AMD cards are unprecedented overkill for most gamers. People are begging for cards that can run games at 1080p, Nvidia went straight to 4K, even showing off 8K gaming on the 3090, and now they can't even deliver any cards that run 720p/1080p. Today, we've got AMD releasing the 6600XT, advertising it as a beast for 1080p gaming [2]. This is what people actually want; affordable and accessible cards to play games on (whether they can keep the 6600xt in stock remains to be seen, of course). Nvidia went straight Icarus with Ampere; they shot for the sun, and couldn't deliver.
Do they need dedicated GPUs at all then? My impression has been that if you just want to go at 1080p and modest settings modern integrated GPUs can do fine for most games.
It depends on the game; for something like CS:GO or Overwatch, an integrated GPU @ 1080p is fine. For something like Cyberpunk, Apex Legends, or Call of Duty, its really not enough for an enjoyable experience.
I don't own any INTC, just index funds right now, but I tip my hat to you on Intel. Buying Intel today is essentially like gobbling up American apple pie. If you love China, buy paper design firms AMD or Apple. If you like the USA, buy Intel or maybe a paperweight in Nvidia, as they're diversified at Samsung and in talks to produce at Intel.
I'm expecting China-Taiwan-US tensions to increase, and all these outsourcing profit seeking traitors will finally eat crow. As if "just getting rich" wasn't enough for them.
My stock bets and opinions don't have to necessarily be pro-American (certainly wouldn't buy anything outright anti-American). But I love it when they align like today.
That depends on how much of TSMC's capacity they've reserved, and which node they are manufacturing on. I'm guessing that a low end GPU doesn't need a 5nm node. I think these are 7nm, which is still going to be booked tightly but probably not as heavily bottlenecked as smaller nodes.
The '+' in this case is a common process node trope where improvements to a node over time that involve rules changes become Node+, Node++, Node+++, etc. So this is a node that started as Samsung 10nm, but they made enough changes to it that they started marketing it as 8nm. When they started talking about it, it wasn't clear if it was a more manufacturable 7nm or instead a 10nm with lots of improvements, so I drop the 10nm++++ to help give some context.
How is it 'much better'? 7nm is not better than 8nm because it has a smaller number - the number doesn't correlate strongly with transistor density these days.
Did you bother trying to do any research or comparison between TSMC's 7nm & Samsung's 8nm or did you just want to make the claim that numbers are just marketing? Despite the fact that numbers alone were not being talked about, but two specific fab processes, and thus the "it's just a number!" mistake wasn't obviously being made in the first place?
But Nvidia has Ampere on both TSMC 7nm (GA100) and Samsung's 8nm (GA102). The TSMC variant has a significantly higher density at 65.6M / mm² vs. 45.1M / mm². Comparing across architectures is murkey, but we also know that the TSMC 7nm 6900XT clocks a lot higher than the Samsung 8nm RTX 3080/3090 while also drawing less power. There's of course a lot more to clock speeds & power draw in an actual product than the raw fab transistor performance, but it's still a data point.
So there's both density & performance evidence to suggest TSMC's 7nm is meaningfully better than Samsung's 8nm.
Even going off of marketing names, Samsung has a 7nm as well and they don't pretend their 8nm is just one-worse than the 7nm. The 8nm is an evolution of the 10nm node while the 7nm is itself a new node. According to Samsung's marketing flowcharts, anyway. And analysis suggests Samsung's 7nm is competitive with TSMC's 7nm.
Nvidia doesn't price any cards other than the founder's editions which you'll notice they both drastically cut down on availability for and also didn't do at all for the "price sensitive" mid-range tier.
Nvidia's pricing as a result is completely fake. Like the claimed "$330 3060" in fact starts at $400 and rapidly goes up from there, with MSRP's on 3060's as high as $560.
I didn't say NVIDIA did directly price cards? Doesn't sound like you are doing a very good job of following the HN rule - always give the most gracious possible reading of a comment. Nothing I said directly implied that they did, you just wanted to pick a bone. It's really quite rude to put words in people's mouths, and that's why we have this rule.
But a 6900XT is available for $3100 at my local store... and the 3090 is $2100. Between the two it's not hard to see why the NVIDIA cards are selling and the AMD cards are sitting on the shelves, the AMD cards are 50% more expensive for the same performance.
As for why that is - which is the point I think you wanted to address, and decided to try and impute into my comment - who knows. Price are "sticky" (retailers don't want to mark down prices and take a loss) and AMD moves fewer cards in general. Maybe that means that prices are "stickier for longer" with AMD. Or maybe it's another thing like Vega where AMD set the MSRP so low that partners can't actually build and sell a card for a profit at competitive prices. But in general, regardless of why - the prices for AMD cards are generally higher, and when they go down the AMD cards sell out too. The inventory that is available is available because it's overpriced.
(and for both brands, the pre-tariff MSRPs are essentially a fiction at this point apart from the reference cards and will probably never be met again.)
> But a 6900XT is available for $3100 at my local store... and the 3090 is $2100.
That's just your store being dumb, then. The 6900 XT is averaging about $1,500 brand new on eBay[0] while the 3090 is going for about $2,500[1]. Even on Newegg, the cheapest in-stock 6900 XT card is $1,700[2] while the cheapest 3090 is $3,000[3]. Everything I've read suggests that the AMD cards, while generally a little slower than their Nvidia counterparts (especially when you factor in ray-tracing), give you way more bang for your buck.
> the prices for AMD cards are generally higher
This is just not true. There may be several reasons for the Nvidia cards being out of stock more often than AMD: better performance; stronger brand; lower production counts; poor perception of AMD drivers; specific games being optimized for Nvidia; or pretty much anything else. But at this point, pricing is set by supply and demand, not by arbitrary MSRPs set by Nvidia/AMD, so claiming that AMD cards are priced too high is absolutely incorrect.
This is a problem for AMD especially, but also Nvidia. Not so much for Intel. They're just budging in line with their superior firepower. Intel even bought out first dibs on TSMC 3nm out from under Apple. I'll be interested to see the market's reaction to this once everyone realizes that Intel is hitting AMD where it hurts and sees the inevitable outcome.
This is one of the smartest moves by Intel, make their own stuff and consume production from all their competitors, which do nothing but paper designs. Nvidia and especially AMD took a risk not being in the fabrication business, and now we'll see the full repercussions. It's a good play (outsourcing) in good times, not so much when things get tight like today.
Do you have sources for any of your claims? Other than going fabless being a fantastic way to cut costs and management challenges, but increase longterm supply line risk, none of that is anything that I've heard. Here are sources for my claims.
I see zero upside with these developments for AMD, and to a lesser degree, Nvidia, who are better diversified with Samsung and also rumored to be in talks with fabricating at Intel as well.
This doesn't really work. If there is more demand, they'll build more fabs. It doesn't happen overnight -- that's why we're in a crunch right now -- but we're talking about years of lead time here.
TSMC is also not stupid. It's better for them for their customers to compete with each other instead of having to negotiate with a monopolist, so their incentive is to make sure none of them can crush the others.
> I see zero upside with these developments for AMD, and to a lesser degree, Nvidia
If Intel uses its own fabs, Intel makes money and uses the money to improve Intel's process which AMD can't use. If Intel uses TSMC's fabs, TSMC makes money and uses the money to improve TSMC's process which AMD does use.
>TSMC is also not stupid. It's better for them for their customers to compete
It depends how much money is involved. I don't think TSMC, nor any business is the master chess player you're envisioning, losing billions to instead worry about AMD's woes. But rather consider AMD's troubles as something they'll have to sort out on their own in due time. They're on their own.
Now, AMD and TSMC do have a good relationship. But large infusions of wealth corrupt even the most stalwart companions. This is one of those things that people don't need to debate, we'll see the results on 3nm. At a minimum, it looks like Intel is going to push AMD out of TSMC's leading nodes. There's no way to size this up as good news for AMD.
>If Intel uses TSMC's fabs, TSMC makes money and uses the money to improve TSMC's process which AMD does use.
TSMC is going to make money without Intel anyway. Choking AMD may not be their intention, but it's certainly a guaranteed side effect of it. Intel makes so much product that they are capable of using up their own space, and others. And now the GPUs are coming. Intel makes 10-times the profit AMD does per year. If AMD didn't have an x86 license, no one would utter these two companies names in the same sentence.
I expect AMD to start using N3 after Apple and Intel have moved on to N2 (or maybe 20A in Intel's case) in 2024 so there's less competition for wafers.
Indeed. Something that's affordable and hits even RX 580 performance would grab the attention of many. Good enough really is when supply is low and prices are high.
What about design capabilities? If they had it in them what were they doing all these yrs? i mean since 2000 i can remember a single GPU from intel that wasnt already behind the market.
Raja Koduri is Intel’s lead architect for their new product line; prior to this he was the lead of the Radeon Technologies Group at AMD, successfully delivering Polaris, Vega and Navi. Navi is AMD’s current GPU product architecture.
You're looking at the wrong numbers. The wafer capacity of memory fabs and logic fabs that are only equipped for older nodes aren't relevant to the GPU market. So Micron, SK hynix, Kioxia/WD and a good chunk of Samsung and TSMC capacity are irrelevant here.
I refuse to engage with the current GPU pricing insanity, so my 5900x is currently paired with a 960 GTX. When Intel enters the market it will be another factor in driving pricing back down, so might play Cyberpunk in 2022...
If you really want to play Cyberpunk on PC, and don't want to buy a new GPU.. playing it on Stadia is an option (especially if you have a GPU that can support VP9 decoding). I played it at 4K/1080p, and it looked pretty good. However, I think if you want the best graphics fidelity (i.e. 4K RayTracing), then you probably do want to just get a high end video card.
i wanna get a good Alyx setup to finally try VR, but with the gpu market the way it is, looks like my RX480 4GB will be sticking around for another 5yrs - it's more expensive now than it was 4 yrs ago (used), and even then it was already 2yrs old. batshit crazy; no other way to describe it :(
That's not quite true. The Arc was originally known as the DG2 and is the successor to the DG1. So to say it isn't "remotely related" is a bit misleading, especially since we have very little information on the architecture.
For some comparison, that's a 30W 80EU part using 70GB/s memory. DG2 is supposed to be 512EU part with over 400GB/s memory. GPUs generally scale pretty well with EU count and memory bandwidth. Plus it has a different architecture which may be even more capable per EU.
Just to add on that, DG1 was comparable to integrated graphics, but just in a discrete form factor. It was a tiny bit better because of higher frequency, I think. But even then it wasn't better in all cases, if I recall correctly.
Given that timeline and their years of existing production history with Thunderbolt, Intel could also feasibly beat both of them to shipping USB4 on a graphics card.
Yep, basically anything capable of playing new games on lower setting at 720p, and > 3yr old games at better settings should be highly competitive in the low-end gaming market. Especially laptops where they might be a secondary machine for gamers with a high end desktop.
Yep, basically anything capable of playing new games at low setting and older > 3yr old games at better settings should be highly competitive in the low-end gaming market.
Anecdotally, I've noticed prices falling on the lower end. My aging RX 580 was worth over $400 used at the beginning of the year; it now goes for ~$300. The 5700 XT was going for close to $1k used, and is more recently selling for $800-900.
With that said, I don't know if it's a sign of the shortage coming to an end; I think the release of the Ryzen 5700G with integrated graphics likely helped bridge the gap for people who wanted low-end graphics without paying the crazy markups.
I remember RX 580s going for 170 Euros before the pandemic. I can only hope that prices reach sane levels sooner than later, but I suspect we're going to see at least another year of outlandish prices.
If they come out swinging here they could have the most deserved smugness in the industry for a good while. People have been rightly criticising them but wrongly writing them off.
It's not "just" a shortage of GPU's but all kinds of components.
And it's also not "just" caused by miners.
But that means if they are really unlucky they could launch into a situation where there is a surplus of good second hand graphic cards and still shortages/price hikes on the GPU components they use...
Through as far as I can tell they are more targeting OEM's (any OEM instead of a selected few), and other large customers, so it might not matter too much for them for this release (but probably from the next one after one-ward it would).
Probably until at least 2022 because the shortage of GPUs isn't solely because of crypto. Until we generally get back on track tricking sand to think we're not going to be able to saturate demand.
> Will Intel have a CPU+GPU combo product for laptops?
What? Obviously the answer is yes, how could it possibly be no? CPU+GPU combo is the only GPU related segment where Intel currently has a product.
To be more specific, Intel currently allocates a fair chunk of their dies to iGPUs. Will that no longer be the case when Intel manufacturers their own dedicated GPUs? It seems like a waste of silicon.