Hacker Newsnew | past | comments | ask | show | jobs | submit | brenelson's commentslogin

I definitely agree with this one, that's why I use Google search even though I'm searching for a specific reddit content.


All I can hope right now is that Intel will develop GPUs to encourage competition and reduce the GPUs to competitive prices.


Intel Arc discrete GPUs are aimed at high-end gaming PCs, and should start arriving Q1 2022:

https://en.wikipedia.org/wiki/Intel_Arc

Likely they'll be subject to the same price gouging and limited supply as AMD and NVidia GPUs...


And these seem to be delayed based on a few early 2022 reports, citing one [1]

[1] https://www.techtimes.com/articles/270196/20220106/intel-arc...


Can anyone in the space explain why it would be hard for Intel to be competitive in the GPU space? Is GPU development so far away from CPU development that they couldn't succeed?


I'm not "in the space" but there are many reasons a detractor might give for being doubtful.

- Intel is not already experienced in designing high-end graphics cards, so there's less institutional knowledge in the company

- Intel iGPUs have a mediocre reputation

- Intel has a history of failed attempts to enter the GPU/accelerator space (e.g. i740 and Larabee)

- DG2 has been delayed from the timeline that was expected in the techspace when we first heard about it

- The person leading Intel Xe is Raja Koduri who led AMD graphics during AMD Vega (though it's unclear how responsible he is for its shortcomings) and arguably has a history of overpromising.

That said I'm pretty bullish on Intel GPUs because supply still hasn't caught up with demand, so as long as they are reasonably functional, I think they will succeed.


> - The person leading Intel Xe is Raja Koduri who led AMD graphics during AMD Vega (though it's unclear how responsible he is for its shortcomings) and arguably has a history of overpromising.

Raja was the technical leader of ATI in 2001+. His hands were involved in every single GPU from that time until Vega.


He went to Apple and back to AMD


Does Intel have access to the necessary patents?


Intel has been pretty bad with bringing new products to market and sticking with them.

Intel Optane, Itanium, Xeon Phi, Cell-phone Atom (yes, this was a thing), Intel Edison, etc. etc.

Intel pretty much makes good Ethernet adapters, Wifi adapters, and CPUs these days. Everything else they've tried seems to die off a few years later.

----------

Its like seeing "Google" try to make a new free webservice. We all know Google has the expertise to make it work, but for some reason, it doesn't.

Similarly, Intel has a wide variety of experts in a number of fields related to chip-making, along with historically successful products and a few chips that are highly important in the field. But something is clearly "off" with Intel's current culture.

---------

Intel has some GPU-engineers working for them. The question is if the organization as a whole can actually release a product (step 1), and then support it for more than 5 years.

I think its a bit of the "curse of the successful company". No matter what product Intel makes, there's no way its going to make as much money as Intel's CPUs. So the business-minds / bean-counters kill the product.


One reason I can think of is that GPUs are a much newer invention than CPUs, and as such there is less history behind how they work and there are far more active patents on their technology.


Try getting another set of Bluetooth earphones, then weigh in the quality.


It's quite surprise on how did they take this one.


I suggest diving right into w3schools, you can start building projects while learning at the same time.


It's okay as a reference, but Free Code Camp does the same more effectively.


It would be better if we could take a look at the application before we decide if we're going to buy or not. There's so little information on the website that makes us convinced to buy this app.


That's great feedback! I'll work on putting together a before/after - That should better communicate how it works. EDIT: Just added a side by side comparison of the page - Its not the cleanest but hopefully shows you what it does!


Still, Google owns everything in terms of search engines. You can't probably deny that, but it's a big prop to Brave for creating their unique search engine.


It doesn't make any sense. Why do you need to have 600W to power a graphics card? Why do they need an independent power connector?


I'm not sure I understand your question, but I will try.

>>Why do you need to have 600W to power a graphics card?

You don't. There is no GPU currently that needs that much. But as cards are comfortably approaching 400W and more, a new connector is necessary so you don't end up with GPUs taking 3 or 4 existing PCIe 8-pin power connectors. This single 55-amp compatible connector allows for significantly easier routing of paths on circuit boards.

>>Why do they need an independent power connector?

Because the PCIe slot alone can only provide 75W of power. And if it could provide more you'd need to provide that power to the motherboard so you just moved your connector from one place to another.


This looks interesting; I'm interested in the upcoming updates. I do hope that it will be user-friend for people that are not coming from technical backgrounds.


Care to elaborate more? I thought it is user friendly enough.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: