Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've always heard these "conspiracy" rumors since highschool. They never made sense to me because of competition... I can't believe that they have the engineers formerly known as ATI that far in the rear view mirror especially since the manufacturing processes everyone in Silicon works with are the same. So where does this "fairly well understood" assumption come from?


Heres the deal, now both Intel and AMD are operating on 14nm 16nm, things are going to rebalance out in both cpu and gpu markets. After much deliberation, I have decided to wait for Zen for all my future builds, because I have this strong feeling AMD is about to be back and stronger than ever before.

Basically, Intel was ahead of AMD(ATI), but not anymore...

I welcome more competition in the market either way.


I love AMD and I'm rooting for them, but I'm not sure we are going to have another Athlon run here.

AMD is still having trouble beating out nVidia, and Intel. The Zen architecture could in theory have HBM for the system CPU/APU, but I'm guessing they won't.

Finally A lot of Intel performance comes from compiler optimization and design, and last time with Athlon it took years for that to roll out. Unless Zen is years ahead of Intel, I don't know if they will be able to edge that far ahead. Intel has been doing CPU-GPU pairs now for more than 5 years, and they are getting better.

I think for AMD to win this round, they would need to have a power ratio that allowed them to dominate the laptop space, with APUs and eliminate the GPU as a component, and right now AMD and Intel are both on 14nm, so I think it would be hard to win there, but we will see.

nVidia is on the 16nm process, so if AMDs chips are equal in performance, they might draw less power. However, for VR, nVidia has been working very closely on the software with the various companies and I think they will have a strong software advantage for a while.


For Zen, they don't have to be the best, just good enough. Frankly, for consumer Intel has been good enough since the 2600 days (except on the mobile front). So Zen should be fine in that market.


Good enough or, "we're #2!" might be all that's needed to chip off some server shares. Which is good for AMD. But for my machine I'm not sure I'd give up any single threaded performance for the promise of 8 physical cores with SandyBridge level IPC.


Because AMD cut linux support for their mid-range GPU's 2 or 3 years in after their release they can royally go fuck themselves.

They would really need to do something groundbreaking for me to even consider them.

Meanwhile im on Nvidia 460GT playing Witcher 3 on high settings (yes, sometimes cutscenes(of all the things, huh) stutter at the beginning, but otherwise for a 2010 GPU is runs ~60fps) , still getting updates, linux "Just Werks(TM)" In a year I'll prolly upgrade...


I'm dealing with this right now - I (foolishly) bought a Radeon 390 instead of an Nvidia card of the same price, and the Linux drivers for it are complete clown shoes. Hell, Ubuntu 16 doesn't even have fglrx. The open-source drivers kinda-sorta work, but they're definitely not good for gaming, and I get a whole bunch of tearing.


I'm sorry to hear that you're having issues with gaming on the open-source driver. May I ask where the issues are specifically? We're still working the rough edges of OpenGL 4.3 support, so some of the very latest games are going to have issues because of that. Apart from that, I do think we're pretty good at fixing bugs that are actually reported to us, and the (few) places where we're behind in performance are slowly getting better as well.


Tearing is the big one. As soon as a game starts playing, I get a whole bunch of horizontal lines that will continuously shift and flicker as the screen refreshes. It doesn't happen with regular video, but it happens with Kerbal Space Program (not that graphically intensive) and PCSX2 (very graphically intensive).

I don't know nearly enough to say exactly where the problem is, but one of the more frustrating aspects is that there is next to no documentation on troubleshooting such issues. It might be a problem on my end for all I know, but all that I can tell is that before updating to Ubuntu 16, fglrx worked okay, and after updating, playing games is extremely aggravating.

I would like to say that the drivers are a hell of a lot better than they were in 2011, though - last time I had to deal with the open-source drivers, it was "Display minimum functionality so that you have the ability to install the proprietary drivers," and HDMI support was horrifyingly bad. These days, it's "Works for everything except gaming," which is completely serviceable for me.


I agree that the configuration story needs work and documentation.

What you could try is to set the vblank_mode=3 environment variable to force V-Sync. If that helps, it means there's either a stale bad config file or a bug somewhere in the client API implementation...

Make sure you have no ~/.drirc, and you really shouldn't need an xorg.conf either.


I added vblank_mode=3 to my .bashrc, and then removed my xorg.conf (which was configured for fglrx, I assume from when it had the Catalyst drivers). ~/.drirc didn't exist on my system. No change. :(


Yes, the performance gains the free driver saw the last 1-2 years were incredible.


My next graphics card for my Linux box will definitely be AMD. I want to support a company that has been as supportive of open source as they have. The strides made by the AMD open souce driver lately have been very impressive. nVidia comes nowhere close. Remember Linux famously giving nVidia the finger a year or two back?


I'm all for competition, but if I'd had a dollar for everytime someone on the internet said to wait for AMD/ATI's next revolutionary product, and them not delivering... Yeah.


>because I have this strong feeling AMD is about to be back and stronger than ever before.

Sorry, but AMD is not coming back in the foreseeable future. It's all about economies of scale, and Intel has them, AMD has not.


If you think of it as a cash flow it helps. I remember the first Nvidia card I was given (as you needed to test your game on a variety of graphics cards).

This is back in the day of 3DFX, PowerGL, you know when there was actual competition in the market. The nVidia card blew them all away, they destroyed the competition. So how do you continue to grow revenue after you've destroyed the competition? Invitation, but not too much (what Apple does now, and very well).


Seems like you made some enemies with the truth. Nothing has come out for several years and the prices on the high end cards haven't moved an inch because AMD had nothing to compete with. Nvidia absolutely sits on tech and milks everything and always has - markets where there is 0 competition always do this; see local cable co monopolies for great examples.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: