Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Crashes, Hangs and Crazy Images by Adding Zero: Fuzzing OpenGL Shader Compilers (medium.com/afd_icl)
68 points by ingve on Dec 1, 2016 | hide | past | favorite | 23 comments



Exploits in shader compilers will likley be a big avenue for exploiting platforms in the future because basically all desktop systems use Nvidia, and an exploit of the shader compiler can usually get you direct root access.


On the Mac it's worse: the drivers are not NVIDIA's (which are relatively high quality) but instead Apple's own and have become horrendously bad in the past few years. I've lost count of the number of just plain broken functionality and hard system crashes I've had to deal with doing graphics programming with Apple drivers, including a random out of bounds memory read (security sensitive? who knows!) that was particularly fun.

As far as I can tell, Apple has set a new record for how quickly you can go from the best OpenGL drivers in the industry to the worst.


Well, this is likely due to the refocus on "Metal". Which is just DirectX _de novo_.

It's not only bugs, it's the lack of features. It's out of date, to the point that companies are no longer releasing games. For instance, Frontier has an Elite: Dangerous version for the Mac. It doesn't have the expansions, due to the lack of compute shaders, which are not supported by Apple's crappy OpenGL implementation.

Not that Macs are good gaming machines. But this only compounds the problem.


> Which is just DirectX _de novo_.

Except without the market share in high end gaming/graphics that made D3D viable.

I wish they would realize that the people they're hurting with this decision are not their competitors (who are actually probably happy to see them underinvest in good GL drivers) but rather us developers, who do have to develop cross-platform apps for market share reasons and so end up shouldering the costs of needless fragmentation.


Sorry, but this has been bothering me for a long time and I would like to address it here.

The myth that Apple writes their own drivers for non-Apple GPUs is absolutely untrue, and I am amazed that it is so widespread when there is very little justification for it.

Apple's involvement with the graphics stack is about the same amount as Microsoft's. They provide the common APIs and the OS support, but the hardware is the vendor's responsibility.

There was precisely one moment in time when this was at least a third true: When Intel had subpar GPUs and offensively bad drivers. Since then, the work has transitioned back to Intel.

Nvidia writes the drivers for the Nvidia GPUs. AMD writes the drivers for the AMD GPUs.

Why are their drivers horrendously bad? Well, for Nvidia, Apple is such an increasingly small market that there is very little financial incentive for Nvidia to care. For AMD, there is slightly more of a reason to care because any market is a good market for them, and I speculate that this is a large part of the reason we've seen fewer and fewer Nvidia cards in Apple products: Apple can get AMD to care, but not Nvidia. When you only use non-Intel or non-Apple GPUs in a very tiny percentage of your products, it's hard to justify paying a ton of money out to incentivize a vendor to care.

There is very little reason why AMD or Nvidia or even Intel would share enough details about their recent hardware with a company that is becoming increasingly self-sufficient.

There are plenty of feedback loops at play here, and plenty enough reasons to see why non-Apple graphics drivers on Apple platforms would be poor. I can't for the life of me see why the assumption that Apple is writing these drivers would make sense.


It's actually some combination of Apple and NVIDIA drivers, but on OS X, unlike pretty much every other OS, Apple insists on owning the OpenGL entry points and doing a bunch of work in its own driver before calling down into the vendor driver's OpenGL entry points. The vendor driver then has to undo some of the "helpful" things that Apple did, and work around whatever bugs are in the Apple code. For NVIDIA, in particular, the OpenGL driver is pretty much the same between Windows and Linux, and so one shouldn't be much more broken than the other, at least when it comes to core OpenGL functionality. But the OS X driver can't take as much advantage of the common code because of the Apple code that it has to deal with.


Measured in install base on desktop/laptop, Intel is much more popular than NVIDIA.

Shader compiler bugs are pretty serious, but they should have exploit mitigation. It takes GLSL or SPIR-V as input and emits GPU machine code blobs, the compiler doesn't need any special privileges. It's just a big attack surface.


That still gives the attacker the ability to emit arbitrary GPU machine code.

I am not familiar with the details of GPUs, but it would not suprise me if arbitrary code execution on a GPU can lead to root access on the main CPU. Anyone know if there is any protections against this (or if it is architectually impossible)?


GPU shaders are, by design, arbitrary GPU code execution. You don't have to do anything special or create an attack per-se. Shaders can't lead to execution of CPU code on their own, they are blackboxed and only have access to (limited) GPU memory.

I can imagine the potential for a combined CPU-GPU exploit, but it's not likely to hinge on a shader. I wouldn't say absolutely "impossible" but if I was looking for GPU exploits, I'd poke at the compiler and API first.


i should mention that once I spoke to the head engineer for nvidia graphics cards and somebody asked him what happens if you try to render a texture filled with NaNs (common result of divide by zero). he said it renders with the color "nvidia green".


They look black to me...


Are you rendering natively or through ANGLE? OpenGL or DirectX? Which kind of NaNs?


Native OGL 4.3, on a 970m. Nans are qnans part of a quad texture.


Content-free article, full of "read the next post"


I imagine a lot of the specific detail has been omitted to avoid this exploit being used for malicious purposes.

EDIT: Followup post is here: https://medium.com/@afd_icl/first-stop-amd-bluescreen-via-we...


Much better, thanks for posting


I wouldn't say content-free, but content-lite. It still covers the basic ideas and tools used, but doesn't go in depth yet.

Disclaimer: The article author was both a lecturer and supervisor to me during my undergraduate degree, but I was in no way involved in the work discussed in the article.



In a perfect world the hardware manufacturers would fuzz their own drivers before release, rather than waiting for someone else to do it and then ignoring their bug reports.


Wasn't this type of a thing a concern back in 2011?

Microsoft talked about the dangers of this back then. https://blogs.technet.microsoft.com/srd/2011/06/16/webgl-con...


Now I'm wondering if it's possible to create pathological images that should render vastly differently in the face of small changes. Specifically, anything which responds so nonlinearly to alteration that the small floating point variance mentioned could produce drastically new results.


That sounds similar in concept to the CSS Acid tests

http://origin.arstechnica.com/staff.media/acidtrip.gif


That reminds me of that time when I was developing shaders for OpenGL ES on Android and since there were barely any shader debugging tools on that platform, virtually the only debugging tool was debugging 'by eye'.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: