Hacker News new | past | comments | ask | show | jobs | submit login

WebGL/OpenGL doesn't use sRGB in the shaders. If you load an sRGB texture, or render to an sRGB surface, the API automatically applies the gamma- (or inverse gamma) curve, so the shader only ever sees the linear values.



Correct, it uses just numbers without any specific information about a colorspace being involved. Decoding and encoding sRGB happen during (texture) read and write stages.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: