Framebuffer is interpreted as BGRA instead of RGBA on nVidia, GL_RGB10_A2

Number:rdar://36824694 Date Originated:01/24/2018
Status:Open Resolved:NO
Product:Mac Product Version:10.13
Classification:OpenGL/CA Reproducible:YES
glClearColor() produces different channel endianness when clearing a GL_FRAMEBUFFER that is backed by a GL_RGB10_A2 CGLTexImageIOSurface2D. 
glClearColor() works differently on Mac/AMD GPUs than on nVidia: the latter has the B and R channels swapped (i.e. interprets the framebuffer as BGRA instead of RGBA). The latter, nVidia, is wrong, btw.

Steps to Reproduce:

Please see the code in

Expected Results:
All architectures should show the same colors, i.e. interpret the framebuffer with the same channel endianness.

Actual Results:
When writing the channel associated to Red in glClearPixels(), nVidia presents a blue output, whereas intel/amd is correctly presenting Red.

See attached system reports and photo.


A bit more details on hardware, versions, etc, since there isn't attachments on this radar. The "intel" machine, that works fine, is a: MacBook Pro (Retina, 13-inch, Early 2015), running 10.13.2, with Graphics Intel Iris Graphics 6100 1536 MB. My favourite program, Chrome, reports GL_VERSION 4.1 INTEL-10.30.12.

The "nVidia" machine, that swaps the channels, is a: MacBook Pro (Retina, 15-inch, late 2013), running 10.13.1, with Graphics Intel Iris Pro 1536MB and nVidia Geforce GT 750M 2MB. When I disable "automatic graphics switching" in System Preferences, Chrome reports GL_VERSION 4.1. NVIDIA-10.26.6 355.

The attached photo is (

Please note: Reports posted here will not necessarily be seen by Apple. All problems should be submitted at before they are posted here. Please only post information for Radars that you have filed yourself, and please do not include Apple confidential information in your posts. Thank you!