CoreImage rendering to Metal texture does not expose compositing options, like OpenGL does.

Originator:raphael
Number:rdar://26038856 Date Originated:02-May-2016 02:50 PM
Status:Open Resolved:
Product:OS X SDK Product Version:10.11.x
Classification:Feature (New) Reproducible:Not Applicable
 
When rendering to a Metal texture using Core Image API, 

	-[CIContext render:toMTLTexture:commandBuffer:bounds:colorSpace:]

CIImage contents replaced texture contents. There is no way of specifying basic blending options like it is when using CoreImage with OpenGL:

	glDepthMask(GL_FALSE);
	glDisable(GL_DEPTH_TEST);
	glViewport(0, 0, (GLsizei)bounds.size.width, (GLsizei)bounds.size.height);
	glBlendFunc (GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
	glEnable (GL_BLEND);
	
	glMatrixMode (GL_PROJECTION);
	glLoadIdentity ();
	glOrtho (0, bounds.size.width, 0, bounds.size.height, -1, 1);
	
	glMatrixMode (GL_MODELVIEW);
	glLoadIdentity ();
	
	CGRect r = CGRectIntersection(image.extent, drawable.extent);
	[context drawImage:image inRect:r fromRect:r];


This blending capability is available within the metal encoder used internally by Core Image.

I would like to suggest exposing an API that makes it possible to composite Core Image rendering onto existing texture contents, and thus make it on par with OpenGL implementation.

Comments


Please note: Reports posted here will not necessarily be seen by Apple. All problems should be submitted at bugreport.apple.com before they are posted here. Please only post information for Radars that you have filed yourself, and please do not include Apple confidential information in your posts. Thank you!