How to use IOSurface instead of NSOpenGLPixelBuffer on Mac? How to use GL_EXT_framebuffer_object instead?

I used to have a project that used Quartz Composer and OpenGL, but Xcode 13 has deprecated these two components, which caused me to fail to get off-screen images during video production.

The previous code to create the OpenGLContext is as follows:

  • (id) initOffScreenOpenGLPixelsWide:(unsigned)width pixelsHigh:(unsigned)height

{

//Check parameters - Rendering at sizes smaller than 16x16 will likely produce garbage

if((width < 16) || (height < 16)) {

	[self release];

	return nil;

}

self = [super init];

if(self != nil) {

        NSOpenGLPixelFormatAttribute pixattributes[] = {

            NSOpenGLPFADoubleBuffer,

            NSOpenGLPFANoRecovery,

            NSOpenGLPFAAccelerated,

            NSOpenGLPFADepthSize, 24,

            (NSOpenGLPixelFormatAttribute) 0

        };

        _pixelFormat = [[NSOpenGLPixelFormat alloc] initWithAttributes:pixattributes];

		//Create the OpenGL context to render with (with color and depth buffers)

	_openGLContext = [[NSOpenGLContext alloc] initWithFormat:_pixelFormat shareContext:nil];

	if(_openGLContext == nil) {

		DDLogInfo(@"Cannot create OpenGL context");

		[self release];

		return nil;

	}

        

        //Create the OpenGL pixel buffer to render into

        NSOpenGLPixelBuffer* glPixelBuffer = [[NSOpenGLPixelBuffer alloc] initWithTextureTarget:GL_TEXTURE_RECTANGLE_EXT    textureInternalFormat:GL_RGBA textureMaxMipMapLevel:0 pixelsWide:width pixelsHigh:height];

        if(glPixelBuffer == nil) {

            DDLogInfo(@"Cannot create OpenGL pixel buffer");

            [self release];

            return nil;

        }

        [_openGLContext setPixelBuffer:glPixelBuffer cubeMapFace:0 mipMapLevel:0 currentVirtualScreen:[_openGLContext currentVirtualScreen]];

        

        //Destroy the OpenGL pixel buffer

        [glPixelBuffer release];

        

	NSMutableDictionary* attributes = [NSMutableDictionary dictionary];

        [attributes setObject:[NSNumber numberWithUnsignedInt:k32BGRAPixelFormat] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey];

	[attributes setObject:[NSNumber numberWithUnsignedInt:width] forKey:(NSString*)kCVPixelBufferWidthKey];

	[attributes setObject:[NSNumber numberWithUnsignedInt:height] forKey:(NSString*)kCVPixelBufferHeightKey];

		//Create buffer pool to hold our frames

	OSErr theError = CVPixelBufferPoolCreate(kCFAllocatorDefault, NULL, (CFDictionaryRef)attributes, &_bufferPool);

	if(theError != kCVReturnSuccess) 

	{

		DDLogInfo(@"CVPixelBufferPoolCreate() failed with error %i", theError);

		[self release];

		return nil;

	}

	

}

    /*

     *A context is current on a per-thread basis. Multiple threads must serialize calls into the same context object.

     */

    [self.openGLContext makeCurrentContext];

return self;

}

By creating an NSOpenGLPixelBuffer object, and then setting the pixelbuffer of NSOpenGLContext, but in Xcode13, NSOpenGLPixelBuffer cannot be created successfully. Looking at the help documentation, it is recommended to use GL_EXT_framebuffer_object instead. So I tried the following code::

        //RGBA8 RenderBuffer, 24 bit depth RenderBuffer, 256x256

        glGenFramebuffersEXT(1, &fb);

        glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, fb);

        //Create and attach a color buffer

        

        glGenRenderbuffersEXT(1, &color_rb);

        //We must bind color_rb before we call glRenderbufferStorageEXT

        glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, color_rb);

        //The storage format is RGBA8

        glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT, GL_RGBA, width, height);

        //Attach color buffer to FBO

        glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT, GL_RENDERBUFFER_EXT, color_rb);

        //-------------------------

        

        glGenRenderbuffersEXT(1, &depth_rb);

        glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, depth_rb);

        glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT, GL_DEPTH_COMPONENT24, width, height);

        //-------------------------

        //Attach depth buffer to FBO

        glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_RENDERBUFFER_EXT, depth_rb);

        //-------------------------

        //Does the GPU support current FBO configuration?

        GLenum status;

        status = glCheckFramebufferStatusEXT(GL_FRAMEBUFFER_EXT);

        switch(status)

        {

            case GL_FRAMEBUFFER_COMPLETE_EXT:

                DDLogInfo(@"gl no problem");

                break;

            default:

                DDLogInfo(@"error");

                break;

        }

        

        //-------------------------

        //and now you can render to the FBO (also called RenderBuffer)

        glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, fb);

When running the program we can get the 'gl no problem' log.

However, when reading off-screen image data, although glGetError does not return an error code, I can only read a black image. In previous versions, a QCRenderer rendered image could be successfully obtained.

Reading off-screen images is implemented as follows:

  • (CVPixelBufferRef) readPixelBuffer

{

    // Create pixel buffer from pixel buffer pool

    CVPixelBufferRef bufferRef;

    OSErr theError = CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, _bufferPool, &bufferRef);

    if(theError) {

        DDLogInfo(@"CVPixelBufferPoolCreatePixelBuffer() failed with error %i", theError);

        return nil;

    }

    theError = CVPixelBufferLockBaseAddress(bufferRef, 0);

if(theError) {

	DDLogInfo(@"CVPixelBufferLockBaseAddress() failed with error %i", theError);

	return nil;

}



void* bufferPtr = CVPixelBufferGetBaseAddress(bufferRef);

    size_t width = CVPixelBufferGetWidth(bufferRef);

    size_t height = CVPixelBufferGetHeight(bufferRef);

size_t bufferRowBytes = CVPixelBufferGetBytesPerRow(bufferRef);

    CGLContextObj cgl_ctx = [_openGLContext CGLContextObj];

CGLLockContext(cgl_ctx);

    //Read pixels back from the OpenGL pixel buffer in ARGB 32 bits format - For extra safety, we save / restore the OpenGL states we change

    GLint save;

glGetIntegerv(GL_PACK_ROW_LENGTH, &save);

glPixelStorei(GL_PACK_ROW_LENGTH, (int)bufferRowBytes / 4);

glReadPixels(0, 0, (GLsizei)width, (GLsizei)height, GL_BGRA, GL_UNSIGNED_INT_8_8_8_8_REV, bufferPtr);

    flipImage(bufferPtr, width, height, bufferRowBytes);

    glPixelStorei(GL_PACK_ROW_LENGTH, save);

CGLUnlockContext(cgl_ctx);

    GLenum code = glGetError();

if(code)

	return nil;



CVPixelBufferUnlockBaseAddress(bufferRef, 0);

    return bufferRef;

}

Ask an expert how to solve this problem.

How to use IOSurface instead of NSOpenGLPixelBuffer on Mac? How to use GL_EXT_framebuffer_object instead?
 
 
Q