1 Reply
      Latest reply on Feb 11, 2020 4:51 PM by baby-rabbit
      SlashAndBurn Level 1 Level 1 (0 points)

        I'm trying to figure out how to display a Web Browser inside my iOS VR app (Obj-c, SceneKit and raw OpenGL), and the part i'm not fully understanding is how to get the WKWebView to draw it's content into a Pixel Buffer of some sort, so I can use the speed of CVOpenGLESTextureCacheCreateTextureFromImage to convert the pixel data into a OpenGl Texture quickly/efficently and display it on a floating surface.


        I'm already doing something simular with the video portion of my app, but it has a AVPlayerItemVideoOutput, which produced the pixel buffer, but I can't figure out how to massage the CALayer into a Buffer so I can convert it into a texture to then draw in opengl.


        I know it has something to do with drawLayer:(Layer) ,(Context), but searching online hasn't been very fruitful.


        And i'm not using SceneKit like you would assume, the app was built before GVR for Scenekit was a thing, so every part of VR is handled manually (scenekit to textures, textures to opengl for Left/Right eye distortion mesh).