I'm trying to figure out how to display a Web Browser inside my iOS VR app (Obj-c, SceneKit and raw OpenGL), and the part i'm not fully understanding is how to get the WKWebView to draw it's content into a Pixel Buffer of some sort, so I can use the speed of CVOpenGLESTextureCacheCreateTextureFromImage to convert the pixel data into a OpenGl Texture quickly/efficently and display it on a floating surface.I'm already doing something simular with the video portion of my app, but it has a AVPlayerItemVideoOutput, which produced the pixel buffer, but I can't figure out how to massage the CALayer into a Buffer so I can convert it into a texture to then draw in opengl.I know it has something to do with drawLayer:(Layer) ,(Context), but searching online hasn't been very fruitful.And i'm not using SceneKit like you would assume, the app was built before GVR for Scenekit was a thing, so every part of VR is handled manually (scenekit to textures, textures to opengl for Left/Right eye distortion mesh).
Post
Replies
Boosts
Views
Activity
I have an old iOS app that doesn't use GoogleVR, instead it rolls it's own VR framework, but once I upgraded to iOS 14 the data coming from CMMotionManager is very wobbly, or shaky, even if I place the phone flat on a table. Am I doing something wrong or do I need to use a different way to access the data? I've seen the Official Google VR team state they had to fix a similar issue, but I'm unable to find their changes, since it is buried inside compiled code for their Unity Plugin.
The code in question is at: https://github.com/mgatelabs/SCN-VR/blob/master/scn-vr/CoreMotionTracker.m
But the main parts would be as follows
But i'm starting the process with:
[self.motionManager startDeviceMotionUpdatesUsingReferenceFrame: CMAttitudeReferenceFrameXArbitraryZVertical];
And then i'm reading and transforming the data like this:
CMQuaternion currentAttitude_noQ = self.motionManager.deviceMotion.attitude.quaternion;
GLKQuaternion baseRotation = GLKQuaternionMake(currentAttitude_noQ.x, currentAttitude_noQ.y, currentAttitude_noQ.z, currentAttitude_noQ.w);
GLKQuaternion rotationFix = GLKQuaternionMakeWithAngleAndAxis(-M_PI_2, 0, 0, 1);
if (self.landscape) {
self.orientation = GLKQuaternionMultiply(baseRotation, rotationFix);
} else {
self.orientation = GLKQuaternionMultiply(rotationFix, baseRotation);
}
And it is all wobbly, which isn't that great for VR.
So I have an app that has been around for years now and recently with the iOS 15 update it's support to play videos hosted on Dropbox has stopped working.
What I do is query dropbox for the temporary link. Then I create a AVPlayerItem and give it to the player which is inside of a OpenGL view, since it is a VR player. Then it should just work, but right now i'm getting the error Error Domain=AVFoundationErrorDomain Code=-11828 \"Cannot Open\" UserInfo={NSLocalizedFailureReason=This media format is not supported., NSLocalizedDescription=Cannot Open, NSUnderlyingError=0x280597990 {Error Domain=NSOSStatusErrorDomain Code=-12847
Now i've tried to troubleshoot this, but I can
Download the file from inside the app (not with the player) and play the file locally.
Place the same file on a UPNP server and it plays (Does not transcode)
If I run the app on Mac OS as an Ipad app, it works. But I suspect something with running on a mac makes it play different.
So i'm kind of lost, it feels like iOS 15 is acting strange on the iPhone, but on the Mav it's fine.