In ARSessionDelegate's didUpdate function two ipad pro 3. gen 12.9 inch differs in arframe image orientation.func session(_ session: ARSession, didUpdate frame: ARFrame) I get the pixelbuffer in this way and create UIImage.let pixelBuffer = frame.capturedImagelet image = UIImage(pixelBuffer: pixelBuffer)We have two ipads both running iOS 13.2.3.The models are like below:iPad (NTEL2B)iPad (MTEL2LL)The MTEL2LL version image is 180 degree rotated. What could be cause?I use this function to create UIImage from CVPixelBufferpublic convenience init?(pixelBuffer: CVPixelBuffer) { if let cgImage = CGImage.create(pixelBuffer: pixelBuffer) { self.init(cgImage: cgImage) } else { return nil } }
Post
Replies
Boosts
Views
Activity
Hello, running activity classifier model on Apple watch series 2 with OS 5.3.5 gives me an error. Model has been trained on CreateML. Anyone seen this error before? WatchKit Extension[587:818956] [espresso] [Espresso::handle_ex_] exception=Unsupported engine type
I'm running basic Oscillator with a sine wave (created with AVAudioEngine), but sound is scrannel on Apple Watch series 2 OS (6.2.6), however it sounds clear on Simulator. What could be the reason?
I'm running oscillator within the range of 100-900 frequency and 0-1 amplitude.
We are using ARWorldTrackingConfiguration for our app to detect horizontal and vertical planes. With iOS 16.0.2 update, iPhone 11 device lost its ability to detect planes. It used to work for iOS 15 versions. Anyone knows what could be the problem here? I also submitted feedback to Apple via Feedback Assistant.
We also tried running Apple's sample project “Tracking and Visualizing Planes” https://developer.apple.com/documentation/arkit/content_anchors/tracking_and_visualizing_planes but it also can't detect with iPhone 11s, but other devices work.
let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = [.horizontal, .vertical]
sceneView.session.run(configuration)