Augmented Reality app unable to load the image from the camera

I have an app on the App Store for many years enabling users to post text into clouds in augmented reality. Yet last week abruptly upon installing the app on the iPhone the screen started going totally dark and a list of little comprehensible logs came up of the kind:

ARSCNCompositor <0x300ad0e00>: ARSCNCompositor (0, 0) initialization failed. Matting is not set up properly.

many times, then

RWorldTrackingTechnique <0x106235180>: Unable to update pose [PredictorFailure] for timestamp 870.392108 ARWorldTrackingTechnique <0x106235180>: Unable to predict pose [1] for timestamp 870.392108

again several times and then:

ARWorldTrackingTechnique <0x106235180>: SLAM error callback: Error Domain=Slam Error Code=7 "Non fatal error occurred due to significant drop in a IMU data" UserInfo={NSDescription=Non fatal error occurred due to significant drop in a IMU data, NSLocalizedFailureReason=SlamEngineNodeGroup Failure: IMU issue: gyro data stream verification failed [Significant data drop]. Failed on timestamp: 870.413247, Last known timestamp: 865.350198, Delta: 5.063049, System timestamp: 870.415781, Delta between system and frame: 0.002534. }

and then again the pose issues several times.

I hoped the new beta version would have solved the issue, but it was not the case. Unfortunately I do not know if that depends on the beta version or some other issue, given the app may be not installed on the Mac simulator.

I would test it with iOS versions it is known to work with to be sure it isn't broken. It looks like you are using at least one third party framework. I would see if there is an updated version available.

Thanks for your answer. I use absolutely no third party library used; as a matter of fact the App worked smoothly before installing the iOS18 beta and perhaps even sometime later. Also the version on the AppleStore also works with no problems with basically the same code.

This is the rest of the errors:

FigCaptureSourceRemote Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:275) - (err=-12784)

ARSCNCompositor <0x301b84fc0>: ARSCNCompositor (0, 0) initialization failed. Matting is not set up properly.

ARDepthSensor <0x301dc1950>: (AVCaptureDeviceTypeBuiltInTimeOfFlightCamera - Back): capture session dropped jasper frame: 334023.383424, LateData

ARDepthSensor <0x301dc1950>: (AVCaptureDeviceTypeBuiltInTimeOfFlightCamera - Back): capture session dropped jasper frame: 334023.466850, LateData

ARImageSensor <0x3019d8b60>: AVCaptureDeviceTypeBuiltInWideAngleCamera: No video frame received. Dropping frame! Reason: 1

Mostly repeated several times.

I'm having the same errors, I'm developing my app with roomplan and all of a sudden it started having glitches and these kinds of errors. I went back to the working codebase and I have the same errors without having changed anything.

same here. I did very pretty basic stuffs with world tracking config on iPhone 16 pro, frame drop is everywhere. I even tested with Apple sample code with world tracking config and the result is the same. Hope Apple engineers are aware of it.

I was using Xcode beta, on Xcode 16 it is fine

FigCaptureSourceRemote Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:275) - (err=-12784)'

I am having similar message and errors. The error logs presented by Xcode do have eniough information to generate a workaround because they don't identify a Delegate queue specifically nor do they provide enough details to see what is blocking the queues. I have a few feedbacks submitted that describe the errors I am seeing with an extensive trail of logs that are pretty much useless because they do identify which delegate queue is being blocked. I am also getting a message that I am attempting to restart a session that is not stopped which could be at the root of the problem but it does not identify WHICH SESSION is being restarted incorrectly.
ARSCNCompositor <0x301b84fc0>: ARSCNCompositor (0, 0) initialization failed. Matting is not set up properly.

ARDepthSensor <0x301dc1950>: (AVCaptureDeviceTypeBuiltInTimeOfFlightCamera - Back): capture session dropped jasper frame: 334023.383424, LateData

ARDepthSensor <0x301dc1950>: (AVCaptureDeviceTypeBuiltInTimeOfFlightCamera - Back): capture session dropped jasper frame: 334023.466850, LateData

[quote='802264022, fbartolom, /thread/759110?answerId=802264022#802264022, /profile/fbartolom']

FigCaptureSourceRemote Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:275) - (err=-12784)

ARSCNCompositor <0x301b84fc0>: ARSCNCompositor (0, 0) initialization failed. Matting is not set up properly.

ARDepthSensor <0x301dc1950>: (AVCaptureDeviceTypeBuiltInTimeOfFlightCamera - Back): capture session dropped jasper frame: 334023.383424, LateData

ARDepthSensor <0x301dc1950>: (AVCaptureDeviceTypeBuiltInTimeOfFlightCamera - Back): capture session dropped jasper frame: 334023.466850, LateData

FigCaptureSourceRemote Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:275) - (err=-12784)

ARSCNCompositor <0x301b84fc0>: ARSCNCompositor (0, 0) initialization failed. Matting is not set up properly.

ARDepthSensor <0x301dc1950>: (AVCaptureDeviceTypeBuiltInTimeOfFlightCamera - Back): capture session dropped jasper frame: 334023.383424, LateData

ARDepthSensor <0x301dc1950>: (AVCaptureDeviceTypeBuiltInTimeOfFlightCamera - Back): capture session dropped jasper frame: 334023.466850, LateData

I am having similar message and errors. The error logs presented by Xcode do have eniough information to generate a workaround because they don't identify a Delegate queue specifically nor do they provide enough details to see what is blocking the queues. I have a few feedbacks submitted that describe the errors I am seeing with an extensive trail of logs that are pretty much useless because they do identify which delegate queue is being blocked. I am also getting a message that I am attempting to restart a session that is not stopped which could be at the root of the problem but it does not identify WHICH SESSION is being restarted incorrectly.

The error popped up in July, as seen in the message thread. Last time I tried the version on the APP store worked seamlessly. And every now and then the screen resurfaced but totally unresponsive. Yet once recompiled to check what happened the screen return black. I am peppering the Apple Feedback Assistant since I found the issue with no answer from them. I hoped the public iOS 18 would have fixed it to no avail, so I'm longing for each beta if something by chance changes. Surely it is frustrating.

we should probably collaborate on this since Apple is not responding to either one of us, I am seeing symptoms that resemble a general reset of the interface due to a fault. I wish I could get an idea of the structure of the bus that the cameras are on. I am also seeing bizarre behavior from the IMU that can only mean that there is some kind of data overrun on the bus due to slow response, Froim my reading of the specs it appears that a ring buffer or MSI is used for status delivery, if the ring buffer wraps or status is not promptly responded to there may be a hardware generated fault on the order of a Machine Checkl. In my former job I was developing PCIe device drivers which sometimes behaved this way

For some strange reason, when I run the app after a iOS update the screen from the camera actually shows, albeit the functionalities of the app do not work. Once I compile it on Xcode Version 15.2 (15C500b) the usual errors return and the screen gets black. I hope it does not depend on the Xcode version, as my Mac does not support a newer Xcode and I have no money to purchase a new Mac!

There is maybe nothing more uniquely "Apple" than encountering an issue, searching it up, and finding a thread that guarantees you have no recourse but to wait and hope that the Apple developers one day attempt to fix their code.

Has anyone found any sort of workaround for this in the meantime?

Augmented Reality app unable to load the image from the camera
 
 
Q