The images that I want to track downloaded from the web service and stored locally. I'm creating the ARImageReference objects from them using the following code:
The session configured depending on the current iOS version since ARImageTrackingConfiguration not available for iOS 11:
The code above works fine for iOS 12 and 13 even if I use ARWorldTrackingConfiguration. Images correctly detected by the ARKit. But when I try to run it on iOS 11.3.1, the app immediately crashes with the following error:
Code Block guard let image = UIImage(contentsOfFile: imageLocalPath), let cgImage = image.cgImage else { return nil } return ARReferenceImage(cgImage, orientation: .up, physicalWidth: 0.12)
The session configured depending on the current iOS version since ARImageTrackingConfiguration not available for iOS 11:
Code Block private lazy var configuration: ARConfiguration = { if #available(iOS 12.0, *), ARImageTrackingConfiguration.isSupported { return ARImageTrackingConfiguration() } return ARWorldTrackingConfiguration() }() if #available(iOS 12.0, *), let imagesTrackingConfig = configuration as? ARImageTrackingConfiguration { imagesTrackingConfig.trackingImages = referenceImages } else if let worldTrackingConfig = configuration as? ARWorldTrackingConfiguration { worldTrackingConfig.detectionImages = referenceImages } session.run(configuration, options: [.resetTracking, .removeExistingAnchors])
The code above works fine for iOS 12 and 13 even if I use ARWorldTrackingConfiguration. Images correctly detected by the ARKit. But when I try to run it on iOS 11.3.1, the app immediately crashes with the following error:
Is it possible that the dynamic markers creation programmatically is not available for the iOS version below 12.0 or am I doing something wrong? Unfortunately, I wasn't able to find any information regarding the specific versions. Thank you.Assert: /BuildRoot/Library/Caches/com.apple.xbs/Sources/AppleCV3D/AppleCV3D-1.13.11/library/VIO/OdometryEngine/src/FrameDownsampleNode/FrameDownsampler.cpp, 62: std::abs(staticcast(aspect1) - staticcast(srcframe.image.width * outputframeheight)) < maxslack (lldb)
Thanks for pointing me in the right direction.
I managed to solve the issue a few days ago by redrawing the images that come from the API.
By checking the original images CGImage size, I discovered that they have the same sizes as the UIImage.
But they must be scaled, for example, on iPhone 8 plus in case if the images with the size 180 x 240, the output cgImage must be 540 x 720, with x3 scale.
By using the correctly scaled CGImage object for the creation of the ARReferenceImage, I got rid of the crash, and the detection works fine now on iOS 11.3.1 as well.
I managed to solve the issue a few days ago by redrawing the images that come from the API.
By checking the original images CGImage size, I discovered that they have the same sizes as the UIImage.
But they must be scaled, for example, on iPhone 8 plus in case if the images with the size 180 x 240, the output cgImage must be 540 x 720, with x3 scale.
By using the correctly scaled CGImage object for the creation of the ARReferenceImage, I got rid of the crash, and the detection works fine now on iOS 11.3.1 as well.