Post

Replies

Boosts

Views

Activity

Reply to Combining ARKit Face Tracking with High-Resolution AVCapture and Perspective Rendering on Front Camera
Hi @Vision Pro Engineer , Thank you for the links. I indeed tried captureHighResolutionFrame(completion:) on my iPhone 14 Pro with iOS 18.1.1 and was able to get a 1512 × 2016 frame instead of the standard 1080 × 1440. I tested it with the Tracking and Visualizing Faces sample app. Unfortunately requires our use case at least the 7MP (2316 × 3088) from the front camera. Is this (1512 × 2016) actually the highest resolution frame I can get with my setup and using ARKit? Or do I need to pay attention other configuration settings? In the video everything is about ARWorldTrackingConfiguration. Does it also apply to ARFaceTrackingConfiguration? I asking because I was not able to get a higher resolution stream. The following returned me nil // Assign the video format that supports hi-res capturing. config.videoFormat = hiResCaptureVideoFormat } // Run the session. session.run(config) Since also features like triggering focus events and other device settings could be beneficial, I tried to access the device as described in the video as well with: do { try device.lockForConfiguration() // configure AVCaptureDevice settings … device.unlockForConfiguration() } catch { // error handling … } } But I was not able to access it. Should it be possible? I investigated into a fast session switching, but was not able to get it faster than 1.6 seconds which brakes the user experience heavily. Below you can find the code that I used to switch sessions and capture an image. ViewController File Content Since we only need the face orientation and face landmarks of ARKit, we looked into other methods to get this. We found the Vision Framework. Are there other options? How would you detect a head pose with the back camera? Thank you in advance, it is really a hot topic on our side.
18h