I have tried the following 2 options but neither yield a high quality photo in return:
The capturedImage gives better quality compared to the other approach but still not to the level I expect.
Is there any workaround of taking high quality during an AR session? And by high quality I mean as close to the quality of photos taken via camera app as possible.
My goal was to leverage ARKit to assist users in capturing "good" photos (based on distance, angle, etc) but not at the cost of quality.
I figure I could pause the session and then launch the camera but that kinda defeat the purpose using AR in the first place. Could someone help me with any pointer to address the issue?
Retrieving data ...