I apologize for the late reply! I forgot to turn on notifications for this thread.
What I’m suggesting is to use UIGraphicsGetImageFromCurrentImageContext() to capture the screenshot and then to capture a frame from the live camera feed using the system camera APIs (specifically AVFoundation), not the UIKit APIs. In essence, your app act as if it were taking a real photo like a camera app does, but instead of saving it to the user’s photo library, it would composite it with the rest of the screenshot from UIGraphicsGetImageFromCurrentContext(). See this Apple Developer article for details on capturing frames from a live camera feed: Capturing Still and Live Photos.