How to initiate a real screenshot programmatically, just like the hardware (2 button) screenshot.

In my app, I need to have an onscreen button initiate a screenshot (which then gets saved to Photos, etc). Currently, every method I've tried from the forums (every version of UIGraphicsGetImageFromCurrentImageContext...) will capture everything EXCEPT the live feed from the camera, which is running as a sublayer on my view.

Everything from labels to buttons and even background color shows up in the pictures using those methods but no matter what, the image from the camera is just a blank space on the pictures.

However, when my app is running, I can press the two buttons and THAT picture always has the image from the camera included! I need this to happen with my onscreen button and that's what I just can't get to happen the same way.

Does anyone know the actual function/extension/code that runs whenever someone presses the hardware buttons to initiate a screenshot? Or how to accomplish getting everything that's onscreen at a given time to be include in a screenshot, regardless of what layer(s) it's in?

Replies

One way you could accomplish this is to use something like UIGraphicsGetImageFromCurrentImageContext() and then to capture separately a frame from the live camera feed before compositing the two images together with, say, Core Graphics to form one image that reconstructs what a “real” screenshot would show. I’m not sure if there’s a more elegant way to do it, but this method should work; it just might require some complex calculations to put the camera frame in the right spot with any appropriate masking, depending on the complexity of your UI layout.

I suspect that whatever code the system runs when the user presses the hardware buttons is not accessible to third-party developers.

  • I actually have tried the UIGraphicsGetImageFromCurrentImageContext() approach without any luck. Of course, I might not have implemented it exactly the way you describe here. Using this method is still giving me a screenshot of everything except what the camera is showing. Would you be able to post some code to show how you're describing using this? (I've tried multiple ways and I can't seem to get it to do it...)

  • I apologize for the late reply! I forgot to turn on notifications for this thread.

    What I’m suggesting is to use UIGraphicsGetImageFromCurrentImageContext() to capture the screenshot and then to capture a frame from the live camera feed using the system camera APIs (specifically AVFoundation), not the UIKit APIs. In essence, your app act as if it were taking a real photo like a camera app does, but instead of saving it to the user’s photo library, it would composite it with the rest of the screenshot from UIGraphicsGetImageFromCurrentContext(). See this Apple Developer article for details on capturing frames from a live camera feed: Capturing Still and Live Photos.

Add a Comment

Did you see this long thread. They claim to capture everything, but looks like that's what you tried already:

https://stackoverflow.com/questions/25448879/how-do-i-take-a-full-screen-screenshot-in-swift

So this one is hopefully more relevant:

https://stackoverflow.com/questions/19573345/how-can-i-programmatically-capture-a-screenshot-of-a-playing-video