Posts

Post not yet marked as solved
2 Replies
3.7k Views
In my app, I need to have an onscreen button initiate a screenshot (which then gets saved to Photos, etc). Currently, every method I've tried from the forums (every version of UIGraphicsGetImageFromCurrentImageContext...) will capture everything EXCEPT the live feed from the camera, which is running as a sublayer on my view. Everything from labels to buttons and even background color shows up in the pictures using those methods but no matter what, the image from the camera is just a blank space on the pictures. However, when my app is running, I can press the two buttons and THAT picture always has the image from the camera included! I need this to happen with my onscreen button and that's what I just can't get to happen the same way. Does anyone know the actual function/extension/code that runs whenever someone presses the hardware buttons to initiate a screenshot? Or how to accomplish getting everything that's onscreen at a given time to be include in a screenshot, regardless of what layer(s) it's in?
Posted Last updated
.