Photos and Imaging

RSS for tag

Integrate still images and other forms of photography into your apps.

Posts under Photos and Imaging tag

72 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Capturing Photot error Code=-11803 "Cannot Record"
Hi iOS community need your help. I am working on an application where I am capturing photo from the back camera using AVCaptureSession. It is working fine with the devices running iOS17+ but I am facing an error on device iPhone X running iOS 16.7.4 ERROR: ` error: Optional(Error Domain=AVFoundationErrorDomain Code=-11803 "Cannot Record" UserInfo={NSUnderlyingError=0x283f0b780 {Error Domain=NSOSStatusErrorDomain Code=-16409 "(null)"}, NSLocalizedRecoverySuggestion=Try recording again., AVErrorRecordingFailureDomainKey=3, NSLocalizedDescription=Cannot Record}) My Code here: final class CedulaScanningVC: UIViewController { var captureSession: AVCaptureSession! var stillImageOutput: AVCapturePhotoOutput! var videoPreviewLayer: AVCaptureVideoPreviewLayer! var delegate: ScanCedulaDelegate? override func viewDidLoad() { super.viewDidLoad() } override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(animated) } override func viewWillDisappear(_ animated: Bool) { super.viewWillDisappear(animated) self.captureSession.stopRunning() } override func viewDidAppear(_ animated: Bool) { super.viewDidAppear(animated) setupCamera() } // MARK: - Configure Camera func setupCamera() { captureSession = AVCaptureSession() captureSession.sessionPreset = .medium guard let backCamera = AVCaptureDevice.default(for: AVMediaType.video) else { print("Unable to access back camera!") return } let input: AVCaptureDeviceInput do { input = try AVCaptureDeviceInput(device: backCamera) //Step 9 stillImageOutput = AVCapturePhotoOutput() if captureSession.canAddInput(input) && captureSession.canAddOutput(stillImageOutput) { captureSession.addInput(input) captureSession.addOutput(stillImageOutput) setupLivePreview() } } catch let error { print("Error Unable to initialize back camera: \(error.localizedDescription)") } } func setupLivePreview() { videoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession) videoPreviewLayer.videoGravity = .resizeAspectFill videoPreviewLayer.connection?.videoOrientation = .portrait self.view.layer.addSublayer(videoPreviewLayer) //Step12 DispatchQueue.global(qos: .userInitiated).async { [weak self] in self?.captureSession.startRunning() //Step 13 DispatchQueue.main.async { self?.videoPreviewLayer.frame = self?.view.bounds ?? .zero } } } func failed() { let ac = UIAlertController(title: "Scanning not supported", message: "Your device does not support scanning a code from an item. Please use a device with a camera.", preferredStyle: .alert) ac.addAction(UIAlertAction(title: "OK", style: .default)) present(ac, animated: true) captureSession = nil } // MARK: - actions func cameraButtonPressed() { let settings = AVCapturePhotoSettings(format: [AVVideoCodecKey: AVVideoCodecType.jpeg]) stillImageOutput.capturePhoto(with: settings, delegate: self) } } extension CedulaScanningVC: AVCapturePhotoCaptureDelegate { func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) { print("error: \(error)") captureSession.stopRunning() DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) { [weak self] in guard let self = self else {return} guard let imageData = photo.fileDataRepresentation() else { print("NO image captured") return } let image = UIImage(data: imageData) self.delegate?.capturedImage(image: image) } } } I don't know what am doing wrong ?
0
0
655
Jan ’24
Code=-11803 "Cannot Record" Error while capturing photo from AVCaptureSession ?
Hi Everyone need your help . I am working on an application where I am capturing photo from the back camera using AVCaptureSession. It is working fine with the devices running iOS17+ but I am facing an error on device iPhone X running iOS 16.7.4 ERROR: error: Optional(Error Domain=AVFoundationErrorDomain Code=-11803 "Cannot Record" UserInfo={NSUnderlyingError=0x283f0b780 {Error Domain=NSOSStatusErrorDomain Code=-16409 "(null)"}, NSLocalizedRecoverySuggestion=Try recording again., AVErrorRecordingFailureDomainKey=3, NSLocalizedDescription=Cannot Record}) Here is my Code: `final class CedulaScanningVC: UIViewController { var captureSession: AVCaptureSession! var stillImageOutput: AVCapturePhotoOutput! var videoPreviewLayer: AVCaptureVideoPreviewLayer! var delegate: ScanCedulaDelegate? override func viewDidLoad() { super.viewDidLoad() } override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(animated) } override func viewWillDisappear(_ animated: Bool) { super.viewWillDisappear(animated) self.captureSession.stopRunning() } override func viewDidAppear(_ animated: Bool) { super.viewDidAppear(animated) setupCamera() } // MARK: - Configure Camera func setupCamera() { captureSession = AVCaptureSession() captureSession.sessionPreset = .medium guard let backCamera = AVCaptureDevice.default(for: AVMediaType.video) else { print("Unable to access back camera!") return } let input: AVCaptureDeviceInput do { input = try AVCaptureDeviceInput(device: backCamera) //Step 9 stillImageOutput = AVCapturePhotoOutput() if captureSession.canAddInput(input) && captureSession.canAddOutput(stillImageOutput) { captureSession.addInput(input) captureSession.addOutput(stillImageOutput) setupLivePreview() } } catch let error { print("Error Unable to initialize back camera: \(error.localizedDescription)") } } func setupLivePreview() { videoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession) videoPreviewLayer.videoGravity = .resizeAspectFill videoPreviewLayer.connection?.videoOrientation = .portrait self.view.layer.addSublayer(videoPreviewLayer) //Step12 DispatchQueue.global(qos: .userInitiated).async { [weak self] in self?.captureSession.startRunning() //Step 13 DispatchQueue.main.async { self?.videoPreviewLayer.frame = self?.view.bounds ?? .zero } } } func failed() { let ac = UIAlertController(title: "Scanning not supported", message: "Your device does not support scanning a code from an item. Please use a device with a camera.", preferredStyle: .alert) ac.addAction(UIAlertAction(title: "OK", style: .default)) present(ac, animated: true) captureSession = nil } // MARK: - actions func cameraButtonPressed() { let settings = AVCapturePhotoSettings(format: [AVVideoCodecKey: AVVideoCodecType.jpeg]) stillImageOutput.capturePhoto(with: settings, delegate: self) } } extension CedulaScanningVC: AVCapturePhotoCaptureDelegate { func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) { print("error: \(error)") captureSession.stopRunning() DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) { [weak self] in guard let self = self else {return} guard let imageData = photo.fileDataRepresentation() else { print("NO image captured") return } let image = UIImage(data: imageData) self.delegate?.capturedImage(image: image) } } }` I don't know what am doing wrong ?
0
0
818
Jan ’24
Sharing a JPEG via Action or Share Extension fails in Photos on macOS
We have a Share Extension that fails in Photos on macOS when trying to share a JPEG image for the following reason: From the NSItemProvider we get from the NSExtensionItem.attachments, we try to load the image using loadFileRepresentation(forTypeIdentifier: “public.image”, completionHandler: …). This fails for .jpeg images in the library. There seems to be a mismatch in expected and actual file extension internally. Here is the log: Error copying file type public.image. Error: Error Domain=NSItemProviderErrorDomain Code=-1000 "Cannot load representation of type public.jpeg" UserInfo={NSLocalizedDescription=Cannot load representation of type public.jpeg, NSUnderlyingError=0x1527c1a80 {Error Domain=NSItemProviderErrorDomain Code=-1 "Cannot copy file at URL file:///Users/frank/Library/Containers/com.apple.Photos/Data/tmp/TemporaryItems/ShareKit-Exports/7CCFA760-AAC9-42B0-812D-68F051ED1543/F912E593-2BE5-4E70-86AB-7657A40657E5/IMG_3517.jpg." UserInfo={NSLocalizedDescription=Cannot copy file at URL file:///Users/frank/Library/Containers/com.apple.Photos/Data/tmp/TemporaryItems/ShareKit-Exports/7CCFA760-AAC9-42B0-812D-68F051ED1543/F912E593-2BE5-4E70-86AB-7657A40657E5/IMG_3517.jpg., NSUnderlyingError=0x152789670 {Error Domain=NSItemProviderErrorDomain Code=-1 "Cannot create a temporary file. Error: Undefined error: 0" UserInfo={NSLocalizedDescription=Cannot create a temporary file. Error: Undefined error: 0}}}}}``` In the specified folder, there is an image, however, it’s named IMG_3517.jpeg, not IMG_3517.jpg. This seems to be a bug in Photo’s item provider implementation. If we use loadObject(ofClass: URL.self, completionHandler: …) instead, we get the correct .jpeg URL in the completion handler.
1
0
816
Jan ’24
Should Photo Extensions work in iOS simulators?
When I: open an existing project create a new PhotoExtensions target run the new target in an iOS simulator (eg iPhone 15, iOS 17.0) Select photos as the app to run Open a photo Tap the ... button at the top right I see: Copy, Duplicate, Hide, etc. But I do not see my new Extension. Is there something else I need to be doing in order to see my new Extension in 'action'?
1
0
643
Jan ’24
Photos sample app can't access full-resolution photos on iOS
I'm running this SwiftUI sample app for photos without any modifications except for adding my developer profile, which is necessary to build it. When I tap on the thumbnail to see the photo library (after granting access to my photo library), I see that some of the thumbnails are stuck in a loading state, and when I tap on thumbnails, I only see a low-resolution image (the thumbnail), not the full-resolution image that should load. In the console I can see this error that occurs when tapping on a thumbnail to see the full-resolution image: CachedImageManager requestImage error: The operation couldn’t be completed. (PHPhotosErrorDomain error 3164.) When I make a few modifications necessary to run the app as a native macOS app, all the thumbnails load immediately, and clicking on them reveals the full-resolution images.
2
0
1.3k
Dec ’23
[IOS 17] Bug when save video from URL and read it by PHPickerViewController
From IOS 17. Have an issue when saving video and reading it from PHPickerViewController. Video become into jpeg file Code save video and I saw reason because i changed creationDate. But IOS 16 no bug here doVertifyAccessAblum() { DispatchQueue.global(qos: .background).async { if let url = URL(string: videoURL), let urlData = NSData(contentsOf: url) { let galleryPath = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0]; let filePath="\(galleryPath)/\(url.lastPathComponent).mp4" DispatchQueue.main.async { urlData.write(toFile: filePath, atomically: true) PHPhotoLibrary.shared().performChanges({ let changeRequest = PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: URL(fileURLWithPath: filePath)) changeRequest?.creationDate = Date() }) { success, error in LOGGING.debug("Save video with status: success=\(success) error=\(String(describing: error))") } } } } } }
5
1
712
Dec ’23
Apple ProRAW image issue when editing
I am running Sonoma 14.1.1 and having an issue with DNG ProRAW images when choosing to edit. I can choose to edit using the photos app, or edit with Photoshop and in both cases the image once edit is chosen becomes less clear and slightly fuzzy with loss of detail. This happens too when I choose to export the photo from photos as a DNG file and then also try to edit with Photoshop the photos. There is a drastically noticeable difference in quality of the image. This appears to be the way the image is handles in the photos app itself. Even if I save directly from the iPhone to files, it does the same thing once on the Mac. Attached are some screen shot examples, but still clear to see the difference.
0
0
428
Nov ’23
How to attach point cloud(or depth data) to heic?
I'm developing 3D Scanner works on iPad. I'm using AVCapturePhoto and Photogrammetry Session. photoCaptureDelegate is like below: extension PhotoCaptureDelegate: AVCapturePhotoCaptureDelegate { func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) { let fileUrl = CameraViewModel.instance.imageDir!.appendingPathComponent("\(PhotoCaptureDelegate.name)\(id).heic") let img = CIImage(cvPixelBuffer: photo.pixelBuffer!, options: [ .auxiliaryDepth: true, .properties: photo.metadata ]) let depthData = photo.depthData!.converting(toDepthDataType: kCVPixelFormatType_DepthFloat32) let colorSpace = CGColorSpace(name: CGColorSpace.sRGB) let fileData = CIContext().heifRepresentation(of: img, format: .RGBA8, colorSpace: colorSpace!, options: [ .avDepthData: depthData ]) try? fileData!.write(to: fileUrl, options: .atomic) } } But, Photogrammetry session spits warning messages: Sample 0 missing LiDAR point cloud! Sample 1 missing LiDAR point cloud! Sample 2 missing LiDAR point cloud! Sample 3 missing LiDAR point cloud! Sample 4 missing LiDAR point cloud! Sample 5 missing LiDAR point cloud! Sample 6 missing LiDAR point cloud! Sample 7 missing LiDAR point cloud! Sample 8 missing LiDAR point cloud! Sample 9 missing LiDAR point cloud! Sample 10 missing LiDAR point cloud! The session creates a usdz 3d model but scale is not correct. I think the point cloud can help Photogrammetry session to find right scale, but I don't know how to attach point cloud.
3
2
805
Oct ’24
Photogrammetry failed with crash(Assert: in line 417)
I'm developing a 3D scanner works on a iPad(6th gen, 12-inch). Photogrammetry with ObjectCaptureSession was successful, but other trials are not. I've tried Photogrammetry with URL inputs, these are pictures from AVCapturePhoto. It is strange... if metadata is not replaced, photogrammetry would be finished but it seems to be no depthData or gravity info were used. (depth and gravity is separated files). but if metadata is injected, this trial are fails. and this time i tried to Photogrammetry with PhotogrammetrySamples sequence and it also failed. the settings are: camera: back Lidar camera, image format: kCVPicelFormatType_32BGRA(failed with crash) or hevc(just failed) image depth format: kCVPixelFormatType_DisparityFloat32 or kCVPixelFormatType_DepthFloat32 photoSetting: isDepthDataDeliveryEnabled = true, isDepthDataFiltered = false, embeded = true I wonder iPad supports Photogrammetry with PhotogrammetrySamples I've already tested some sample codes provided by apple: https://developer.apple.com/documentation/realitykit/creating_a_photogrammetry_command-line_app https://developer.apple.com/documentation/avfoundation/additional_data_capture/capturing_depth_using_the_lidar_camera https://developer.apple.com/documentation/realitykit/taking_pictures_for_3d_object_capture What should I do to make Photogrammetry successful?
2
1
641
Apr ’24
Photos app doesn't properly show OpenEXR images on HDR display
I'm working on a game which uses HDR display output for a much brighter range. One of a feature of the game is the ability to export in-game photos. The only appropriate format I found for this is Open EXR. The embedded Photos app is capable of showing HDR photos on an HDR display. However, if drop an EXR file to the photos with a large range, it won't be properly displayed with HDR mode with the full range. At the same time, pressing Edit on the file makes it HDR displayable and it remains displayable if save the edit with any, even a tiny, change. Moreover, if the EXR file is placed next to 'true' HDR one (or an EXR 'fixed' as on above), then durring scroll between the files, the broken EXR magically fixes at the exact moment the other HDR drives up to the screen. I tested on different files with various internal format. Seems to be a coomon problem for all. Tested on the latest iOS 17.0.3. Thank you in advance.
1
0
743
Apr ’24
Display HDR images for PhotoKit assets
In my app I get a UIImage for a PHAsset via PHImageManager.requestImage(for:targetSize:contentMode:options:resultHandler:). I directly display that image in a UIImageView that has preferredImageDynamicRange set to .high. The problem is I do not see the high dynamic range. I see the HDRDemo23 sample code uses PhotosPicker to get a UIImage from Data through UIImageReader whose config enables prefersHighDynamicRange. Is there a way to support HDR when using the Photos APIs to request display images? And is there support for PHLivePhoto displayed in PHLivePhotoView retrieved via PHImageManager.requestLivePhoto?
4
1
1.4k
Apr ’24