Discuss using the camera on Apple devices.

Posts under Camera tag

182 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

My 14promax urgently needs 24 million pixels
The 24-megapixel camera is most widely used for daily photography, and the 48-megapixel camera is only used for taking landscapes or photos. After all, it takes up too much memory. The biggest problem with the 14promax now is that its photography is lame, and the 12-megapixel camera has long lagged behind Android. There are too many. Adding 24 million modes is much more valuable than updating iOS, and the experience is directly doubled.
0
0
439
Feb ’24
EXIF field FocalLengthIn35mmFilm has meaningless value
Using an iPhone 8 and iOS 16.7.5, when taking a picture, the EXIF data I'm getting does not seem to make sense. I am getting: FocalLength: 399/100 FocalLengthIn35mmFilm: 177 The FocalLength EXIF field is correct since iPhone 8's back lens does have a focal length of 3.99mm. The FocalLengthIn35mmFilm value, however, is wrong. The actual value is (obviously) much less, probably between 23 and 27mm (ish). Could this be a bug, or may FocalLengthIn35mmFilm is expressed in a unit which I am not aware of? Thanks for your help.
4
0
813
Feb ’24
Unintended grayscale picture on iPhone 14 Pro Max with iOS 17.0
In a Xamarin iOS app, there is a screen (Screen A) designed for capturing ID photos. We've written code to set the default camera zoom to 2x when opening Screen A, enabling users to take photos by pressing a button. The subsequent screen (Screen B) serves as a preview screen for the photos taken on Screen A. The issue at hand is that photos captured on Screen A are unintentionally displayed in grayscale on Screen B. The correct behavior should be displaying them in color on Screen B. This problem occurs only on iPhone 14 Pro Max with iOS 17.0; it does not occur on iPhone 15 Pro with iOS 17.1. Moreover, when the code for a 2x zoom is not present during the capture settings, photos are displayed in color on Screen B on iPhone 14 Pro Max with iOS 17.0. If the code for a 2x zoom is present during the capture settings, and the AVCaptureSession's SessionPreset is set to Preset640x480, the photos are displayed in color on Screen B on iPhone 14 Pro Max with iOS 17.0. Is there an instance where the setting of AVCaptureSession's SessionPreset on iPhone 14 Pro Max with iOS 17.0 influences unintentional grayscale conversion when processing images after taking a 2x zoom photo?
0
0
395
Jan ’24
Getting the orientation of the camera relative to the user interface
Apple in all their wisdom has deprecated almost every API that can be used to get the interface orientation because they want developers to treat an orientation change as a simple size change. However, there are other uses for the interface orientation other than adjusting the UI. Since the camera is fixed to the device, and does not rotate with the interface, images from the camera need to be adjusted for orientation when displaying and/or processing them for computer vision tasks. Using traits is not a reliable way of determining the orientation, especially when running on an iPad. What is the recommended way to determine the relative angle of the camera in relation to the interface now all the interfaceOrientation APIs are deprecated? And specifically: how to get a notification of an interface orientation change?
2
0
552
Jan ’24
Camera usage inside Virtualization?
I'm trying to use the host camera from inside a virtual machine created using the Virtualization framework in Swift. I can't seem to figure out how to achieve this though, and unlike audio devices, keyboards, displays etc there doesn't seem to be a corresponding class and docs page for cameras or generic usb devices. Is there any way to connect a built in Apple camera to a mac virtual machine created with the Virtualization framework?
1
1
633
Jan ’24
Use CoreImage filters on Vision Pro (visionOS) view
I have an iOS app that uses (camera) video feed and applies CoreImage filters to simulate a specific real world effect (for educational purposes). Now I wanted to make a similar app for visionOS and apply the same CoreImage filters to the content (live view) users sees while wearing Apple Vision Pro headset. Is there a way to do it with current APIs and what would you recommend? I saw that we cannot get video feed from camera(s), is there a way to do it with ARKit and applying the filters somehow using that? I know visionOS is a young/fresh platform but any help would be great! Thank you!
1
0
1.1k
Jan ’24
Unable change Photo permission on setting app
My app uses camera and photo library. I found that if a user follows certain steps, they will no longer be able to change the photo permissions for my app in the Settings app. The steps are as follows Press the camera button in the app to launch the camera. Take a picture with camera permissions granted. grant ".addOnly" permission to the photo library. Press the photo library button in the app to read photo library. Deny ".readWrite" permission to the photo library. After step 5, the Settings app only shows items to switch ".addOnly" permissions, but not ".readWrite" permissions. I am aware that in iOS14 or later, the permission required after a photo is taken with the camera should be ".addOnly". Therefore, I suspect that this problem is occurring in other apps. So far I have devised my app to deal with this problem, but is this the expected behavior of the Settings app? If so, how can I avoid this problem?
0
1
407
Jan ’24
ITMS-90683: Missing purpose string in Info.plist
I have an app that is getting rejected from TestFlight because of this error: ITMS-90683: Missing purpose string in Info.plist - Your app’s code references one or more APIs that access sensitive user data, or the app has one or more entitlements that permit such access. The Info.plist file for the “TurtleTuner.app” bundle should contain a NSCameraUsageDescription key with a user-facing purpose string explaining clearly and completely why your app needs the data. If you’re using external libraries or SDKs, they may reference APIs that require a purpose string. While your app might not use these APIs, a purpose string is still required. For details, visit: https://developer.apple.com/documentation/uikit/protecting_the_user_s_privacy/requesting_access_to_protected_resources. The app does not use the camera, only the microphone. I cannot find references to the camera in any of the third party libraries I'm using. What are some ways to troubleshoot this beyond looking for "camera" in the few dependencies? For context, this commit allows the app to get through successfully to TestFlight: https://github.com/tsargent/turtle-tuner/commit/67d4a52e62839ad6c2a49848bea9c408d983f17a While this following commit, which reverts the commit, fails on TestFlight with the mentioned camera permission error: https://github.com/tsargent/turtle-tuner/commit/c95b0b16c4e85d77e625d36b816ed53faa826cf5
0
0
782
Jan ’24
iPhone 15 Pro Front Camera quality issues and poor face photos
This isn't just my observation but lots of people around me and also you can find tonnes of feedback on the inter webs. The processing of images taken with the front facing camera on the 15 (and I think 14 before) is so over processed that I'm aware of people jumping to other phones. And they're right. The 15 exacerbates that even more. You can turn off HDR (a viewing thing), you can prioritise speed over processing but really you cannot turn this off. You can take a Live Photo and then choose a different frame and the processing is less. As a developer I look at that and think it's bonkers, it's just software so why hasn't anyone produced a camera app that makes faces look good (not AI processing) from the front camera. I can be all enthusiastic and say I will develop one but it seems like a simple, obvious fix for Apple. To have the settings so bad that I have friends returning their phones, seems pretty bad. And as a photographer I would agree. There's a lot to love with Apple on the 15 and the log and prores but a simple selfie produces such ugly results. That's an actual problem. So throwing it it out there. What does everyone think? cheers Paul
0
1
1.7k
Jan ’24
Capturing Photot error Code=-11803 "Cannot Record"
Hi iOS community need your help. I am working on an application where I am capturing photo from the back camera using AVCaptureSession. It is working fine with the devices running iOS17+ but I am facing an error on device iPhone X running iOS 16.7.4 ERROR: ` error: Optional(Error Domain=AVFoundationErrorDomain Code=-11803 "Cannot Record" UserInfo={NSUnderlyingError=0x283f0b780 {Error Domain=NSOSStatusErrorDomain Code=-16409 "(null)"}, NSLocalizedRecoverySuggestion=Try recording again., AVErrorRecordingFailureDomainKey=3, NSLocalizedDescription=Cannot Record}) My Code here: final class CedulaScanningVC: UIViewController { var captureSession: AVCaptureSession! var stillImageOutput: AVCapturePhotoOutput! var videoPreviewLayer: AVCaptureVideoPreviewLayer! var delegate: ScanCedulaDelegate? override func viewDidLoad() { super.viewDidLoad() } override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(animated) } override func viewWillDisappear(_ animated: Bool) { super.viewWillDisappear(animated) self.captureSession.stopRunning() } override func viewDidAppear(_ animated: Bool) { super.viewDidAppear(animated) setupCamera() } // MARK: - Configure Camera func setupCamera() { captureSession = AVCaptureSession() captureSession.sessionPreset = .medium guard let backCamera = AVCaptureDevice.default(for: AVMediaType.video) else { print("Unable to access back camera!") return } let input: AVCaptureDeviceInput do { input = try AVCaptureDeviceInput(device: backCamera) //Step 9 stillImageOutput = AVCapturePhotoOutput() if captureSession.canAddInput(input) && captureSession.canAddOutput(stillImageOutput) { captureSession.addInput(input) captureSession.addOutput(stillImageOutput) setupLivePreview() } } catch let error { print("Error Unable to initialize back camera: \(error.localizedDescription)") } } func setupLivePreview() { videoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession) videoPreviewLayer.videoGravity = .resizeAspectFill videoPreviewLayer.connection?.videoOrientation = .portrait self.view.layer.addSublayer(videoPreviewLayer) //Step12 DispatchQueue.global(qos: .userInitiated).async { [weak self] in self?.captureSession.startRunning() //Step 13 DispatchQueue.main.async { self?.videoPreviewLayer.frame = self?.view.bounds ?? .zero } } } func failed() { let ac = UIAlertController(title: "Scanning not supported", message: "Your device does not support scanning a code from an item. Please use a device with a camera.", preferredStyle: .alert) ac.addAction(UIAlertAction(title: "OK", style: .default)) present(ac, animated: true) captureSession = nil } // MARK: - actions func cameraButtonPressed() { let settings = AVCapturePhotoSettings(format: [AVVideoCodecKey: AVVideoCodecType.jpeg]) stillImageOutput.capturePhoto(with: settings, delegate: self) } } extension CedulaScanningVC: AVCapturePhotoCaptureDelegate { func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) { print("error: \(error)") captureSession.stopRunning() DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) { [weak self] in guard let self = self else {return} guard let imageData = photo.fileDataRepresentation() else { print("NO image captured") return } let image = UIImage(data: imageData) self.delegate?.capturedImage(image: image) } } } I don't know what am doing wrong ?
0
0
659
Jan ’24
Code=-11803 "Cannot Record" Error while capturing photo from AVCaptureSession ?
Hi Everyone need your help . I am working on an application where I am capturing photo from the back camera using AVCaptureSession. It is working fine with the devices running iOS17+ but I am facing an error on device iPhone X running iOS 16.7.4 ERROR: error: Optional(Error Domain=AVFoundationErrorDomain Code=-11803 "Cannot Record" UserInfo={NSUnderlyingError=0x283f0b780 {Error Domain=NSOSStatusErrorDomain Code=-16409 "(null)"}, NSLocalizedRecoverySuggestion=Try recording again., AVErrorRecordingFailureDomainKey=3, NSLocalizedDescription=Cannot Record}) Here is my Code: `final class CedulaScanningVC: UIViewController { var captureSession: AVCaptureSession! var stillImageOutput: AVCapturePhotoOutput! var videoPreviewLayer: AVCaptureVideoPreviewLayer! var delegate: ScanCedulaDelegate? override func viewDidLoad() { super.viewDidLoad() } override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(animated) } override func viewWillDisappear(_ animated: Bool) { super.viewWillDisappear(animated) self.captureSession.stopRunning() } override func viewDidAppear(_ animated: Bool) { super.viewDidAppear(animated) setupCamera() } // MARK: - Configure Camera func setupCamera() { captureSession = AVCaptureSession() captureSession.sessionPreset = .medium guard let backCamera = AVCaptureDevice.default(for: AVMediaType.video) else { print("Unable to access back camera!") return } let input: AVCaptureDeviceInput do { input = try AVCaptureDeviceInput(device: backCamera) //Step 9 stillImageOutput = AVCapturePhotoOutput() if captureSession.canAddInput(input) && captureSession.canAddOutput(stillImageOutput) { captureSession.addInput(input) captureSession.addOutput(stillImageOutput) setupLivePreview() } } catch let error { print("Error Unable to initialize back camera: \(error.localizedDescription)") } } func setupLivePreview() { videoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession) videoPreviewLayer.videoGravity = .resizeAspectFill videoPreviewLayer.connection?.videoOrientation = .portrait self.view.layer.addSublayer(videoPreviewLayer) //Step12 DispatchQueue.global(qos: .userInitiated).async { [weak self] in self?.captureSession.startRunning() //Step 13 DispatchQueue.main.async { self?.videoPreviewLayer.frame = self?.view.bounds ?? .zero } } } func failed() { let ac = UIAlertController(title: "Scanning not supported", message: "Your device does not support scanning a code from an item. Please use a device with a camera.", preferredStyle: .alert) ac.addAction(UIAlertAction(title: "OK", style: .default)) present(ac, animated: true) captureSession = nil } // MARK: - actions func cameraButtonPressed() { let settings = AVCapturePhotoSettings(format: [AVVideoCodecKey: AVVideoCodecType.jpeg]) stillImageOutput.capturePhoto(with: settings, delegate: self) } } extension CedulaScanningVC: AVCapturePhotoCaptureDelegate { func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) { print("error: \(error)") captureSession.stopRunning() DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) { [weak self] in guard let self = self else {return} guard let imageData = photo.fileDataRepresentation() else { print("NO image captured") return } let image = UIImage(data: imageData) self.delegate?.capturedImage(image: image) } } }` I don't know what am doing wrong ?
0
0
824
Jan ’24
CMIO CameraExtension installation is not ATOMIC
If I do these tasks on random order, CMIO CameraExtension go into unstable condition. Copy MyApp.app under /Applications or /Applications/MyAppGroup/ Install by MyApp sending OSSystemExtensionRequest.activationRequest Check install condition by command : "systemextensionsctl list" uninstall by MyApp sending OSSystemExtensionRequest.deactivationRequest Remove /Applications/MyAppGroup/ by command line and Finder Remove /Applications/MyApp.app by command line and Finder Kill MyApp.app during activationRequest. Once my CMIO CameraExtension go into unstable condition, it is impossible to remove on normal way. "systemextensionsctl list" shows my extension is activated. Remove by API failed with code=4. Removing file of MyApp.app does not remove CameraExtension Only way to remove CameraExtension is "Boot macOS as recovery mode", disable SIP, "systemextensionsctl uninstall" Audio HAL extension is file based and ATOMIC. I can check file existence by "ls" command and remove by "rm -rf" command. I never met unstable condition.
1
0
594
Jan ’24
CameraExtension and Uninstaller
I need to write macOS App, CameraExtension(CMIO) and Uninstaller app. Bundle ID is like this App : com.my.app CameraExtension : com.my.app.cameraex Uninstaller app : com.my.app.unisntaller My App can activate CameraExtension by OSSystemExtensionRequest.activationRequest. But Uninstaller cannot deactivate CameraExtension. I got error : Error Domain=OSSystemExtensionErrorDomain Code=4 "Extension not found in App bundle: perhaps App is not validly structured" I set AppGroup and add SystemExtension feature and provision for uninstaller. I guess "com.my.app.unisntaller" cannot deactivate "com.my.app.cameraex". What kind of Bundle ID should I use for my uninstaller? Writing App and Uninstaller is correct way for CameraExtension? My manager ask to provide easy method for removing all modules.
4
0
964
Jan ’24
Wkwebview doesn't upload photo from camera
I have wkwebview, from it the camera is triggered, a photo is taken and uploaded to the web. 
The problem is that when the photo is taken nothing happens on the web, the camera just closes and that's it, the button in webview that opens the camera stops triggering until you re-enter the application. Permishen to use the camera is added. 

If you open through safari - everything works, android also has the expected behavior, but in webview ais does not want to load. Maybe something new has appeared in webview that needs to be enabled, given permission? Tryedto allow inlinedmediaplayback, allowArbitryLoads
1
0
739
Dec ’23
AVCaptureDevice.FocusMode cannot be set in iPhone15
We are developing an app which uses AVCaptureSession. Here is a part of my code: if context.config.basic_settings.auto_focus == 1 { // 自動フォーカス device.focusMode = AVCaptureDevice.FocusMode.continuousAutoFocus completion() } else { device.focusMode = AVCaptureDevice.FocusMode.locked var _lenspos = Float(context.config.basic_settings.lens_position) ?? 0.5 if _lenspos > 1.0 { _lenspos = 1.0 } if _lenspos < 0.0 { _lenspos = 0.0 } device.setFocusModeLocked(lensPosition: _lenspos, completionHandler: { (time) in completion() }) } This code can successfully set focus mode to autofoucs or manual focus and also can set lens position perfectly if we use iPhone13 or iPhone14. But If we use iPhone15, this code cannot set the focus mode or lensposition. We also tried with different iPhone15 devices, but the result is always the same(fails to set focus). And also used different iPhone13 and iPhone14 devices. But these iPhones can set the focus correctly everytime. Is it a bug of iPhone15? Or is there anything that I can do about the issue?
2
0
798
Jan ’24