Discuss using the camera on Apple devices.

Posts under Camera tag

182 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

AVCaptureMovieFileOutput DOES NOT support spatial video capture on iPhone 15 Pro of iOS 18.1?
Hi everyone, I am having trouble implementing spatial video recording into files by following the WWDC24 video: Build compelling spatial photo and video experiences. Specifically, the flag "isSpatialVideoCaptureSupported" of AVCaptureMovieFileOutput shows FALSE where the code is tested on both my physical iPhone 15 Pro (iOS 18.1) and the simulator (iOS 18.0). This is the code that I am running: let movieFileOutput = AVCaptureMovieFileOutput() print("movieCapture output isSpatialVideoCaptureSupported: \(movieFileOutput.isSpatialVideoCaptureSupported)") However, one of the formats of AVCaptureDevice shows a TRUE for the flag isSpatialVideoCaptureSupported. for format in currentDevice.formats { if format.isSpatialVideoCaptureSupported { print("isSpatialVideoCaptureSupported is true") break } } I am totally confused now, why DOES the camera device support spatial mode while the movieFileCapture DOES NOT? Can someone please help? Really appreciate it!! Here are my testing environment: iPhone 15 Pro iOS 18.1 (US version) Xcode 16.0 beta 16A5171c
1
0
337
Aug ’24
Camera Extension: Video Freezes When Previewing With QuickTime Player
I am developing a camera extension as described here Creating a camera extension as described in Creating a camera extension with Core Media I/O. To ensure this wasn't some issue with my code, I reverted to the sample code added when you choose File > New > Target > Camera Extension in Xcode, in other words, I am using the example code provided by Apple. I am able to install the camera extension and see it in QuickTime Player, where I choose it as the video input. In QuickTime Player I see the white line generated by the sample code moving up and down. For some period of time, it works. But eventually, the video in QuickTime Player freezes. The thing that's really weird is if I add some NSLog() statements at the point in the code where it returns the newly created sample: [self->_streamSource.stream sendSampleBuffer:sbuf discontinuity:CMIOExtensionStreamDiscontinuityFlagNone hostTimeInNanoseconds:(uint64_t)(CMTimeGetSeconds(timingInfo.presentationTimeStamp) * NSEC_PER_SEC)]; the samples are still being generated and sent to the stream. But the apparently QuickTime Player has decided to stop consuming them. I thought maybe setting the discontinuity parameter to CMIOExtensionStreamDiscontinuityFlagTime or CMIOExtensionStreamDiscontinuityFlagSampleDropped if the delta time since the last sample was generated was off by a tiny bit, but this did not improve the situation. Finally, could this have something to do with frequently installing and uninstalling my camera extension as part of the debugging and testing process? Thanks in advance for any advice you might have!
0
0
433
Aug ’24
Center stage control mode not working for iPad on front camera with .photo session preset.
I am working on an iPad app using the front camera. The camera logic is implemented using the AVFoundation framework. During a session I use central stage mode to center the face with front camera . Central stage mode works fine in all cases except when I set the session preset photo. In other presets: high, medium, low, cif352x288, vga640x480, hd1280x720, hd1920x1080, iFrame960x540, iFrame1280x720, inputPriority presets central stage mode working except photo session preset. Can you explain why this happens or maybe it is an unobvious bug? Code snippet: final class CameraManager { //MARK: - Properties private let captureSession: AVCaptureSession private let photoOutput: AVCapturePhotoOutput private let previewLayer: AVCaptureVideoPreviewLayer //MARK: - Init init() { captureSession = AVCaptureSession() photoOutput = AVCapturePhotoOutput() previewLayer = AVCaptureVideoPreviewLayer(session: captureSession) } //MARK: - Methods Setup Camera and preview layer func setupPreviewLayerFrame(view: UIView) { previewLayer.frame = view.frame view.layer.insertSublayer(previewLayer, at: 0) setupCamera() } private func setupCamera() { guard let videoCaptureDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .front) else { return } AVCaptureDevice.centerStageControlMode = .app AVCaptureDevice.isCenterStageEnabled = true do { let input = try AVCaptureDeviceInput(device: videoCaptureDevice) captureSession.addInput(input) captureSession.addOutput(photoOutput) /// high, medium, low, cif352x288, vga640x480, hd1280x720, hd1920x1080, iFrame960x540, iFrame1280x720 and inputPriority presets working except photo session preset captureSession.sessionPreset = .photo DispatchQueue.global(qos: .userInteractive).async { self.captureSession.startRunning() } } catch { print("Error setting up camera: \(error.localizedDescription)") } } }
0
0
432
Aug ’24
UIImagePickerController inside of a SwiftUI View dismisses the view, rather than firing the callback.
Hello, When using a UIImagePickerController with the .camera configuration I'm currently facing an issue where the delegate function imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) is not firing. But rather UIKit internals dismisses the parent view to the UIImagePickerController. I'm showing the picker controller through a UIViewControllerRepresentable. It does not always occur however, and the behavior is very flakey, sometimes it fires when pressing the b, sometimes it does not. When setting a breakpoint at the dismiss function when pressing the "Use Photo" button, UIKit internals dismisses the view, not my own code.
1
0
404
Aug ’24
Detect when main app launched by LockedCameraCapture for permission
I have a LockedCameraCapture extension working well, however there is one situation I cannot find a solution to. If the user has not yet provided camera access permission then the main app will be launched rather than the LockedCameraCapture extension. I cannot find a mechanism by which my main app can detect that this was the reason for the launch and thereby request permission. When the button is pressed from the control center without permission the app is run and the CameraCaptureIntent is called so I can prompt the user from there. However, as best I can tell the CameraCaptureIntent is not called when launched from a locked Lock Screen, the app is simply opened. My app has a variety of functions, most of which do not involve the camera so I cannot just always prompt the user for camera access on open. Is there any mechanism by which my main app can detect it was launched for this reason so it could ask for permission? Thank you!
4
1
517
Sep ’24
GameKit Access Point causes camera background image in ARKit to be black in iOS 18 beta only
I have an AR game using ARKit with SceneKit that works just fine in iOS 17. In the iOS 18 betas, the AR background image shows black instead of showing the real world. As a result there's no tracking and obviously the whole game is useless. I narrowed down the issue to showing the Game Center Access Point. My app has ViewController 1 (VC1) showing the main menu and that's where I want to show the GC Access Point. From there you open VC2 which shows a list of levels. Selecting any level will open VC3 which has the ARScene. Following is the code I use to start Game Center in VC1: GKLocalPlayer.local.authenticateHandler = { gcAuthVC, error in let isGameCenterReady = (gcAuthVC == nil) && (error == nil) if let viewController = gcAuthVC { self.present (viewController, animated: true, completion: nil) } if error != nil { print(error?.localizedDescription ?? "") } if isGameCenterReady { GKAccessPoint.shared.location = .topLeading GKAccessPoint.shared.showHighlights = true GKAccessPoint.shared.isActive = true } } When switching to VC2 I run GKAccessPoint.shared.isActive = false so that the Access Point will no longer show in any of the following VCs. I tried running it in VC1, VC2, and again in VC3 - it doesn't change anything. Once I reach VC3, the background is black. If in VC1 I don't run GKAccessPoint.shared.isActive = true, so I don't activate the access point, the behavior is as follows: If I wait until after the Game Center login animation completes and closes on its own and then I proceed to VC2 and VC3, the camera image will show correctly If I quickly move to VC2 before the Game Center login animation has completed, so my code will close it by setting active = false, and then I continue to VC3, I will see the black background problem. So it does look like activating the access point and then de-activating it causes the issue. BTW, if I activate the access point and leave it on in all VCs, the same black background issue persists. Other than that, when I'm in VC3 with the black background and I switch to another app (so my game moves to the background), when it returns to the foreground, the camera suddenly shows the real world correctly! I tried to manually reset the AR session by pausing and restarting it, but that didn't change anything. Also, when I check with the debugger, it looks like when the app comes back to the foreground it also doesn't run the session start code. But something does seem to reset itself so I wonder what that is. Maybe I could trigger the same manually in my cdoe??? I repeat that everything works just fine in iOS 17 and below. This problem only started with the iOS 18 beta (currently on beta 5, but it started in some of the previous betas as well). So could this be a bug in iOS 18? As a workaround I could check the iOS version and if it's iOS18 not activate the access point, hoping that the user will not jump to VC2 too quickly, and show my own button which will open Game Center. But I'd rather give the users the full experience with their own avatar and the highlights showing up. Plus, certainly some users will move quickly to VC2 and that will be an awful experience. Any help would be greatly appreciated. Thanks!
2
0
491
Aug ’24
There’s a pink film from the front camera on my iPhone 15 Pro Max
Hello. I have 2 iphones 15 pro max and i have both of them the same problem (im watching here in community and some friends have it) my front camera have pink shade and its a shame because i bought this phone 1500 euro and Past almost a month without apple fixing it... On other post i saw someone said about cheap protection but guess what...i dont have any protection....only if he means my screen from apple is cheap/defective product... ( Checked my serial number and its 100% new). So i need a answer because this problem is 100% software...its a shame because i cant take anymore selfies... (Its not only at night) ***i posted to apple support community previous and got deleted because i said one phone is on beta ios 18...but the other phone have the officially latest update and have the same problem....
1
0
527
Aug ’24
Can we use the ARKit CameraFrameProvider API for prototyping
Its my understanding that to use the CameraFrameProvider, which provides access to the Apple Vision Pro front facing camera feed the enterprise main camera access "com.apple.developer.arkit.main-camera-access.allow" entitlement is required. Is there a method to prototype apps on a that use the CameraFrameProvider running on an apple vision pro that has developer mode enable without having the "com.apple.developer.arkit.main-camera-access.allow" entitlement?
1
0
362
Jul ’24
camera app and gallery app is laggy!!!! iPhone 15
It's so laggy, that iPhone 11 is more responsive than this ip15..thats a nightmare! after I take a pic, I have to wait 20 secs to watch it or to swipe it. iPhone got free space almost 70GB, and app gallery and camera app I so laggy that I am shocked and ****** of the same time. iOS 17, the same laggy... iOS18 Beta, just check if things changed -> nope, the same laggy sh...
0
0
313
Jul ’24
Camera issues on iPhone 14 Pro Max after iOS 18 beta 3
4 days after installing iOS 18 beta 3 my iPhone no longer had macro control, .5 zoom, and other features for the Pro models, I’ve tried restarting my phone but nothing changed, settings Keeps saying that the Phone has an unknown part on the camera or it’s not genuine, I bought it brand new on T-mobile last year, I need help wether this is just a beta issue or an actual physical damage
3
2
1.2k
Jul ’24
Minimum required device capabilities for iPhones that have true depth camera?
Hello, I am trying to submit my app to the app store and I want to make sure that my app is only installed by iPhones with a true depth camera. I have tried including the "iPhone / iPad Minimum Performance A12" in the the minimum required devices capabilities tab in info.plist but it seems to not work. I can still open my app with a phone that does not have the true depth camera. Is there a way of setting the minimum requirement to have the true depth camera through the info.plist or can I also hard code it in my app?. Your assistance is greatly appreciated.
0
0
308
Jul ’24
Camera is not detected by using ImageCaptureCore framework
Hi, Currently my app is using ImageCaptureCore framework to work with DSLR camera. But when I tested it in iOS 18, it turns out my camera cannot do connection with iPhone by wired connection. It seems there are some developer run into the same problem, there are: https://forums.developer.apple.com/forums/thread/756960 https://stackoverflow.com/questions/78618886/icdevicebrowser-fails-to-find-any-devices-after-ios-18-update And it’s reproduced in some apps that expected to use ImageCaptureCore framework. I’d like to clarify that: Is the issue currently iOS 18 bugs? Is there any plan of Apple to remove wired connection support of ImageCaptureCore framework? Thank you.
0
0
353
Jul ’24
I referred to the Enhanced Sensor Access code from WWDC24 to display the main camera of Vision Pro in the application interface, but it is not displaying
this is my code: import Foundation import ARKit import SwiftUI class CameraViewModel: ObservableObject { private var arKitSession = ARKitSession() @Published var capturedImage: UIImage? private var pixelBuffer: CVPixelBuffer? private var cameraAccessAuthorizationStatus = ARKitSession.AuthorizationStatus.notDetermined func startSession() { guard CameraFrameProvider.isSupported else { print("Device does not support main camera") return } Task { await requestCameraAccess() guard cameraAccessAuthorizationStatus == .allowed else { print("User did not authorize camera access") return } let formats = CameraVideoFormat.supportedVideoFormats(for: .main, cameraPositions: [.left]) let cameraFrameProvider = CameraFrameProvider() print("Requesting camera authorization...") let authorizationResult = await arKitSession.requestAuthorization(for: [.cameraAccess]) cameraAccessAuthorizationStatus = authorizationResult[.cameraAccess] ?? .notDetermined guard cameraAccessAuthorizationStatus == .allowed else { print("Camera data access authorization failed") return } print("Camera authorization successful, starting ARKit session...") do { try await arKitSession.run([cameraFrameProvider]) print("ARKit session is running") guard let cameraFrameUpdates = cameraFrameProvider.cameraFrameUpdates(for: formats[0]) else { print("Unable to get camera frame updates") return } print("Successfully got camera frame updates") for await cameraFrame in cameraFrameUpdates { guard let mainCameraSample = cameraFrame.sample(for: .left) else { print("Unable to get main camera sample") continue } print("Successfully got main camera sample") self.pixelBuffer = mainCameraSample.pixelBuffer } DispatchQueue.main.async { self.capturedImage = self.convertToUIImage(pixelBuffer: self.pixelBuffer) if self.capturedImage != nil { print("Successfully captured and converted image") } else { print("Image conversion failed") } } } catch { print("ARKit session failed to run: \(error)") } } } private func requestCameraAccess() async { let authorizationResult = await arKitSession.requestAuthorization(for: [.cameraAccess]) cameraAccessAuthorizationStatus = authorizationResult[.cameraAccess] ?? .notDetermined if cameraAccessAuthorizationStatus == .allowed { print("User granted camera access") } else { print("User denied camera access") } } private func convertToUIImage(pixelBuffer: CVPixelBuffer?) -> UIImage? { guard let pixelBuffer = pixelBuffer else { print("Pixel buffer is nil") return nil } let ciImage = CIImage(cvPixelBuffer: pixelBuffer) let context = CIContext() if let cgImage = context.createCGImage(ciImage, from: ciImage.extent) { return UIImage(cgImage: cgImage) } print("Unable to create CGImage") return nil } } this my log: User granted camera access Requesting camera authorization... Camera authorization successful, starting ARKit session... ARKit session is running Successfully got camera frame updates void * _Nullable NSMapGet(NSMapTable * _Nonnull, const void * _Nullable): map table argument is NULL
0
0
352
Jul ’24
Performant alternative to scaling a CIImage / PixelBuffer
Hey, I’m building a camera app where I am applying real time effects to the view finder. One of those effects is a variable blur, so to improve performance I am scaling down the input image using CIFilter.lanczosScaleTransform(). This works fine and runs at 30FPS, but when running the metal profiler I can see that the scaling transforms use a lot of GPU time, almost as much as the variable blur. Is there a more efficient way to do this? The simplified chain is like this: Scale down viewFinder CVPixelBuffer (CIFilter.lanczosScaleTransform) Scale up depthMap CVPixelBuffer to match viewFinder size (CIFilter.lanczosScaleTransform) Create CIImages from both CVPixelBuffers Apply VariableDepthBlur (CIFilter.maskedVariableBlur) Scale up final image to metal view size (CIFilter.lanczosScaleTransform) Render CIImage to a MTKView using CIRenderDestination From some research, I wonder if scaling the CVPixelBuffer using the accelerate framework would be faster? Also, Instead of scaling the final image, perhaps I could offload this to the metal view? Any pointers greatly appreciated!
2
0
576
Jul ’24
How does Final Cut Camera synchronize videos
I have an application that enables recording video from multiple iPhones through an iPad. It uses Multipeer Connectivity for all the device communication. When the user presses record on the iPad, it sends a command to each device in parallel and they start capturing video. But since network latency varies, I cannot guarantee that the recording start and stop times are consistent among all the iPhones. I need the frames to be exactly in sync. I tried using the system clock on each device for synchronizing the videos. If all the device system clocks were in sync within 3ms (30 frames per second), then it should be okay. But I tested and the clocks vary quite a bit, multiple seconds. So that won't work. I ultimately solved the problem by having a countdown timer on the iPad. The user puts the iPad in view of each phone with the countdown. Then later I use a python script to cut all the videos when the countdown timer goes to 0. But that's more work for the end user and requires manual work on our end. With a little ML text recognition, this could get better. Some people have suggested using a time server and syncing the clocks that way. I still haven't tried this out, and I'm not sure if it's even possible to run a NTP server on an iPad, and whether the NTP resolution will be below 3ms. I tried out Final Cut Camera and it has solved the synchronization problem. Each frame is in sync. The phones don't start and stop at exactly the same time, and they account for this by adding black frames to the front and/or back of videos to account for differences. I've searched online and other people have the same problem. I'd love to know how Apple was able to solve the synchronization issue when recording video from multiple iPhones from an iPad over what I assume is Multipeer Connectivity.
1
0
408
Jul ’24
Iphone 14 & 15 pro model camera focusing error / UNITY APP
Hi, I would like to ask your advice with our IOS app, which has a problem with the Iphone 14 pro & Iphone 15 pro model camera focusing on close objects. The game is made with Unity and the idea is that kids can scan random QR - and barcodes to catch monsters. With Iphone the no pro models focus and read the barcodes well, but new models with three back camera systems our app does not focus on close, that makes the barcode reading hard. Somehow it seems that the Iphone pro model is not changing the camera lens to close focusing one in a third party app like it does when you use the camera itself. Do you have any advice on how to solve the focusing problem? I appreciate your help!
0
0
478
Jul ’24