Post

Replies

Boosts

Views

Activity

Unable to Get Result from DetectHorizonRequest - Result is nil
I am using Apple’s Vision framework with DetectHorizonRequest to detect the horizon in an image. Here is my code: func processHorizonImage(_ ciImage: CIImage) async { let request = DetectHorizonRequest() do { let result = try await request.perform(on: ciImage) print(result) } catch { print(error) } } After calling the perform method, I am getting result as nil. To ensure the request's correctness, I have verified the following: The input CIImage is valid and contains a visible horizon. No errors are being thrown. The relevant frameworks are properly imported. Given that my image contains a clear horizon, why am I still not getting any results? I would appreciate any help or suggestions to resolve this issue. Thank you for your support! This is the image
0
0
215
Oct ’24
Seeking API for Advanced Auto Image Enhancement Similar to Photos App's Auto Feature
Hi everyone, I've been working with the autoAdjustmentFilters provided by Core Image, which includes filters like CIHighlightShadowAdjust, CIVibrance, and CIToneCurve. However, I’ve noticed that the results differ significantly from the "Auto" enhancement feature in the Photos app. In the Photos app, the Auto function seems to adjust multiple parameters such as contrast, exposure, white balance, highlights, and shadows in a more advanced manner. Is there an API or a framework available that can replicate the more sophisticated "Auto" adjustments as seen in the Photos app? Or would I need to manually combine filters (like CIExposureAdjust, CIWhitePointAdjust, etc.) to approximate this functionality? Any insights or recommendations on how to achieve this would be greatly appreciated. Thank you!
0
0
188
Oct ’24
iPhone RAW photos is very dark when it import to 3rd software
I use the official api to output the raw file. when I transfer it to my mac, it is very dark. But if I shot with iOS original camera app, it is brighter. This is how I save the proraw file to photo library let creationRequest = PHAssetCreationRequest.forAsset() creationRequest.addResource(with: .photo, data: photo.compressedData, options: nil) // Save the RAW (DNG) file an alternate resource for the Photos asset. let options = PHAssetResourceCreationOptions() ////////options.shouldMoveFile = true creationRequest.addResource(with: .alternatePhoto, fileURL: photo.rawFileURL, options: options)
0
0
370
Nov ’23
how to get the proraw image output with 1:1, 16:9
Now I use AVFoundation framework to get the photo output, but the image aspect ratio is 4:3. But according to the Camera app in the iPhone 13 Pro, it has server image aspect ratio: 4:3, 16:9 and 1:1 when take the proraw image. So how can I get the 1:1, 16:9 aspect ratio proraw image? After I do some research, I find that no matter you use which kinds of camera in the iPhone 11, 12, 13, 14, 15 or Pro, the result image is always 4:3, 1:1 and 16:9 come from the 4:3 cropping. If it is true, how can I crop the proraw file without any data lossing? My developer environment: iPhone 13 Pro iOS 16.7 xcode 14.3.1 This is the session configuration code for the camera device configuration. session.beginConfiguration() /* Do not create an AVCaptureMovieFileOutput when setting up the session because Live Photo is not supported when AVCaptureMovieFileOutput is added to the session. */ session.sessionPreset = .photo // Add video input. do { var defaultVideoDevice: AVCaptureDevice? if let backCameraDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back) { // If a rear dual camera is not available, default to the rear wide angle camera. defaultVideoDevice = backCameraDevice } else if let frontCameraDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .front) { // If the rear wide angle camera isn't available, default to the front wide angle camera. defaultVideoDevice = frontCameraDevice } guard let videoDevice = defaultVideoDevice else { print("Default video device is unavailable.") setupResult = .configurationFailed session.commitConfiguration() return } let videoDeviceInput = try AVCaptureDeviceInput(device: videoDevice) if session.canAddInput(videoDeviceInput) { session.addInput(videoDeviceInput) self.videoDeviceInput = videoDeviceInput } else { print("Couldn't add video device input to the session.") setupResult = .configurationFailed session.commitConfiguration() return } } catch { print("Couldn't create video device input: \(error)") setupResult = .configurationFailed session.commitConfiguration() return } // check the lens list let camerasOptions = videoDeviceDiscoverySession.devices var availableCameras: [AVCaptureDevice.DeviceType] = [] if camerasOptions.isEmpty { print("no camera devices") } else { for camera in camerasOptions { if camera.deviceType == .builtInUltraWideCamera || camera.deviceType == .builtInWideAngleCamera || camera.deviceType == .builtInTelephotoCamera { if !availableCameras.contains(camera.deviceType) { availableCameras.append(camera.deviceType) } } } } DispatchQueue.main.async { self.lensList = availableCameras } // Add the photo output. if session.canAddOutput(photoOutput) { session.addOutput(photoOutput) photoOutput.isHighResolutionCaptureEnabled = true photoOutput.maxPhotoQualityPrioritization = .quality print(photoOutput.isAppleProRAWSupported) // Use the Apple ProRAW format when the environment supports it. photoOutput.isAppleProRAWEnabled = photoOutput.isAppleProRAWSupported DispatchQueue.main.async { self.isSupportAppleProRaw = self.photoOutput.isAppleProRAWSupported } } else { print("Could not add photo output to the session") setupResult = .configurationFailed session.commitConfiguration() return } session.commitConfiguration()
1
1
688
Oct ’23