Hello there!
I am trying to use PHPickerViewController to load videos, but I got a problem: I could load some videos only not all.
I refer to the existing thread - https://developer.apple.com/forums/thread/652695, but dosen't work.
This is the code I persent PHPickerViewController
var config = PHPickerConfiguration()
config.selectionLimit = 1
config.filter = .videos
config.preferredAssetRepresentationMode = .current
let picker = PHPickerViewController(configuration: config)
picker.delegate = self
present(picker, animated: true, completion: nil)
Below is the relevant implementation of the method: func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]):
picker.dismiss(animated: true, completion: nil)
for result in results {
result.itemProvider.loadFileRepresentation(forTypeIdentifier: UTType.movie.identifier) { (url, error) in
if let error = error {
print(error)
return
}
guard let url = url else { return }
let fileName = "\(Date().timeIntervalSince1970).\(url.pathExtension)"
let newUrl = URL(fileURLWithPath: NSTemporaryDirectory() + fileName)
try? FileManager.default.copyItem(at: url, to: newUrl)
DispatchQueue.main.async {
self.playVideo(newUrl)
}
}
}
Before I print error in line 5, Xcode printed 3 lines of error:
[AXRuntimeCommon] Unknown client: TestPHPicker
[default] [ERROR] Could not create a bookmark: NSError: Cocoa 257 "The file couldn’t be opened because you don’t have permission to view it." }
Error copying file type public.movie. Error: Error Domain=NSItemProviderErrorDomain Code=-1000 "Cannot load representation of type public.movie" UserInfo={NSLocalizedDescription=Cannot load representation of type public.movie, NSUnderlyingError=0x283a4a610 {Error Domain=NSCocoaErrorDomain Code=4101 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x283a48b10 {Error Domain=PHAssetExportRequestErrorDomain Code=2 "(null)" UserInfo={NSUnderlyingError=0x283a4a550 {Error Domain=CloudPhotoLibraryErrorDomain Code=82 "Failed to download CPLResourceTypeOriginal" UserInfo=0x28219b300 (not displayed)}}}}}}
And I print error in line 5:
Error Domain=NSItemProviderErrorDomain Code=-1000 "Cannot load representation of type public.movie" UserInfo={NSLocalizedDescription=Cannot load representation of type public.movie, NSUnderlyingError=0x283a4a610 {Error Domain=NSCocoaErrorDomain Code=4101 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x283a48b10 {Error Domain=PHAssetExportRequestErrorDomain Code=2 "(null)" UserInfo={NSUnderlyingError=0x283a4a550 {Error Domain=CloudPhotoLibraryErrorDomain Code=82 "Failed to download CPLResourceTypeOriginal" UserInfo=0x28219b300 (not displayed)}}}}}}
For some videos I can load successfully, and some videos I got error. I don't know why this happened.
I am testing this on an iPhone X iOS 14.0(18A373). Xcode 12.0 (12A7209).
Thanks for help!
Photos & Camera
RSS for tagExplore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.
Post
Replies
Boosts
Views
Activity
For PHPickerViewController, we know we can perform simple filtering by
var config = PHPickerConfiguration()
config.filter = PHPickerFilter.images
But, how about we only want to show images with format JPG & PNG, but excluding GIF? This is because our app doesn't support GIF.
Is it possible to do so?
The sample code in the Apple documentation found in PHCloudIdentifier does not compile in xCode 13.2.1.
Can the interface for identifier conversion be clarified so that the answer values are more accessible/readable. The values are 'hidden' inside a Result enum
It was difficult (for me) to rewrite the sample code because I made the mistake of interpreting the Result type as a tuple. Result type is really an enum.
Using the Result type as the return from library.cloudIdentifierMappings(forLocalIdentifiers: ) and .localIdentifierMappings(
for: )
puts the actual mapped identifiers inside the the enum where they need additional access via a .stringValue message or an evaluation of an element of the result enum.
For others finding the same compile issue, here is a working version of the sample code. This compiles in xCode 13.2.1.
func localId2CloudId(localIdentifiers: [String]) -> [String] {
var mappedIdentifiers = [String]()
let library = PHPhotoLibrary.shared()
let iCloudIDs = library.cloudIdentifierMappings(forLocalIdentifiers: localIdentifiers)
for aCloudID in iCloudIDs {
let cloudResult: Result = aCloudID.value
// Result is an enum .. not a tuple
switch cloudResult {
case .success(let success):
let newValue = success.stringValue
mappedIdentifiers.append(newValue)
case .failure(let failure):
// do error notify to user
}
}
return mappedIdentifiers
}
``` swift func
func cloudId2LocalId(assetCloudIdentifiers: [PHCloudIdentifier]) -> [String] {
// patterned error handling per documentation
var localIDs = [String]()
let localIdentifiers: [PHCloudIdentifier: Result<String, Error>] = PHPhotoLibrary.shared() .localIdentifierMappings(
for: assetCloudIdentifiers)
for cloudIdentifier in assetCloudIdentifiers {
guard let identifierMapping = localIdentifiers[cloudIdentifier] else {
print("Failed to find a mapping for \(cloudIdentifier).")
continue
}
switch identifierMapping {
case .success(let success):
localIDs.append(success)
case .failure(let failure) :
let thisError = failure as? PHPhotosError
switch thisError?.code {
case .identifierNotFound:
// Skip the missing or deleted assets.
print("Failed to find the local identifier for \(cloudIdentifier). \(String(describing: thisError?.localizedDescription)))")
case .multipleIdentifiersFound:
// Prompt the user to resolve the cloud identifier that matched multiple assets.
print("Found multiple local identifiers for \(cloudIdentifier). \(String(describing: thisError?.localizedDescription))")
// if let selectedLocalIdentifier = promptUserForPotentialReplacement(with: thisError.userInfo[PHLocalIdentifiersErrorKey]) {
// localIDs.append(selectedLocalIdentifier)
default:
print("Encountered an unexpected error looking up the local identifier for \(cloudIdentifier). \(String(describing: thisError?.localizedDescription))")
}
}
}
return localIDs
}
i have received a lot of crash log only in iOS16
the crash occured when i called :
[[PHImageManager defaultManager] requestImageDataForAsset:asset options:options resultHandler:resultHandler]
here is the crash log
Exception Type: NSInternalInconsistencyException
ExtraInfo:
Code Type: arm64
OS Version: iPhone OS 16.0 (20A5328h)
Hardware Model: iPhone14,3
Launch Time: 2022-07-30 18:43:25
Date/Time: 2022-07-30 18:49:17
*** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason:Unhandled error (NSCocoaErrorDomain, 134093) occurred during faulting and was thrown: Error Domain=NSCocoaErrorDomain Code=134093 "(null)"
Last Exception Backtrace:
0 CoreFoundation 0x00000001cf985dc4 0x1cf97c000 + 40388
1 libobjc.A.dylib 0x00000001c8ddfa68 0x1c8dc8000 + 96872
2 CoreData 0x00000001d56d2358 0x1d56cc000 + 25432
3 CoreData 0x00000001d56fa19c 0x1d56cc000 + 188828
4 CoreData 0x00000001d5755be4 0x1d56cc000 + 564196
5 CoreData 0x00000001d57b0508 0x1d56cc000 + 935176
6 PhotoLibraryServices 0x00000001df1783e0 0x1df0ed000 + 570336
7 Photos 0x00000001df8aa88c 0x1df85d000 + 317580
8 PhotoLibraryServices 0x00000001df291de0 0x1df0ed000 + 1723872
9 CoreData 0x00000001d574e518 0x1d56cc000 + 533784
10 libdispatch.dylib 0x00000001d51fc0fc 0x1d51f8000 + 16636
11 libdispatch.dylib 0x00000001d520b634 0x1d51f8000 + 79412
12 CoreData 0x00000001d574e0a0 0x1d56cc000 + 532640
13 PhotoLibraryServices 0x00000001df291d94 0x1df0ed000 + 1723796
14 PhotoLibraryServices 0x00000001df291434 0x1df0ed000 + 1721396
15 Photos 0x00000001df8a8380 0x1df85d000 + 308096
16 Photos 0x00000001df89d050 0x1df85d000 + 262224
17 Photos 0x00000001df87f62c 0x1df85d000 + 140844
18 Photos 0x00000001df87ee94 0x1df85d000 + 138900
19 Photos 0x00000001df87e594 0x1df85d000 + 136596
20 Photos 0x00000001df86b5c8 0x1df85d000 + 58824
21 Photos 0x00000001df86d938 0x1df85d000 + 67896
22 Photos 0x00000001dfa37a64 0x1df85d000 + 1944164
23 Photos 0x00000001dfa37d18 0x1df85d000 + 1944856
24 youavideo -[YouaImageManager requestImageDataForAsset:options:resultHandler:] (in youavideo) (YouaImageManager.m:0) 27
25 youavideo -[YouaAlbumTransDataController requstTransImageHandler:] (in youavideo) (YouaAlbumTransDataController.m:0) 27
26 youavideo -[YouaAlbumTransDataController requstTransWithHandler:] (in youavideo) (YouaAlbumTransDataController.m:77) 11
27 youavideo -[YouaUploadTransDataOperation startTrans] (in youavideo) (YouaUploadTransDataOperation.m:102) 19
28 Foundation 0x00000001c9e78038 0x1c9e3c000 + 245816
29 Foundation 0x00000001c9e7d704 0x1c9e3c000 + 268036
30 libdispatch.dylib 0x00000001d51fa5d4 0x1d51f8000 + 9684
31 libdispatch.dylib 0x00000001d51fc0fc 0x1d51f8000 + 16636
32 libdispatch.dylib 0x00000001d51ff58c 0x1d51f8000 + 30092
33 libdispatch.dylib 0x00000001d51febf4 0x1d51f8000 + 27636
34 libdispatch.dylib 0x00000001d520db2c 0x1d51f8000 + 88876
35 libdispatch.dylib 0x00000001d520e338 0x1d51f8000 + 90936
36 libsystem_pthread.dylib 0x00000002544b9dbc 0x2544b9000 + 3516
37 libsystem_pthread.dylib 0x00000002544b9b98 0x2544b9000 + 2968
i can't find the error code 134093 definition
i don't know what's going wrong in iOS16
Would anyone have a hint of why this could happen and how to resolve it?
thanks very much
WWDC23 Platform State of the Union mentioned that Volume shutter buttons to trigger the camera shutter is coming later this year. This was mentioned at 0:30:15.
Would anyone know when this will be available?
My app uses PHLivePhoto.request to generate live photos, but memory leaks if I use a custom targetSize.
PHLivePhoto.request(withResourceFileURLs: [imageUrl, videoUrl], placeholderImage: nil, targetSize: targetSize, contentMode: .aspectFit) {[weak self] (livePhoto, info) in
Change targetSize to CGSizeZero, problem resolved.
PHLivePhoto.request(withResourceFileURLs: [imageUrl, videoUrl], placeholderImage: nil, targetSize: CGSizeZero, contentMode: .aspectFit) {[weak self] (livePhoto, info) in
Hi there,
I am building a camera application to be able to capture an image with the wide and ultra wide cameras simultaneously (or as close as possible) with the intrinsics and extrinsics for each camera also delivered.
We are able to achieve this with an AVCaptureMultiCamSession and AVCaptureVideoDataOutput, setting up the .builtInWideAngleCamera and .builtInUltraWideCamera manually. Doing this, we are able to enable the delivery of the intrinsics via the AVCaptureConnection of the cameras. Also, geometric distortion correction is enabled for the ultra camera (by default).
However, we are seeing if it possible to move the application over to the .builtInDualWideCamera with AVCapturePhotoOutput and AVCaptureSession to simplify our application and get access to depth data. We are using the isVirtualDeviceConstituentPhotoDeliveryEnabled=true property to allow for simultaneous capture. Functionally, everything is working fine, except that when isGeometricDistortionCorrectionEnabled is not set to false, the photoOutput.isCameraCalibrationDataDeliverySupported returns false.
From this thread and the docs, it appears that we cannot get the intrinsics when isGeometricDistortionCorrectionEnabled=true (only applicable to the ultra wide), unless we use a AVCaptureVideoDataOutput.
Is there any way to get access to the intrinsics for the wide and ultra while enabling geometric distortion correction for the ultra?
guard let captureDevice = AVCaptureDevice.default(.builtInDualWideCamera, for: .video, position: .back) else {
throw InitError.error("Could not find builtInDualWideCamera")
}
self.captureDevice = captureDevice
self.videoDeviceInput = try AVCaptureDeviceInput(device: captureDevice)
self.photoOutput = AVCapturePhotoOutput()
self.captureSession = AVCaptureSession()
self.captureSession.beginConfiguration()
captureSession.sessionPreset = AVCaptureSession.Preset.hd1920x1080
captureSession.addInput(self.videoDeviceInput)
captureSession.addOutput(self.photoOutput)
try captureDevice.lockForConfiguration()
captureDevice.isGeometricDistortionCorrectionEnabled = false // <- NB line
captureDevice.unlockForConfiguration()
/// configure photoOutput
guard self.photoOutput.isVirtualDeviceConstituentPhotoDeliverySupported else {
throw InitError.error("Dual photo delivery is not supported")
}
self.photoOutput.isVirtualDeviceConstituentPhotoDeliveryEnabled = true
print("isCameraCalibrationDataDeliverySupported", self.photoOutput.isCameraCalibrationDataDeliverySupported) // false when distortion correction is enabled
let videoOutput = AVCaptureVideoDataOutput()
videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: "sample buffer delegate", attributes: []))
if captureSession.canAddOutput(videoOutput) {
captureSession.addOutput(videoOutput)
}
self.videoPreviewLayer.setSessionWithNoConnection(self.captureSession)
self.videoPreviewLayer.videoGravity = AVLayerVideoGravity.resizeAspect
let cameraVideoPreviewLayerConnection = AVCaptureConnection(inputPort: self.videoDeviceInput.ports.first!, videoPreviewLayer: self.videoPreviewLayer);
self.captureSession.addConnection(cameraVideoPreviewLayerConnection)
self.captureSession.commitConfiguration()
self.captureSession.startRunning()
’m using the AVFoundation Swift APIs to record a Video (CMSampleBuffers) and Audio (CMSampleBuffers) to a file using AVAssetWriter.
Initializing the AVAssetWriter happens quite quickly, but calling assetWriter.startWriting() fully blocks the entire application AND ALL THREADS for 3 seconds. This only happens in Debug builds, not in Release.
Since it blocks all Threads and only happens in Debug, I’m lead to believe that this is an Xcode/Debugger/LLDB hang issue that I’m seeing.
Does anyone experience something similar?
Here’s how I set all of that up: startRecording(...)
And here’s the line that makes it hang for 3+ seconds: assetWriter.startWriting(...)
Dear Sir/Madam,
I am reaching out as a developer working on an academic project. I am currently working on my bachelor's thesis, where I am developing a mobile application for iOS devices. For the success of this project, it is essential to have precise information about the hardware components of specific iPhone models, especially the iPhone SE with the model number MMXN3ZD/A and iOS version 17.1.1.
My primary interest lies in the exact technical specifications of the LEDs and the CCD or CMOS image sensor (depending on which type is used) installed in the iPhone SE. For my project, it is crucial to understand the spectral properties of these components:
LED Specifications: I require information about the spectra of the LEDs, especially which wavelengths of light they emit. This is relevant for the functionality of my app, which relies on photometric analyses.
CCD/CMOS Sensor Specifications: Furthermore, it is important for me to know the wavelengths for which the sensor built into the device is sensitive. This information is critical to accurately interpret the interaction between the sensor and the illuminated environment.
The results of my research and development will not only be significant for my academic work but could also provide valuable insights for the further development of iOS applications in my field of study.
I would be very grateful if you could provide me with this information or direct me to the appropriate department or resource where I can obtain these specific technical details.
Thank you in advance for your support and cooperation.
Best regards,
I am running Sonoma 14.1.1 and having an issue with DNG ProRAW images when choosing to edit. I can choose to edit using the photos app, or edit with Photoshop and in both cases the image once edit is chosen becomes less clear and slightly fuzzy with loss of detail. This happens too when I choose to export the photo from photos as a DNG file and then also try to edit with Photoshop the photos. There is a drastically noticeable difference in quality of the image. This appears to be the way the image is handles in the photos app itself. Even if I save directly from the iPhone to files, it does the same thing once on the Mac. Attached are some screen shot examples, but still clear to see the difference.
Hello,
Faced with a really perplexing issue. Primary problem is that sometimes I get depth and video data as expected, but at other times I don't. And sometimes I'll get both data outputs for a 4-5 frames and then it'll just stop. The source code I implemented is a modified version of the sample code provided by Apple, and interestingly enough I can't re-create this issue with the Apple sample app. So wondering what I could be doing wrong?
Here's the code for setting up the capture input. preferredDepthResolution is 1280 in my case. I'm running this on an iPad Pro (6th gen). iOS version 17.0.3 (21A360). Encounter this issue on iPhone 13 Pro as well. iOS version is 17.0 (21A329)
private func setupLiDARCaptureInput() throws {
// Look up the LiDAR camera.
guard let device = AVCaptureDevice.default(.builtInLiDARDepthCamera, for: .video, position: .back) else {
throw ConfigurationError.lidarDeviceUnavailable
}
guard let format = (device.formats.last { format in
format.formatDescription.dimensions.width == preferredWidthResolution &&
format.formatDescription.mediaSubType.rawValue == kCVPixelFormatType_420YpCbCr8BiPlanarFullRange &&
format.videoSupportedFrameRateRanges.first(where: {$0.maxFrameRate >= 60}) != nil &&
!format.isVideoBinned &&
!format.supportedDepthDataFormats.isEmpty
}) else {
throw ConfigurationError.requiredFormatUnavailable
}
guard let depthFormat = (format.supportedDepthDataFormats.last { depthFormat in
depthFormat.formatDescription.mediaSubType.rawValue == kCVPixelFormatType_DepthFloat16
}) else {
throw ConfigurationError.requiredFormatUnavailable
}
// Begin the device configuration.
try device.lockForConfiguration()
// Configure the device and depth formats.
device.activeFormat = format
device.activeDepthDataFormat = depthFormat
let desc = format.formatDescription
dimensions = CMVideoFormatDescriptionGetDimensions(desc)
let duration = CMTime(value:1, timescale:CMTimeScale(60))
device.activeVideoMinFrameDuration = duration
device.activeVideoMaxFrameDuration = duration
// Finish the device configuration.
device.unlockForConfiguration()
self.device = device
print("Selected video format: \(device.activeFormat)")
print("Selected depth format: \(String(describing: device.activeDepthDataFormat))")
// Add a device input to the capture session.
let deviceInput = try AVCaptureDeviceInput(device: device)
captureSession.addInput(deviceInput)
guard let audioDevice = AVCaptureDevice.default(for: .audio) else {
return
}
// Configure audio input - always configure audio even if isAudioEnabled is false
audioDeviceInput = try! AVCaptureDeviceInput(device: audioDevice)
captureSession.addInput(audioDeviceInput)
deviceSystemPressureStateObservation = device.observe(
\.systemPressureState,
options: .new
) { _, change in
guard let systemPressureState = change.newValue else { return }
print("system pressure \(systemPressureState.levelAsString()) due to \(systemPressureState.factors)")
}
}
Here's how I'm setting up the output:
private func setupLiDARCaptureOutputs() {
// Create an object to output video sample buffers.
videoDataOutput = AVCaptureVideoDataOutput()
captureSession.addOutput(videoDataOutput)
// Create an object to output depth data.
depthDataOutput = AVCaptureDepthDataOutput()
depthDataOutput.isFilteringEnabled = false
captureSession.addOutput(depthDataOutput)
audioDeviceOutput = AVCaptureAudioDataOutput()
audioDeviceOutput.setSampleBufferDelegate(self, queue: videoQueue)
captureSession.addOutput(audioDeviceOutput)
// Create an object to synchronize the delivery of depth and video data.
outputVideoSync = AVCaptureDataOutputSynchronizer(dataOutputs: [depthDataOutput, videoDataOutput])
outputVideoSync.setDelegate(self, queue: videoQueue)
// Enable camera intrinsics matrix delivery.
guard let outputConnection = videoDataOutput.connection(with: .video) else { return }
if outputConnection.isCameraIntrinsicMatrixDeliverySupported {
outputConnection.isCameraIntrinsicMatrixDeliveryEnabled = true
}
}
The top part of my delegate implementation is as follows:
func dataOutputSynchronizer(
_ synchronizer: AVCaptureDataOutputSynchronizer,
didOutput synchronizedDataCollection: AVCaptureSynchronizedDataCollection
) {
// Retrieve the synchronized depth and sample buffer container objects.
guard let syncedDepthData = synchronizedDataCollection.synchronizedData(for: depthDataOutput) as? AVCaptureSynchronizedDepthData,
let syncedVideoData = synchronizedDataCollection.synchronizedData(for: videoDataOutput) as? AVCaptureSynchronizedSampleBufferData else {
if synchronizedDataCollection.synchronizedData(for: depthDataOutput) == nil {
print("no depth data at time \(mach_absolute_time())")
}
if synchronizedDataCollection.synchronizedData(for: videoDataOutput) == nil {
print("no video data at time \(mach_absolute_time())")
}
return
}
print("received depth data \(mach_absolute_time())")
}
As you can see, I'm console logging whenever depth data is not received. Note because I'm driving the video frames at 60 fps, its expected that I'll only receive depth data for every alternate video frame.
Console output is posted as a follow up comment (because of the character limit). I edited some lines out for brevity. You'll see it started streaming correctly but after a while it stopped received both video and depth outputs (in some other runs, it works perfectly and in some other runs I receive no depth data whatsoever). One thing to note, I sometimes run quicktime mirroring to see the device screen to see what the app is displaying (so not sure if that's causing any interference - that said I don't see any system pressure changes either).
Any help is most appreciated! Thanks.
I got code of CMIO CameraExtension by Xcode target and it is running with FaceTime. I guess this kind of Extension has lots of security limitation.
I like to run command like "netstat" in Extension. Is that possible to call Process.run()? I got keep getting error like "The file zsh doesn’t exist". Same code with Process.run() worked in macOS app.
I like to run DistributedNotificationCenter and send text from App to CameraExtension. Is that possible? I do not receive any message on CameraExtension.
If there is any other IPC method between macOS app and CameraExtension, please let me know.
var config = PHPickerConfiguration()
config.filter = PHPickerFilter.images
I want only 'png' files to be displayed when the PHPickerViewController photo list is opened.
I've read this post : https://developer.apple.com/forums/thread/687415
In this post, it is mentioned that filtering image formats by PHPickerConfiguration is not possible (2 years ago).
Is it still not possible? Has issue 71832162 not been resolved?
When developing a custom camera for iOS, when the sessionPreset of AVCaptureSession is set to AVCaptureSessionPresetPhoto, photos cannot be taken on the iPhone 15 Pro Max, but other devices are normal.SessionPreset settings and other enumerations can be shot normally. Please help Apple developers to determine the cause.
In addition, I initially thought there was a problem with our code writing, but when I looked at some demos written by others, the same problem would occur when using the AVCaptureSessionPresetPhoto enumeration and running it on iPhone15 pro max.
I tried running this demo app in "Designed for iPad" mode on my M3 MacBook Pro, and it crashes with the following errors:
LSPrefs: could not find untranslocated node for <FSNode 0x6000022578c0> { isDir = ?, path = '/private/var/folders/yk/2vw8ntf53r79cyldlxx4t4t80000gn/X/6527F067-B4CF-5E9F-8412-6ADCB21853EE/d/Wrapper/Capturing Photos.app' }, proceeding on the assumption it is not translocated: Error Domain=NSPOSIXErrorDomain Code=1 "Operation not permitted"
LSPrefs: could not find untranslocated node for <FSNode 0x6000022578c0> { isDir = ?, path = '/private/var/folders/yk/2vw8ntf53r79cyldlxx4t4t80000gn/X/6527F067-B4CF-5E9F-8412-6ADCB21853EE/d/Wrapper/Capturing Photos.app' }, proceeding on the assumption it is not translocated: Error Domain=NSPOSIXErrorDomain Code=1 "Operation not permitted"
LSPrefs: could not find untranslocated node for <FSNode 0x6000022578c0> { isDir = ?, path = '/private/var/folders/yk/2vw8ntf53r79cyldlxx4t4t80000gn/X/6527F067-B4CF-5E9F-8412-6ADCB21853EE/d/Wrapper/Capturing Photos.app' }, proceeding on the assumption it is not translocated: Error Domain=NSPOSIXErrorDomain Code=1 "Operation not permitted"
CMIO_DAL_PlugInManagement.cpp:917:CreatePlugIn Could not find plugin with kCMIOHardwarePlugInTypeID
CMIO_DAL_CMIOExtension_Device.mm:355:Device legacy uuid isn't present, using new style uuid instead
CMIO_DAL_CMIOExtension_Device.mm:355:Device legacy uuid isn't present, using new style uuid instead
[C:1-3] Error received: Invalidated by remote connection.
CMIO_DAL_CMIOExtension_Stream.mm:1429:GetPropertyData wrong data size for kCMIOStreamPropertyCenterStageFramingMode
CMIOHardware.cpp:331:CMIOObjectGetPropertyData Error: 561211770, failed
Fig assert: "err == 0 " at bail (CMIOUtilities.h:133) - (err=561211770)
CMIO_DAL_CMIOExtension_Stream.mm:1429:GetPropertyData wrong data size for kCMIOStreamPropertyCenterStageFramingMode
CMIOHardware.cpp:331:CMIOObjectGetPropertyData Error: 561211770, failed
Fig assert: "err == 0 " at bail (CMIOUtilities.h:133) - (err=561211770)
Using capture device: FaceTime HD Camera
Camera access not determined.
Unknown client: Capturing Photos
LSPrefs: could not find untranslocated node for <FSNode 0x6000022578c0> { isDir = ?, path = '/private/var/folders/yk/2vw8ntf53r79cyldlxx4t4t80000gn/X/6527F067-B4CF-5E9F-8412-6ADCB21853EE/d/Wrapper/Capturing Photos.app' }, proceeding on the assumption it is not translocated: Error Domain=NSPOSIXErrorDomain Code=1 "Operation not permitted"
Error loading /System/Library/Accessibility/BundlesBase/com.apple.Photos.axbundle/com.apple.Photos (84): dlopen(/System/Library/Accessibility/BundlesBase/com.apple.Photos.axbundle/com.apple.Photos, 0x0109): Symbol not found: _OBJC_CLASS_$_PXSearchResultsViewModel
Referenced from: <128FED4B-1EFC-38CC-BFB9-F6980FB96165> /System/Library/Accessibility/BundlesBase/com.apple.Photos.axbundle/Versions/A/com.apple.Photos
Expected in: <0E82B4EE-CAFC-36CC-8E72-5DF1BAD3BBD2> /System/iOSSupport/System/Library/PrivateFrameworks/PhotosUICore.framework/Versions/A/PhotosUICore
Error loading /System/Library/Accessibility/BundlesBase/com.apple.Photos.axbundle/com.apple.Photos (84): dlopen(/System/Library/Accessibility/BundlesBase/com.apple.Photos.axbundle/com.apple.Photos, 0x0109): Symbol not found: _OBJC_CLASS_$_PXSearchResultsViewModel
Referenced from: <128FED4B-1EFC-38CC-BFB9-F6980FB96165> /System/Library/Accessibility/BundlesBase/com.apple.Photos.axbundle/Versions/A/com.apple.Photos
Expected in: <0E82B4EE-CAFC-36CC-8E72-5DF1BAD3BBD2> /System/iOSSupport/System/Library/PrivateFrameworks/PhotosUICore.framework/Versions/A/PhotosUICore
Error loading /System/Library/Accessibility/BundlesBase/com.apple.Photos.axbundle/com.apple.Photos (84): dlopen(/System/Library/Accessibility/BundlesBase/com.apple.Photos.axbundle/com.apple.Photos, 0x0109): Symbol not found: _OBJC_CLASS_$_PXSearchResultsViewModel
Referenced from: <128FED4B-1EFC-38CC-BFB9-F6980FB96165> /System/Library/Accessibility/BundlesBase/com.apple.Photos.axbundle/Versions/A/com.apple.Photos
Expected in: <0E82B4EE-CAFC-36CC-8E72-5DF1BAD3BBD2> /System/iOSSupport/System/Library/PrivateFrameworks/PhotosUICore.framework/Versions/A/PhotosUICore
AX Safe category class 'SFUnifiedBarRegistrationAccessibility' was not found!
Photo library access not determined.
<<<< FigCaptureCameraParameters >>>> Fig assert: "success" at bail (FigCaptureCameraParameters.m:252) - (err=0)
<<<< FigCaptureCameraParameters >>>> Fig assert: "success" at bail (FigCaptureCameraParameters.m:252) - (err=0)
<<<< FigCaptureCameraParameters >>>> Fig assert: "success" at bail (FigCaptureCameraParameters.m:252) - (err=0)
fopen failed for data file: errno = 2 (No such file or directory)
Errors found! Invalidating cache...
fopen failed for data file: errno = 2 (No such file or directory)
Errors found! Invalidating cache...
CMIOHardware.cpp:1388:CMIOStreamRegisterAsyncStillCaptureCallback stream doesn't support async still capture
CMIOHardware.cpp:1412:CMIOStreamRegisterAsyncStillCaptureCallback Error: 1970171760, failed
<<<< CMIOFigCaptureStream >>>> Fig assert: "! stream->streaming" at bail (CMIOFigCaptureStream.m:1173) - (err=0)
-[MTLDebugDevice newTextureWithDescriptor:iosurface:plane:]:2641: failed assertion `Texture Descriptor Validation
IOSurface textures must use MTLStorageModeShared
libsystem_kernel.dylib`:
0x188f2e0d4 <+0>: mov x16, #0x148
0x188f2e0d8 <+4>: svc #0x80
-> 0x188f2e0dc <+8>: b.lo 0x188f2e0fc ; <+40> Thread 19: signal SIGABRT
0x188f2e0e0 <+12>: pacibsp
0x188f2e0e4 <+16>: stp x29, x30, [sp, #-0x10]!
0x188f2e0e8 <+20>: mov x29, sp
0x188f2e0ec <+24>: bl 0x188f26230 ; cerror_nocancel
0x188f2e0f0 <+28>: mov sp, x29
0x188f2e0f4 <+32>: ldp x29, x30, [sp], #0x10
0x188f2e0f8 <+36>: retab
0x188f2e0fc <+40>: ret
Greetings everyone,
My app is crash when i open camera screen open and close i have added subview in the camera that shows the main screen but the app does not crash every time, the app works well 5-6 times after the app crashes.
I'm using instead of the Quickpose.ai library and the app crashes instead of lib. so I don't know where is the problem i have shown some code and my crash log.
*** Terminating app due to uncaught exception 'NSGenericException', reason: '*** -[AVCaptureSession startRunning] startRunning may not be called between calls to beginConfiguration and commitConfiguration'
*** First throw call stack:
(0x1889f4870 0x180d13c00 0x1a4e30b44 0x10505cff0 0x1047ed7cc 0x1047ed84c 0x105824f50 0x105826b34 0x10582e98c 0x10582f728 0x10583c5f8 0x10583bc2c 0x1f2365964 0x1f2365a04)
libc++abi: terminating due to uncaught exception of type NSException
![]("https://developer.apple.com/forums/content/attachment/a1eeece3-6529-4c79-8931-963f58818a93" "title=Screenshot 2023-12-12 at 9.35.27 AM.png;width=1920;height=1080")
![]("https://developer.apple.com/forums/content/attachment/2184c975-e299-40e4-b466-cafa5165ae03" "title=Screenshot 2023-12-12 at 9.35.32 AM.png;width=1920;height=1080")
`
![]("https://developer.apple.com/forums/content/attachment/d78ac3ac-313a-4df9-960d-0c58c3087bec" "title=Screenshot 2023-12-15 at 12.11.38 PM.png;width=1920;height=1080")
``
Starting with Sonoma 14.2 it is not possible any longer to connect canon cameras to an app via USB using Canon's EDSDK framework. This has been working fine up to Sonoma 14.1.
The app using the EDSDK is not crashing, but the SDK is not reporting back any connected cameras any longer. The camera is connected and can be seen in the system report as well as in e.g. gphoto2 and even in the EOS Utility Software.
It seems that 14.2 introduced some breaking change to the access to cameras from within apps.
I've tried upgrading to the newest EDSDK version and checked with and without App Sandbox. There is no way to find the camera any longer on 14.2.
Even with sample code from Apple, the same warning forever.
"Error returned from daemon: Error Domain=com.apple.accounts Code=7 "(null)""
Failed to log access with error: access=<PATCCAccess 0x28316b070> accessor:<<PAApplication 0x283166df0 identifierType:auditToken identifier:{pid:1456, version:3962}>> identifier:2EE1A54C-344A-40AB-9328-3F8E8B5E8A85 kind:intervalEvent timestampAdjustment:0 visibilityState:0 assetIdentifierCount:0 tccService:kTCCServicePhotos, error=Error Domain=NSCocoaErrorDomain Code=4097 "connection to service with pid 235 named com.apple.privacyaccountingd" UserInfo={NSDebugDescription=connection to service with pid 235 named com.apple.privacyaccountingd}
Entitlements are reviewed.
I'm running this SwiftUI sample app for photos without any modifications except for adding my developer profile, which is necessary to build it. When I tap on the thumbnail to see the photo library (after granting access to my photo library), I see that some of the thumbnails are stuck in a loading state, and when I tap on thumbnails, I only see a low-resolution image (the thumbnail), not the full-resolution image that should load.
In the console I can see this error that occurs when tapping on a thumbnail to see the full-resolution image:
CachedImageManager requestImage error: The operation couldn’t be completed. (PHPhotosErrorDomain error 3164.)
When I make a few modifications necessary to run the app as a native macOS app, all the thumbnails load immediately, and clicking on them reveals the full-resolution images.
So I've spent the last five years optimizing my video AI system so that it runs with less than 5% CPU while processing a 30fps video feed on a Macbook Pro M2, and everything is great, until Sonoma comes out, and I find myself consuming 40% CPU for the exact same workload.
So I fire up Instruments, and the "heaviest stack trace" (see screenshot) turns out to be Espresso doing some completely unasked-for and absolutely useless processing on my video frames. I turn off Reactions, but nothing helps - the CPU consumptions stays at 40%.
"Reactions" is nothing but a useless toy to please some WWDC keynote fanboys, I don't want it anywhere near my app or my users, and I especially do not want to take the blame for it pissing away the user's CPU cycles and battery.
Now, how do I make it go away, for ever?
Best regards
Jacob