Post

Replies

Boosts

Views

Activity

Debugging communications between iPhone & Mac
I am developing an app that is to be runnable on both Mac Studio, and iPhone 14. The app instance on the Mac communicates with the app's other instance in the iPhone 14. To debug the Bonjour communications I need to run this source code in the debugger in both the Mac Studio simulator, and at the iPhone, simultaneously. The ideal would be to able to step through source code, and set break points, in each app instance simultaneously. Is that possible? If so how?
1
0
838
Jul ’23
photoOutput() called with its error parameter set
I followed the directions to capture, and store uncompressed (raw), image files at: https://developer.apple.com/documentation/avfoundation/photo_capture/capturing_photos_in_raw_and_apple_proraw_formats I excluded the thumbnails because they are not needed. When picture taking is attempted the completion handling callback method: RAWCaptureDelegate.photoOutput( _ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) is called as expected, but called with its "error" parameter set to the value: Error Domain=AVFoundationErrorDomain Code=-11803 "Cannot Record" UserInfo={AVErrorRecordingFailureDomainKey=3, NSLocalizedDescription=Cannot Record, NSLocalizedRecoverySuggestion=Try recording again.} I do not know what to do about this error. The error message recommends trying again, but the error repeats every time. Any suggestions on possible causes, or how to proceed to find out what is going wrong, would be much appreciated. The device this is run on is an iPhone 14 pro.
1
0
733
Jul ’23
How to code storing uncompressed files from the iPhone's camera?
The need is to capture an image in an uncompressed format, and save it in a lossless file format. I am not particular about which uncompressed format. However AVCapturePhotoOutput.availablePhotoPixelFormatTypes shows that on my iPhone 14 Pro the BGRA format is available, and its type value is: 1111970369. So I use this code to get a settings object, and trigger a photo capture with it: let settings = AVCapturePhotoSettings(format: [ String( kCVPixelBufferPixelFormatTypeKey ) : 1111970369] ) self.photoOutput?.capturePhoto(with: settings, delegate: self ) I am not sure I have the AVCapturePhotoSettings() dictionary argument right. It is the best I could conclude from available documentation, and it does not produce a compile time error. I found no example how to do uncompress photo files, I found only one that produces jpegs, and I have successfully used that example to capture images in jpeg format. This is the URL for that example which I now attempt to modify to produce uncompressed image files: https://www.appcoda.com/avfoundation-swift-guide/ . This example is somewhat out of date, and some method names had to be changed accordingly. Next, when the system calls the delegated callback: func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) I need to convert the passed photo.pixelBuffer image object into one I can pass to: self.photoCaptureCompletionBlock?(image, nil) Class UIImage(data: data) çannot do this alone. I get a compile time error when I pass photo.pixelBuffer to it. I tried to convert with the AVCapturePhotoOutput() class, but could not a find version of its initiators the compiler would accept to do this. What do I need to do to get the data in photo.pixelBuffer stored in an uncompressed file?
0
0
600
Jul ’23
How to Instantiate AVCapturePhotoSettings( format: [string:string] )
I tried various ways to instantiate the class: AVCapturePhotoSettings as listed below: let settings = AVCapturePhotoSettings(format: [ kCVPixelBufferPixelFormatTypeKey : "BGRA"] ) let settings = AVCapturePhotoSettings(format: [ kCVPixelBufferPixelFormatTypeKey as String : "BGRA"] ) let settings = AVCapturePhotoSettings(format: [ String( kCVPixelBufferPixelFormatTypeKey) : "BGRA"] let settings = AVCapturePhotoSettings(format: [ "kCVPixelBufferPixelFormatTypeKey" : "BGRA"] ) The first method gets the error: Cannot convert value of type 'CFString' to expected dictionary key type 'String' and so the three other attempts below this were done, but each of these attempts got the run time error: *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** +[AVCapturePhotoSettings photoSettingsWithFormat:] Either kCVPixelBufferPixelFormatTypeKey or AVVideoCodecKey must be specified' What is the correct way to use this initiator for the class: AVCapturePhotoSettings ?
0
0
579
Jul ’23
How to look up Photo Pixel Formate Types?
I listed the AVCapturePhotoSettings.availablePhotoPixelFormatTypes array in my iPhone 14 during a running photo session and I got these type numbers: 875704422 875704438 1111970369 I have no idea what these numbers mean. How can I use these numbers to look up a human readable string that can tell me what these types are in a way I am familiar with, such as jpeg, tiff, png, bmp, dng, etc, so I know which of these numbers to choose when I instantiate the class: AVCaptureSession?
1
0
1.3k
Jul ’23
Can the permission dialog to access the photo library be bypassed?
This handler in my photo app: cameraController.captureImage(){ ( image: UIImage?, error: Error? ) in // This closure is the completion handler, it is called when the picture taking is completed. if let error = error { print( "camera completion handler called with error: " + error.localizedDescription ) }else{ print( "camera completion handler called." ) } // If no image was captured return here: guard let image = image else { print(error ?? "Image capture error") return } var phpPhotoLib = PHPhotoLibrary.shared() if( phpPhotoLib.unavailabilityReason.debugDescription != "nil" ){ print( "phpPhotoLib.unavailabilityReason.debugDescription = " + phpPhotoLib.unavailabilityReason.debugDescription ) }else{ // There will be a crash right here unless the key "Privacy - Photo Library Usage Description" exists in this app's .plist file try? phpPhotoLib.performChangesAndWait{ PHAssetChangeRequest.creationRequestForAsset(from: image) } } } asks the user for permission to store the captured photo data in the library. It does this once. It does not ask on subsequent photo captures. Is there a way for app to bypass this user permission request dialog, or at least present it to the user in advance of triggering a photo capture when the app first runs? In this situation the camera is being used to capture science data in a laboratory setting. The scientist using this app already expects, and needs, the photo to be stored. The redundant permission request dialog makes the data collection cumbersome, and inconvenient, due to the extra effort required to reach the camera where it is located.
1
0
639
Jul ’23
PHAssetChangeRequest.creationRequestForAsset() Crashes
There is a crash on the closing brace of the photo capture completion handler where commented below. // Initiate image capture: cameraController.captureImage(){ ( image: UIImage?, error: Error? ) in // This closure is the completion handler, it is called when // the picture taking is completed. print( "camera completion handler called." ) guard let image = image else { print(error ?? "Image capture error") return } try? PHPhotoLibrary.shared().performChangesAndWait { PHAssetChangeRequest.creationRequestForAsset(from: image) } // crash here } When step through pauses on that closing brace, and the next step is taken, there is a crash. Once crashed what I see next is assembly code as shown in the below disassembly: libsystem_kernel.dylib`: 0x1e7d4a39c <+0>: mov x16, #0x209 0x1e7d4a3a0 <+4>: svc #0x80 -> 0x1e7d4a3a4 <+8>: b.lo 0x1e7d4a3c4 ; <+40> Thread 11: signal SIGABRT 0x1e7d4a3a8 <+12>: pacibsp 0x1e7d4a3ac <+16>: stp x29, x30, [sp, #-0x10]! 0x1e7d4a3b0 <+20>: mov x29, sp 0x1e7d4a3b4 <+24>: bl 0x1e7d3d984 ; cerror_nocancel 0x1e7d4a3b8 <+28>: mov sp, x29 0x1e7d4a3bc <+32>: ldp x29, x30, [sp], #0x10 0x1e7d4a3c0 <+36>: retab 0x1e7d4a3c4 <+40>: ret Obviously I have screwed up picture taking somewhere. I would much appreciate suggestions on what diagnostics will lead to the resolution of this problem. I can make the entire picture taking code available on request as an attachment. It is too lengthy to post here.
1
0
740
Jul ’23
Taking Lossless Picture
I am following these directions to code picture taking: https://www.appcoda.com/avfoundation-swift-guide/ These directions are for taking jpeg shots. But what is needed is to output to a lossless format such as DNG, PNG, or BMP. How would those instructions be modified to output pictures in a lossless format? Is there a tutorial similar to the one linked to above that explains how to do lossless formats?
1
0
706
Jul ’23
How to make the return key appear on numeric keyboards?
I have a UITextField object in a storyboard's scene where the user is make numeric entries. Because this field is for numbers only I set the keyboard type to "Number Pad" in attributes. But this keyboard does not have a return key which leaves the user unable to signal when the entry is complete, the keyboard should be put away, and the entered value processed. I tried all the other numeric keyboards also, and found that none of them has a return key. The default keyboard does, but it is a full alphanumeric keyboard which is undesirable for that field. How can I cause the return key, done key, or whichever does that function, appear on a numeric only keyboard?
2
0
2.2k
Jul ’23
"Editing Did End" event is not calling its event handler
I have a UITextField object on a storyboard's scene. Its "Editing Did End" event is connected to the view controller's "actIPhoneName()" event handler method as shown in this screenshot. I configured the keyboard to show its "Done" key. I expected that when this Done button is touched, that the keyboard would disappear, and the actIPhoneName() method would be called. But neither of these happen, nor does anything else. The UITextField object remains in edit mode. The breakpoint is never reached. What must I do to make the Done keyboard key work to make the UITextField object lose its First Responder status, and call its "Editing Did End" event handler method?
1
0
704
Jul ’23
How to add a scene, and its class, to the Storyboard
I added a scene to an existing story board that already has other scenes. I put focus on the new scene, and in the Identity Inspector's "Class" field I gave it a class name. The "Storyboard ID" field did not show in this inspector. Why would it not? No file named after the class name I gave it appeared in the "Project navigator". Is this supposed to happen automatically? Or must I create this Swift file that contains the class definition for the scene manually?
2
0
1.4k
Jul ’23
Using DispatchQueue
I am attempting to follow this example of how to create a camera app: https://www.appcoda.com/avfoundation-swift-guide/ It describes a sequence of 4 steps, and these steps are embodied in four functions: func prepare(completionHandler: @escaping (Error?) -> Void) { func createCaptureSession() { } func configureCaptureDevices() throws { } func configureDeviceInputs() throws { } func configurePhotoOutput() throws { } DispatchQueue(label: "prepare").async { do { createCaptureSession() try configureCaptureDevices() try configureDeviceInputs() try configurePhotoOutput() } catch { DispatchQueue.main.async { completionHandler(error) } return } DispatchQueue.main.async { completionHandler(nil) } } } What is confusing me is that it appears these four steps need to be executed sequentially, but in the Dispatch Queue they are executed simultaneously because .async is used. For example farther down that webpage the functions createCaptureSession(), and configureCaptureDevices(), and the others, are defined. In createCaptureSession() the member variables self.frontCamera, and self.rearCamera, are given values which are used in configureCaptureDevices(). So configureCaptureDevices() depends on createCaptureSession() having been already executed, something that it appears cannot be depended upon if both functions are executing simultaneously in separate threads. What then is the benefit of using DispatchQueue()? How is it assured the above example dependencies are met? What is the label parameter of the DispatchQueue()'s initializer used for?
1
0
725
Jul ’23
Can't Create Storyboard Outlet
I am not able drag a reference object from a Storyboard screen to code in the Assistant. The expected blue line appears as the screenshot shows, but no code is created in the Assistant. This reference object is not in the Launch Screen. When the the screen this object is in is highlighted, the correct file opens in the Assistant. Dragging this object from the "Content View" folder also does not work, even though the blue line also appears. I was recently able to do this. Something changed. What might have changed?
1
0
838
Jul ’23
UIButton Changing Foreground, and Background, Colors
The goal is to have the background, and foreground, colors show the current state of a UIButton on a storyboard file. The attempt to do that is done with this code: var toggleLight: Bool = false @IBAction func toggleLight(_ sender: UIButton) { if( toggleLight ){ toggleLight = false sender.baseBackgroundColor = UIColor.black sender.baseForegroundColor = UIColor.white }else{ toggleLight = true sender.baseBackgroundColor = UIColor.white sender.baseForegroundColor = UIColor.black } tableView.reloadData() } I get these errors: Value of type 'UIButton' has no member 'baseBackgroundColor' Value of type 'UIButton' has no member 'baseForegroundColor' I do not understand this because in file "UIKit.UIButton" I see: extension UIButton { ... public var baseForegroundColor: UIColor? public var baseBackgroundColor: UIColor? ... } What has gone wrong there?
0
0
582
Jun ’23