Posts

Post marked as solved
2 Replies
1.1k Views
The need is to persist between launches the state of storyboard objects such as of type UISwitch, UITextField, etc. Can this be done using @AppStorage? If so how can @AppStorage be set to watch these? I tried getting @AppStorage to watch an outlet class member variable that is connected to the storyboard object: @IBOutlet weak var iPhoneName: UITextField! @AppStorage("iPhoneName") var iPhoneName: String = "" This got an error because the variable to be watched is already declared. I decided to make the the watched variable different than the one connected to the Storyboard's UITextField object: @AppStorage("iPhoneName") var striPhoneName: String = "" and got the error: Unknown attribute 'AppStorage' . In what import library is @AppStorage defined? If @AppStorage cannot be used for this, what is the easiest way to code storyboard object persistence? I am looking for an easy, and quick way. I am not concerned with memory usage right now.
Posted
by spflanze.
Last updated
.
Post not yet marked as solved
1 Replies
723 Views
I am developing an app that is to be runnable on both Mac Studio, and iPhone 14. The app instance on the Mac communicates with the app's other instance in the iPhone 14. To debug the Bonjour communications I need to run this source code in the debugger in both the Mac Studio simulator, and at the iPhone, simultaneously. The ideal would be to able to step through source code, and set break points, in each app instance simultaneously. Is that possible? If so how?
Posted
by spflanze.
Last updated
.
Post not yet marked as solved
1 Replies
554 Views
I followed the directions to capture, and store uncompressed (raw), image files at: https://developer.apple.com/documentation/avfoundation/photo_capture/capturing_photos_in_raw_and_apple_proraw_formats I excluded the thumbnails because they are not needed. When picture taking is attempted the completion handling callback method: RAWCaptureDelegate.photoOutput( _ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) is called as expected, but called with its "error" parameter set to the value: Error Domain=AVFoundationErrorDomain Code=-11803 "Cannot Record" UserInfo={AVErrorRecordingFailureDomainKey=3, NSLocalizedDescription=Cannot Record, NSLocalizedRecoverySuggestion=Try recording again.} I do not know what to do about this error. The error message recommends trying again, but the error repeats every time. Any suggestions on possible causes, or how to proceed to find out what is going wrong, would be much appreciated. The device this is run on is an iPhone 14 pro.
Posted
by spflanze.
Last updated
.
Post not yet marked as solved
0 Replies
488 Views
The need is to capture an image in an uncompressed format, and save it in a lossless file format. I am not particular about which uncompressed format. However AVCapturePhotoOutput.availablePhotoPixelFormatTypes shows that on my iPhone 14 Pro the BGRA format is available, and its type value is: 1111970369. So I use this code to get a settings object, and trigger a photo capture with it: let settings = AVCapturePhotoSettings(format: [ String( kCVPixelBufferPixelFormatTypeKey ) : 1111970369] ) self.photoOutput?.capturePhoto(with: settings, delegate: self ) I am not sure I have the AVCapturePhotoSettings() dictionary argument right. It is the best I could conclude from available documentation, and it does not produce a compile time error. I found no example how to do uncompress photo files, I found only one that produces jpegs, and I have successfully used that example to capture images in jpeg format. This is the URL for that example which I now attempt to modify to produce uncompressed image files: https://www.appcoda.com/avfoundation-swift-guide/ . This example is somewhat out of date, and some method names had to be changed accordingly. Next, when the system calls the delegated callback: func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) I need to convert the passed photo.pixelBuffer image object into one I can pass to: self.photoCaptureCompletionBlock?(image, nil) Class UIImage(data: data) çannot do this alone. I get a compile time error when I pass photo.pixelBuffer to it. I tried to convert with the AVCapturePhotoOutput() class, but could not a find version of its initiators the compiler would accept to do this. What do I need to do to get the data in photo.pixelBuffer stored in an uncompressed file?
Posted
by spflanze.
Last updated
.
Post not yet marked as solved
0 Replies
475 Views
I tried various ways to instantiate the class: AVCapturePhotoSettings as listed below: let settings = AVCapturePhotoSettings(format: [ kCVPixelBufferPixelFormatTypeKey : "BGRA"] ) let settings = AVCapturePhotoSettings(format: [ kCVPixelBufferPixelFormatTypeKey as String : "BGRA"] ) let settings = AVCapturePhotoSettings(format: [ String( kCVPixelBufferPixelFormatTypeKey) : "BGRA"] let settings = AVCapturePhotoSettings(format: [ "kCVPixelBufferPixelFormatTypeKey" : "BGRA"] ) The first method gets the error: Cannot convert value of type 'CFString' to expected dictionary key type 'String' and so the three other attempts below this were done, but each of these attempts got the run time error: *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** +[AVCapturePhotoSettings photoSettingsWithFormat:] Either kCVPixelBufferPixelFormatTypeKey or AVVideoCodecKey must be specified' What is the correct way to use this initiator for the class: AVCapturePhotoSettings ?
Posted
by spflanze.
Last updated
.
Post not yet marked as solved
1 Replies
569 Views
I am following these directions to code picture taking: https://www.appcoda.com/avfoundation-swift-guide/ These directions are for taking jpeg shots. But what is needed is to output to a lossless format such as DNG, PNG, or BMP. How would those instructions be modified to output pictures in a lossless format? Is there a tutorial similar to the one linked to above that explains how to do lossless formats?
Posted
by spflanze.
Last updated
.
Post marked as solved
1 Replies
580 Views
There is a crash on the closing brace of the photo capture completion handler where commented below. // Initiate image capture: cameraController.captureImage(){ ( image: UIImage?, error: Error? ) in // This closure is the completion handler, it is called when // the picture taking is completed. print( "camera completion handler called." ) guard let image = image else { print(error ?? "Image capture error") return } try? PHPhotoLibrary.shared().performChangesAndWait { PHAssetChangeRequest.creationRequestForAsset(from: image) } // crash here } When step through pauses on that closing brace, and the next step is taken, there is a crash. Once crashed what I see next is assembly code as shown in the below disassembly: libsystem_kernel.dylib`: 0x1e7d4a39c <+0>: mov x16, #0x209 0x1e7d4a3a0 <+4>: svc #0x80 -> 0x1e7d4a3a4 <+8>: b.lo 0x1e7d4a3c4 ; <+40> Thread 11: signal SIGABRT 0x1e7d4a3a8 <+12>: pacibsp 0x1e7d4a3ac <+16>: stp x29, x30, [sp, #-0x10]! 0x1e7d4a3b0 <+20>: mov x29, sp 0x1e7d4a3b4 <+24>: bl 0x1e7d3d984 ; cerror_nocancel 0x1e7d4a3b8 <+28>: mov sp, x29 0x1e7d4a3bc <+32>: ldp x29, x30, [sp], #0x10 0x1e7d4a3c0 <+36>: retab 0x1e7d4a3c4 <+40>: ret Obviously I have screwed up picture taking somewhere. I would much appreciate suggestions on what diagnostics will lead to the resolution of this problem. I can make the entire picture taking code available on request as an attachment. It is too lengthy to post here.
Posted
by spflanze.
Last updated
.
Post marked as solved
1 Replies
842 Views
I listed the AVCapturePhotoSettings.availablePhotoPixelFormatTypes array in my iPhone 14 during a running photo session and I got these type numbers: 875704422 875704438 1111970369 I have no idea what these numbers mean. How can I use these numbers to look up a human readable string that can tell me what these types are in a way I am familiar with, such as jpeg, tiff, png, bmp, dng, etc, so I know which of these numbers to choose when I instantiate the class: AVCaptureSession?
Posted
by spflanze.
Last updated
.
Post not yet marked as solved
1 Replies
517 Views
This handler in my photo app: cameraController.captureImage(){ ( image: UIImage?, error: Error? ) in // This closure is the completion handler, it is called when the picture taking is completed. if let error = error { print( "camera completion handler called with error: " + error.localizedDescription ) }else{ print( "camera completion handler called." ) } // If no image was captured return here: guard let image = image else { print(error ?? "Image capture error") return } var phpPhotoLib = PHPhotoLibrary.shared() if( phpPhotoLib.unavailabilityReason.debugDescription != "nil" ){ print( "phpPhotoLib.unavailabilityReason.debugDescription = " + phpPhotoLib.unavailabilityReason.debugDescription ) }else{ // There will be a crash right here unless the key "Privacy - Photo Library Usage Description" exists in this app's .plist file try? phpPhotoLib.performChangesAndWait{ PHAssetChangeRequest.creationRequestForAsset(from: image) } } } asks the user for permission to store the captured photo data in the library. It does this once. It does not ask on subsequent photo captures. Is there a way for app to bypass this user permission request dialog, or at least present it to the user in advance of triggering a photo capture when the app first runs? In this situation the camera is being used to capture science data in a laboratory setting. The scientist using this app already expects, and needs, the photo to be stored. The redundant permission request dialog makes the data collection cumbersome, and inconvenient, due to the extra effort required to reach the camera where it is located.
Posted
by spflanze.
Last updated
.
Post marked as solved
2 Replies
1.1k Views
I added a scene to an existing story board that already has other scenes. I put focus on the new scene, and in the Identity Inspector's "Class" field I gave it a class name. The "Storyboard ID" field did not show in this inspector. Why would it not? No file named after the class name I gave it appeared in the "Project navigator". Is this supposed to happen automatically? Or must I create this Swift file that contains the class definition for the scene manually?
Posted
by spflanze.
Last updated
.
Post marked as solved
2 Replies
1.4k Views
I have a UITextField object in a storyboard's scene where the user is make numeric entries. Because this field is for numbers only I set the keyboard type to "Number Pad" in attributes. But this keyboard does not have a return key which leaves the user unable to signal when the entry is complete, the keyboard should be put away, and the entered value processed. I tried all the other numeric keyboards also, and found that none of them has a return key. The default keyboard does, but it is a full alphanumeric keyboard which is undesirable for that field. How can I cause the return key, done key, or whichever does that function, appear on a numeric only keyboard?
Posted
by spflanze.
Last updated
.
Post marked as solved
1 Replies
563 Views
I have a UITextField object on a storyboard's scene. Its "Editing Did End" event is connected to the view controller's "actIPhoneName()" event handler method as shown in this screenshot. I configured the keyboard to show its "Done" key. I expected that when this Done button is touched, that the keyboard would disappear, and the actIPhoneName() method would be called. But neither of these happen, nor does anything else. The UITextField object remains in edit mode. The breakpoint is never reached. What must I do to make the Done keyboard key work to make the UITextField object lose its First Responder status, and call its "Editing Did End" event handler method?
Posted
by spflanze.
Last updated
.
Post marked as solved
1 Replies
659 Views
I do not know what I did to make it so, but somehow I accidentally got Xcode Interface Builder to display a storyboard file in an undesired mode which displays the UI buttons, and text fields, in rectangles as shown in the screenshot below. How do I reverse what I did to get it out of this mode?
Posted
by spflanze.
Last updated
.
Post marked as solved
1 Replies
556 Views
I am attempting to follow this example of how to create a camera app: https://www.appcoda.com/avfoundation-swift-guide/ It describes a sequence of 4 steps, and these steps are embodied in four functions: func prepare(completionHandler: @escaping (Error?) -> Void) { func createCaptureSession() { } func configureCaptureDevices() throws { } func configureDeviceInputs() throws { } func configurePhotoOutput() throws { } DispatchQueue(label: "prepare").async { do { createCaptureSession() try configureCaptureDevices() try configureDeviceInputs() try configurePhotoOutput() } catch { DispatchQueue.main.async { completionHandler(error) } return } DispatchQueue.main.async { completionHandler(nil) } } } What is confusing me is that it appears these four steps need to be executed sequentially, but in the Dispatch Queue they are executed simultaneously because .async is used. For example farther down that webpage the functions createCaptureSession(), and configureCaptureDevices(), and the others, are defined. In createCaptureSession() the member variables self.frontCamera, and self.rearCamera, are given values which are used in configureCaptureDevices(). So configureCaptureDevices() depends on createCaptureSession() having been already executed, something that it appears cannot be depended upon if both functions are executing simultaneously in separate threads. What then is the benefit of using DispatchQueue()? How is it assured the above example dependencies are met? What is the label parameter of the DispatchQueue()'s initializer used for?
Posted
by spflanze.
Last updated
.