In this app I get access to QRCode.
Reading works perfectly from the camera.
Now I am struggling to get the image that was processed by the built in QRCode reader.
I have found many hints on SO, but cannot make it work.
Here is the code I have now.
It is a bit long, I have to slit in 2 parts
I looked at:
// https://stackoverflow.com/questions/56088575/how-to-get-image-of-qr-code-after-scanning-in-swift-ios
// https://stackoverflow.com/questions/37869963/how-to-use-avcapturephotooutput
import UIKit
import AVFoundation
class ScannerViewController: UIViewController, AVCaptureMetadataOutputObjectsDelegate {
		fileprivate var captureSession: AVCaptureSession! // use for QRCode reading
		fileprivate var previewLayer: AVCaptureVideoPreviewLayer!
		
		// To get the image of the QRCode
		private var photoOutputQR: AVCapturePhotoOutput!
		private var isCapturing = false
		
		override func viewDidLoad() {
				super.viewDidLoad()
				var accessGranted = false
			 //	switch AVCaptureDevice.authorizationStatus(for: .video) {
// HERE TEST FOR ACCESS RIGHT. WORKS OK ;
// But is .video enough ?
				}
				
				if !accessGranted {	return }
				captureSession = AVCaptureSession()
				
				photoOutputQR = AVCapturePhotoOutput() // IS IT THE RIGHT PLACE AND THE RIGHT THING TO DO ?
				captureSession.addOutput(photoOutputQR)	 // Goal is to capture an image of QRCode once acquisition is done
				guard let videoCaptureDevice = AVCaptureDevice.default(for: .video) else { return }
				let videoInput: AVCaptureDeviceInput
				
				do {
						videoInput = try AVCaptureDeviceInput(device: videoCaptureDevice)
				} catch {	return }
				
				if (captureSession.canAddInput(videoInput)) {
						captureSession.addInput(videoInput)
				} else {
						failed()
						return
				}
				
				let metadataOutput = AVCaptureMetadataOutput()
				
				if (captureSession.canAddOutput(metadataOutput)) {
						captureSession.addOutput(metadataOutput) // SO I have 2 output in captureSession. IS IT RIGHT ?
						
						metadataOutput.setMetadataObjectsDelegate(self, queue: DispatchQueue.main)
						metadataOutput.metadataObjectTypes = [.qr]	// For QRCode video acquisition
						
				} else {
						failed()
						return
				}
				
				previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
				previewLayer.frame = view.layer.bounds
				previewLayer.frame.origin.y += 40
				previewLayer.frame.size.height -= 40
				previewLayer.videoGravity = .resizeAspectFill
				view.layer.addSublayer(previewLayer)
				captureSession.startRunning()
		}
		
		override func viewWillAppear(_ animated: Bool) {
				
				super.viewWillAppear(animated)
				if (captureSession?.isRunning == false) {
						captureSession.startRunning()
				}
		}
		
		override func viewWillDisappear(_ animated: Bool) {
				
				super.viewWillDisappear(animated)
				if (captureSession?.isRunning == true) {
						captureSession.stopRunning()
				}
		}
		
		// MARK: - scan Results
		
		func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) {
				
				captureSession.stopRunning()
				
				if let metadataObject = metadataObjects.first {
						guard let readableObject = metadataObject as? AVMetadataMachineReadableCodeObject else { return }
						guard let stringValue = readableObject.stringValue else { return }
						AudioServicesPlaySystemSound(SystemSoundID(kSystemSoundID_Vibrate))
						found(code: stringValue)
				}
				// Get image - IS IT THE RIGHT PLACE TO DO IT ?
				// https://stackoverflow.com/questions/37869963/how-to-use-avcapturephotooutput
				print("Do I get here ?", isCapturing)
				let photoSettings = AVCapturePhotoSettings()
				let previewPixelType = photoSettings.availablePreviewPhotoPixelFormatTypes.first!
				print("previewPixelType", previewPixelType)
				let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
														 kCVPixelBufferWidthKey as String: 160,
														 kCVPixelBufferHeightKey as String: 160]
				photoSettings.previewPhotoFormat = previewFormat
				if !isCapturing {
						isCapturing = true
						photoOutputQR.capturePhoto(with: photoSettings, delegate: self)
				}
				dismiss(animated: true)
		}
		
}
extension ScannerViewController: AVCapturePhotoCaptureDelegate {
	
		func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
			
				isCapturing = false
				print("photo", photo, photo.fileDataRepresentation())
				guard let imageData = photo.fileDataRepresentation() else {
						print("Error while generating image from photo capture data.");
						return
				}
		 }
}
I get the following print on console
Clearly photo is not loaded properly
Do I get here ? false
previewPixelType 875704422
photo <AVCapturePhoto: 0x281973a20 pts:nan 1/1 settings:uid:3 photo:{0x0} time:nan-nan> nil
Error while generating image from photo capture data.
Post
Replies
Boosts
Views
Activity
One of my production Mac is still on Mojave. I want to update to Catalina and not Big Sur (waiting for dust to settle down).
Automatic update from system preferences only proposes Big Sur as well as some patch updates for MacOS 10.14.6
What is the safe and sure way to update from Mojave to the latest Catalina ?
On a MBP with Catalina.
When using Xcode 12 (any release), and open a SwiftUI app (absolutely basic one, just a say hello), I get the usual list of simulators.
If I open the same app with any Xcode below 12 (e.g 11.6, 11.5), I do not see any simulator in the list.
Any other app (SwiftUI or pure Swift) has no problem.
I tried to add simulators, could not get it done.
What could be corrupted in the SwiftUI app ?
I have an image to fill the entire screen, even behind the notch.
So I set the Top margin to safe area to -44.
That works OH on all iPhones… except on iPhone 12 ProMax simulator.
Here I get a few pixels uncovered at top.
I had to change value to -48 to get everything OK.
So question is: has "notch area" which defines safe area increased 4 pixels on iPhone 12 Pro max ? If so, is it a hardware change ?
When I open the storyboard of a project created in older XCode with XCode 12.1, I get the following error message:
The document “Main.storyboard” had an issue that was found and repaired.
This may be due to an SCM operation such as merging. Please save the document to fix the issue. WIth following details:
Multiple resources have the same name: darkTextColor. I searched for darkTextColor in project, could not find any.
But opening storyboard as source shows that a line is duplicated
		<resources>
				<image name="BoutonBackToStart.png" width="53" height="53"/>
				<image name="BoutonCompter.png" width="79" height="79"/>
				<image name="BoutonHelp.png" width="35" height="35"/>
				<image name="BoutonPrefs.png" width="53" height="52"/>
				<systemColor name="darkTextColor">
						<color white="0.0" alpha="1" colorSpace="custom" customColorSpace="genericGamma22GrayColorSpace"/>
				</systemColor>
				<systemColor name="darkTextColor">
						<color white="0.0" alpha="1" colorSpace="custom" customColorSpace="genericGamma22GrayColorSpace"/>
				</systemColor>
However, the project compiles and runs without issue in Xcode 12.1.
I thus accepted to repair. But next time I open, same error again.
So, I deleted manually the redundant line and error disappeared for good.
So, I looked at the project in XCode 11.3. No error when opening storyboard.
Looking at the storyboard as source, I get the darkTextColor, but not in the resources section, but in various objects (as label subviews, under the key
<color key="textColor" cocoaTouchSystemColor="darkTextColor"/>
So, it seems that XCode 12 converts storyboard with errors when moving key for color to the resources section, here duplicating an entry.
Did anyone notice similar issue ?
I discovered the term : Release Candidate in the downloads and wondered what it was.
Finally found a note in https://developer.apple.com/news/releases/
Note: The term "Release Candidate" (RC) replaces "GM seed" and indicates this version is near final.
I am archiving a new version of an app and want to upload to appstore.
I did this on a first mac (MBP / Xcode 12GM). [Is GM relevant here ? Can I archive with a GM version ?]
At some point after asking to upload, I was noticed that I missed a Distribution certificate (I may have revealed some time ago for other reason).
I thus created a new distribution certificate and went to upload.
I ask for automatically signing and it goes on processing.
I get the iPad content info , indicating notably the keychain-access-group and team identifier.
I hit upload, and after about 1 minute in "Creating App Store Connect API analysis file…" message I get an AppStore Connect Operation Error "an error occurred when uploading to the Appstore".
I could not see the last message for long, but in a blink it seems it was authentication error.
So I tried to archive and upload from the other Mac (Xcode 11.3).
Now, I get the message that "XXXX has 2 distribution certificates, but their private key are not installed. Contact the creator to get a copy of private key."
I'm lost here.
How do I get such a copy of private key ?
Do I need to recreate completely the private key ?
How do I install it ?
Should I remove certificates and recreate ? If so, is there a risk for other published apps ? Do I have to do it on each Mac ?
Or should I do it in Xcode, creating a new distribution certificate on each Mac (or only on one ?)
I looked at this SO: https://stackoverflow.com/questions/16563364/how-can-i-add-private-key-to-the-distribution-certificate
They say:
Yes, the error you are getting means that there is not a private key on your Mac associated with the distribution certificate you are trying to use to sign the app.
There are two possible solutions, depending on whether the computer who requested the distribution certificate is available or not.
If the computer who requested the distribution certificate is available (or there is a backup of the distribution assets somewhere)
From the computer where the distribution asset was generated, open Xcode.
Click on Window, Organizer.
Expand the Teams section.
Select your team, select the certificate of "iOS Distribution" type, click Export and follow the instructions.
Save the exported file and go to your computer.
Repeat steps 1-3.
Click Import and select the file you exported before. But at step 3, I do not see Teams section in Organizer (may be it changed location since this SO thread ?)
I opened an existing playground file (iOS) on Xcode 11.3.
The panel with the ">" to execute playground does not show anymore ; the panel that opens at bottom is for debug, with a button "=>" for breakpoints.
I cannot find a way to make this "execute playground" reappear.
Note: I have created a new iOS playground, pasted the complete playground here and everything works normally.
In both cases (old and new playground), the panel at bottom is named "Debug area" (when you hover mouse over the leftmost button "v". But in the old playground, there are buttons to show variables and debug.
In addition, the old storyboard fails to run, where the new one, with exact same code, succeeds.
What am I missing here ?
In this app, I send user notification, and ask for repeat.
I have an Apple Watch paired to the iPhone, but the app is purely iPhone.
User notifications are delivered on the watch automatically when the iPhone is locked.
Good, it works.
But I keep receiving a repeated notification even after swiping on the watch to tap the OK ack.
Delegate is set everywhere I invoke the current notification center.
I want repeat only 3 times, so I have a counter decremented each time a notification is triggered. And notification removed if user ack'ed it.
This is done in
userNotificationCenter(_ center: UNUserNotificationCenter, didReceive response: UNNotificationResponse, withCompletionHandler completionHandler: @escaping () -> Void)
This works on the iPhone, but not when phone is locked and notifications show on the watch: the decrement is apparently not done
initial notification keeps showing on the watch even when user has acknowledged it.
I probably miss something here.
I looked at this example of protocol declaration and composition
https ://medium. com/@09mejohn/protocol-composition-in-swift-bd7ed6018ac5
protocol CanBark {
		func bark()
}
protocol CanWaggTail {
		func waggTail()
}
struct Dog {
		
		// Protocol Composition
		typealias Dependency = CanBark & CanWaggTail
		
		let dogManager: Dependency
		func ballFound() {
				dogManager.waggTail()
				dogManager.bark()
		}
		
}
It compiles even though bark() is not defined.
More important, how do I declare a Dog: I am requested to declare the dogManager parameter (auto completion tells)
let mario = Dog(dogManager: <#T##Dog.Dependency#>)
So I try to create
let dependency : CanBark & CanWaggTail
But how should I initialise dependency for use for mario ?
The new forum has been there for a week now.
I did wait to get accustomed to it so that my comments are not just due to the change factor.
I think it is now possible to make some assessment and I would appreciate others’ opinion to see if it is just me…
There are some good points: the editor is much less whimsical
the possibility to tag topic is a good thing
search has improved (a little).
But I find really frustrating points: Editor is better and has a live Preview. What a strange concept in the world of the inventor of WYSIWYG ! Did we return to the 90’s editors with their control characters ?
Tags are good, but IMHO it would be useful to have major and minor tags. That would help focus when we list threads by tag by making it possible to select only the major one
All posts are now reviewed to avoid spam. Good enough. But why can it take hours (in a case, already 5 hours and still in review) for a very simple answer post (no URL, no bizarre word) to be reviewed. A bit frustrating. And that does not allow for any rapidly going discussion.
Many features seem to be inspired from SO, but without some critical capabilities, as the inclusion of images.
The concept of upvote or downvote has been introduced as well. Was it really needed ? Won’t it lead to some regrettable side effects that we see in SO where some do not dare propose an answer or even ask a question for fear of a negative vote ? That’s particularly the case for anyone not very fluent in english who may not express properly his/her opinion. I did find that previous developers forum was a more welcoming place than SO. Hope that will not change.
There are also capabilities that are missing from previous forum (unless they are so well hidden I could not find them): because of increased spacing, there are now just 15 instead 25 posts per page. And you have to scroll a lot more to get at the end of the shorter list. That do slow down the screening.
how can one see the threads he contributed to ? Seems we can only filter the one we authored.
It seems now quasi impossible to edit a post, and of course delete it.
So at the end, instead of the whaooh effect I was expecting I have just a mixed feeling.
I get a behavior I never experienced before when updating XCode from 11.4.1 to 11.5 from Appstore.I'm using a MBP, MacOS 10.15.4.- When update starts, system indicates (in Launchpad )a download of 2.03 GB (!?! Not 8 GB)- After download, installation starts (or seem to start).- After a minute or so, a new download starts, this time announced as 8.09 GB.Note:I also tried to download from downloads/more.I get a full download (of 8.12 GB), even though the file is labelled (on the web page) as 7.5 GB for then xip file.
Do you use third party framweworks, like Alamofire ?
I submitted an iOS app with a watchOS companion app.App has been 'Metadata Rejected':Here is the full message:Guideline 2.1 - Information NeededWe have started the review of your app, but we are not able to continue because we need access to a video that demonstrates the current version of your app in use on a physical watchOS device.Please only include footage in your demo video of your app running on a physical watchOS device, and not on a simulator. It is acceptable to use a screen recorder to capture footage of your app in use.Next StepsTo help us proceed with the review of your app, please provide us with a link to a demo video in the App Review Information section of App Store Connect and reply to this message in Resolution Center.To provide a link to a demo video:- Log in to App Store Connect- Click on "My Apps"- Select your app- Click on the app version on the left side of the screen- Scroll down to "App Review Information"- Provide demo video access details in the "Notes" section- Once you've completed all changes, click the "Save" button at the top of the Version Information page.Please note that in the event that your app may only be reviewed by means of a demo video, you will be required to provide an updated demo video with every resubmission.Since your App Store Connect status is Metadata Rejected, we do NOT require a new binary. To revise the metadata, visit App Store Connect to select your app and revise the desired metadata values. Once you’ve completed all changes, reply to this message in Resolution Center and we will continue the review.I have 3 questions:- Is it a systematic requirement for Watch apps ? I did not see in guidelines ; or is it for some reason specific to my app or to the reviewer ?- How can I record video on the Apple Watch ? Should I film the watch while in operation and post this video ? Or is there a direct way to record the video from the watch to iPhone (using system tools, not third party).- I understand it is not video for publication on appstore, but video for tester. So should I include video in the screen captures section or put it on some web site and give a link to it to the tester ?
Here is the set up with views hierarch:- A popover window - a custom view - an NSBox - an NSView - 4 radio buttons, aligned on left verticallyThe radio buttons share the same IBAction (to get the radio button effect)Here is the problem:- the top most radio button draws with an outer blue circle- if I move this button down, or if I hide it, then another one gets the circle- If I select the button, the outer circle remains.