Post not yet marked as solved
I read in Xcode 14 release notes:
Fixed: WatchKit storyboards are deprecated in watchOS 7.0 and later. Please migrate to SwiftUI and the SwiftUI Lifecycle. (94058186)
And effectively, when adding a WatchOS target to an iOS app as a companion, it is now created as SwiftUI. In Xcode 13 we could still create as WatchKit (storyboard).
That raises a few questions:
what will happen to existing apps with companions ? Shall we have soon to redesign Watch part completely from storyboards to SwiftUI ?
What is the reason for such a deprecation ? That leads to have code with a part in storyboard the other in SwiftUI. Not ideal situation.
Is WatchKit deprecated completely ?
Personal opinion: with some present limitations of SwiftUI (such as in Lists), it will make it pretty hard to get the same flexibility in the designs that we had with WatchKit. I find it a regression not to leave both options possible.
Post not yet marked as solved
I use a ViewBuilder to generate the destination used in a NavigationSplitView (to select the detail View).
This viewBuilder depends on a parameter (in fact 2, but I try to simplify).
ViewBuilder is simple, just calls a View:
@ViewBuilder func destination(object: SomeObject, name: String) -> some View {
MyView(objToUse: object, nameToUse: name)
}
But this does not work. When I change the selection in the Master of the splitView, view is not updated (even though I've checked the content is updated.
This si so simple that I started using directly
MyView(objToUse: object, nameToUse: name)
in the detail View.
It did not work either.
Now, here is the surprise: if I use a switch statement in the ViewBuilder, it works:
Let's say we have:
struct SomeContent: Hashable, Identifiable, Codable {
var id = UUID()
var name: String
}
struct Object : Hashable, Identifiable, Equatable, Codable {
var id = UUID()
var content: [SomeContent] = []
}
So I define a func to get all the names
func allNames(of object: SomeObject) -> [String] {
var names : [String] = []
for o in object.content {
names.append(o.name)
}
return names
}
And modify ViewBuilder as follows: it works
@ViewBuilder func destination(object: SomeObject, name: String) -> some View {
let names : [String] = allNames(of: object)
switch name {
case names[0]: MyView(objToUse: object, nameToUse: name)
case names[1]: MyView(objToUse: object, nameToUse: name)
case names[2]: MyView(objToUse: object, nameToUse: name)
default: EmptyView()
}
It also works with nested if else if instead of a switch.
What is it I am missing ?
That's an old question (this thread https://developer.apple.com/forums/thread/16375 dates back iOS9 or this other thread https://stackoverflow.com/questions/67249956/uisearchbar-warning-uitexteffectswindow-should-not-become-key-please-file-a-bu), but I've not found a comprehensive answer yet.
I need to add a Return key to a numeric keyboard (BTW, how is it it does not exist in standard ?)
I used to do it using
let keyBoardWindow = UIApplication.shared.windows.last
to get the keyboard window and then adding the subview (self.returnButton) to it
keyBoardWindow?.addSubview(self.returnButton)
Worked great and still works OK with iOS 15.
But we are told to use connectedScenes instead…
So I adapt code:
var keyBoardWindow: UIWindow? = nil
if #available(iOS 13.0, *) { // use connectedScenes
let scenes = UIApplication.shared.connectedScenes
let windowScene = scenes.first as? UIWindowScene
if let kbWindow = windowScene?.windows.last {
keyBoardWindow = kbWindow
}
} else {
keyBoardWindow = UIApplication.shared.windows.last
}
It runs, but subview does not show.
The problem is that keyboardWindow used to be a UIRemoteKeyboardWindow (the true keyboard window) and now is UITextEffectsWindow, which is not the real keyboardWindow. So adding subview to it adds improperly in the hierarchy.
Note: someone detailed the view hierarchy here: https://developer.apple.com/forums/thread/664547
UIWindow
UITextEffectsWindow
UIInputWindowController
UIInputSetContainerView
UIInputSetHostView
UIEditingOverlayViewController
UIEditingOverlayGestureView
I guess I have to add subview to something else than UITextEffectsWindow (keyBoardWindow), but to what ?
I tried
keyBoardWindow = kbWindow.superview as? UIWindow
to no avail.
I also tried to debug view hierarchy, but keyboard does not show in debugView.
That's a follow up of a previous thread.
https://developer.apple.com/forums/thread/707130
I did some test on iOS 16 simulator with Xcode 14ß.
Recognition is very poor. And recognition rate of some single letters (an L or an I for instance) is zero (literally). Same code worked better (success 50% for same single letters) with iOS 15.2.
Did something change on iOS 16 ? I filed a bug report: Jun 7, 2022 at 3:28 PM – FB10066541
With this code, button is at the center of the view and message is logged only when tapping in the button
1. struct ContentView: View {
2. var body: some View {
3.
4. ZStack {
5. Button(action: {
6. print("Touched")
7. }) {
8. Image(systemName: "square.split.diagonal.2x2")
9. .font(.system(size: 24))
10. .foregroundColor(.black)
11. // .position(x: 12, y: 12)
12. }
13.
14. }
15. .frame(width: 300, height: 300)
16. .background(Color.yellow)
17. }
18. }
But if I uncomment line 11, the message is logged on tap anywhere in the view.
How is it position() changes the behaviour ?
In a SwiftUI project, I get the following runtime error:
2022-12-15 11:31:37.453318+0100 MyApp[7039:3585656] [SceneConfiguration] Info.plist contained no UIScene configuration dictionary (looking for configuration named "(no name)")
There are no scene in the project:
I have tried to change the settings,
To no avail.
Post not yet marked as solved
I need to implement a remote control for a SwiftUI app running on iPad.
I would need to handle 2 different events like Start / Stop.
On software side, I plan to use .onReceive to listen to BLE device.
So my questions:
I need realtime reaction (a few 1/100 s max)
can any BLE remote control do the job ?
If anyone has tested some low cost remote, I am interested to know.
Or should I look for Homekit solutions ?
Or use an iPhone as remote controller ?
I installed Xcode 14.2 (in parallel with other versions of Xcode with different names) on MBP MacOS 12.6.2.
Il works OK except when trying to use WatchOS simulator.
When I select a WatchOS target and then look for simulator, none is installed. I get an item in menu proposing to GET 9.1.
I downloaded.
But at the end of download, installation failed with the message that installation of watchOS 9.1 simulator runtime failed in CoreSimulator.
I tried the solution proposed here, to no avail.
https://stackoverflow.com/questions/74096242/unable-to-select-device-for-watchos-app-in-xcode
Note: installation on an iMac running 12.6.2 and Xcode 14.2 shows a long list of simulators:
Is there a difference in the distribution of results between those 2 forms ?
let x = (1...9).randomElement()!
let y = Int.random(in: 1...9)
In doc, I never see the first form used
I get this error in Xcode 14 / iOS 16 on device that I had not with previous versions.
[general] *** -[NSKeyedUnarchiver validateAllowedClass:forKey:] allowed unarchiving safe plist type ''NSNumber' (0x205da88f8) [/System/Library/Frameworks/Foundation.framework]' for key 'NS.objects', even though it was not explicitly included in the client allowed classes set: '{(
"'NSDictionary' (0x205da1178) [/System/Library/Frameworks/CoreFoundation.framework]",
"'NSString' (0x205da8948) [/System/Library/Frameworks/Foundation.framework]"
)}'. This will be disallowed in the future.
The only places where I reference NSDictionary.self or NSString.self or NSNumber.self for allowed classes are:
@objc class MyClass : NSObject, NSSecureCoding {
required init(coder decoder: NSCoder) {
let myObject = decoder.decodeObject(of: [MyClass.self, NSNumber.self, NSArray.self, NSDictionary.self, NSString.self], forKey: myKey) as? [SomeClass] ?? []
}
and in another class
class Util {
// in a class func:
let data = try Data(contentsOf: fileURL)
guard let unarchived = try NSKeyedUnarchiver.unarchivedObject(ofClass: NSDictionary.self, from: data) else { return nil }
I do not understand what NS.objects refers to.
I tried to add NSObject.self in
let myObject = decoder.decodeObject(of: [MyClass.self, NSNumber.self, NSArray.self, NSDictionary.self, NSString.self, NSObject.self], forKey: myKey) as? [SomeClass] ?? []
but that gave even more warnings:
[Foundation] *** -[NSKeyedUnarchiver validateAllowedClass:forKey:]: NSSecureCoding allowed classes list contains [NSObject class], which bypasses security by allowing any Objective-C class to be implicitly decoded. Consider reducing the scope of allowed classes during decoding by listing only the classes you expect to decode, or a more specific base class than NSObject. This will become an error in the future. Allowed class list: {(
"'NSNumber' (0x205da88f8) [/System/Library/Frameworks/Foundation.framework]",
"'NSArray' (0x205da1240) [/System/Library/Frameworks/CoreFoundation.framework]",
"'NSObject' (0x205d8cb98) [/usr/lib]",
"'NSDictionary' (0x205da1178) [/System/Library/Frameworks/CoreFoundation.framework]",
"'MyClass.self' (0x1002f11a0) [/private/var/containers/Bundle/Application/5517240E-FB23-468D-80FA-B7E37D30936A/MyApp.app]",
"'NSString' (0x205da8948) [/System/Library/Frameworks/Foundation.framework]"
Another warning refers to NS.keys:
2022-09-16 16:19:10.911977+0200 MyApp[4439:1970094] [general] *** -[NSKeyedUnarchiver validateAllowedClass:forKey:] allowed unarchiving safe plist type ''NSString' (0x205da8948) [/System/Library/Frameworks/Foundation.framework]' for key 'NS.keys', even though it was not explicitly included in the client allowed classes set: '{(
"'NSDictionary' (0x205da1178) [/System/Library/Frameworks/CoreFoundation.framework]"
)}'. This will be disallowed in the future.
I do not understand what NS.keys refers to.
What additional class types should I add ?
In this app I get access to QRCode.
Reading works perfectly from the camera.
Now I am struggling to get the image that was processed by the built in QRCode reader.
I have found many hints on SO, but cannot make it work.
Here is the code I have now.
It is a bit long, I have to slit in 2 parts
I looked at:
// https://stackoverflow.com/questions/56088575/how-to-get-image-of-qr-code-after-scanning-in-swift-ios
// https://stackoverflow.com/questions/37869963/how-to-use-avcapturephotooutput
import UIKit
import AVFoundation
class ScannerViewController: UIViewController, AVCaptureMetadataOutputObjectsDelegate {
		fileprivate var captureSession: AVCaptureSession! // use for QRCode reading
		fileprivate var previewLayer: AVCaptureVideoPreviewLayer!
		
		// To get the image of the QRCode
		private var photoOutputQR: AVCapturePhotoOutput!
		private var isCapturing = false
		
		override func viewDidLoad() {
				super.viewDidLoad()
				var accessGranted = false
			 //	switch AVCaptureDevice.authorizationStatus(for: .video) {
// HERE TEST FOR ACCESS RIGHT. WORKS OK ;
// But is .video enough ?
				}
				
				if !accessGranted {	return }
				captureSession = AVCaptureSession()
				
				photoOutputQR = AVCapturePhotoOutput() // IS IT THE RIGHT PLACE AND THE RIGHT THING TO DO ?
				captureSession.addOutput(photoOutputQR)	 // Goal is to capture an image of QRCode once acquisition is done
				guard let videoCaptureDevice = AVCaptureDevice.default(for: .video) else { return }
				let videoInput: AVCaptureDeviceInput
				
				do {
						videoInput = try AVCaptureDeviceInput(device: videoCaptureDevice)
				} catch {	return }
				
				if (captureSession.canAddInput(videoInput)) {
						captureSession.addInput(videoInput)
				} else {
						failed()
						return
				}
				
				let metadataOutput = AVCaptureMetadataOutput()
				
				if (captureSession.canAddOutput(metadataOutput)) {
						captureSession.addOutput(metadataOutput) // SO I have 2 output in captureSession. IS IT RIGHT ?
						
						metadataOutput.setMetadataObjectsDelegate(self, queue: DispatchQueue.main)
						metadataOutput.metadataObjectTypes = [.qr]	// For QRCode video acquisition
						
				} else {
						failed()
						return
				}
				
				previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
				previewLayer.frame = view.layer.bounds
				previewLayer.frame.origin.y += 40
				previewLayer.frame.size.height -= 40
				previewLayer.videoGravity = .resizeAspectFill
				view.layer.addSublayer(previewLayer)
				captureSession.startRunning()
		}
		
		override func viewWillAppear(_ animated: Bool) {
				
				super.viewWillAppear(animated)
				if (captureSession?.isRunning == false) {
						captureSession.startRunning()
				}
		}
		
		override func viewWillDisappear(_ animated: Bool) {
				
				super.viewWillDisappear(animated)
				if (captureSession?.isRunning == true) {
						captureSession.stopRunning()
				}
		}
		
		// MARK: - scan Results
		
		func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) {
				
				captureSession.stopRunning()
				
				if let metadataObject = metadataObjects.first {
						guard let readableObject = metadataObject as? AVMetadataMachineReadableCodeObject else { return }
						guard let stringValue = readableObject.stringValue else { return }
						AudioServicesPlaySystemSound(SystemSoundID(kSystemSoundID_Vibrate))
						found(code: stringValue)
				}
				// Get image - IS IT THE RIGHT PLACE TO DO IT ?
				// https://stackoverflow.com/questions/37869963/how-to-use-avcapturephotooutput
				print("Do I get here ?", isCapturing)
				let photoSettings = AVCapturePhotoSettings()
				let previewPixelType = photoSettings.availablePreviewPhotoPixelFormatTypes.first!
				print("previewPixelType", previewPixelType)
				let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
														 kCVPixelBufferWidthKey as String: 160,
														 kCVPixelBufferHeightKey as String: 160]
				photoSettings.previewPhotoFormat = previewFormat
				if !isCapturing {
						isCapturing = true
						photoOutputQR.capturePhoto(with: photoSettings, delegate: self)
				}
				dismiss(animated: true)
		}
		
}
extension ScannerViewController: AVCapturePhotoCaptureDelegate {
	
		func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
			
				isCapturing = false
				print("photo", photo, photo.fileDataRepresentation())
				guard let imageData = photo.fileDataRepresentation() else {
						print("Error while generating image from photo capture data.");
						return
				}
		 }
}
I get the following print on console
Clearly photo is not loaded properly
Do I get here ? false
previewPixelType 875704422
photo <AVCapturePhoto: 0x281973a20 pts:nan 1/1 settings:uid:3 photo:{0x0} time:nan-nan> nil
Error while generating image from photo capture data.
I try to exclude some activities from UIActivity.
It works as expected when exclusion is done directly with the activity, as with:
UIActivity.ActivityType.message,
UIActivity.ActivityType.airDrop
but not when activity is declared with an init as with:
UIActivity.ActivityType(rawValue: "net.whatsapp.WhatsApp.ShareExtension"),
UIActivity.ActivityType(rawValue: "com.ifttt.ifttt.share"),
So, with the following code:
let excludedActivityTypes = [
UIActivity.ActivityType.message,
UIActivity.ActivityType.airDrop,
UIActivity.ActivityType(rawValue: "net.whatsapp.WhatsApp.ShareExtension"),
UIActivity.ActivityType(rawValue: "com.ifttt.ifttt.share")
]
let activityVC = UIActivityViewController(activityItems: [modifiedPdfURL], applicationActivities: nil)
activityVC.excludedActivityTypes = excludedActivityTypes
message and airDrop do not show, but WhatsApp and IFTTT still show.
I have tested with
activityVC.completionWithItemsHandler = { (activity, success, modifiedItems, error) in
print("activity: \(activity), success: \(success), items: \(modifiedItems), error: \(error)")
}
that WhatsApp and IFTTT services are effectively the ones listed here.
When selecting WhatsApp, print above gives:
activity: Optional(__C.UIActivityType(_rawValue: net.whatsapp.WhatsApp.ShareExtension)), success: false, items: nil, error: nil
Post not yet marked as solved
Hi Quinn, if I may add a point 10 to your (very meaningful) list:
when pasting images, think of reducing the size (can edit simply the image ref by dividing a dimension by 2) to avoid huge images that clutter the screen.
When you get an answer, please indicate if that solved the issue, if not explain what is not working. And don't forget to close the thread by marking the correct answer once you got it.
Do you use third party framweworks, like Alamofire ?
I have a tableView inside a UIView in a VC.
This VC is in a navigation view. It is a UIViewController, not a UITableViewController.
The VC set up is this one:
The issue:
When I run app, there is a bar (full width, about 25 pixels high) appearing over the tableView at the bottom, hiding partially a cell (we see the (i) button nearly completely hidden). When scrolling tableView, this bar does not move and hides other cell.
I looked at the view in debugger (I edited the content of cells):
and found that the bar which is hiding comes from navigation controller:
So 2 questions:
what is this view ?
How to avoid it coming over the table view ?
I have a solution by replacing upperView by a header in tableView, but I would like to understand what goes on here.