I'm adding pointer interactions to my app.if #available(iOS 13.4, *) {
let interaction = UIPointerInteraction(delegate: self)
button.addInteraction(interaction)
}@available(iOS 13.4, *)
func pointerInteraction(_ interaction: UIPointerInteraction, styleFor region: UIPointerRegion) -> UIPointerStyle? {
UIPointerStyle(effect: .lift(UITargetedPreview(view: button)), shape: .roundedRect(button.frame, radius: ViewAssistant.buttonCornerRadius))
}I get this exception sometimes, not sure why?Thread 1: Exception: "UIPreviewTarget requires that the container view is in a window, but it is not. (container: <UIView: 0x10873d800> => <UIScrollView: 0x108873c00> => <_UIVisualEffectContentView: 0x108745070> => <Pixel_Nodes.PanelCreatorView: 0x108727ad0>)"
Post
Replies
Boosts
Views
Activity
Hi, it seems the AVCaptureDevice is not available in an iOS project via Mac Catalyst.
The official AVCam demo project also crashes while running via Mac Catalyst.
Dose anyone know a solution to get a live camera feed?
Is there a way to tag views so debugging an auto layout crash becomes easier?
The Tags “Auto Layout“ and “Automator” seems to have the wrong icons in the list view on this page:
https://developer.apple.com/support/forums-tags/
Hi, I've got a Swift Framework with a bunch of Metal files. Currently users have to manually include a Metal Lib in their bundle provided separately, to use the Swift Package.
First question; Is there a way to make a Metal Lib target in a Swift Package, and just include the .metal files? (without a binary asset)
Second question; If not, Swift 5.3 has resource support, how would you recommend to bundle a Metal Lib in a Swift Package?
I've updated my Swift Package, PixelKit, to Swift 5.3.
I tagged my commit with 1.1.1 and pushed to origin.
Tho when trying to add it to another package, it failed to resolve dependencies:
because PixelKit >=1.0.13 contains incompatible tools version and root depends on PixelKit 1.1.1..<2.0.0, version solving failed. Here's how I include it:
.package(url: "https://github.com/hexagons/PixelKit.git", from: "1.1.1")
I'm not sure what the old tag 1.0.13 has to do with this.
I'm using the same swift tools version in both packages:
// swift-tools-version:5.3
I'm making a Mac Catalyst app and have some 16 bit metal textures I would like to convert to an array of the new Float16.
Tho it dose not seem to be available...
'Float16' is unavailable in Mac Catalyst
Is Float16 - https://developer.apple.com/documentation/swift/float16 Apple Silicon only?
I've got a Multiplatform app with a SwiftUI view and this function:
.onDrop(of: [.image], delegate: myController)
In an extension of my controller I conform to the DropDelegate and implement performDrop(info:)
I've also typealias'ed the Image types.
#if os(macOS)
public typealias MPImage = NSImage
#else
public typealias MPImage = UIImage
#endif
Here's my implementation:
guard info.hasItemsConforming(to: [.image]) else { return false }
let items: [NSItemProvider] = info.itemProviders(for: [.image])
guard let item: NSItemProvider = items.first else { return false }
guard item.canLoadObject(ofClass: MPImage.self) else { return false }
item.loadObject(ofClass: MPImage.self) { (reading, error) in
		guard error == nil else { return }
		guard let image: MPImage = reading as? MPImage else { return }
		self.didLoad(image: image)
}
This compiles fine on iOS, tho on macOS I get the following errors:
Instance method 'canLoadObject(ofClass:)' requires that 'MPImage' (aka 'NSImage') conform to 'ObjectiveCBridgeable' and
Instance method 'loadObject(ofClass:completionHandler:)' requires that 'MPImage' (aka 'NSImage') conform to 'ObjectiveCBridgeable' Is this not the way to do drag and drop on macOS?
I've got a SwiftUI app with the onDrop method on my View like this:
.onDrop(of: [.image], delegate: myController)
In an extension of my controller I conform to the DropDelegate and implement performDrop(info:)
Here's my implementation:
guard info.hasItemsConforming(to: [.image]) else { return false }
let items: [NSItemProvider] = info.itemProviders(for: [.image])
guard let item: NSItemProvider = items.first else { return false }
guard item.canLoadObject(ofClass: MPImage.self) else { return false }
item.loadObject(ofClass: MPImage.self) { (reading, error) in
guard error == nil else { return }
guard let image: MPImage = reading as? MPImage else { return }
self.didLoad(image: image)
}
The code compiles fine on iOS (tho not on macOS, see link)
https://developer.apple.com/forums/thread/653934
So when I run the app on iOS, and drag an image from Photos and drop it on my app I get the following error:
MyApp perform drop failed: Error Domain=NSItemProviderErrorDomain Code=-1000 "Cannot load representation of type public.jpeg" UserInfo={NSLocalizedDescription=Cannot load representation of type public.jpeg, NSUnderlyingError=0x283fa15f0 {Error Domain=NSCocoaErrorDomain Code=260 "The file “DAE533E7-2918-465E-9F3C-502B8DEC78BA.jpeg” couldn’t be opened because there is no such file." UserInfo={NSURL=file:///var/tmp/com.apple.DragUI.druid/.com.apple.DragUI.BDfAeP/DAE533E7-2918-465E-9F3C-502B8DEC78BA.jpeg, NSFilePath=/var/tmp/com.apple.DragUI.druid/.com.apple.DragUI.BDfAeP/DAE533E7-2918-465E-9F3C-502B8DEC78BA.jpeg, NSUnderlyingError=0x283f4b8a0 {Error Domain=NSPOSIXErrorDomain Code=2 "No such file or directory"}}}}
I've start editing an image in Photos on macOS with my extension, and then when I try to save, I get an alert saying something went wrong when saving.
func finishContentEditing(completionHandler:)
In the finish content editing method I save my photo async like this:
let output = PHContentEditingOutput(contentEditingInput: input)
let unitCrop: UnitCrop = UnitCrop(frame: self.cropper.frame)
do {
		let data: Data = try JSONEncoder().encode(unitCrop)
		output.adjustmentData = PHAdjustmentData(formatIdentifier: UnitCrop.formatIdentifier,
																						 formatVersion: UnitCrop.formatVersion,
																						 data: data)
} catch {
		completionHandler(nil)
		return
}
self.cropper.save { result in
		switch result {
		case .success(let image):
				guard let imageData: Data = image.pngData() else {
						completionHandler(nil)
						return
				}
				do {
						try imageData.write(to: output.renderedContentURL, options: .atomic)
						print("success!")
						completionHandler(output)
				} catch {
						completionHandler(nil)
						return
				}
		case .failure(let error):
				completionHandler(nil)
		}
}
I see the success print, tho I still get an popup alert saying it failed in Photos and no edits gets saved.
Do anyone know how to debug this?
I've got a Multiplatform SwiftUI app and I'm trying to create a hierarchical structure with OutlineGroup. My code works fine on iOS. Tho on macOS I get a crash.
List {
		Section(header: Text("A")) {
				OutlineGroup(layers.aLayers,
										 children: \.children) { layer in
						LayerView(layer: layer,
											selectedID: $layers.id)
				}
		}
		Section(header: Text("B")) {
				OutlineGroup(layers.bLayers,
										 children: \.children) { layer in
						LayerView(layer: layer,
											selectedID: $layers.id)
				}
		}
}
.listStyle(SidebarListStyle())
Does anyone know what this means?
Thread 1: Fatal error: Could not find child #1 for Optional(SwiftUI.OutlineItem(id: SwiftUI.ViewListID.Canonical(_index: 0, implicitID: 0, explicitID: nil), isGroupItem: true))
Hey,
We found an issue, or it's actually know, check the stackoverflow link.
Our Core ML model's auto generated Swift code does not compile.
On macOS 11.0 Beta (20A5395g)
with Xcode Version 12.0 (12A7209)
https://stackoverflow.com/questions/63917164/ml-build-error-for-catalyst-xcode-12-gm
Hi I'm using this package in an app. The package builds fine. Tho I get build errors in the package when I build the app.
Here's the error I get:
Main actor-isolated class 'My Class' has different actor isolation from nonisolated superclass 'My Super Class'
I'm not using actor or @MainActor, so I'm not sure where the error is coming from.
Here's a line where the error shows up in the package.
Related stackoverflow post.
I've got a multi-platform document based app, with package files (the "file" is a folder, but looks like a file for the user).
I can create files on all platforms, tho I can only open files on Mac and iPhone. When I try to open files (in iCloud) on iPad, the file does not open and nothing is logged. Tho the files do open when they are stored locally on the iPad.
I followed the documentation here:
https://developer.apple.com/library/archive/documentation/CoreFoundation/Conceptual/CFBundles/DocumentPackages/DocumentPackages.html
I've specified LSTypeIsPackage to true and added com.apple.package to UTTypeConformsTo.
I've tried incrementing the build number.
I'm targeting iOS 14 / macOS 11, and build with Xcode 13 beta 3.
Here's my info.plist:
<plist version="1.0">
<dict>
<key>CFBundleDocumentTypes</key>
<array>
<dict>
<key>CFBundleTypeName</key>
<string>File Name</string>
<key>LSHandlerRank</key>
<string>Owner</string>
<key>LSItemContentTypes</key>
<array>
<string>com.mysite.file</string>
</array>
<key>LSTypeIsPackage</key>
<true/>
</dict>
</array>
<key>LSRequiresIPhoneOS</key>
<true/>
<key>LSSupportsOpeningDocumentsInPlace</key>
<true/>
<key>UISupportsDocumentBrowser</key>
<true/>
<key>UIFileSharingEnabled</key>
<true/>
<key>UTExportedTypeDeclarations</key>
<array>
<dict>
<key>UTTypeConformsTo</key>
<array>
<string>com.apple.package</string>
<string>public.composite-content</string>
<string>public.data</string>
</array>
<key>UTTypeDescription</key>
<string>File Name</string>
<key>UTTypeIdentifier</key>
<string>com.mysite.file</string>
<key>UTTypeTagSpecification</key>
<dict>
<key>public.filename-extension</key>
<array>
<string>myext</string>
</array>
</dict>
</dict>
</array>
</dict>
</plist>
Does anyone know how to debug this?
I'm trying to use the new layerEffect(_:maxSampleOffset:isEnabled:).
https://developer.apple.com/documentation/swiftui/view/layereffect(_:maxsampleoffset:isenabled:)
Tho I'm not sure how to define the metal shader function signature. The docs indicate that we should use SwiftUI::Layer, tho I'm not sure what to import to get access to this layer structure.
[[ stitchable ]] half4 name(float2 position, SwiftUI::Layer layer, args...)
My goal is to create a custom blur effect.
Does anyone have any pointer on how to get started with layer effects?