Like ddhuntsman I am also seeing this when trying to initialize an NSAttributedString. It just hangs.
let html = "Some HTML ..."
if let data = html.data(using: .utf8) {
		let attributedString = try? NSAttributedString(data: data, options: [.documentType: NSAttributedString.DocumentType.html], documentAttributes: nil)
Does anyone have a solution for this?
Post
Replies
Boosts
Views
Activity
I was able to get the menu by using NSMenuToolbarItem for my menu item and attaching a menu to it.
On iOS 14 you can determine if it is not from a right click when menuAppearance is UIContextMenuInteractionAppearanceRich and return nil. When you do this your long press gesture recognizer will still work.
(nullable UIContextMenuConfiguration *)contextMenuInteraction:(UIContextMenuInteraction *)interaction configurationForMenuAtLocation:(CGPoint)location API_AVAILABLE(ios(14.0)){
if (interaction.menuAppearance == UIContextMenuInteractionAppearanceRich) {
return nil;
}
		
		//Do what you want on right click
		 ...
Any luck getting this fixed? I am seeing the same thing, only I am not using any of the betas.
Did you ever figure this out? We were able to get our app to show up for the action button and then it just disappeared from the action button list a few weeks ago. We can't find anything that has changed and we are also able to get https://github.com/KhaosT/WatchActionButtonExample to work just fine.
Did you ever figure this out? I am seeing the same thing and I can't figure out what I am doing wrong.
I have found a little more information on this, but I have not yet been able to pinpoint it. I thought I would share where I am at in case it helps someone else find this issue. I found that if I build the app with Xcode 14.0 and remove CryptoSwift as a package dependency most of the time I can get it to work. I say most of the time because sometimes when I do a clean build on a new simulator it still does not work. But about 80% of the time it is working for me when I do this. I then tried Xcode 14.1 with CryptoSwift removed from my project and I got it to work once, but I was never able to get it to work again when I would do a fresh build.
With each of these tests I was using a watchOS 9.0 simulator. I then tried building with Xcode 14.0 without CryptoSwift as a dependency and I was able to select my app for the action button on my Apple Watch Ultra running watchOS 9.3.
This does seem to point to a bug in Xcode 14.2 with the build settings and dependences. I am going to keep experimenting to see if I can pinpoint exactly what the combination is that makes it work.
I am able to get https://github.com/KhaosT/WatchActionButtonExample to work just fine with Xcode 14.2.
If anyone else is seeing this and has information on this issue, I would greatly appreciate any insights you might have.
I found exactly what I have to do with my app to get the action button to work.
I have to use Xcode 14.0.1
I had to remove CryptoSwift as a package dependency and add it as an xcframework instead. What is odd is that I am linking against the exact same version of CryptoSwift, I just had to change how I brought it into my project.
I found that the reason I did not have constant results was that I needed to delete the DerivedData every time I switch versions of Xcode.
I could not get the action button to work with my app using Xcode 14.1 or Xcode 14.2. Most likely there is something else, like removing the CryptoSwift packager dependency that I need to do to get it to work. I tried removing the other swift package dependencies that I have and that did not make any difference.
I figured out a way to do this, though it does seem like more of a hack than a real solution. I encoded my values in the red channel of a color, I then created an image with those colors, and then I assigned texture coordinates for each vertex. That allowed me to get the correct interpolated value out in my fragment shader.
I see that the documentation has been updated to show that PointLight, DirectionalLight, and SpotLight are not available on visionOS. Does anyone know how to add a light source to a RealityKit scene on visionOS? I am working on a fully immersive experience and I need to add light from the sun to my scene, but I can't figure out how to do it.
I create a sphere that I use for sky and display an image on the inside of it by changing the scale property on the Entity.
Here is how I create a very large sphere and have the material rendered on the inside of it.
let entity = Entity()
entity.components.set(ModelComponent(
mesh: .generateSphere(radius: 100000),
materials: [material]
))
// Ensure the texture image points inward at the viewer.
entity.scale *= .init(x: -1, y: 1, z: 1)
I have the same question, but so far I have not found anyway to do this with public APIs. https://developer.apple.com/forums/thread/732953
Here is what I did to send an image to ShaderGraphMaterial.
let img = getSomeUIImage()
let mat = getShaderGraphMaterial()
if let cgImg = img.cgImage {
let texture = try TextureResource.generate(from: cgImg, options: TextureResource.CreateOptions.init(semantic: nil))
try mat.setParameter(name: "cover", value: .textureResource(texture))
}
I recommend looking at https://microconf.com/ for advice on how to hire, when to hire, what product to work on first, etc. The associated podcast (https://www.startupsfortherestofus.com) has a lot of podcast episodes talking about hiring an engineer if you are a non technical founder.
Depending on the type of game you are building, you will likely want to use SceneKit, SpriteKit, or RealityKit. SwiftUI is the future, so my recommendation is to start there. Though there are things it does not support and then you have to bridge over to UIKit in those cases.