Posts

Post not yet marked as solved
0 Replies
712 Views
Hi! Wondering if there's a way to add subject lifting to images in my SwiftUI app without relying on UIViewControllerRepresentable and Coordinators to adopt the ImageAnalysisInteraction protocol. Thank you!
Posted
by SAIK1065.
Last updated
.
Post not yet marked as solved
0 Replies
713 Views
Hello! I want to build an app that lets devices with the LiDAR Scanner scan their environment and share their scans with one another. As of now, I can create the mesh using the LiDAR Scanner and export it as an OBJ file. However, I would like the ability to map textures and colors onto this model. How would one go on to get the real world texture and place it onto the OBJ model? Thank you!
Posted
by SAIK1065.
Last updated
.
Post not yet marked as solved
1 Replies
758 Views
Hello! I really want to make these immersive AR experiences with engaging graphics like the new AR Spaces feature in the Clips app. Is RealityKit the way to go for this? Or does it require some really lower level understanding and use of API’s like ARKit with Metal? Also, with RealityKit’s scene understanding, we can map a mesh of the scene the LIDAR device can see but I was wondering how would I switch that out with a custom mesh design like in the Clips app as well. Thank you!
Posted
by SAIK1065.
Last updated
.
Post marked as solved
1 Replies
1.1k Views
Hello! I've been trying to replicate the linear regressor tabular application as shown at the end of the WWDC 2021 session Build dynamic iOS apps with the Create ML framework. However, I keep getting the error Cannot find 'DataFrame' in scope and Cannot find 'Column' in scope. I was wondering if this is a bug with the current beta or if I have my code wrong. I am trying to run my application on an iPhone 12 Pro running iOS 15 Beta 1. I am using Xcode 13 on Monterey. Thank you!
Posted
by SAIK1065.
Last updated
.
Post not yet marked as solved
1 Replies
1k Views
I have a Reality Composer project that has two scenes. In the first scene, I post a notification from Xcode to trigger an animation and then change the scene to the next scene. This is done using the Change Scene in the Action Sequence. In the second scene, when the user taps on an object, it should send a notification to Xcode using the Notify action. How do I set the .onAction in Xcode for the second scene if I switch to the second scene using an Action Sequence from the first scene? Thanks!
Posted
by SAIK1065.
Last updated
.
Post not yet marked as solved
1 Replies
901 Views
Hi! I am attempting to create a style transfer model for images. My content image folder has about 80,000 images. However, when I click train, the Create ML app displayed the “Processing” message for about a minute before I get the error: Could not create buffer with format ‘BGRA’ (-6662). Hope someone can help me with what this issue is! Is my training data too large or are their pictures in there that don’t conform to the right image format? Thank you!
Posted
by SAIK1065.
Last updated
.
Post not yet marked as solved
1 Replies
1.1k Views
I have a custom Core ML model built with Keras that has a couple of updatable layers. I was wondering if I switched this model to be deployed from the cloud rather than packaged with the app, would the on-device personalized changes transfer whenever I deploy a model over the cloud or would a user have to start with a fresh, new model every time? Similarly, do on-device personalized models work with model encryption? Thanks!
Posted
by SAIK1065.
Last updated
.
Post marked as Apple Recommended
753 Views
When I tap on a widget, it goes directly to my app. However, I am wondering how do I deep link to various screens in my app from a widget. Is there some sort of NSUserActivity I can read?
Posted
by SAIK1065.
Last updated
.