Hi,I have a very weird situation. My iMessage App icon is showing well on all iPhones, but not iPad. All icons are generated automatically by sketch in different sizes.I even replaced all icons with version I created using photoshop.I verified all sizes.The icon is in the final binary but is not showing in the list of iMessage apps, instead I see the generic icon with our app name.https://itunes.apple.com/us/app/wit-what-is-this/id1152446245Any idea where this could be comming from? Any resources I could check?All the best Christoph
Post
Replies
Boosts
Views
Activity
Hi guys,I am creating MTL Textures from CIImages. Everything is sRGB, but when it shows up in Scene Kit it looks white too pale, like the data is used as RGB.I am using no lights nothing in the scene, it's purely a scene with SCNPlanes having the textures as diffuse material and a camera that is it. No lights nothing.Any Ideas what I can do to get a normal representation?(Below a picture of the CIImage in the small window and in SCNViewAll the bestChristoph
Hi,
I tried to bring an app to macOS using Catalyst, but have troubles with CloudKit. Basically what the app does is, it writes records to the public database and allows to retrieve that data via the recordID.
Simple and straight forward and of course the code has been working for iOS for years now. On macOS I can upload data, but when I try to receive data I get nothing, as if the request was never started. I do not get gate back, nor any error, not even a time out.
Could anybody give me a hint where to look for a solution?
All the best
Christoph
Hi,
I am already confused about getting the regions of interest on Mac (I am getting them through the project info sections which makes no sense to my, why is a region of interest dependent on the project and it seems bug prone as the section are slowly updated after assets are added).
How do I get the region of interest of a PHAsset on iOS? And is that approach portable to macOS?
All the best
Christoph
Hi,
I have been rejected because my App links against MapKit, but does not use it.
I verified and otool -L finds this
@rpath/libswiftMapKit.dylib (compatibility version 1.0.0, current version 2.0.0, weak)
But I have no idea where this comes from
• it is not in my list of linked Libraries
• MapKit is not appearing anywhere in my Code (no "import MapKit" or anything)
• I don't use any thing that starts with an MK
• I do not include or link against non Apple Libraries
What I do have is
• a 5.3 Swift code except for
• Zip Library and for it linker flags -lz and -liconv
What could trigger the linking against MapKit?
How can I isolate this problem?
I am truly stuck and could use some pointers.
All the best
Christoph
Hi,
I have multiple idea for a GeoTracking AR apps using ARGeoTrackingConfiguration, but noticed the list of cities are all in the US. My personal hope was that all cities with 3D data would support this sooner or later, but this has not happened for me yet.
Is there any way for testing/faking this to know what the user will experience?
How fast is it expanding, so would it be worth waiting till it's available in my areas (Düsseldorf, Germany), too?
All the best
Christoph
Hi,
I have embedded a view into a scroll view in case the content is too large, but mostly it is not. Is there a way to have the ScrollView shrink to fit its content?
To my surprise it looks like this is a simple task not possible in SwiftUI? Am I missing something?
All the best
Christoph
Hi,
I am using SCNConstraints to zoom in on a geometry the user clicked, but would like to reset to my original position when the user clicks outside.
How do I reset my Camera to its original position?
I read constraints are changing the transform, but the position, transform and everything is unchanged. Yet when I set
constraints = nil
nothing happens the camera stays where it last was and does not move and everything it contains claims a different position. even the world transform shows it's unchanged.
How do I reset a camera node to it's origin after I removed all constraints???
Hi,
I wanted to offer the UI for editing and deleting a share.
So for using
UICloudSharingController(share: , container: )
I implemented the delegate to call "persistUpdatedShare" in "cloudSharingControllerDidSaveShare"
func cloudSharingControllerDidSaveShare(_ csc: UICloudSharingController) {
if let share = csc.share,
let privatePersistentStore = PersistenceController.shared.privatePersistentStore {
PersistenceController.shared.container.persistUpdatedShare(share, in: privatePersistentStore) { (share, error) in
Thread.performOnMain {
if let error = error {
PersistenceController.shared.storeWarning = error
}
}
}
}
}
and to call "purgeObjectsAndRecordsInZone" in "cloudSharingControllerDidStopSharing"
func cloudSharingControllerDidStopSharing(_ csc: UICloudSharingController) {
let puzzleToShare = puzzleToShare
if let share = csc.share,
let privatePersistentStore = PersistenceController.shared.privatePersistentStore {
PersistenceController.shared.container.purgeObjectsAndRecordsInZone(with: share.recordID.zoneID,
in: privatePersistentStore) { (zoneID, error) in
Thread.performOnMain {
if let error = error {
PersistenceController.shared.storeWarning = error
} else {
puzzleToShare.makePrivateDuplicate()
try? puzzleToShare.puzzle.managedObjectContext?.save()
}
}
}
}
}
I got this implementation from a TSI, but it makes no sense to me to delete and duplicate my objects rather than just keeping them and only deleting the share.
Question:
How can I cleanly discontinue a share, but keep my data. Duplicating my objects seems such a weird approach. But without this implementation it does not really work. (I get strange behaviors and crashes)
All the best
Christoph
Hi,
when working with SwiftUI @Published variables need to be set from the main thread, but at the same time, cannot be defined as @MainActor.
The easy solution is a simple forced sync or async execution on the main thread, but it seems like this does not exist in the world of async/await and Tasks - at least I did not find it.
Instead I found that all demo's from Apple is a quite sad solution. The full method is marked as @MainActor even though this is often only relevant for the last line.
The question is either that I can execute a setter on main only OR that I can mark my SwiftUI code as being on main and thus being able to accept @MainActor @Published
Is there really no solution? If so where do I have to propose/request a solution for this.
I feel like every developer had written his own version of Thread.performOnMain, but here I want something using Task and that the compiler can verify.
All the best
Christoph
Hi,
I have a look to wait for all non purchase Transaction
for await result in Transaction.updates {
…
}
But this block works when I trigger a restore purchases, but it is never called when I approve a pending purchase.
Is there any reasoning why this could be or a bug. (As an app store user and father I had a similar problem where my daughter had to tap purchase again and then it worked.
All the best
Christoph
Hi,
I always though @FetchedResults should automatically update, but it seems this is not always true, or do I have to trigger updates myself? I am using Core Data with CloudKit sync.
When I start my app, nothing is visually updating while it'S doing the first sync. Only after that view is reloaded it shows the synced data.
Any idea what I have to do automatically get updates when the data is syncing the background.
All the best
Christoph
Hi,
I get nicely creased USD models from my 3D designer which are super small, as higher meshes can be generated from them and even normals can be derived from these models.
Question is: Does RealityKit automatically …
• … generate the normals from the crease information?
• … subdivide for these models (do I have to/can I set a subdivision level)?
• … generate subdivision in the GPU as the model get's closer to the camera or should I even force create multiple detail levels?
All the best
Christoph
Hi,
I noticed that the face painting sample is creating cgImages from the PencilKit and then generating texture from those again. I do something similar In my app that currently uses SceneKit and I would like to port to RealityKit.
In my code I draw into a CGImage (draw shape, masked image and shadows) and then convert that CGImage to a texture.
I would like to optimize that as there is a very noticable latency with this process.
Would it help if made small CGImages (for the local changes) which are then converted to a texture and use DrawQueues to lay those over the existing images?
Or what is the most efficient way to get my changes onto a texture in near real time?
All the best
Christoph
Hi,
since Upgrading to Ventura I cannot Debug my Photos Extension any more.
But I urgently need to as we have a new bug that makes our app crash only on Ventura.
(it's ONLY extensions that cannot be debugged, anything else works.)
Trying to Debug shows: Could not attach to pid : “38699”
Details:
attach failed (Not allowed to attach to process. Look in the console messages (Console.app), near the debugserver entries, when the attach failed. The subsystem that denied the attach permission will likely have logged an informative message about why it was denied.)
Console:
"macOSTaskPolicy: (com.apple.debugserver) may not get the task control port of (friedmann print ) (pid: 35508): (friedmann print ) is hardened, (friedmann print ) doesn't have get-task-allow, (com.apple.debugserver) is a declared debugger(com.apple.debugserver) is not a declared read-only debugger"
Is there ANYTHING I can do? Or do I have to wait for the next Ventura or XCode update???
All the best
Christoph