Posts

Post marked as solved
3 Replies
857 Views
I am using NWListener and NWBrowser to establish a NWConnection between two local devices. I noticed that if I run the NWBrowser client on the iOS Simulator and create a (single) connection for a result endpoint found by the browser, the NWListener's newConnectionHandler on the destination device is called with three different connections. Is this a bug or just something unique to the simulator? Or should I prepared to handle this in general? And if so, should I just accept the first connection and reject subsequent ones, or pick the last one, or choose the best option amongst them (and if so, how)? Here is an example of the three connections received by the NWListener for the one NWConnection attempt from the simulator: Client connected: [C1 fe80::8bb:e5ff:fe18:bab%anpi0.57383 tcp, traffic class: 700, local: fe80::8bb:e5ff:fe18:b54%anpi0.56395, definite, attribution: developer, server, path satisfied (Path is satisfied), viable, interface: anpi0, scoped] Client connected: [C2 fe80::53:558d:8f79:9a0c%en2.57384 tcp, traffic class: 700, local: fe80::897:b297:76e0:a53%en2.56395, definite, attribution: developer, server, path satisfied (Path is satisfied), viable, interface: en2, scoped] Client connected: [C3 fe80::3c:91d7:b7e:7c2b%en0.57385 tcp, traffic class: 700, local: fe80::1cf2:d1b6:220d:4d8f%en0.56395, definite, attribution: developer, server, path satisfied (Path is satisfied), viable, interface: en0, scoped, ipv4, ipv6, dns]
Posted
by ppix.
Last updated
.
Post not yet marked as solved
5 Replies
1.1k Views
For a local video streaming app I want to establish both a TCP and a UDP NWConnection. What is the best way to do this using Bonjour? Do I advertise separate Bonjour services for TCP and UDP? Or if I just advertise one (say TCP), what is the best way of creating a new UDP NWConnection to the same device once the TCP connection has been established?
Posted
by ppix.
Last updated
.
Post marked as solved
6 Replies
2.6k Views
This is both a heads-up to other developers, and a request for workarounds: I noticed that the CoreML image segmentation model in my app crashes when compiling the app with Xcode 13 and running it on an iOS 14 device. Compiling the same app and model with Xcode 12 the compiled CoreML model works just fine when running on the same devices. The crash seems to be caused by CoreML trying to access the MLShapedArray type, which was only introduced in iOS 15 (see stack trace in screenshot). So if you have an app using CoreML, make sure you test on iOS 14 devices before submitting a new build using Xcode 13. For Apple folks, I filed a Feedback on this a little while ago (and of course heard nothing back). The ID is 9584636. I had hoped this issue would get fixed before the Xcode 13 release, but unfortunately it is still there in the GM build. Any workarounds (aside from 'keep using Xcode 12') would be appreciated.
Posted
by ppix.
Last updated
.
Post marked as Apple Recommended
1.7k Views
After a user complained that they could no longer load partially transparent PNG images in my photo compositing app, I eventually tracked this down to a bug in iOS 15.1 (tested on beta 2). When the user selects a PNG image in the PHPickerViewController, the returned NSItemProvider reports having a file for the "public.png" UTI (and only that one). However when requesting data for that UTI, the system actually returns JPEG image data instead. Just a heads up to other developers who might run into this. Hopefully it will get fixed before 15.1 ships. I reported it as FB9665280.
Posted
by ppix.
Last updated
.
Post not yet marked as solved
0 Replies
717 Views
A heads up to other developers using CoreML: Make sure to test your apps and CoreML models on an A15 device like the new iPhone 13. My app uses CoreML for a custom image segmentation model. Which runs fine on all previous devices, but hangs/crashes on my iPhone 13 Pro, and (based on customer reports) on other devices with the A15. The error seems to happen when part of the model is executing on the Neural Engine. I worked around it for now by not using the Neural Engine when running on A15 devices. modelConfig.computeUnits = UIDevice.current.hasA15Chip() ? .cpuAndGPU : .all where hasA15Chip() is a custom helper method. For Apple engineers: I provided additional information in FB9665812.
Posted
by ppix.
Last updated
.
Post marked as solved
11 Replies
5.6k Views
We have a reasonably complex mesh and need to update the vertex positions for every frame using custom code running on the CPU. It seems like SceneKit is not really set up to make this easy, as the SCNGeometry is immutable. What is the easiest (yet performant) way to achieve this? So far I can see two possible approaches: 1) Create a new SCNGeometry for every frame. I suspect that this will be prohibitively expensive, but maybe not? 2) It seems that SCNProgram and its handleBinding... method would allow updating the vertex positions. But does using SCNProgram mean that we have to write all our own shaders from scratch? Or can we still use the default Scenekit vertex and fragment shaders even when using SCNProgram?
Posted
by ppix.
Last updated
.