It's June 2024, 2.5 years after this issue was reported and this issue still exists in Xcode 15.4! I know it is the same issue because closing the tabs like it was recommenced here solved it for me.
This is a big disgrace for Apple that a bug in the most basic functionality of their editor which affects productivity of all of the developers does not get the priority it deserves. Xcode is already a can of bugs but this is a new level of pissing on its users.
Post
Replies
Boosts
Views
Activity
It looks like what I need was added in iOS 16 beta https://developer.apple.com/documentation/coregraphics/cgpath/3994964-intersection?changes=latest_minor&language=_5
So I guess I was right and currently it is not available.
However for my particular case, I did not need the intersection path itself, only to stroke it so a colleague suggested to clip one path by the other and stroke the result and then switch roles and stroke the other part. Together the intersection path is approximately stroked (with some visual artifacts at intersection points)
I experience the same error message with iPhone 11 Pro and other selected few models. It happens with extremely short assets encoded with MPEG-4 Part 2, using ffmpeg's -c:v mpeg4 option.
I found a workaround, not sure if this is the correct way. The following will return the format which mainMixerNode will accept.
[self.engine.mainMixerNode inputFormatForBus:0]
In the first paragraph, I meant to write [AVSpeechSynthesizer writeUtterance:toBufferCallback:].
I found a workaround for this. I converted the SVG files to PDFs. It seems that the code which does the PDF rendering consumes less memory.
I had a somewhat different but similar issue when building on M1 MBP with Xcode running with Rosetta. Some of Apple's APIs failed and together with failing there is this error printed to the console.
In my opinion this is some kind of a simulator bug and it is present as of Xcode 12.3. Maybe 12.4 solves it, but I did not try.
Anyway, what solved it for me was to download iOS 13.7 simulator runtime and create a new simulator with this iOS version.
While not using XPC, The code in the following repo - https://github.com/johnboiles/obs-mac-virtualcam demonstrates how to communicate between a DAL plugin and a user space application. The code uses Foundation's distributed objects - https://developer.apple.com/documentation/foundation/object_runtime/distributed_objects_support?language=objc and it is very straightforward and easy to understand.
Edit: Oh, I missed the part which says that distributed objects is deprecated :(