For the ones that like to have this functionality, I've created RocketSim 6.0. It allows creating screenshots with bezels, as well as recordings.
You can now join the beta:
https://testflight.apple.com/join/ORz3QWRv
Post
Replies
Boosts
Views
Activity
Is your start method called? What about your stop method? If so, what’s do you see passed to the reason parameter?
I managed to find the cause, I was using true for both:
let providerConfiguration = NEFilterProviderConfiguration()
providerConfiguration.filterSockets = true
providerConfiguration.filterPackets = true
After changing filterPackets to false my extension didn't stop anymore!
I’m not sure I undertand this question. However, the code snippet you posted in reasonable enough.
Perfect, that's all I need to know right now! After my extension started working, I was able to disallow all incoming requests and verified my concept.
Btw, I found a small improvement in your guideline: https://developer.apple.com/forums/thread/725805
The next issue you’ll find is that choosing Product > Run runs the app from the build products directory rather than the Applications directory. To fix that:
Edit your app’s scheme again.
On the left, select Debug.
Debug was a bit unclear to me at first. I would better understand this if it was named Run.
Btw, your instructions helped big time!
When I deactivate an extension I still see it in systemextensionsctl list but it’s flagged as deactivated
I see the same; the extension is deactivated. However, my app is still installed in /Application with the deactivated extension. Re-running from Xcode 14.2 and macOS 13.2.1 (22D68) I couldn't get my new build to execute.
I did find another way to solve it without restarting:
Remove application from /Applications
Press continue to confirm it will delete the app's extension
Clean my Trash bin
Re-run the app
This is a quick enough timeline, but obviously, ideally, I would just re-run from Xcode and have my latest Network Extension code working.
Thanks for those insights! It seems that there's still a scenario. We've been able to reproduce the user's behavior as follows:
Create a video in GoPro Quik
Export it to the Photos app
Import it into the WeTransfer app <- This is where we use PHPickerViewController in combination with .current
Notice a difference in file size as reported by the Photos app
Even though the video is locally stored (not in iCloud), a "Preparing..." progress popup appears. That indicates transcoding is happening.
I've attached a few screenshots to indicate our issue.
Could it be that there's still some kind of setting we're missing to indicate our app allows any kind of file to be added?
After some further investigation, it turns out that the HEVC video gets transcoded into MPEG-4, even though we've configured our PHPickerViewController as follows:
var configuration = PHPickerConfiguration()
configuration.selectionLimit = 0
configuration.preferredAssetRepresentationMode = .current
return PHPickerViewController(configuration: configuration)
We're loading the file using itemProvider.loadFileRepresentation(forTypeIdentifier: "public.movie"), which I believe should be correct.
A few interesting findings:
If I Airdrop the file to my Mac, it remains being HEVC and keeps the correct size
If I load that file into the Xcode Simulator and add it to our app running in the Simulator, transcoding is not taking place
The latter is especially interesting: What is the difference between the file on the device vs. the file in the Simulator after airdropping? Why does transcoding take place on device and not in the Simulator?
I'd almost say there's some kind of device setting or iCloud influencing the result, but at this point, I've got no clue. Looking forward to get your thoughts!
I'm building RocketSim (https://www.rocketsim.app), a developer tool that already works for the Simulator.
RocketSim shows a side window next to the Simulator, and would potentially also show it next to an iPhone Mirror.
The idea is to add actions like that allow developers to write automations like:
Login using a username/password (requiring an API to submit text)
Tap a button, fill in text, tap another button aka app navigation automation (requires both text input and UI interaction)
Developers have to perform many repetitive interactions in their apps during debugging, so it would be awesome if I could automate that for real devices!
Additionally, an API to enable developer settings like:
Network conditioner
Accessiblity settings
Thanks a lot, this helps a ton! I'll check out the NETransparentProxyProvider and loop back in case it doesn't work out.
Oh, hey Antoine!
Sounds like we know each other? 🙈
AFAIK there’s no API to do this sort of thing. There is the devicectl tool. Have you played around with it already?
I have checked it out, but didn't find evidence of it being my answer. I'll have to check out the new stuff, though! However, I did wonder whether this new framework has anything to do with communicating to connected iPhones: https://developer.apple.com/documentation/CoreHID
Only parasocially, in that I read your blog.
Ah, that's so cool! Make me proud to see Apple developers read my content too 💪
Please do. Honestly, I don’t think there’s enough there to do what you want, but it’s good to confirm that before you go off to file your ERs.
Dang, you're completely right. This helps a ton already for what I want to built! Thanks a lot!
Nope. CoreHID is a new high-level API to interact with HID devices, and iOS devices aren’t those.
Perfect, that clarifies a lot and saves me some time investigating that library.