is there any kind of special Apple's Remote Management Service like remote service to dox or kyc myself to get that kind of programmatic functionality for the project? :x
Post
Replies
Boosts
Views
Activity
but then there is a new problem, even with Replaykit and the magic endpoint we need a way for the ai to interact with the device. Apparently that's also tricky hmm as Apple doesn't have any way to programmatically control or send touch/tap commands or something similar to a virtual keyboard or virtual mouse commands akin to pyautogui/py module:keyboard. hmmm the ai would there but heavily locked down, eh. :x
ok i think i found a way around this. You record in the app ask the USER for permission(s). Then use replay framework to capture screen, processes it live. Send to ai endpoint (needs to be a singular text-image chat endpoint), parse response implement touch controls and execute them. phew. I already have the bones of the app for another project. So might well figure out how to do this in iphone lingo (swift/ios ?). And I wouldn't technically be breaking any Apple rules.
eh we still need the singular text-image chat ai endpoint ofcourse.
basically ai-controlled devices/apps, that can better assist people. The parser acts as the Ai's eyes and hands sending it images and executing commands on its behalf. :3
hope I got the tags right on this post lol. hello? :x