Posts

Post not yet marked as solved
14 Replies
4.1k Views
I'm interested in exploring the idea of connecting an iPhone to a Mac/PC. The initial idea is to allow a computer to access/use the iPhones camera and microphone, but I guess I would also be interested in knowing more about how the communication between an iPhone and Mac/PC could work in a general sense too. There are many posts talking about an iPhone integrating with external hardware devices and MFI is mentioned as well as Redpark cables - but I'm not sure if these same things apply when working with a Mac/PC. So, some questions: What protocols, technologies or frameworks would I use send audio and video from an iPhone and a Mac using a USB cable (if the type of cable matters or makes a difference (ie USB-C vs USB-A), please call it out - but I think it's very likely the type of cable wouldn't make any difference) What protocols, technologies or frameworks would I use to send commands from a Mac to the iPhone using USB so that it can be handled by the iOS app It appears that developers are not able to send raw USB protocol/messages/packets unless the developer is a part of MFI, is that correct? Is there another way to achieve this communication? E.g. with some type of "middle-man" (ie a Redpark cable? some other device?) I've seen iOS applications that turn your phone into a webcam (they usually offer connectivity over USB and over the network (NDI)) - these applications require additional downloads on the Mac/PC. My assumption is that the iOS application is sending messages/packets to the Mac/PC application, and the Mac/PC application is then handling all the work to expose that video/auidio stream as a "camera" device to the system. Would this be the only way to achieve this functionality? Could you expose an iOS application as a camera to a Mac (ignoring PC for now) in a native way, so that it would work the same way it does when you plug in a webcam? Is this possible with MFI alone, or is this possible with a third party device (cable/hardware) in between the iPhone and Mac?
Posted
by jmurphyau.
Last updated
.
Post not yet marked as solved
0 Replies
301 Views
I wan't to build something that interacts with an external application, but that external application doesn't work well with Mac accessibility.. I've used the inspector to navigate as far/deep as I could and I'm left with a very large UI element that has various functions within it - and I want to interact with one of those functions.. I was wondering if it were possible with the accessibility APIs to see what that element looks like (like a screenshot?) which I could then use to work out where in that UI a particular button is and what it's current state is - by analysing the image
Posted
by jmurphyau.
Last updated
.