Generate automatic touch on screen using coordinates

Hey there,


I’m facing one problem to generate automatic click on my current screen. My requirement is when i get x,y coordinate from server in that case i've to get my top most view controller and generate touch on that location.

On my presented screen there can be multiple control objects ( uibutton, segment, textfield, tablebiew, or actionsheet etc.)

So in that case if my x,y coordinate contain to any of uicontrol object frame that object action click automatic.

Example i get 50, 120 coordinate from server and on my screen UIButton is positioned with frame (20, 100, 280, 50) so in that case button action call automatically. This would be same for any other UIControl type object like (UIAlertController, UIButton, UITableViewCell didselect, ActionSheet etc)


I'm using -

Xcode - 11

iOS - 12

Socket to get coordinate from server


Ravi Sendhav

I’m facing one problem to generate automatic click on my current screen.


What is the problem exactly ?


May read this:

https://stackoverflow.com/questions/38075699/swift-programmatically-click-on-point-in-screen/38124175

Actually i'm developing a remote control app. I have implemented broadcasting using ReplayKit framework. For this i have used WebSockat (StartScream) to send frames to server.

Now i want an additional functionality, let's take an example my Apps Login screen is showing on web page. When click on login button on web page in that case i received x,y coordinate in my app using socket. Using that cooridnate i have to click login button action automatically in my app. Automatic click using x,y cooridnate will apply on ohter UIControl also (Tableview, Textfield, actionsheet, uialercontroller etc.).

In short i want remote controlling of my app like TeamViewer do (cobrowse.io also provide this feature).

Question is: what is it you do not succeed in doing ?

I'm not able to generate automatic touch in my app using x,y cooridnate.

Nope.

Can you let me know which part of link is useful for my requirement.

I can generate touch on button to call the action but problem with other type UIControl.

I thought of the part starting here :


So the objective was to simulate a touch screen event on a webView. The structure is ViewController- self.glassView-| self.webView-|

I have tried this code already. But it only return's the view object that contain coordinates i have received from server.

But i want to generate clcik event on that object i get from this code.

@Claude31 do have any other suggestion or solutions for this?

First you need to get the frame of the UIButton. You do that with:

myButton.frame.origin.x

myButton.frame.origin.y

myButton.frame.size.width

myButton.frame.size.height


Then if there is an overlap you call [self buttonAction:nil]; where buttonAction is the -(IBAction) method that the UIButton is triggering. You can use the fact that the 'sender' is nil to differentiate a remote push from an actual push.


But this is all explained above - so what do you not understand?

I'm not talking about UIButton only. In my case it can be any type of uicontrol object like UIAlertController Ok, Cancel button, tableview cell, UISegment or any thing else. I have to automatically perform action of that UIControl object using x,y coordinate.


In simple word i'm working on Remote controlling application.

Replace "UIButton" in my post with each UIControl object you wish to control. You will need to identify an action for each 'touch' on each different UIControl object. It's a difficult task.

Hi Ravi,

Were you able to find out how to trigger touch events for your Remote Control App using Swift or SwiftUI?

This is not working.

Generate automatic touch on screen using coordinates
 
 
Q