I have controls inside both SwiftUI views as well as UIKit VCs and views that are embedded inside UIView(Controller)Representable. In my (music) app it's common for the user to interact with multiple controls at the same time, e.g. tweaking knobs or moving sliders.
I've noticed that controls that are inside UIKit recognise touches/gestures at the same time no problem. Same for controls inside SwiftUI. But if one control is inside SwiftUI view and the other is inside a UIKit view then only the one that's touched first registers touches.
Is this a known issue/limitation of mixing UIKit and SwiftUI views? I can't find a mechanism/API which would let me specify not to prevent simultaneous touches/gestures being detected.
I've noticed that controls that are inside UIKit recognise touches/gestures at the same time no problem. Same for controls inside SwiftUI. But if one control is inside SwiftUI view and the other is inside a UIKit view then only the one that's touched first registers touches.
Is this a known issue/limitation of mixing UIKit and SwiftUI views? I can't find a mechanism/API which would let me specify not to prevent simultaneous touches/gestures being detected.