I'm implementing a SwiftUI control over a UIView's drawing surface. Often you are doing both at the same time. Problem is that starting with the SwiftUI gesture will block the UIKit's UITouch(s). And yet, it works when you start with a UITouch first and then a SwiftUI gesture. Also need UITouch's force
and majorRadius
, which isn't available in a SwiftUI gesture (I think).
Here's an example:
import SwiftUI
struct ContentView: View {
@GestureState private var touchXY: CGPoint = .zero
var body: some View {
ZStack {
TouchRepresentable()
Rectangle()
.foregroundColor(.red)
.frame(width: 128, height: 128)
.gesture(DragGesture(minimumDistance: 0)
.updating($touchXY) { (value, touchXY, _) in touchXY = value.location})
.onChange(of: touchXY) {
print(String(format:"SwiftUI(%.2g,%.2g)", $0.x,$0.y), terminator: " ") }
//.allowsHitTesting(true) no difference
}
}
}
struct TouchRepresentable: UIViewRepresentable {
typealias Context = UIViewRepresentableContext<TouchRepresentable>
public func makeUIView(context: Context) -> TouchView { return TouchView() }
public func updateUIView(_ uiView: TouchView, context: Context) {}
}
class TouchView: UIView, UIGestureRecognizerDelegate {
func updateTouches(_ touches: Set<UITouch>, with event: UIEvent?) {
for touch in touches {
let location = touch.location(in: nil)
print(String(format: "UIKit(%g,%g) ", round(location.x), round(location.y)), terminator: " ") }
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) { updateTouches(touches, with: event) }
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) { updateTouches(touches, with: event) }
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) { updateTouches(touches, with: event) }
override func touchesCancelled(_ touches: Set<UITouch>, with event: UIEvent?) { updateTouches(touches, with: event) }
}
Because this is multitouch, you need to test on a real device.
Sequence A:
- finger 1 starts dragging outside of red square (UIKit)
- finger 2 simultaneously drags inside of red square (SwiftUI)
⟹ Console shows both UIKit and SwiftUI touch positions
Sequence B:
- finger 1 starts dragging inside of red square (SwiftUI)
- finger 2 simultaneously drags outside of red square (UIKit)
⟹ Console shows only SwiftUI touch positions
Would like to get both positions in Sequence B.
Sorry for delay. Wound up using a UIHostingController and doing a custom hitTest, within a SwiftUI view model. This is within a MVVM architecture. Going deeper is rather complex and may change for xrOS. Any, here's the starting point clue:
override func viewDidAppear(_ animated: Bool) {
view.addSubview(pipeline.mtkView)
pipeline.makeShader(for: root˚)
pipeline.setupPipeline()
pipeline.settingUp = false
let touchView = SkyTouchView(touchDraw) // UIKit
let menuView = MenuView(SkyFlo.shared.root˚, touchView) // SwiftUI
let hostView = UIHostingController(rootView: menuView).view
if let hostView {
view.addSubview(hostView)
hostView.translatesAutoresizingMaskIntoConstraints = false
hostView.topAnchor.constraint(equalTo: view.topAnchor).isActive = true
hostView.bottomAnchor.constraint(equalTo: view.bottomAnchor).isActive = true
hostView.leftAnchor.constraint(equalTo: view.leftAnchor).isActive = true
hostView.rightAnchor.constraint(equalTo: view.rightAnchor).isActive = true
hostView.backgroundColor = .clear
}
}