Manage hit test mask in SwiftUI for an Image with transparency

Question says it all.

I want the transparent pixels to pass through the taps / clicks / gestures, while the opaque pixels catches them.

Obviously be able to control the behaviour would be even better, so I could ignore slightly translucent pixels too.

Pre-processing is not possible, user images, so it's not easy.

So far, the best I thought was to get a global gesture recognizer, and try to figure out where in my complex hierarchy this tap falls, and see if the image is underneath. But that seems overly complicated for something so simple and basic, really.

What you are trying is not so simple and basic even in UIKit. Can you share the complete code of your current (complicated) solution? Some readers might think of improvements if they could see the code.

When you say transparent, do you mean color is .clear ?

@Claude31 - I mean I have a 32-bit image with Alpha channel, so my pixels have (r,g,b,255) for fully opaque, and (r,g,b,0) for fully transparent. Since the image has transparency in it, tapping in a place where you see underneath is totally counter-intuitive.

@OOPer I only thought of the code, I don't have the actual code itself, but it mostly boils down to doing a side channel to the traditional Tap, through a tap delegate, and the global coordinates are sent in Environment. Then, they are processed by any object requiring a tap, where the objects are responsible to know if they fall on top of themselves (through Geometry checks), as well as the actual check in the image's pixel, for its colouring (probably be worthwhile to do a 5-tap so it's not too precise). Lastly, the objects need to have a holder to know their respective Z ordering (mine would be in model, as I already know the innate z ordering).

The problem is it doesn't actually scale, as every tappable objects would need to listen to this. So an optimization might be to keep track of the global geometries of all the objects in an ordered list, and go down that list through the manager itself, foregoing the entire SwiftUI system itself, then have a callback to the object to say "you got tapped there, want it?".

So ... mostly, recreating an entire artificial makeshift slow event passing system on the side of SwiftUI just because...

There might be other solutions, but if none exists, I would propose Apple to add a Gesture handler to tell if it can "accept", "drain" or "ignore" a Gesture, alongside local coordinates, and the object could give its blessings. That would solve my issue, as well as allow very complex operations.

Manage hit test mask in SwiftUI for an Image with transparency
 
 
Q