use PencilKit programmatically with two stacked PKCanvasView

I'm trying to stack two PKCanvasView to use the lower layer for highlighing and the upper layer for pen writing, so that the highlighting strokes are below and not over the handwritten text, to keep it perfectly readable even after it has been highlighted.

For this I need to be able to redirect the Apple Pencil gestures to the appropriate layer(s), depending on the chosen tool : lower layer for the highlighter tool, upper layer for the pen tool, both layers for the eraser tool, etc.

I've easily managed to create my own tool buttons (Pen, Highlighter, Eraser, Selection, Undo, Redo) with SwiftUI and select the tool for a PKCanvasView, like in the example code below :

Code Block
self.canvasView.tool = PKInkingTool(.pen, color: .black, width: 15)

But for the life of me, I still can't find a way to :
  • intercept and redirect the gestures produced by the Apple Pencil device to the actice tool of one or several PKCanvasView, and/or get the gestures directly and emulate the tools directly on the PKCanvas or its stroke array;

  • manually set the some of the tools parameters when/after creating them (like the eraser tool width for instance);

  • apply a custom semi-transparent opacity to a PKCanvasView layer.

I'm becoming quite desperate at the moment, so any help about this would be highly appreciated !

Hey....did you got any leads/solution for this?...I am also facing the same problem

use PencilKit programmatically with two stacked PKCanvasView
 
 
Q