Haptics are often represented as audio for presentation purposes by Apple in videos and learning resources.
I am curious if:
...Apple has released, or is willing to release any tools that may have been used synthesize audio to represent a haptic patterns (such as in their WWDC19 Audio-Haptic presentation)?
...there are any current tools available that take haptic instruction as input (like AHAP) and outputs an audio file?
...there is some low-level access to the signal that drives the Taptic Engine, so that it can be repurposed as an audio stream?
...you have any other suggestions!
I can imagine some crude solutions that hack together preexisting synthesizers and fudging together a process to convert AHAP to MIDI instructions, dialing in some synth settings to mimic the behaviour of an actuator, but I'm not too interested in that rabbit hole just yet.
Any thoughts? Very curious what the process was for the WWDC videos and audio examples in their documentation...
Thank you!