Programatic Haptics

I'm developing an iPhone App that is using on-device speech recognition. I'd like to create a 'buzz' to confirm the word was recognised without having to look at the screen. I can get haptics working from a button; but when I try to attach sensoryfeedback to any view element, it does not fire despite my toggle from successful voice commands variable correctly flipping. Has anyone tried a similar thing with success, or have any ideas? Thanks very much.

I have found a work-around if anyone has the same issue. This is something to do with the recording and vibrate using the same data stream and I cannot find any documentation to deal with multiple streams correctly. Instead, I have turned off voice recording, make the buzz and then reactivate voice recording (I had to introduce a sleep before reactivating voice recording to allow the vibrate to complete), a bit of a botch; but it works. If anyone has any links to examples that do this elegantly then I'd be very grateful.

Programatic Haptics
 
 
Q