Sound and Haptics

RSS for tag

Create meaningful and delightful experiences that engage a wider range of senses with sound and haptic design.

Posts under Sound and Haptics tag

19 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Haptic feedback enabled but not working
I have an iPhone 14 with iOS 18 installed on which I noticed the vibration no longer works. The haptic feedback setting is set to "Always on", I have no vibration on notifications, nor on incoming calls and the keyboard feedback does not work either. I also noticed another strange thing: going to the ringtones menu, these do not play if I select them to try them. I tried to update to iOS 18.1 but even with this version I have the same problem. Could it therefore be a hardware problem and not a software problem? Is anyone else having the same problem as me? Thanks
1
0
214
4w
iOS 18 LED TO LOW VOLUME
After installing iOS 18, the iPhone would have consistent low volume and from this low, it wants to go more down suddenly when playing some media and gets back to its normal condition (already low) itself. I have tried several things from Internet like forced restart, several times restart, turning on vocal shortcut and turning it off back, but nothing helped. Any suggestions? Its Iphone 16 pro max by the way
1
0
295
Oct ’24
Creating CHHapticPatternPlayer with AudioCustom Event crashes
Hello everyone, I’m experiencing occasional crashes in my app related to the Core Haptics framework, specifically when creating haptic pattern players. The crashes are intermittent and not easily reproducible, so I’m hoping to get some guidance on what might be causing them. It's seems it's connected to Audio Resource I'm using within AHAP file. Setup: I use AVAudioSession and AVAudioEngine to record and play continuous audio. After activating the audio session and setting up the audio engine, I initialize the CHHapticEngine as follows: let engine = try CHHapticEngine(audioSession: .sharedInstance()) ... try engine?.start() // Recreate all haptic pattern players you had created. let pattern = createPatternFromAHAP(Pattern.thinking.rawValue)! thinkingHapticPlayer = try? engine?.makePlayer(with: pattern) // Repeat for other players... AHAP file: "Pattern": [ ... haptic events { "Event": { "Time": 0.0, "EventType": "AudioCustom", "EventWaveformPath": "voice.chat.thinking.mp3", "EventParameters": [ { "ParameterID": "AudioVolume", "ParameterValue": 0.7 } ] } } ] I’m receiving the following crash report: Crashed: com.apple.main-thread EXC_BREAKPOINT 0x00000001ba525c68 0 CoreHaptics +[CHHapticEngine(CHHapticEngineInternal) doRegisterAudioResource:options:fromPattern:player:error:].cold.1 + 104 1 CoreHaptics +[CHHapticEngine(CHHapticEngineInternal) doRegisterAudioResource:options:fromPattern:player:error:].cold.1 + 104 2 CoreHaptics +[CHHapticEngine(CHHapticEngineInternal) doRegisterAudioResource:options:fromPattern:player:error:] + 3784 3 CoreHaptics -[CHHapticPattern resolveExternalResources:error:] + 388 4 CoreHaptics -[PatternPlayer initWithPlayable:engine:privileged:error:] + 560 5 CoreHaptics -[CHHapticEngine createPlayerWithPattern:error:] + 256 6 Mind VoiceChatHapticEngine.swift - Line 170 VoiceChatHapticEngine.createThinkingHapticPlayer() + 170 Has anyone encountered similar crashes when working with CHHapticEngine and haptic patterns that contains audioCustom event? Thank you so much for your help. Ondrej
2
0
300
Sep ’24
Haptic to Audio (synthesize audio file from haptic patterns?)
Haptics are often represented as audio for presentation purposes by Apple in videos and learning resources. I am curious if: ...Apple has released, or is willing to release any tools that may have been used synthesize audio to represent a haptic patterns (such as in their WWDC19 Audio-Haptic presentation)? ...there are any current tools available that take haptic instruction as input (like AHAP) and outputs an audio file? ...there is some low-level access to the signal that drives the Taptic Engine, so that it can be repurposed as an audio stream? ...you have any other suggestions! I can imagine some crude solutions that hack together preexisting synthesizers and fudging together a process to convert AHAP to MIDI instructions, dialing in some synth settings to mimic the behaviour of an actuator, but I'm not too interested in that rabbit hole just yet. Any thoughts? Very curious what the process was for the WWDC videos and audio examples in their documentation... Thank you!
2
0
293
Sep ’24
Vision Pro system audio volume is very low after VisionOS 2.0 Beta 5 update
I updated my Vision Pro to VisionOS 2.0 Beta yesterday, and now everything is very quiet even at max volume. I tested with the built in speakers, Beats Pro and Airpods Pro Gen 2 as well and same problem with all of them. If I turn the volume down to 50% you cant tell what audio is being played anymore. I tried restarting the headset and it makes no difference. Anything else I can try to resolve this issue?
1
1
437
Aug ’24
Reality Composer Pro - Audio not working for .USDZ
Based on info online I'm under the impression we can add spatial audio to USDZ files using Reality Composer Pro, however I've been unable to hear this audio outside of the preview audio in the scene inspector. Attached is a screenshot with how I've laid out the scene. I see the 3D object fine on mobile and Vision Pro, but can't get audio to loop. I have ensured the audio file is in the scene linked as the resource for the spatial audio node. Am I off on setting this up, it's broken or this simply isn't a feature to save back to USDZ? In the following link they note their USDZ could "play an audio track while viewing the model", but the model isn't there anymore. Can someone confirm where I might be off please?
3
0
706
Jul ’24
AccessibilityUIServer has microphone locked
Just installed iOS 18 Beta 3. I am seeing my AccessibilityUIServer using the microphone and this is causing no notification sounds, inability to use Siri by voice and volume is grayed out. If I start to play anything with sound AccessibilityUIServer releases the microphone and I am able to use the app. Calls still work since AccessibilityUIServer will release and the phone will ring. Feed back ID is FB14241838.
12
9
4.6k
Sep ’24
Programatic Haptics
I'm developing an iPhone App that is using on-device speech recognition. I'd like to create a 'buzz' to confirm the word was recognised without having to look at the screen. I can get haptics working from a button; but when I try to attach sensoryfeedback to any view element, it does not fire despite my toggle from successful voice commands variable correctly flipping. Has anyone tried a similar thing with success, or have any ideas? Thanks very much.
1
0
394
Jul ’24
Custom Apple Pencil Pro Haptics
Just got my hands on the Apple Pencil Pro, and was looking forward to being able to add tactile feedback to my apps through the Pencil. Apple appears to have updated their docs to make mention of the possibility of haptic feedback via the pencil and makes vague references to a certain SwiftUI modifier, but doesn't give any pencil-specific guidelines, code or speak about its capabilities at all. I'd like to ask if its possible to use CoreHaptic to queue custom haptic feedback (as of now the standard code which works on iPhone doesn't seem to work on an iPad with a paired Pencil Pro), or if that's not possible if there are any updated resources/example code for triggering predefined Pencil Pro haptics.
2
0
880
May ’24
UI Sound FX package for Vision Pro?
Hi, I noticed that several of the top Vision Pro apps such as Disney+ and Max have similar if not identical SFX for basic navigation like Select and Back. I was wondering if Apple has provided an audio SFX library as a resource, or if the similarity is coincidental. I'm not familiar with Apple providing anything like this in the past, but figured on a totally new platform where they might be trying to establish a paradigm, it was worth looking into!
0
0
791
Mar ’24
Core Haptics Engine Player Destruction with Corrupted AHAP Files
Hello, I've encountered a concerning issue while working with Core Haptics which I believe warrants attention from the community and Apples engineering team. When attempting to play a haptic pattern using the ChHapticEngine.playPattern method with a corrupted AHAP file path, I've observed unexpected behavior. Specifically, the player is immediately destroyed, triggering the notifyWhenPlayersFinished callback with nil error before executing the catch block(catch block is eventually executed), Furthermore, this behavior introduces ambiguity regarding the successful playback of haptic patterns, as corrupted files can produce false positives in error notifications. Current workaround is simple, AHAP files are JSON formated files at the end of the day, so just try to decode data or create Foundation object from given JSON data. and if this fails then data must not be in expected format(or decoding strategy is off), that way you can guarantee the validity of an AHAP file before passing it to your player. Michaud Reyna, Senior iOS Engineer
1
0
549
Feb ’24
Swift unable to find sound file
Hi everyone, I'm currently facing an issue with AVAudioPlayer in my SwiftUI project. Despite ensuring that the sound file "buttonsound.mp3" is properly added to the project's resources (I dragged and dropped it into Xcode), the application is still unable to locate the file when attempting to play it. Here's the simplified version of the code I'm using: import SwiftUI import AVFoundation struct ContentView: View { var body: some View { VStack { Button("Play sound") { playSound(named: "buttonsound", ofType: "mp3") } } } } func playSound(named name: String, ofType type: String) { guard let soundURL = Bundle.main.url(forResource: name, withExtension: type) else { print("Sound file not found") return } do { let audioPlayer = try AVAudioPlayer(contentsOf: soundURL) audioPlayer.prepareToPlay() audioPlayer.play() } catch let error { print("Error playing sound: \(error.localizedDescription)") } }
7
0
1.5k
Feb ’24
Game not muted with ring/silent switch
Hi, I'm building a simple game for iOS. I have background music. The ring/silent switch is not disabling sound when switched to silent. So far I'm testing on devices through TestFlight (still internal testing, not beta). Do I need to code this function myself or does iOS know it's a game and disable sound automatically? and/or would my game be rejected if the switch doesn't disable sound? (I have an internal setting to enable/disable sounds in the game) Due to the way it's coded (capacitor app), I can't access the ring/silent switch to disable/enable sound. Thanks, this problem makes me feel like a preserved moose.
0
0
599
Feb ’24
Lower Notification Sounds on iOS 17
I'm involved in development on an iOS app for home security and alarm systems. There is recently a lot of negative feedback from customers about how low the notification sounds are since iOS17. Much of the feedback centers around the inability to control the volume of the notification sounds. My question is: if our app uses custom notification sounds, are these impacted by the volume changes made in iOS 17? I know previous versions of iOS allow you to control "Ringtone and Alert" volume in settings (with a volume slider). Is this same control still available for custom notification sounds within our app?
0
0
804
Nov ’23