Apple added support for WebKit speech recognition in Safari 14.1. We're trying to use it in our WebApp and facing some issues. The issue is mic never stops after the user stops speaking and we never get the recognized text on iPhone and iPad.
Here is a simple WebApp to test : https://oiyw7.csb.app/
Post
Replies
Boosts
Views
Activity
In our web app, we use the MQTT library which internally uses a WebSocket for communications. In iOS 14 and earlier it was working fine. But in iOS 15 and macOS Monterey, WebSocket breaks when we try to send the first message.
Below are some findings from our research:
There is an experimental feature named NSURLSession WebSocket in Safari. This feature is by default enabled on iOS 15 and macOS Monterey. If we disable this feature, WebSocket seems to work fine.
We have also noticed that when the above-mentioned experimental feature is enabled, it seems to add some limit on the size of the message that we can send from a socket connection. If we try to send messages of more than 84 bytes, the socket connection breaks. If we try to send messages of fewer than 84 bytes it works fine.