In my application, I use CallKit and have supportsHolding = true set. During my phone call, another call comes in (e.g., GSM). I accept the incoming call and put the current call on hold.
If I end the active call myself, everything is fine, and CallKit calls the
method provider(_ provider: CXProvider, didActivate audioSession: AVAudioSession).
However, if the other party ends the call, the second call remains on hold. In the application, the user clicks on unhold, and I notify CallKit that the hold has ended.
But in this case, the didActivate method is not called at all. If I try to activate the audio myself after unhold, I receive the error:
Domain=NSOSStatusErrorDomain Code=561017449 "Session activation failed" UserInfo={NSLocalizedDescription=Session activation failed}
AVAudioSessionErrorInsufficientPriority == NSOSStatusErrorDomain Code: 561017449
What needs to be done for CallKit to activate my audio?
Post
Replies
Boosts
Views
Activity
I'm using UITextView.
I set its background color to .white.
Initially, the UITextView appears with a white background color.
However, after I click on the UITextView, a cursor appears and the background color changes to .systemBackground.
I am unable to change this background color by any means.
From appstore receive email with this issues:
ITMS-90738: Invalid purpose string value - The “NSPhotoLibraryUsageDescription” value for the NSPhotoLibraryUsageDescription key isn’t allowed in “.....”. Your app’s code references one or more APIs that access sensitive user data, or the app has one or more entitlements that permit such access. The Info.plist file for the “.....” bundle should contain a NSPhotoLibraryUsageDescription key with a user-facing purpose string explaining clearly and completely why your app needs the data.
ITMS-90738: Invalid purpose string value - The “NSCameraUsageDescription” value for the NSCameraUsageDescription key isn’t allowed in “....”. Your app’s code references one or more APIs that access sensitive user data, or the app has one or more entitlements that permit such access. The Info.plist file for the “.....” bundle should contain a NSCameraUsageDescription key with a user-facing purpose string explaining clearly and completely why your app needs the data.
In Info.plist file have defined:
<key>NSCameraUsageDescription</key>
<string>NSCameraUsageDescription</string>
<key>NSFaceIDUsageDescription</key>
<string>NSFaceIDUsageDescription</string>
<key>NSMicrophoneUsageDescription</key>
<string>NSMicrophoneUsageDescription</string>
<key>NSPhotoLibraryAddUsageDescription</key>
<string>NSPhotoLibraryAddUsageDescription</string>
<key>NSPhotoLibraryUsageDescription</key>
<string>NSPhotoLibraryUsageDescription</string>
in InfoPlist.strings have this keys:
"NSCameraUsageDescription"="xxxxx app uses camera to take photos or videos which should be sent via secure message.";
"NSMicrophoneUsageDescription"="xxxxx app uses microphone to make secure voice communication.";
"NSPhotoLibraryUsageDescription"="xxxxx app needs access to Photo library to attach your images or videos.";
"NSPhotoLibraryAddUsageDescription" = "xxxxx app needs access to Photo library to save your images or videos.";
"NSFaceIDUsageDescription"="Enabling Face ID allows quick and secure access to your xxxx app account.";
Is some other setting necessary to solve these issues?
I want to implement screen recording similar to Teams, WhatsApp...
I created an Upload Broadcast Extension.
If I start the extension, both audio and video will start to be recorded (without CallKit).
Issue:
During an ongoing call (CallKit), the extension starts, but does not start recording.
Called broadcastStarted(withSetupInfo:) but not called processSampleBuffer(_:, with:).
The extension stops after a short while.
How do I setup ReplayKit to work together with CallKit?
Is it possible that the CallKit audio setting does not allow ReplayKit recording?
Is it possible to turn off audio in ReplayKit and request only video?
When i make a call within our VOIP application (ipadOS app on MacOS and M1 - MBP 16) all is fine.
If i make a call with plugged in headphones - all is fine.
If i unplug the headphones during the call - whole audio just stop working.
If i hang up the call, make the call again - audio is there with no problems.
On iPhone and iPad work correctly.
Where can be a problem?
HALC_ShellDevice::CreateIOContextDescription: failed to get a description from the server
HALC_ProxyIOContext::IOWorkLoop: the server failed to start, Error: 0x6E6F7065
HALC_ProxyIOContext::IOWorkLoop: the server failed to start, Error: 0x6E6F7065
HALC_ProxyIOContext::IOWorkLoop: the server failed to start, Error: 0x6E6F7065
AudioObjectGetPropertyDataSize: no object with given ID 73
AudioObjectHasProperty: no object with given ID 66
AudioObjectHasProperty: no object with given ID 66
AudioObjectHasProperty: no object with given ID 0
[auvp] AUVPAggregate.cpp:4413 Failed to get current tap stream physical format, err=2003332927
AudioObjectGetPropertyDataSize: no object with given ID 66
AudioObjectGetPropertyData: no object with given ID 66
AudioObjectHasProperty: no object with given ID 66
AudioObjectRemovePropertyListener: no object with given ID 66
AudioObjectHasProperty: no object with given ID 73
AudioObjectHasProperty: no object with given ID 66
AudioObjectGetPropertyDataSize: no object with given ID 73
AudioObjectHasProperty: no object with given ID 66
AudioObjectHasProperty: no object with given ID 66
HALC_ProxySystem::GetObjectInfo: got an error from the server, Error: 560947818 (!obj)
AudioObjectHasProperty: no object with given ID 73
AudioObjectHasProperty: no object with given ID 66
HALC_ShellObject::HasProperty: there is no proxy object
AudioObjectHasProperty: no object with given ID 73
AudioObjectHasProperty: no object with given ID 73
[auvp] AUVPAggregate.cpp:6912 error 2003332927 getting input sample rate
AudioObjectHasProperty: no object with given ID 73
[auvp] AUVPAggregate.cpp:6922 error 2003332927 getting input latency
AudioObjectHasProperty: no object with given ID 73
[auvp] AUVPAggregate.cpp:6932 error 2003332927 getting input safety offset
AudioObjectHasProperty: no object with given ID 66
[auvp] AUVPAggregate.cpp:6944 error 2003332927 getting tap stream input latency
AudioObjectHasProperty: no object with given ID 66
[auvp] AUVPAggregate.cpp:6954 error 2003332927 getting tap stream input safety offset
AudioObjectHasProperty: no object with given ID 73
[auvp] AUVPAggregate.cpp:6965 error 2003332927 getting output sample rate
AudioObjectHasProperty: no object with given ID 66
[auvp] AUVPAggregate.cpp:6975 error 2003332927 getting output latency
AudioObjectHasProperty: no object with given ID 66
AudioDeviceDuck: no device with given ID
[auvp] AUVPAggregate.cpp:6985 error 2003332927 getting output safety offset
AudioObjectHasProperty: no object with given ID 73
AudioObjectHasProperty: no object with given ID 73
AudioObjectHasProperty: no object with given ID 73
AudioObjectHasProperty: no object with given ID 66
AudioObjectsPublishedAndDied: no such owning object
AudioObjectsPublishedAndDied: no such owning object
AudioObjectHasProperty: no object with given ID 73
AudioObjectHasProperty: no object with given ID 66
AudioObjectHasProperty: no object with given ID 73
[auvp] AUVPAggregate.cpp:6912 error 2003332927 getting input sample rate
AudioObjectHasProperty: no object with given ID 73
[auvp] AUVPAggregate.cpp:6922 error 2003332927 getting input latency
AudioObjectHasProperty: no object with given ID 73
[auvp] AUVPAggregate.cpp:6932 error 2003332927 getting input safety offset
AudioObjectHasProperty: no object with given ID 66
[auvp] AUVPAggregate.cpp:6944 error 2003332927 getting tap stream input latency
AudioObjectHasProperty: no object with given ID 66
[auvp] AUVPAggregate.cpp:6954 error 2003332927 getting tap stream input safety offset
AudioObjectHasProperty: no object with given ID 73
[auvp] AUVPAggregate.cpp:6965 error 2003332927 getting output sample rate
AudioObjectHasProperty: no object with given ID 66
[auvp] AUVPAggregate.cpp:6975 error 2003332927 getting output latency
AudioObjectHasProperty: no object with given ID 66
[auvp] AUVPAggregate.cpp:6985 error 2003332927 getting output safety offset
AudioObjectHasProperty: no object with given ID 73
AudioObjectHasProperty: no object with given ID 73
AudioObjectPropertiesChanged: no such object
[auvp] AUVPAggregate.cpp:2799 AggCompChanged wait failed
AudioObjectHasProperty: no object with given ID 73
AudioObjectHasProperty: no object with given ID 73
AudioObjectSetPropertyData: no object with given ID 73
[auvp] AUVPUtilities.cpp:472 SetDeviceMuteState(73) false: (err=560947818)
AudioObjectHasProperty: no object with given ID 73
AudioObjectHasProperty: no object with given ID 73
[auvp] AUVPUtilities.cpp:560 SetCFNumberValueForKeyInDescriptionDictionary(73); doesn't support 'cdes'
AudioObjectHasProperty: no object with given ID 73
AudioObjectHasProperty: no object with given ID 73
AudioObjectHasProperty: no object with given ID 73
AudioObjectHasProperty: no object with given ID 73
AudioObjectGetPropertyDataSize: no object with given ID 73
AudioObjectHasProperty: no object with given ID 66
AudioObjectHasProperty: no object with given ID 66
AudioObjectHasProperty: no object with given ID 66
[auvp] AUVPUtilities.cpp:560 SetCFNumberValueForKeyInDescriptionDictionary(66); doesn't support 'cdes'
AudioObjectHasProperty: no object with given ID 66
AudioObjectHasProperty: no object with given ID 66
[auvp] AUVPAggregate.cpp:3523 VP block error num input channels is unexpected (err=-66784)
[vp] vpStrategyManager.mm:358 Error code 2003332927 reported at GetPropertyInfo
[vp] vpStrategyManager.mm:358 Error code 2003332927 reported at GetPropertyInfo
HALC_ProxySystem::GetObjectInfo: got an error from the server, Error: 560947818 (!obj)
HALC_ShellDevice::RebuildStreamLists: there is no device
[vp] vpStrategyManager.mm:358 Error code 2003332927 reported at GetPropertyInfo
[vp] vpStrategyManager.mm:358 Error code 2003332927 reported at GetPropertyInfo
Sending PDF file via AirDrop give me a list of app to open file in.
But in case of Image sending (JPG for example) im getting a Image Library by default.
Is there a way to force system to send JPG to my app?
What is the right way to create content for ATVs with Dolby Vision support?
I have video content, we are preparing a server (VOD).
We also want to support apple tv.
Well, we don't know how to prepare it properly for her.
any suggestions?
Hi,
I create communication CarPlay app.
I use CPListTemplate for contact list.
My contact list has more than 24 items.
But CPListTemplate only allows 24 lines.
System return:
CPListTemplate.maximumItemCount: 24
CPListTemplate.maximumSectionCount: 50
How to disable this limit?
Just like in the apple phone app.
Thx
MyApp not show in CarPlay.
MyApp.entitlements:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>com.apple.developer.carplay-calling</key>
<true/>
<key>com.apple.developer.carplay-communication</key>
<true/>
<key>com.apple.developer.siri</key>
<true/>
</dict>
</plist>
info plist:
<key>UIApplicationSceneManifest</key>
<dict>
<key>UIApplicationSupportsMultipleScenes</key>
<true/>
<key>UISceneConfigurations</key>
<dict>
<key>CPTemplateApplicationSceneSessionRoleApplication</key>
<array>
<dict>
<key>UISceneClassName</key>
<string>CPTemplateApplicationScene</string>
<key>UISceneConfigurationName</key>
<string>TemplateSceneConfiguration</string>
<key>UISceneDelegateClassName</key>
<string>$(PRODUCT_MODULE_NAME).CarPlaySceneDelegate</string>
</dict>
</array>
<key>UIWindowSceneSessionRoleApplication</key>
<array>
<dict>
<key>UISceneConfigurationName</key>
<string>Default Configuration</string>
<key>UISceneDelegateClassName</key> 		<string>$(PRODUCT_MODULE_NAME).SceneDelegate</string>
<key>UISceneStoryboardFile</key>
<string>Main</string>
</dict>
</array>
</dict>
</dict>
This code is never calls:
import CarPlay
class CarPlaySceneDelegate: UIResponder, CPTemplateApplicationSceneDelegate {
var interfaceController: CPInterfaceController?
func templateApplicationScene(_ templateApplicationScene: CPTemplateApplicationScene,
didConnect interfaceController: CPInterfaceController) {
self.interfaceController = interfaceController
let contact = CPContact(name: "TEST", image: UIImage(systemName: "person")!)
let tamplate = CPContactTemplate(contact: contact)
interfaceController.setRootTemplate(tamplate, animated: false, completion: nil)
}
func templateApplicationScene(_ templateApplicationScene: CPTemplateApplicationScene, didDisconnectInterfaceController interfaceController: CPInterfaceController) {
self.interfaceController = nil
}
}
I have a valid provisioning profile.
Please what am i doing wrong?
Well thank you
I am working on a VoIP application.
I want to integrate CarPlay.
CarPlay requires Siri integration.
Is it possible to integrate an in-app handler for making a call?
Is it possible to disable CarPlay for iOS 13 and older?
If so, how?
Hello everyone,In our application we are using callKit for VOIP calls and now we need to handle action when Play/Pause button on wired headdset is pressed.But always when this button is pressed we are getting call ended from the callKit (it is the default system action).Is there any way how to disable this default action, or to bypass it somehow to prevent this unneeded ‚call end‘ from CallKit.Thank you
Hello everyone,UserDefaults will not be deleted after uninstalling the application.Only for iOS13.On older iOS versions it will be deleted.It needs some new settings?Thank you!
Hi,Is it allowed to combine "content-available": 1 + "apns-push-type": "alert" + "apns-priority", "10" in one PUSH notification ?It is working now - it show an Alert to user and deliver content to application.But I am not sure if this configuration is correct according Apple rules.THX
Hi everybody,can you advise me how to unregister from VoIP notification?THX