Posts

Post not yet marked as solved
1 Replies
208 Views
Is there any API to check which microphone mode is active for my macOS application? There is API to check microphone mode for AVCaptureDevice. But the status bar allows to select Microphone mode for an application that reads Microphone Audio (not for Microphone itself).
Posted
by Deepa Pai.
Last updated
.
Post not yet marked as solved
2 Replies
658 Views
My macOS application is built for 64-bit and arm64. Does this run on 32-bit machine. I am asking this because: macOS was 64-bit even when it was supporting 32-bit apps to run on it (<= 10.14). So, does it convert 64-bit instructions to 32-bit and thus can a user run 64-bit app on his 32-bit machine? (It might be slower though). Thank you. Deepa
Posted
by Deepa Pai.
Last updated
.
Post not yet marked as solved
0 Replies
848 Views
We have macOS application which is menu-bar only. The metrics that was introduced in 2021 AppStoreConnect does not seem to track sessions of menu-bar applications. Also, I found this article that mentions the same: https://fleetingpixels.com/blog/2021/6/12/app-analytics-usage-data-for-mac-menu-bar-applications My queries: Some sessions are recorded for same menu-bar application say 12, 1 etc. I am not sure how this could happen. Could anyone please let me know what might be the reason for this. Does AppStore Connect has any plans to support sessions metrics for MenuBar applications as well?
Posted
by Deepa Pai.
Last updated
.
Post not yet marked as solved
12 Replies
5.2k Views
Hi, My mac application update is under development. For testing purpose, I have code signed my application using "Apple Development". But, the team is not able to launch the application. They get the error: "***" is damaged and can't be opened. Delete "***" and download it again from the AppStore. I am archiving the application using XCode13 in Monterey 11.01. I remember it was working few days back. The certificate is valid and it is going to expire on Dec 17th. Since, I am using Push Notification service, I have used provisioning profile including this certificate and the team's system also has been added. Could someone help me out. Thank you.
Posted
by Deepa Pai.
Last updated
.
Post not yet marked as solved
0 Replies
510 Views
I am creating AVCaptureDeviceInput using an audio driver (user land driver) which has 6 channels (5.1 channels). The audio driver captures system's audio. I am creating AVCaptureAudioDataOutput using stream description of 2 channels. Now I add AVCaptureDeviceInput and AVCaptureAudioDataOutput to AVCaptureSession and write sample buffers of AVCaptureAudioDataOutput to a file. I play a 5.1 file in my system and my above sample app writes it to a file. The recorded audio will have 2 channels as per steam description. the recorded file will have all 5.1 channels recorded in a stereo file (Eg: Left Front and Rear in Left; Right Front & Rear in Right). My query is: Who handles the mixing here? Thank you.
Posted
by Deepa Pai.
Last updated
.
Post not yet marked as solved
6 Replies
3.6k Views
Hi, I have a macOS application say Main.app which uses Helper App say Helper.app with UI support. Helper.app is placed inside Main.app/Contents/Library/LoginItems/Helper.app. Helper.app is launched using NSWorkSpace and when user opts to launch Helper.app on login, SMLoginItemSetEnabled is turned ON for Helper.app. Main.app and Helper.app communicate with each other via NSConnection.  Helper app supports set of features based on some Condition and same condition is used to validate a feature in Main.app. Hence, Main.app talks to Helper.app to check if a feature can be validated. 1) Can someone write their version of Helper.app with same bundle identifier as my Helper.app and expose a connection with same name (when my Main.app and Helper.app is not running). 2) Now when Main.app is launched, it get connected to 3rd party Helper.app. There are two versions of Main.app Main.app and Helper.app both are sandboxed and Belong to same group. Main.app and Helper.app both are not sandboxed (But, hardened run time is enabled). Groups are not defined. Thank you. Regards, Deepa
Posted
by Deepa Pai.
Last updated
.
Post not yet marked as solved
1 Replies
960 Views
My previous release version CFBundleVersion is 301.3.12010. CFBundleShortVersionString is 1.3.12. Now I need to provide an update and the CFBundleVersionis is 301.3.13061 AND CFBundleShortVersionString is 1.3.13. If I upload the build using Xcode organiser, it successfully uploads and validates. But the Build number shown is 1.3.13(302). Earlier it was showing 1.3.12 (301.3.12010). I checked my info.plist, nowhere I have defined 302. I don't know why AppStoreConnect still shows 302. Could someone help to solve this issue. Thank you. Deepa
Posted
by Deepa Pai.
Last updated
.
Post not yet marked as solved
5 Replies
1.4k Views
Hello, I have Mac application which accesses Sound Input. Hardened run time ha been enabled for this app with Audio Input turned ON, sandbox is disabled. info.plist has entry "Privacy - Microphone Usage Description" with the proper description. But, few of our users started reporting that they are getting following crash in Catalina. My app has been built using XCode12.5 in BigSur. Crash: Dispatch queue: com.apple.root.default-qos Exception Type: EXC_CRASH (SIGABRT) Exception Codes: 0x0000000000000000, 0x0000000000000000 Exception Note: EXC_CORPSE_NOTIFY Termination Reason: Namespace TCC, Code 0x0  I have asked them to try resetting Microphone permission using tccutl. But, they are still facing this crash. How do I trace out why it is crashing and why it has been crashing for few users say 5-6. Thank you. Regards, Deepa
Posted
by Deepa Pai.
Last updated
.
Post not yet marked as solved
0 Replies
731 Views
Hello, We develop music based application and most of our users are US based. We are getting enquiry by these users about issues/support related to HiRes Lossless Apple Music in our application. But, we are not able to verify as HiRes Lossless Apple Music is not available to Indian account at present. Is there any way the developers can enable HiRes lossless music so that we can verify and solve the issues. We might not be able to wait until HiRes lossless Apple music is released to Indian Store. Thank you. Regards, Deepa
Posted
by Deepa Pai.
Last updated
.
Post not yet marked as solved
8 Replies
1.5k Views
I have an Apple Script added to my project and copying it to bundle resources. I have set OSACOMPILEEXECUTEONLY to yes under Build Settings. I compile my project, open the Application bundle and reach to myAppleScript.scpt under Resources folder. I am able to open myAppleScript.scpt in Script Editor and view its source. As per documentation OSACOMPILEEXECUTEONLY should make it executable and cannot view the original source in Script Editor.
Posted
by Deepa Pai.
Last updated
.
Post not yet marked as solved
0 Replies
600 Views
Hi, What is the gain range supported for parameters of kAudioUnitSubType_GraphicEQ. I could not find the documentation for it. It would be better if you provide the documentation link as well along with the gain range. Thank you. Regards, Deepa
Posted
by Deepa Pai.
Last updated
.
Post not yet marked as solved
0 Replies
2.3k Views
As per the documentation, we can use AudioChannelDescription.mCoordinates to set speaker position. Following is my code snippet: OSStatus status = noErr; AudioObjectPropertyAddress propertyAddress; propertyAddress.mSelector = kAudioDevicePropertyPreferredChannelLayout; propertyAddress.mScope = kAudioDevicePropertyScopeOutput; propertyAddress.mElement = kAudioObjectPropertyElementMaster; if(AudioObjectHasProperty(self.mID, &amp;propertyAddress)){ UInt32 propSize = 0; AudioObjectGetPropertyDataSize(self.mID, &amp;propertyAddress, 0, NULL, &amp;propSize); AudioChannelLayout* layout = (AudioChannelLayout*)malloc(propSize); AudioChannelLabel labels[2] = {kAudioChannelLabel_Right, kAudioChannelLabel_Left}; layout-&gt;mNumberChannelDescriptions = 2; layout-&gt;mChannelLayoutTag = kAudioChannelLayoutTag_UseChannelDescriptions; layout-&gt;mChannelBitmap = 0; for (UInt32 i = 0; i &lt; layout-&gt;mNumberChannelDescriptions; i++) { layout-&gt;mChannelDescriptions[i].mChannelLabel = labels[i]; layout-&gt;mChannelDescriptions[i].mChannelFlags = kAudioChannelFlags_SphericalCoordinates|kAudioChannelFlags_Meters; layout-&gt;mChannelDescriptions[i].mCoordinates[kAudioChannelCoordinates_Distance] = sender.doubleValue; layout-&gt;mChannelDescriptions[i].mCoordinates[kAudioChannelCoordinates_Azimuth] = (i == 0) ? 90 : -90; layout-&gt;mChannelDescriptions[i].mCoordinates[kAudioChannelCoordinates_Elevation] = 0; } status = AudioObjectSetPropertyData(self.mID, &amp;propertyAddress, 0, NULL, propSize, layout); if(status != noErr){ NSLog(@"setPosition cannot set"); } }The API doesn't throw any error. But, there is no change is listening experience. Even if I set right speaker at distance 'x' (far), it doesn't give experience of listening from far speaker.What might be wrong here? Did I understand speaker positioning concept wrong?Thanks.Deepa
Posted
by Deepa Pai.
Last updated
.