Posts

Post not yet marked as solved
1 Replies
990 Views
Hello, In my iPadOS app I need to be able to apply AUv3 effects to individual input channels. However, AVAudioEngine's inputNode delivers one output bus with all input channels in that bus. How can I separate these input channels? For example, an audio interface with 6 input channels yields an inputNode with 6 channels on the input bus and output bus. How can I apply an effect to channel 1? AVAudioEngine only lets me connect busses between audio units. AVAudioConnectionPoint is an auxiliary class that allows me to establish a one-to-many routing. It seems that with the V3 API I cannot achieve my goal. What other solutions are there? Use the V2 API and install a callback... exactly where? Or is there some kind of audio format conversion foreseen?
Posted
by Biochili.
Last updated
.
Post not yet marked as solved
0 Replies
847 Views
Hello, A bit late but now that I have learned about on-demand resources I see that there is a maximum of 20 GB of the total hosted content. Does this limit apply also when we have an Enterprise Developer Program and we can host it on our own server? Is there any other way to have more content? My customer wants to release special musical albums that are ~800 MB per album, with an ever-increasing number of albums over the years. So the limit of 20 GB would be reached after 25 albums. With growing device storage sizes I would expect that this limit also grows. I've been thinking about releasing 1 app per album but this results in plenty of apps, plus fixing bugs in all apps is expensive. Or I could develop my own content downloading concept that downloads resources from my own server. But this involves some server-side coding because content should only be accessible if purchased. I prefer to use on-demand resources when possible.
Posted
by Biochili.
Last updated
.
Post not yet marked as solved
2 Replies
1.4k Views
Hello, For my customer I am maintaining a universal iOS app for 4 years now, and in summer 2020 we have added 2 non-renewing subscriptions. The problem with one of the two subscriptions is that it works in the sandbox environment but its product (SKProduct) is NOT available in production environment, ie. when released on the App Store. We can tell because the app shows a disabled buy button if the product is invalid via a SKProductsRequest. In sandbox environment (when started from Xcode or via TestFlight at my customer and our beta testers) the button is enabled, ie. the SKProduct is valid. So we know this is not a simple Release vs Debug build issue. The corresponding IAP is approved in App Connect. We have a total of 61 approved IAPs. What could be wrong with this subscription? The price is tier 80 = 449,99€, that is the only difference that sets it apart from the other IAPs which are 49,99€. We have contacted Apple Developer Support many times. We usually receive the routine response that the issue has been re-escalated. It's on-going now since August 2020, and we are losing user confidence. Has anyone ever seen such a problem? If anyone from Apple reads this, please get in touch if you think you can help. I don't know what else we can do to have someone at Apple look into this. I speculate Dev Support does not want to spend time on this because it's a non standard problem that requires someone to dig deep into log files. It's not hard to reproduce: Download our app for free and go to the purchases tab. It's that easy. Dev Support could just do that and reproduce it, then check logs. All we can do is raise issues with Dev Support or App Store Support, but what if they just let it starve? We feel left alone on this.
Posted
by Biochili.
Last updated
.
Post not yet marked as solved
0 Replies
586 Views
Hello, In my app I am observing the AVAudioEngineConfigurationChange notification. It uses AVAudioEngine, its audio input node, some samplers, some effects, some mixers, and the output node. So when I plug in an external audio interface into the iPad my observer is called. AVAudioSession.inputNumberOfChannels tells me there are now 6 input channels instead of 1 (the audio interface indeed has 6 inputs). However, my current AVAudioEngine's inputNode format still shows 1 channel. How do I get the engine to update the input node's format? Should I allocate a new engine and discard the old one? The documentation explicitly says, "The engine must not be deallocated from within the client's notification handler". https://developer.apple.com/documentation/foundation/nsnotification/name/1389078-avaudioengineconfigurationchange I am at a loss. Should I refrain from using AVAudioEngine.inputNode and allocate my own instance? I have also tried to reconnect the inputNode with the new hardware format to the next node in the graph, like the documentation suggests. As a result the next node render call fails to pullInput() with code kAudioUnitErr_CannotDoInCurrentContext (it's my own custom AU effect so I can tell). Please assist.
Posted
by Biochili.
Last updated
.