Posts

Post not yet marked as solved
1 Replies
520 Views
I'm still seeing various confusing download states in the transaction when trying to download content. I'm mostly seeing issues on iOS 14.2. A quite common error seems to be the download being reported as finished and there is a referenced cache folder in Library/Caches/StoreKit. However, this folder appears to be empty and there is no "Contents" sub-folder as expected. What seems to happen is that the download transitions from waiting (progress zero) to finished (progress 1.0) straight after the download has been started. This error condition persists until a power-cycle of the device which seems to clear the error, allowing the download to proceed successfuly. Also, can someone confirm the naming convention of these cache files changed significantly in iOS 14? In 13.7 for example, the download location and naming convention appear to be different. I'm also still seeing occasional instances where the state is waiting, but the download progress goes from zero to 1.0 and the content is actually downloaded. In this latter error scenario, it means I have to have a workaround to essentially ignore this documentation: Use the downloadState property to determine whether the download has completed. Don’t use the progress or timeRemaining property of the download object to check its status; these properties are for updating your UI.
Posted Last updated
.
Post marked as solved
7 Replies
747 Views
Hi, I've noticed that after cancelling an IAP content download, and attempting to re-download, while the transaction download seems to kick-off OK, and progress is reported as being > 0, the download state is still set to cancelled. I would have assumed that when (re)downloading, the state would / should change to being active ("Download is actively downloading" according to the state enum docs). Is this a legit bug or a "feature" ? It entails a nasty workaround on the app-side for sure as you have to assume now that an active download is where you have a progress of > 0 instead of looking at the download state property. Yuk.
Posted Last updated
.
Post marked as solved
1 Replies
346 Views
Looks to me like some are (e.g. volume, pan...) but others aren't, despite them being configured (and working on macOS) in the .EXS file using Logic Pro's Sampler MOD MATRIX. I've tried using sendController(), sendMIDIEvent() and MusicDeviceMIDIEvent(). I'm sure the latter worked at some point but could be mistaken. Vaguely recall the AK guys were using this to tweak envelopes in their ROM player. For example, it would be nice to control one of the filters cutoff frequency via a CC #74 message but I can't seem to get this working at all on iOS. Question is, basically, am I expecting too much from AVAudioUnitSampler on iOS?
Posted Last updated
.
Post marked as solved
3 Replies
487 Views
Given that many AVFoundation audio components are built on AudioUnits it occurred to me that it might be "nice" to leverage some of these components as part of an AUv3 implementation and let those guys do some of the heavy lifting, i.e. populating buffers and so on. A good example is perhaps (my obsession) AVAudioUnitSampler, all the more tempting given that the internalRenderBlock for the underlying AUAudioUnit was recently exposed not so long ago (iOS 11 maybe?) Two literal questions really I guess... 1) Is trying to use AVFoundation "stuff" as part of an AUv3 implementation like this a "bad idea" ? 2) Alternatively, if this could work "in theory", the thing that's totally unclear to me is how to properly integrate AVAudioEngine given that (please correct me if I'm wrong) AVAudioUnitSampler totally needs AVAudioEngine to be up and running to do its thing. During some hack tests, I was able to get an AV sampler-based AU instantiated and making noise but couldn't seem to route outputs correctly to the host. There were also errors around setting maximumFramesToRender that made me suspect the answer to question 1 is "yes". And finally, mostly likely a rhetorical question... 3) Is it just me, or does the audio side of AVFoundation, and the current "push" towards AUv3 seem at odds?
Posted Last updated
.
Post not yet marked as solved
4 Replies
890 Views
Hi Folks,I hope this cross-post from this old Core Audio thread is OK. I think it deserves a new thread as it's more of an AVFoundation question.I'm hoping someone will be able to point me in the right direction as I'm really struggling to find any working examples of supporting IAA Audio in a Swift app.WIth a bit of digging and looking at Obj-C examples, I've managed to get my generator AU published OK so it's being recognized by a host app (AUM) and while it can't launch my app, it seems to connect to it if I launch it manually. However the audio from my app isn't showing up in the host's mixer channel although it's still audible via the shared audio session. I get the route change notification when the host app launches, and I guess I need to do *something* in there, but I'm not sure what... :-)I create my audio graph using AVFoundation nodes and all that seems to be working fine. However, I can't figure out whether I can apply the same approach I've seen in the Core Audio IAA graph examples.I also set up my audio session using the shared instance, playback category and mix with others. This also seems to be working fine.If anyone can please point me in the right direction, or even share a snippet of a working example, I would be very, very, grateful. I've been on this for a couple of days now and close to giving up / banging my head on the desk until I don't know what IAA stands for anymore :-)Thanks, -RobPS. Here's the core audio thread, and so far, I'm the only responder :-/ https://forums.developer.apple.com/message/132839#132839
Posted Last updated
.