Post

Replies

Boosts

Views

Activity

[iOS 17.4, XCode 15.3] Previously working NWConnection for peer-to-peer connection now permanently stuck on "Preparing"
After updating my devices to iOS/iPadOS 17.4, and XCode to 15.3, Network framework peer-to-peer connections have stopped working entirely. The system was working fine before and the code has not been changed. On the client side (NWBrowser) the server (NWListener) can be seen, but upon attempting to establish a connection the client-side NWConnection.State gets permanently stuck at .preparing. NWConnection.stateUpdateHandler doesn't enter any other state. It doesn't seem as though it's taking a long time to prepare; it's just stuck. This situation occurs across multiple connection modes (wired, common wifi, separate wifi). Additional information I didn't participate in the 17.4 beta and RC The code in "Creating a custom peer-to-peer protocol" works--this code forms the basis of my code
2
0
1.3k
Mar ’24
AVAudioEngine: Is there a way to play audio at full volume while having an active input tap?
My project has uses an AVAudioEngine with a very simple setup: A Speech recognizer running on a tap on the engine's input with separate AVAudioPlayerNodes handling playback. try session.setCategory(.playAndRecord, mode: .default, options: []) try session.setActive(true, options: .notifyOthersOnDeactivation) try session.setAllowHapticsAndSystemSoundsDuringRecording(true) filePlayerNode ---> engine.mainMixerNode bufferPlayerNode --> engine.mainMixerNode engine.mainMixerNode --> engine.outputNode //bufferPlayer.scheduleBuffer() is called on its own queue The input works fine since the buffers can be collected into a file and plays back correctly, and also because the recognizer works fine; but when I try to play the live audio by sending the buffer to the bufferPlayer on this or another device, the buffer audio plays at a very low volume, sometimes with severe distortions. If I lower the sample rate via AVAudioConverter, the distortions get worse. I've tried experimenting with the AVAudioSession category options, having separate AVAudioEngines, and much, much more, yet I still haven't figured this out. It's gotten to the point where I've fixed almost all the arcane and minor issues in my audio system, yet I still can't play back my voice properly. The ability to both play and record simultaneously is a basic feature of phones--when on speaker mode, a phone doesn't need to behave like a walkie-talkie. In my mind, it's inconceivable that the relatively new AVAudioEngine doesn't have a implementation for this, since the main issue (feedback loops) can be dealt with via a simple primitive circuit. Live video chat apps like FaceTime wouldn't be possible without this, yet to my surprise I found no answers online (what I did find were articles explaining how to write a file while playback is occurring). Is there truly no way to do this on AVAudioEngine? Am I missing something fundamental? Any pointers would be greatly appreciated
1
0
877
Feb ’24
Getting a list of words recognized by Speech
Is there a way to extract the list of words recognized by the Speech framework? I'm trying to filter out words that won't appear in the transcription output, but to do that I'll need a list of words that can appear. SFSpeechLanguageModel.Configuration can be initialized with a vocabulary, but there doesn't seem to be a way to read it, and while there are ways to create custom vocabularies, I have yet to find a way to retrieve it. I added the Natural Language tag in case the framework might contribute to a solution
0
0
703
Feb ’24
Is the code in 'Building a custom peer-to-peer protocol' insecure?
I'm new to Networking, so forgive me if this is a silly question: In the sample code, Building a custom peer-to-peer protocol, TLS is configured as follows: // Create TLS options using a passcode to derive a pre-shared key. private static func tlsOptions(passcode: String) -> NWProtocolTLS.Options { let tlsOptions = NWProtocolTLS.Options() let authenticationKey = SymmetricKey(data: passcode.data(using: .utf8)!) var authenticationCode = HMAC<SHA256>.authenticationCode(for: "TicTacToe".data(using: .utf8)!, using: authenticationKey) let authenticationDispatchData = withUnsafeBytes(of: &authenticationCode) { (ptr: UnsafeRawBufferPointer) in DispatchData(bytes: ptr) } sec_protocol_options_add_pre_shared_key(tlsOptions.securityProtocolOptions, authenticationDispatchData as __DispatchData, stringToDispatchData("TicTacToe")! as __DispatchData) sec_protocol_options_append_tls_ciphersuite(tlsOptions.securityProtocolOptions, tls_ciphersuite_t(rawValue: TLS_PSK_WITH_AES_128_GCM_SHA256)!) return tlsOptions } The sample code touts the connection as secure ("...uses Bonjour and TLS to establish secure connections between nearby devices"), but to my untrained eye it doesn't seem so. My reasoning is as follows: If I adapt this code as-is, so connections between two instances of my app use SymmetricKeys derived from the four-digit passcode, then wouldn't my encryption be easy to break by an adversary who sends 0000...9999 and records corresponding changes in the encryption, exposing my app to all sorts of attacks? The sample uses the passcode to validate the connection (host user shows client user the passcode, which is manually entered), which is a feature I would like to keep in some form or another, which is why this is causing so many headaches. Generally speaking, is there a way to secure a local peer-to-peer connection over Network.framework that doesn't involve certificates? If certificates are the only way, are there good resources you can recommend?
6
0
912
Jan ’24
Wired data transfer between an app on two iOS/iPadOS devices--Possible or pipe dream?
If two iOS/iPadOS devices have your app opened, is it possible to have the apps send data to each other over a wired connection? E.g. If two iPhone 15s are connected by USB-C, can I get my app in iPhone A to send data to iPhone B and vice-versa? I've been looking around for quite a while now and at this point I just want to know if it's technically feasible.
5
0
1.1k
Jan ’24
ScrollView behaves as though Sheet is always expanded when mutiple detents enabled
A follow-up to Scrolling sticker browser on a Messages App sheet causes sheet to move, re-formulated and posted here after distilling the issue ScrollView behaves as though the Sheet is constantly expanded and transfers the drag gesture to the Sheet when scrolled to the top (i.e. when first displayed), causing the user to move the Sheet and not the ScrollView when attempting to scroll up or down. If this should be filed as a bug, let me know. Notes The problem doesn't exist if the sheet has only one detent, but since Messages App Extensions must be adjustable in phone portrait, this does nothing for me Adding a Rectangle with hitTesting disabled doesn't solve the issue Adding competing high priority DragGestures doesn't fix it One partial solution is having ScrollViewReader scroll down a tiny bit upon appearing, but the issue re-emerges after the user has scrolled to the top. Code to reproduce: struct Playground: View { @State private var detent = PresentationDetent.fraction(1/3) @State private var isSheetPresented = true var body: some View { Rectangle() .fill(Color(.systemGray5)) .sheet(isPresented: $isSheetPresented) { VStack { Text("ScrollView-in-Sheet Experiment") .padding() ScrollView { ScrollViewReader { scrollProxy in VStack(spacing: 0) { ForEach(0...10, id: \.self) { i in Rectangle() .fill(.white) .frame(height: 50) .id(i) .overlay { Text(i.description) } } } } } .frame(height: 200) .padding() } .background { Color(.systemGray6) } .presentationDetents([.large, .fraction(1/3)], selection: $detent) } } }
1
0
721
Jan ’24
Scrolling sticker browser on a Messages App sheet causes sheet to move
As someone who learned Swift via SwiftUI, UIKit is completely alien to me, so I apologize if this is actually a very simple issue. I have a Messages extension that includes a sticker browser within it. In this extension, the MSMessagesAppViewController hosts a SwiftUI View, which in turn hosts a UIViewRepresentable version of MSStickerBrowserView. The whole Messages App sheet moves with an upward drag, and can switch to its expanded mode, whenever the browser is scrolled to the top (first sticker is at top left), but it doesn't budge when the browser is scrolled to the other end when it should allow the sheet to move upward with the drag. It seems something is reversed within the gesture priority management that allows a sheet to be moved in the appropriate direction when a contained scrollview is at the appropriate end. Things I've tried while reaching a diagnosis include: Limiting the presentation style to compact (the modal still moves, but never succeeds in changing) Adding competing highPriorityGestures in the SwiftUI view, set at various locations Inserting a rectangle with allowsHitTesting(false) beneath the browser Changing firstResponder statuses for all relevant views Changing GestureResponder priorities (there are no gesture responders in all views examined) Things I've considered but don't have the technical skills to implement: Have the view scroll a little downwards programmatically (like what can be done via ScrollViewReader in SwiftUI), but I have no idea how this can be done via MSStickerBrowserView or UIKit in general. Maybe the MSStickerBrowserView thinks its always in the expanded state (when the sheet is expanded, the end-drags work fine). If this is the case, if there's a way to either fix this misconception (via controller's didTransition) or do away with end drags in general, the problem should go away. Any pointers would be greatly appreciated!
2
0
822
Jan ’24
Navigating between sticker browsers and messages extensions and the main app
Happy new year! My app has the ability to generate stickers so I'm trying to connect two separate Messages extensions: The sticker browser and the app extension. The former is a repository and the latter is where stickers are made. It would be easy to create these two extensions and have them stand separately, like Apple's Memoji, but I'm trying to find a way to better streamline the user experience, such that they can navigate from the browser to the extension and back seamlessly. As far as I can tell, there's no indication that this is possible, but also nothing to indicate that it isn't. Similarly, are there ways to navigate from the main app to a messages extension, or go the other way? From what I've read, there's no known way to do the former but there was a way to do the latter that no longer works. tl;dr - Is it possible for users to press a button to go from a MSStickerBrowserViewController to a MSMessagesAppViewController (both belonging to the same app group) or back? Or go from the main app to either or back?
0
0
479
Jan ’24
Is it possible to compile images into an APNG using Swift?
Hello, I'm wondering if there is a way to programmatically write a series of UIImages into an APNG, similar to what the code below does for GIFs (credit: https://github.com/AFathi/ARVideoKit/tree/swift_5). I've tried implementing a similar solution but it doesn't seem to work. My code is included below I've also done a lot of searching and have found lots of code for displaying APNGs, but have had no luck with code for writing them. Any hints or pointers would be appreciated. func generate(gif images: [UIImage], with delay: Float, loop count: Int = 0, _ finished: ((_ status: Bool, _ path: URL?) -> Void)? = nil) { currentGIFPath = newGIFPath gifQueue.async { let gifSettings = [kCGImagePropertyGIFDictionary as String : [kCGImagePropertyGIFLoopCount as String : count]] let imageSettings = [kCGImagePropertyGIFDictionary as String : [kCGImagePropertyGIFDelayTime as String : delay]] guard let path = self.currentGIFPath else { return } guard let destination = CGImageDestinationCreateWithURL(path as CFURL, __UTTypeGIF as! CFString, images.count, nil) else { finished?(false, nil); return } //logAR.message("\(destination)") CGImageDestinationSetProperties(destination, gifSettings as CFDictionary) for image in images { if let imageRef = image.cgImage { CGImageDestinationAddImage(destination, imageRef, imageSettings as CFDictionary) } } if !CGImageDestinationFinalize(destination) { finished?(false, nil); return } else { finished?(true, path) } } } My adaptation of the above code for APNGs (doesn't work; outputs empty file): func generateAPNG(images: [UIImage], delay: Float, count: Int = 0) { let apngSettings = [kCGImagePropertyPNGDictionary as String : [kCGImagePropertyAPNGLoopCount as String : count]] let imageSettings = [kCGImagePropertyPNGDictionary as String : [kCGImagePropertyAPNGDelayTime as String : delay]] guard let destination = CGImageDestinationCreateWithURL(outputURL as CFURL, UTType.png.identifier as CFString, images.count, nil) else { fatalError("Failed") } CGImageDestinationSetProperties(destination, apngSettings as CFDictionary) for image in images { if let imageRef = image.cgImage { CGImageDestinationAddImage(destination, imageRef, imageSettings as CFDictionary) } } }
3
0
1.3k
Dec ’23
How do I detect whether an iPad's camera location is landscape or portrait?
The 10th-Gen iPad differs from its predecessors by having a camera that's located at the top of its landscape orientation. This is a headache for me since my app needs to know the rough camera location given the device's orientation for AR purposes. I can find out whether the device is a tablet or not, but I can't find out whether it's an iPad 10. Are there any direct or indirect ways for me to find out whether a camera is placed for portrait or landscape use?
0
0
569
Aug ’23
Is the ability to make in-app currency purchases of creator content a violation of app store rules?
Hello, I'm making an app where users can create a multimedia object and other users can use in-app currency (purchased via in-app purchase) to gain access to the object. They can also choose to subscribe to a creator to encourage and support their content creation. Tokens can be converted to cash and sent to creators. It is unclear to me whether this violates app store rules. After reading through the App Store Guidelines and searching through the forums (this thread was close to what I was looking for), I have yet to arrive at a clear answer. The Guidelines state that "tipping" content creators is acceptable, but this isn't exactly what I'm looking for in a creator marketplace. The Business section doesn't contain anything else that seems relevant, and this makes it seem like only voluntary tipping of content creators is accepted. The commerce engineer in the thread linked above discourages using in-app currency, but that doesn't seem to work for my use case (the thread's creator wants to use the IAP mechanism). Furthermore, IAP cannot be created programmatically (i.e. by users) and is error-prone. It must be stressed that I'm not trying to deny giving Apple its 15-30% share, since users must buy in-app currency using Apple's IAP (this is not a multi-platform app). Cash in the app's economy has only one entry point, and that is via Apple's IAP. I have asked a similar question recently, but received no response. This is probably because I didn't phrase it well enough and didn't attach the correct tags. Creating the infrastructure for a creator marketplace app is a lot of grueling work, and I would very much like to know whether my app will be rejected for it before I embark on this quest. Any help would be greatly appreciated. tl;dr - Is the ability to make in-app currency purchases of creator content a violation of app store rules?
1
0
696
Jun ’23
CloudKit for storage of gated user content and account profiles?
I'm a new developer and I can't decide whether CloudKit or Firebase is more suited to my app's needs. The general vibe I'm getting from multiple sources is that CloudKit is purely for storage, but that seems off to me, so I'm hoping to find some hints or answers with regard to my specific use-cases. My app allows users to create multimedia objects that are stored somewhere and displayed in the app's marketplace, with use access granted to those who decide to spend tokens on them. These multimedia objects can go up to a few gigabytes. Ownership of these objects are tracked via a user account database. Am I correct in assuming that my app's CloudKit public database can be used to store, check user request validity, and distribute these multimedia objects? Can it also be used for the management of user accounts and overall token counts? Additionally, is it possible for the app's administrators to pipe data out of the public database for analysis and to payment processor APIs? Any help would be greatly appreciated
0
0
491
Jun ’23
Is there a way to save SKTextureAtlas(dictionary) sprite sheets to disk to read later?
My app uses SpriteKit and requires the use of SKTextureAtlas for performance. However, it uses user-generated content, which means that atlases don’t initialize using bundled images, but instead have to recreate all sprite sheets, leading to long loading times. Is it possible to save the sprite sheets made from user-generated content to disk so that SKTextureAtlas can load them instead of recreating all sprite sheets upon every initialization? Is there any alternative solution to this problem? For example, is there a way to dump a bunch of images into memory to use as a texture pool and keep them there until deallocated?
0
0
675
May ’23
Are user-to-user token transactions, with the ability to cash-out, allowed?
I’m trying to build a simple app that will allow users to create items in-app and buy/sell access to them. Users can buy tokens from the App Store that can be used to fund these purchases (e.g. 1 token = 100 coins), and can also be used support the development of these items (e.g. give creator 1 coin a week). The creators can then cash out the coins once a certain amount is reached, via Stripe or some other external payment provider. Is this allowed under the current rules of the app store? Is it possible to implement using CloudKit? If this isn’t allowed, is it possible to allow users to share content on the app’s “store” for other users to browse and download, and can that be done just using CloudKit? This is my first-ever app and I don’t know left from right when it comes to back-ends. I will greatly appreciate any pointers.
0
0
607
May ’23
Controlling a SwiftUI view's refresh rate without TimelineView
Hello, I have an @EnvironmentObject variable that publishes 30 times per second but the view that depends on that data must be limited to 20 frames per second. The publishing rate cannot be controlled, so I must control the rate at which SwiftUI refreshes that view in order to achieve the desired effect. I tried using TimelineView(.periodic(from: .now, by: 1/20)), but the restriction gets ignored due to the published data; I also considered Combine's timer, but it isn't based on real time. Are there any other methods I can use to force SwiftUI to refresh certain views at certain real time intervals, in essence controlling its FPS?
0
1
1.1k
Jun ’22