iOS is the operating system for iPhone.

Posts under iOS tag

200 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Can a Live Caller ID server supply live data or must it be static?
With the Live Caller ID example server, the caller lookup dataset is defined in an input.txtpd and processed by running a ConstructDatabase command which creates a block.binpb and an identity.binpb file. In other words, a static input file is being processed into static block and identity files. However, in the real world, the data content for identified and blocked numbers is something which is in a constant state of flux and evolution, as new numbers becoming available, old ones become stale, numbers which were initially considered safe change into being considered malicious etc. etc. Is the example server just that, merely an example using fixed datasets, and an actual production server is able to use live every changing data to formulate its response back to the iPhone OS query? Here's a concrete use case - suppose it's a requirement to permit US nanp numbers but to block anything else. The total number of non US nanp numbers is so large and ever changing that it would be unfeasible to attempt to capture them in an input.txtpd file and then process that, and then to re-capture and re-process it endlessly. Instead what would be required is the ability for the Live Caller ID server to evaluate at query time, using a regular expressions for example, if a number is nanp or not. Is this possible?
1
1
204
3w
iOS18 Live Activity failed to show sometimes
In a music streaming app, when using Activity.request to activate the Dynamic Island, the system’s Now Playing interface appears correctly. However, the app's live activities, lock screen, and other related features fail to display properly. During debugging, the following code is used: activity = try Activity.request(attributes: attributes, contentState: contentState, pushType: .token) if !NMABTestManager.default().is(inTest: "FH-NewLiveActivityPush") { // Listen to push token updates if activity != nil { tokenUpdatesTask?.cancel() tokenUpdatesTask = Task.detached { for await tokenData in self.activity!.pushTokenUpdates { let mytoken = tokenData.map { String(format: "%02x", $0) }.joined().uppercased() // pushToken is Data, needs to be converted to String using the above method before being passed to the server self.pushToken = mytoken } } } } } catch (let error) { print("Error Starting Live Activity: \(error.localizedDescription)") } In this scenario, the push token is returned correctly, and no errors are triggered. This issue did not occur in iOS 17 but appears sporadically in iOS 18. Once it occurs, it cannot be resolved through restarting or other means. feedbackid:FB14763873, i upload my sysdisagnose
2
0
224
3w
AVCaptureMovieFileOutput DOES NOT support spatial video capture on iPhone 15 Pro of iOS 18.1?
Hi everyone, I am having trouble implementing spatial video recording into files by following the WWDC24 video: Build compelling spatial photo and video experiences. Specifically, the flag "isSpatialVideoCaptureSupported" of AVCaptureMovieFileOutput shows FALSE where the code is tested on both my physical iPhone 15 Pro (iOS 18.1) and the simulator (iOS 18.0). This is the code that I am running: let movieFileOutput = AVCaptureMovieFileOutput() print("movieCapture output isSpatialVideoCaptureSupported: \(movieFileOutput.isSpatialVideoCaptureSupported)") However, one of the formats of AVCaptureDevice shows a TRUE for the flag isSpatialVideoCaptureSupported. for format in currentDevice.formats { if format.isSpatialVideoCaptureSupported { print("isSpatialVideoCaptureSupported is true") break } } I am totally confused now, why DOES the camera device support spatial mode while the movieFileCapture DOES NOT? Can someone please help? Really appreciate it!! Here are my testing environment: iPhone 15 Pro iOS 18.1 (US version) Xcode 16.0 beta 16A5171c
1
0
178
2w
Xcode 16 thinks iPhone has Dev mode disabled, but it doesn't
I'm attempting to do something I've done many times—built to device. However, since updating my mac OS to 15, downloading Xcode 16 beta, and updating my device to iOS 18 beta, I am unable to do so. Xcode is giving me an error telling me that Developer mode is disabled with instructions to enable. But when I check the device, Developer mode is already enabled. I've tried disabling and re-enabling (which requires a restart and confirmation). I've tried shutting down the device and restarting it. I've tried quitting and re-opening Xcode. Anyone else running into this issue? Any thoughts?
0
1
177
3w
RealityView world tracking without camera feed?
Is it possible with iOS 18 to use RealityView with world tracking but without the camera feed as background? With content.camera = .worldTracking the background is always the camera feed, and with content.camera = .virtual the device's position and orientation don't affect the view point. Is there a way to make a mixture of both? My use case is that my app "Encyclopedia GalacticAR" shows astronomical objects and a skybox (a huge sphere), like a VR view of planets, as you can see in the left image. Now that iOS 18 offers RealityView for iOS and iPadOS, I would like to make use of it, but I haven't found a way to display my skybox as environment, instead of the camera feed. I filed the suggestion FB14734105 but hope that somebody knows a workaround...
4
1
383
2w
Custom fonts stop showing sometimes when you switch applications in SwiftUI
I have loaded custom fonts into my SwiftUI project in Xcode to a folder, with the target membership set to the correct build, I have included the correct names inside the info.plist in the fonts provided by application section, and included them in the code using the correct names as such: extension Font { static let mainFont: (FontWeights, CGFloat) -> Font = { fontType, size in switch fontType { case .hairline: Font.custom("TT Norms Pro Thin", size: size) case .thin: Font.custom("TT Norms Pro ExtraLight", size: size) case .light: Font.custom("TT Norms Pro Light", size: size) case .regular: Font.custom("TT Norms Pro Regular", size: size) case .medium: Font.custom("TT Norms Pro Medium", size: size) case .semiBold: Font.custom("TT Norms Pro DemiBold", size: size) case .bold: Font.custom("TT Norms Pro Bold", size: size) case .heavy: Font.custom("TT Norms Pro ExtraBold", size: size) case .black: Font.custom("TT Norms Pro Black", size: size) } } static let serifFont: (SerifFontWeights, CGFloat) -> Font = { fontType, size in switch fontType { case .semiBold: Font.custom("Butler Free Bd", size: size) } } } extension View { func mainFont(_ fontWeight: FontWeights? = .regular, _ size: CGFloat? = nil) -> some View { return font(.mainFont(fontWeight ?? .regular, size ?? 16)) } func serifFont(_ fontWeight: SerifFontWeights? = .semiBold, _ size: CGFloat? = nil) -> some View { return font(.serifFont(fontWeight ?? .semiBold, size ?? 16)) } } Most of the time this works and the font shows, but if I switch applications, often some of the fonts disappear and the standard iOS font shows. Sometimes, but rarely, it also already doesn't show correctly on launch. I also tried simpler ways to include the font and different font files, but it causes the same behavior. What could be the reason for this erratic behavior?
1
0
315
3w
Game Center breaks RealityView world tracking
Has anyone come across the issue that setting GKLocalPlayer.local.authenticateHandler breaks a RealityView's world tracking on iOS / iPadOS 18 beta 5? I'm in the process of upgrading my app to make use of the much appreciated RealityView unification, using RealityView not only on visionOS but now also on iOS and iPadOS. In my RealityView, I enable world tracking on iOS like this: content.camera = .worldTracking However, device position and orientation were ignored (the camera remained static) and there was no camera pass-through. Then I discovered that the issue disappeared when I remove the line GKLocalPlayer.local.authenticateHandler = { viewController, error in // ... some more code ... } So I filed FB14731139 and hope that it will be resolved before the release of iOS / iPadOS 18.
2
1
229
1w
iOS 18.2 public beta update not showing on my phone
I have had the iOS 18.1 public beta for some time and it has serious network / signal / mobile data issue…keeps reconnecting and disconnecting and losing signal which is resulting in call dropping. online shows iOS public beta 3 is out but I still haven’t had the iOS 18.2 public beta. anyone know what’s going on please and how I can get to update to iOS public beta 18.2??
0
0
301
4w
iOS18 AudioPlaybackIntent respond too slow when app not launched
here's my case, i develop a control widget, user can push a controlwidget to play music without open the app. i bind a AudioPlaybackIntent to the widget. when user click the widget, the app will receive the intent and start to play music. it works perfect when app running in background or foreground. but when app not launched, the respond is too slow, it takes several seconds to start play music. is there any way to make the respond faster when app not launched? if i want to do something when app not launched, what should i do?
2
0
334
3w
iOS18 Control Widget custom symbol preview failed
i export apple SF as custom sf for test. code is simple: var body: some ControlWidgetConfiguration { StaticControlConfiguration( kind:"ControlWidgetConfiguration" ) { ControlWidgetButton(action: DynamicWidgetIntent()) { Text("test") Image("custom_like") } }.displayName("test") } as we can see, it can't show image in the preview. but it can show image in the Control widget center. am i do some thing wrong?
0
4
234
4w
Chat gpt audio in background
Dear Apple Development Team, I’m writing to express my concerns and request a feature enhancement regarding the ChatGPT app for iOS. Currently, the app's audio functionality does not work when the app is in the background. This limitation significantly affects the user experience, particularly for those of us who rely on the app for ongoing, interactive voice conversations. Given that many apps, particularly media and streaming services, are allowed to continue audio playback when minimized, it’s frustrating that the ChatGPT app cannot do the same. This restriction interrupts the flow of conversation, forcing users to stay within the app to maintain an audio connection. For users who multitask on their iPhones, being able to switch between apps while continuing to listen or interact with ChatGPT is essential. The ability to reference notes, browse the web, or even respond to messages while maintaining an ongoing conversation with ChatGPT would greatly enhance the app’s usability and align it with other background-capable apps. I understand that Apple prioritizes resource management and device performance, but I believe there’s a strong case for allowing apps like ChatGPT to operate with background audio. Given its growing importance as a tool for productivity, learning, and communication, adding this capability would provide significant value to users. I hope you will consider this feedback for future updates to iOS, or provide guidance on any existing APIs that could be leveraged to enable such functionality. Thank you for your time and consideration. Best regards, luke yes I used gpt to write this.
1
0
224
4w