Hello,
I am studying a possible scenario.
Let's say I create an App Clip that features a WKWebView.
The WKWebView hosts a sort of webapp that uses local storage and IndexDB.
When the complete app is installed, are the data persisted so the WKWebView of the complete app finds them as it was reading them before?
Are the data transferred by the operating system to a new location but still accessed the same way by the WKWebView (Or even the location is the same because it is the WKWebView special storage)? Or are they wiped out?
Thank you in advance
Best regards
Posts under iOS tag
200 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
When building in Xcode 15.4 debug, only a part of the initial View for List is initialized. As you scroll, new ones are initialized, and old ones are destroyed.
When building the same code in Xcode 16.2, ALL Views are initialized first, and then immediately destroyed as you scroll, new ones are initialized, and old ones are destroyed.
MRE:
struct ContentView: View {
private let arr = Array(1...5555)
var body: some View {
List(arr, id: \.self) {
ListCellView(number: $0)
}
}
}
struct ListCellView: View {
private let number: Int
private let arr: [Int]
@StateObject private var vm: ListCellViewModel // Just to print init/deinit
init(number: Int) {
print(#function, Self.self, number)
self.arr = Array(0...number)
self.number = number
let vm = ListCellViewModel(number: number) // To see duplicates of init
self._vm = StateObject(wrappedValue: vm) // because here is @autoclosure
}
var body: some View {
let _ = print(#function, Self.self, number)
Text("\(number)")
}
}
class ListCellViewModel: ObservableObject {
private let number: Int
init(number: Int) {
print(#function, Self.self, number)
self.number = number
}
deinit {
print(#function, Self.self, number)
}
}
An example from a memory report with this code:
Fortunately, the behavior in release mode is the same as in Xcode 15.4. However, the double initialization of Views is still present and has not been fixed.
...
init(number:) ListCellView 42
init(number:) ListCellViewModel 42
init(number:) ListCellView 42
init(number:) ListCellViewModel 42
deinit ListCellViewModel 42
body ListCellView 42
Yes, unnecessary View initializations and extra body calls are "normal" for SwiftUI, but why do they here twice so that one is discarded immediately?
One of my clients is interested in developing a system similar to BrowserStack for internal team usage. Could you please guide me on how to approach the development of this system?
Specifically, the project requires:
Full iPhone screen recording.
Capturing and executing click events on the iPhone.
Do I need to obtain permission from Apple for these functionalities?
Hello Apple Team,
I'm trying to import the Audodesk FBX SDK to my Objective-C iOS Project.
The SDK is written in C++, but has support for iOS and the iOS simulator architectures.
I've added the path to the include folder in the Header Search Path
I've also added the paths to libfbxsdk.a in the Library Search Paths
Finally, I've added the libfbxsdk.a file to the Link Binary with Libraries.
However, when I build the project, I get the following error:
building for 'iOS', but linking in object file (/Users/Lond/Documents/v2/Autodesk/iOS/2020.3.7/lib/ios/debug/libfbxsdk.a[28](fbxalloc.cxx.o)) built for 'macOS'
In the terminal, if I type the command:
lipo -info libfbxsdk.a
I get the message
Non-fat file: libfbxsdk.a is architecture: arm64
confirming that I'm using the library for the correct architecture.
Do I need to add any other confifuration option? (Like the other linker flag or something else)
I'm quite new to C++, and integrating a C++ SDK into iOS is not easy.
I'm using Mac Os Sonoma 14.6.1
Tested on Xcode 15.4 and 16.2
Target Device: iPhone 13 Pro (iOS 17.6.1)
iOS FBX SDK version: 2020.3.7
Link to the SDK if needed:
https://aps.autodesk.com/developer/overview/fbx-sdk
Any help would be greatly appreciated
Thank you
I have two eSIMs from different providers. One shows data usage in settings>Mobile Service>Mobile Data For …….
It shows ‘Current Period‘ and ‘Roaming ‘usage for this eSIM but not the other one. I have contacted the help desk for the provider and they say it’s not enabled. I am assuming they mean they have not set it on the sims provided Or in Carrier Bundle.
Is there any way of enabling this through the device using any software tool etc.??
thanks in advance
I'm trying to persist a bookmark to an external device (mass storage controller connected via camera adapter) across disconnection / reconnection, but it is failing at startAccessingSecurityScopedResource.
The URL is initially retrieved using
UIDocumentPickerViewController *documentProvider;
documentProvider = [[UIDocumentPickerViewController alloc] initForOpeningContentTypes:[NSArray arrayWithObjects:UTTypeFolder, nil]];
documentProvider.delegate = self;
documentProvider.modalPresentationStyle = UIModalPresentationOverFullScreen;
[self presentViewController:documentProvider animated:YES completion:nil];
and then persisted to a bookmark using
DeviceBookmark = [url bookmarkDataWithOptions:NSURLBookmarkCreationMinimalBookmark includingResourceValuesForKeys:nil relativeToURL:nil error:nil]
When accessing the resource I use
NSURL *url = [NSURL URLByResolvingBookmarkData:DeviceBookmark options:NSURLBookmarkResolutionWithoutUI relativeToURL:nil bookmarkDataIsStale:&isStale error:&error]
to retrieve the new URL.
If I don't remove the MSC then the retrieved URL remains the same and functions as expected. If I remove and reconnect the MSC then the URL changes, I get true for isStale and nil for error but startAccessingSecurityScopedResource fails.
I've been banging my head against this for about a day now, but can't see what the issue can be. I've tried adding some related permissions to the entitlements, but this seems to be macOS related as far as I can tell.
What am I missing?!
when we launch the application and change the language from german/french to english or any other language then in also it is changing app language, but bluetooth connection screen with pair or cancel alert is showing on previous selected language. Since that alert is system alert, is there any wayto debug/resolve that issue.
I'm experiencing an inconsistent behavior with the App Tracking Transparency (ATT) prompt in my Cordova iOS app using the admob-plus-cordova and cordova-plugin-consent plugins.
Environment:
Cordova iOS app
Plugins: admob-plus-cordova, cordova-plugin-consent
iOS Simulator: 16.0
Physical device: iphone 12 17.5.1
Xcode version: 16.2
Issue:
The ATT permission prompt appears correctly in the iOS Simulator but fails to show on physical devices. I've verified that:
Info.plist includes NSUserTrackingUsageDescription
The ATT request is triggered before initializing AdMob
The device is running iOS 14.5 or later
Expected behavior:
ATT prompt should appear on first launch on physical devices (as it does in the simulator)
Actual behavior:
ATT prompt appears correctly in simulator [attach your screenshot]
ATT prompt never appears on physical device
Troubleshooting steps tried:
Verified app hasn't previously requested ATT permission
Confirmed tracking is enabled in device Settings -> Privacy -> Tracking
Verified implementation order (ATT request before AdMob initialization)
Any insights on why this might be happening or additional debugging steps would be greatly appreciated.
I am trying to run TinyLlama directly using Swift Playgrounds for iOS. I have tried multiple solutions, like libraries (LLM.swift, swift-transformers, ...) which never worked due to import issues, and also tried importing an exported mlmodel.
For the later, I followed the article about Llama 3.1 on CoreML. It was hard to understand how to do the inference with it, but I was able to export a mlpackage, that I then placed in a xcode project to generate the mlmodelc (compiled model) and the model class. I had to go with the first version described in the article, without optimizations, as I got errors during model loading with the flexible input shapes. I was able to run the model for one token generation.
But my biggest problem is that, though the mlmodelc is only 550 MiB, th model loads 24+GiB of memory, largely exceeding what I can have on an iOS device.
Is there a way to use do LLM inferences on Swift Playgrounds at a reasonable speed (even 1 token / s would be sufficient)?
We have developed a custom iOS framework called PaySDK. Earlier we distributed the framework as PaySDK.xcframework.zip through GitHub (Private repo) with two dependent xcframeworks.
Now, one of the clients asking to distribute the framework through Swift Package Manager.
I have created a new Private repo in the GitHub, created the new release (iOSSDK_SPM_Test) tag 1.0.0. Uploaded the below frameworks as Assets and updated the downloadable path in the Package.Swift and pushed to the GitHub Main branch.
PaySDK.xcframework.zip
PaySDKDependentOne.xcframework.zip
PaySDKDependentTwo.xcframework.zip
When I try to integrate (testing) the (https://github.com/YuvaRepo/iOSSDK_SPM_Test) in Xcode, am not able to download the frameworks, the downloadable path is pointing to some old path (may be cache - https://github.com/YuvaRepo/iOSSDK_SPM/releases/download/1.2.0/PaySDK.xcframework.zip).
Package.Swift:
// swift-tools-version:5.3
import PackageDescription
let package = Package(
name: "iOSSDK_SPM_Test",
platforms: [
.iOS(.v13)
],
products: [
// Products define the executables and libraries a package produces, making them visible to other packages.
.library(
name: "iOSSDK_SPM_Test",
targets: ["PaySDK", "PaySDKDependentOne", "PaySDKDependentTwo"]
)
],
targets: [
// Targets are the basic building blocks of a package, defining a module or a test suite.
// Targets can depend on other targets in this package and products from dependencies.
.binaryTarget(
name: "PaySDK",
url: "https://github.com/YuvaRepo/iOSSDK_SPM_Test/releases/download/1.0.0/PaySDK.xcframework.zip",
checksum: " checksum "
),
.binaryTarget(
name: "PaySDKDependentOne",
url: "https://github.com/YuvaRepo/iOSSDK_SPM_Test/releases/download/1.0.0/PaySDKDependentOne.xcframework.zip",
checksum: " checksum "
),
.binaryTarget(
name: "PaySDKDependentTwo",
url: "https://github.com/YuvaRepo/iOSSDK_SPM_Test/releases/download/1.0.0/PaySDKDependentTwo.xcframework.zip",
checksum: " checksum "
),
.testTarget(
name: "iOSSDK_SPM_TestTests",
dependencies: ["PaySDK", "PaySDKDependentOne", "PaySDKDependentTwo"]
)
]
)
Steps I followed:
I have tried below steps,
Removed the local repo and cloned new
rm -rf ~/Library/Caches/org.swift.swiftpm/
rm -rf ~/Library/Developer/Xcode/DerivedData/*
Can anyone help to identify the issue and resolve? Thanks in advance.
Hi all!
I have been experiencing some issues when using the AVAudioEngine to play audio and record input while doing a voice chat (through the PTT Interface).
I noticed if I connect any players to the AudioGraph OR call start that the audio session becomes active (this is on iOS).
I don't see anything in the docs or the header files in the AVFoundation, but is it possible that calling the stop method on an engine deactivates the audio session too?
In a normal app this behavior seems logical, but when using PTT all activation and deactivation of the audio session must go through the framework and its delegate methods.
The issue I am debugging is that when the engine with the input node tapped gets stopped, and there is a gap between the input and when the server replies with inbound audio to be played and something seems to be getting the hardware/audio session into a jammed state.
Thanks for any feedback and/or confirmation on this behavior!
0
I am developing a custom Picture in Picture (PiP) app that plays videos. The video continues to play even when the app goes to the background, but I would like to be able to get the bool value when the PiP is hidden on the edge, as shown in the attached image. The reason why we need this is because we don't want the user to consume extra network bandwidth, so we want to pause the video when the PiP is hidden on the edge of the screen. Please let me know if this is possible. This can be done in the foreground or in the background.
Background:
We are using Apple WeatherKit for an app to fetch future weather data specialty cloudCover and condition and pretipitationChance from the Overnightforecast data from weatherResponse.dailyForecast.forecast array object.
Code Sample:
On xcode on Swift and Swift UI using Apple WeatherKit
task{
do{
let weatherResponse= try await self.weatherService.forecast(for: latLong)
let cloudCover = weatherResponse.dailyForecast.forecast[0].overnightforecast.cloudCover
Issue:
While we are on debug mode, we can literally see overnightforecast.cloudCover data (as well as condition and pretipitationChance) is available under from index 0 to index 10 of weatherResponse.dailyForecast.forecast array object but when we are trying to assign this property, its throwing error as
"overnightforecast object not present under weatherResponse.dailyForecast.forecast array object"
Attachment:
Hello everyone,
I'm experiencing an issue with my app, and I would greatly appreciate any guidance. Here's the situation:
My app works flawlessly on the simulator for all iPhone models (16, 16 Pro, 16 Pro Max, SE, etc.).
It runs without issues on physical iPhone 11 devices (both mine and a friend's).
However, when my friend installs the app from the App Store on their iPhone 15 Pro or iPhone 16, it crashes immediately upon opening.
Steps I’ve taken so far:
Verified that the app works on the simulator for the same models where it crashes in real life.
Updated the app's minimum deployment target to iOS 18, but this did not resolve the issue. The app still crashes on the physical iPhone 15 Pro and 16 models while continuing to run fine on previous physical devices and all simulated ones.
Additionally, I had another friend beta test the app through TestFlight on their iPhone 15 Pro, and they received an alert message indicating that the app had crashed when they tried to open it. This confirms the issue still exists after my attempted fixes.
I'm unsure what could be causing this discrepancy between the simulated and physical devices, or what the alert message in TestFlight might signify. Has anyone encountered a similar issue, or does anyone have suggestions on what I should do to fix this?
Thanks in advance for your help!
Hello,
I've been using the @Environment(\.dismiss) var dismiss in a SwiftUI app for the last 2 years which means it was working as expected in iOS 16, iOS 17 and for the most part iOS 18 up until iOS 18.2.1 in which it is causing an endless loop and eventually a crash.
It seems to be something about using a the @Environment(\.dismiss) with a NavigationLink which seems to cause this issue.
When I add a log in my swiftUI views with let _ = Self._printChanges(), I see the following printed out in a loop:
CurrentProjectView: _dismiss changed.
SurveyView: @self changed.
Similar issues have been reported:
https://forums.developer.apple.com/forums/thread/720803
https://forums.developer.apple.com/forums/thread/739512
Any idea how to resolve this ?
Hi,
For the purposes of iteration speed in development builds, on an iPhone in development mode, I am attempting to use hot reloaded dylibs. The goal is that the app is rarely fully restarted and small code changes can be applied quickly, drastically reducing iteration speed.
For this purpose I have a socket server on my Mac that sends changed dylibs to my app on my iPhone. This works great on Mac, however on iOS i am running into codesigning problems.
I am using the following to codesign the dylib:
codesign -f -s *** --timestamp=none testlibrary-ios.dylib
I am placing the downloaded dylib in this folder:
const char* cachedirectoryPath = [NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES)[0] UTF8String];
dlopen gives me the following error:
dlopen(/var/mobile/Containers/Data/Application/67A3D31B-6F72-4939-9E7F-665FC78CDC61/Library/Caches/testlibrary-ios.dylib, 0x000A): tried: '/usr/lib/system/introspection/testlibrary-ios.dylib' (no such file, not in dyld cache), '/var/mobile/Containers/Data/Application/67A3D31B-6F72-4939-9E7F-665FC78CDC61/Library/Caches/testlibrary-ios.dylib' (code signature invalid in <78A101AD-D756-3526-8754-8B7F4925DE90> '/private/var/mobile/Containers/Data/Application/67A3D31B-6F72-4939-9E7F-665FC78CDC61/Library/Caches/testlibrary-ios.dylib' (errno=1) sliceOffset=0x00000000, codeBlobOffset=0x0000C2E0, codeBlobSize=0x00004990),
....
Is loading a dylib like this on iPhones in development mode possible?
Any idea what is going wrong with codesigning or installing the dylib?
(Obviously this code is never deployed in an app that goes on the AppStore)
I am trying to diagnose a very strange application crash that occurs intermittently under the following conditions
App built in release and installed on iPhone
App is in the background (e.g. close the app and open a different app without fully force quitting it)
When these conditions are present, and I re-open the application by clicking on it's icon on the home screen, the app shows briefly on the screen and then immediately quits. This happens maybe 50% of the time when these conditions are present, but it does not create a crash report and there are no jetsam reports from the time of the crash either.
I was able to capture a sysdiagnose immediately after the crash (within 3 seconds), and I have reviewed the logs to help me determine a possible cause, but none of the logs seem to be causing it.
I am putting some potentially relevant log lines below, and I am attaching the logarchive.
Additionally, the entire application is open source here on github, and the crashing :( version of the app is available here on the app store. Note this crash does not happen in the macOS version.
Finally, I saw a previous thread that recommended adding atexit {abort()} to an app that doesn't produce a crash report, so I added that here before collecting these logs and I still don't get a crash report.
Here are some log lines may be relevant, but none of them provide a reason for app termination.
>>> MY APP IS OPENED
default 2025-01-25 13:16:11.060118 -0500 runningboardd com.apple.runningboard monitor Calculated state for app<com.msdrigg.roam(95D1E2E9-9609-44D9-A30A-0C4AEA990A0D)>: running-active (role: UserInteractiveFocal) (endowments: <private>)
default 2025-01-25 13:16:11.060132 -0500 runningboardd com.apple.runningboard process [app<com.msdrigg.roam(95D1E2E9-9609-44D9-A30A-0C4AEA990A0D)>:1758] Set jetsam priority to 100 [0] flag[1]
default 2025-01-25 13:16:11.060132 -0500 runningboardd com.apple.runningboard ttl [app<com.msdrigg.roam(95D1E2E9-9609-44D9-A30A-0C4AEA990A0D)>:1758] Resuming task.
default 2025-01-25 13:16:11.060185 -0500 runningboardd com.apple.runningboard ttl [app<com.msdrigg.roam(95D1E2E9-9609-44D9-A30A-0C4AEA990A0D)>:1758] Set darwin role to: UserInteractiveFocal
info 2025-01-25 13:16:11.062002 -0500 CommCenter com.apple.CommCenter ul BundleID: com.msdrigg.roam is a foreground app
>>> XPC says something about XPC_ERROR_CONNECTION_INTERRUPTED
com.apple.mDNSResponder Default [R9386->Q40264] Question assigned DNS service 125
default 2025-01-25 13:16:11.067097 -0500 Roam com.apple.xpc connection [0x300b94900] Re-initialization successful; calling out to event handler with XPC_ERROR_CONNECTION_INTERRUPTED
default 2025-01-25 13:16:11.067152 -0500 Roam com.apple.runningboard monitor Received state update for 1758 (app<com.msdrigg.roam(95D1E2E9-9609-44D9-A30A-0C4AEA990A0D)>, unknown-NotVisible
info 2025-01-25 13:16:11.068357 -0500 Roam com.apple.coreaudio
>>>MY APP RUNS AND STARTS LOGGING ON ITS OWN
default 2025-01-25 13:16:11.109376 -0500 Roam com.msdrigg.roam ECPWebsocketClient Clearing handlers
default 2025-01-25 13:16:11.109378 -0500 Roam com.msdrigg.roam ECPWebsocketClient No longer in error b/c restarting
default 2025-01-25 13:16:11.109419 -0500 Roam com.msdrigg.roam ECPWebsocketClient Ignoring state change because it is the same connecting at 2025-01-25 18:16:11 +0000
>>> XPC Connection invalidated
default 2025-01-25 13:16:11.146441 -0500 runningboardd com.apple.runningboard process XPC connection invalidated: [app<com.msdrigg.roam(95D1E2E9-9609-44D9-A30A-0C4AEA990A0D)>:1758]
>>> Launchd reports app exit
default 2025-01-25 13:16:11.150861 -0500 launchd user/501/UIKitApplication:com.msdrigg.roam[6159][rb-legacy] [1758] exited due to SIGPIPE | sent by Roam[1758], ran for 4930203ms
default 2025-01-25 13:16:11.150876 -0500 launchd user/501/UIKitApplication:com.msdrigg.roam[6159][rb-legacy] [1758] service state: exited
Logs split due to size being too big :(
roam-crash.1.log
roam-crash.2.log
roam-crash.3.log
roam-crash.4.log
roam-crash.5.log
roam-crash.6.log
I want to understand which component types are intended to have an associated hint text, haptic feedback, or earcon associated with it for VoiceOver screen reader users. Is there a list somewhere or a HIG guideline for which transition types should have a sound?
Some transitions in Apple apps generally include different beep sounds, such as
opening a new screen
screen dimming
when a VoiceOver user swipes from the header / navbar to the body
a scraping sound when swiping up or down a page.
the beginning or end of the body section
in Calculator when swiping from one row to the next.
opening a pop up menu
I would also appreciate any direction on what code strings are associated with these sounds and how custom components can capture these sounds or haptics or hints where it is expected? On the other hand, I don't want to get that info and then dictate that every component needs a specific beep type since these sounds appear to be used for specific purposes.
How to share 'back facing' iOS camera app at same time Eye Tracking app needs 'front facing' camera?
While using my xmas present of a new iPhone and iOS 18.2, I figured I'd try the Eye Tracker app. I've been working with clients successfully using Tobii and other existing eye trackers. In my limited tests, Apple has room for improvement.
My main issue is with the camera app which cannot be used at the same time while using the Eye Tracker app. I get an error popup from Apple:
Camera is use by another app
The image below is from my app showing the popup message "Camera in use by another app", but the same error occurs on the installed camera app. This error is from Apple, not my app.
For terminology: 'front' camera is the one pointing at the user (the selfi camera) while 'back' camera is the main one with multiple lenses. Eye tracking needs the 'front' camera.
It seems when an app uses the camera, it takes over both the front and back facing cameras (since you might swap them). Thus another app, especially Eye Tracking, cannot use just the front facing camera at the same time.
That limits use of Eye Tracking, in particular one cannot take pictures or click any buttons on an app that uses the camera.
Anyone know of a way for an app to not take over both front and back cameras at the same time? If I can separate them, the Eye Tracker could use the front camera while the camera uses the back camera.
Use case: When users open any URL in iOS Safari and tap the share button, the share extension can automatically receive the PDF version of the webpage from Safari.
We've observed that the system Markup app and the Apple Books app can automatically receive a PDF file from Safari. However, our custom share extension only receives the webpage URL.
While iOS Safari provides an "Options" button to manually share a webpage as a PDF, this feature is not intuitive and requires user education.
Could someone help to identify the correct NSExtensionActivationRule (or any other solution) that would allow our share extension to directly receive the PDF snapshot from Safari without requiring additional user actions?