Post

Replies

Boosts

Views

Activity

slow motion video playback "ramp-up" stage
In iOS, when I use AVPlayerViewController to play back a slow motion video, it has a "ramp-up" stage at the start and a "ramp-down" stage at the end, and the video plays at the normal speed (i.e. not slow motion) during these stages. My question is: are these non-slow-motion stages defined in the video file itself? (e.g. some kind of meta data?) Or, is it just a playback approach used by AVPlayerViewController ? Thanks!
0
0
247
Oct ’24
assumesHTTP3Capable not working only on some iPhones
Hi, We are using HTTP3 only and hence using assumesHTTP3Capable for every request. It worked so far but now encountered one iPhone that never honor this flag and always tries to create a connection using TCP: [tcp] tcp_input [C1:3] flags=[R.] seq=0, ack=2023568485, win=0 state=SYN_SENT rcv_nxt=0, snd_una=2023568484 The request is created like this: let url = URL(string: urlString)! var request = URLRequest(url: url, cachePolicy: .reloadIgnoringLocalCacheData, timeoutInterval: 60.0) request.assumesHTTP3Capable = true return try await urlSession.data(for: request) iOS: 16 XCode: 15.3 In what cases iOS CFNetwork would not honor "assumesHTTP3Capable" ? (or how can I find out why?)
4
0
578
Jun ’24
How to find out network connection error details
Hi, When running my iOS app in Xcode, I got the following message in the console multiple times: [connection] nw_read_request_report [C1] Receive failed with error "Operation timed out" It seems not critical as my app still works, but how can I find out more details of the connection that printed this message? For example, the network request the caused this, or the URL? Xcode: 15.3 iOS 17 SwiftUI app
5
0
738
Jun ’24
How to make content view extend beneath NavigationBar like in a ZStack?
I have view hierarchy like this: NavigationView { VStack { ScrollView { // some content } .background(NavigationLink(destination: MyView(), isActive: $isActive) { EmptyView() }) // snipped } } When the user follows the navigation link to MyView, by default NavigationBar and MyView is laid out like in a VStack. How can I make them like in a ZStack(alignment: .topLeading) ? i.e. I like MyView extends beneath the NavigationBar. Here is the MyView: GeometryReader { geo in HStack { // some content } .onAppear { hideNavigationbar = false } .onTapGesture(perform: { hideNavigationbar.toggle() }) .navigationBarHidden(hideNavigationbar) .navigationBarTitleDisplayMode(.inline) // snipped } The problem is, when hideNavigationbar toggles, MyView would shift up and down because its height would change. I want NavigationBar sits on top of MyView like in a ZStack so that when NavigationBar hides, MyView will not shift (only uncover its top portion). What is the proper way to achieve that? XCode: 15.3 using SwiftUI iOS: 15
1
0
373
Apr ’24
TLS For Accessory Developers: does it work for non-local network?
I'm following the approach in https://developer.apple.com/forums/thread/703234 section "Doing Even Better: Proper Security". My question is: does it work if the accessory is not in the local network (i.e. out there on the Internet with an IP address) ? I tried and: SecTrustEvaluateWithError(trust, nil) returns true, but TLS still fails: ATS failed system trust Connection 1: system TLS Trust evaluation failed(-9802) <snip> Domain=NSURLErrorDomain Code=-1200 "An SSL error has occurred and a secure connection to the server cannot be made." UserInfo={NSLocalizedRecoverySuggestion=Would you like to connect to the server anyway?, Here is my code : var err = SecTrustSetPolicies(trust, SecPolicyCreateBasicX509()) os_log("SecTrustSetPolicies returns \(err)") err = SecTrustSetAnchorCertificates(trust, [self.myCA] as NSArray) os_log("SecTrustSetAnchorCertificates returns \(err)") err = SecTrustSetAnchorCertificatesOnly(trust, true) os_log("SecTrustSetAnchorCertificatesOnly returns \(err)") // check the trust object let evalResult = SecTrustEvaluateWithError(trust, nil) os_log("SecTrust eval result: \(evalResult)") // create a credential with accepted server trust. let credential = URLCredential(trust: trust) completionHandler(.useCredential, credential) the logs are: SecTrustSetPolicies returns 0 SecTrustSetAnchorCertificates returns 0 SecTrustSetAnchorCertificatesOnly returns 0 SecTrust eval result: true Did I do anything wrong? or is it not supported outside the local network? Thanks.
2
0
715
Apr ’24
SwiftUI: dismissing `ActionSheet` dismisses its parent view
I have a view FooView in a navigation stack presented via a NavigationLink. And FooView has a button that triggers a ActionSheet. The problem is: when the ActionSheet is dismissed (either cancelled or selected option), its parent view, i.e. FooView is also dismissed, i.e. pops up from the navigation stack. Is there a way to prevent it doing that? More importantly, why is this happening? What dismissed FooView? iOS version: iOS 15 on iPhone XCode: 14.2 Here is a minimum reproducible sample code. I noticed that if I moved NavigationView from outside of TabView into ContentView, then the issue is gone. But then the Tabs will show up even in FooView, hence I kept NavigationView outside TabView. (And why does that matter?) import SwiftUI @main struct ViewDismissApp: App { var body: some Scene { WindowGroup { NavigationView { TabView { ContentView(title: "Tab 1") .tabItem { Label("Tab1", systemImage: "wifi") } ContentView(title: "Tab 2") .tabItem { Label("Tab2", systemImage: "wrench.adjustable") } } } .navigationViewStyle(.stack) } } } struct ContentView: View { let title: String var body: some View { VStack { NavigationLink { FooView() } label: { Text(title) } } } } struct FooView: View { @State private var showActions = false var body: some View { VStack { Text("Foo View") actionButton() } } private func actionButton() -> some View { Button(action: { showActions = true }) { Image(systemName: "arrow.up.circle") } .actionSheet(isPresented: $showActions) { ActionSheet(title: Text("Actions"), message: nil, buttons: [ .default(Text("Option 1")) { // TODO }, .default(Text("Option 2")) { // TODO }, .cancel() ]) } } } Thanks!
1
0
464
Feb ’24
Background URLSession http/3 URLSessionUploadTask not started in real device
I'm trying to upload large files in a background URLSession to a server that accepts HTTP/3 requests. Per its design, the server uses a self-signed SSL certificate. In my testing, I can create a regular URLSession (foreground) with the following: var request = URLRequest(url: dst, cachePolicy: .reloadIgnoringLocalCacheData, timeoutInterval: 6.0) request.assumesHTTP3Capable = true request.httpMethod = "POST" self.session.dataTask(with: request) { &lt;snip&gt; } And handles the SSL certificate challenge by implementing the following for the delegate: func urlSession( _ session: URLSession, didReceive challenge: URLAuthenticationChallenge, completionHandler: @escaping @Sendable (URLSession.AuthChallengeDisposition, URLCredential?) -&gt; Void ) This approach works and I can communicate with the server. But when I try to apply the same for a background session for an upload task, the task seems to be stuck, i.e. never started by the system. More specifically, the background session upload task works with iOS simulator. But on a real device connected to macOS, the task never started, and none of the delegate methods are called. The http/3 server didn't receive any request either. The background session was created: let appBundleName = Bundle.main.bundleURL.lastPathComponent.lowercased().replacingOccurrences(of: " ", with: ".") let sessionIdentifier: String = "com.networking.\(appBundleName)" let config = URLSessionConfiguration.background(withIdentifier: sessionIdentifier) self.session = URLSession(configuration: config, delegate: self, delegateQueue: nil) later, an upload task was created : var request = URLRequest(url: dst, cachePolicy: .reloadIgnoringLocalCacheData, timeoutInterval: 6.0) request.assumesHTTP3Capable = true request.httpMethod = "POST" let backgroundTask = self.session.uploadTask(with: request, fromFile: src) backgroundTask.resume() My question(s): why is the background session upload task stuck? Is there a way to find out where it is stuck? Does a background URLSession support app's delegate handling SSL certificate challenge? The test env: Xcode 14.2 macOS 12.6.7 iOS 15.7.7 Thanks!
4
0
1.2k
Aug ’23
Cannot build Archive for a SwiftUI app that uses a XCFramework
Hi, My SwiftUI app uses a third-party framework (written in objc, say with name Foo) that is built into a XCFramework. I can build and run this app in iOS simulator and an iPhone connected to my Mac. But when I do "Product" -> "Archive", it fails with the error: Cannot find 'Foo' in scope The error is reported at anywhere that uses things like Foo.bar(). Why would Archive fail while both simulator and real device works with the framework? What's the extra thing does Archive do? XCode version: 14.2 Target setting -> General -> Frameworks, Libraries and Embedded Content: "Embed & Sign". Thanks.
1
0
643
Jul ’23
AVPlayer video track freeze after 1 second while audio continues
Hi, In my SwiftUI app, I'm using AVPlayer and a UIViewRepresentable view to play back a video from a network server. There are many videos on the server, so I have a LazyHStack for the video thumbnails. Only one video is shown on screen at a time. When the user scrolls between videos, I will create a new AVPlayerItem and replace the current player item in the AVPlayer. The problem is: the first video plays back without any problems. But when the user scrolls over to the next video, the new video would play for 1 second and then freezes while the audio track continues. The video track would resume after a few seconds. One note is that if I drag the video view a bit without fully scrolling to another video, the video track will resume after I released the drag. So it seems maybe somehow the AVPlayerLayer was not rendering? Related code pieces: struct PlayerView: UIViewRepresentable { let player: Player let width: CGFloat let height: CGFloat @Binding var updateCount: Int func makeUIView(context: Context) -> UIView { print("\(#function)") let playerLayer = AVPlayerLayer(player: player) let view = UIView() view.frame = CGRectMake(0, 0, width, height) playerLayer.frame = view.bounds view.layer.addSublayer(playerLayer) return view } func updateUIView(_ uiView: UIView, context: Context) { print("\(#function)") guard let playerLayer = uiView.layer.sublayers?.first as? AVPlayerLayer else { return } uiView.frame = CGRectMake(0, 0, width, height) if updateCount > 0 { playerLayer.player = player playerLayer.frame = uiView.bounds } } } updateCount was added to force updateUIView called when the parent View increases the updateCount. But it seems not helping much. My question: Is there any known issue about replacing the current item in AVPlayer in SwiftUI app? Here is my code of replacing the current item: let asset = AVURLAsset(url: url) asset.resourceLoader.setDelegate(urlDelegate, queue: urlDelegate.resourceLoaderQueue) let assetKeys = ["playable", "hasProtectedContent"] let playerItem = AVPlayerItem(asset: asset, automaticallyLoadedAssetKeys: assetKeys) player.replaceCurrentItem(with: playerItem) Note that I have a custom delegate to handle the resource loader for the URLAsset. This delegate is the same for all videos. They always plays well when selected first time (i.e. without replacing another video). One more data point: I have an ObjC version of the same design, and it worked without problems.
2
0
1.4k
May ’23
How to import an objective-c framework into a SwiftUI app in Xcode 13?
Hi, I followed some online guides to import an objective-c framework into a SwiftUI app but did not work correctly. I am using Xcode 13.4. Here are the problems and my questions: The objective-c framework (Cronet from Chromium) has multiple versions: Debug-iphonesimulator Debug-iphoneos Release-iphoneos But in Xcode, the SwiftUI app only has one target, i.e. no different targets for Debug or Release, or iPhone device vs. Simulator. Which version of the objective-c framework should I import for the SwiftUI app target? I tried to import the version of "Debug-iphonesimulator" into SwiftUI app. The import seems to be OK, i.e. it shows under "Frameworks", but if I click the framework name, there are no header files listing. (see attached screenshots). Is there supposed to be a list of header files from the framework after importing? How can I use the framework if no header files available in the project? Thanks.
1
0
1.2k
Oct ’22
embedded AVPlayerViewController reads video partially but did not play
I'm using AVPlayerViewController to play back a remote video asset. It works when I use presentViewController: animated: completion: to pop up the controller. But it stopped working when I embed AVPlayerViewController in a UIView as following: [avPlayer play]; [videoView addSubview:playerController.view]; playController.view.frame = videoView.bounds; I also tried to move [avPlayer play] after the view frame setting, did not work either. The view shows a black background with a play button in the middle. (See attached screenshot). In more details, I use a delegate to handle the video asset reading via AVAssetResourceLoader setDelegate: queue: , the logs of the delegate shows that when embedding the AVPlayerViewController, the player did read the video URL a few times (4 requests) but then stopped for apparent no reason. By a request, I mean the following callback was called: resourceLoader:shouldWaitForLoadingOfRequestedResource: In the failed case, the last request tries to read the last part of the video even though it has not read the previous parts yet: <AVAssetResourceLoadingDataRequest: 0x60000163be40, requested offset = 8842658, requested length = 4156, requests all data to end of resource = YES, current offset = 8842658>> In contrast, when using presentViewController to play back successfully, the last request was reading a much larger length of the video till the end. <AVAssetResourceLoadingDataRequest: 0x6000005b4620, requested offset = 15884, requested length = 8765940, requests all data to end of resource = NO, current offset = 15884>> Is there any known issue with embedding AVPlayerViewController in a UIView? especially when using custom delegate to handle URL asset read? Thanks!
1
0
1.3k
Jul ’22
how to play back a slow motion video via URL in iOS?
I am trying to support video playback for remote video files in iOS. It works for a regular video, but has a problem with slow motion videos: the slow motion effect is lost. I.e. A slow motion plays back like a regular speed rate video. Here is what I am doing:         AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];         [asset.resourceLoader setDelegate:self.urlDelegate                                     queue:[self.urlDelegate getDispatchQueue]];         AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];         AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];         AVPlayerViewController *controller = [[AVPlayerViewController alloc] init];         controller.player = player;         [player play];         [self presentViewController:controller animated:false completion:^{ <snipped>         }]; Note, the url points to a QuickTime (.MOV) video file on a HTTP(s) server. Again, the above code plays a slow motion video just like a regular video without any slow motions. Is it because AVURLAsset does not support slow motion video? What are I missing to let AVPlayer plays slow motion? I am targeting iOS 12 and above. Thanks!
1
0
1.7k
Jul ’22
Cannot set constraints for an image view inside a scroll view
Hi, I was not able to set constraints for an image view inside a scroll view in XCode. Any ideas why? Here are the details: The reason to do this: Xcode complains that the scroll view missing constraints for its "scrollable content", please see attached screen shot. XCode version: 12.5.1 macOS: Big Sur. View relationship: The scroll view is the superview of the image view. Tried to use "ctrl drag" between the image view and the scroll view. No constraints options available. The scroll view is in a Collection View cell. Please see attached screen shot.
9
0
2.9k
Feb ’22
Embed a framework for different platforms
Is there a way to embed a framework for different platforms (device vs. iOS simulator) using one Xcode target? (Xcode version is 11.3)I know that I can add different paths for device vs. simulator, in Target -&gt; Build Settings -&gt; Search Paths -&gt; Framework Search Paths. But under Target -&gt; Build Phases -&gt; Embed Frameworks , when I tried to add a framework twice, under different paths, the Build will fail. What should I do without creating a seperate target ? Here is a related question, but did not get detailed answer about what exactly should be done: https://forums.developer.apple.com/thread/66978
2
0
1.1k
Feb ’20
AVAssetResourceLoadingRequest dataRequest issue
Hi,I'm using AVPlayer and AVAssetResourceLoaderDelegate to handle a custom video playback (streaming) from remote server. In the delegate code, it will uses its own NSURLSession to create a data task retrieving the video from the remote server, and feed the data to AVAssetResourceLoader.The first loadingRequest I got is good: it's a contentInformationRequest and I do the following: (cotentType will be "video/mp4") request.contentInformationRequest.byteRangeAccessSupported = true; request.contentInformationRequest.contentType = response.MIMEType; request.contentInformationRequest.contentLength = totalLength;after that I'll get the first data request, however the data request always has "requestsAllDataToEndOfResource" set to true, and requestedLength set to "totalLength", as in this debug output:data request = requested offset = 0, requested length = 28875525, requests all data to end of resource = YES, current offset = 0&gt;I was expecting this data request is only asking for a range of data, especially it's the first data request (after the contentInfoRequest). Why is that?Then my code would continue to handle this as it's getting more data from the remote server asynchronically, it will give it to the data request:- (void)URLSession:(NSURLSession *)session dataTask:(NSURLSessionDataTask *)dataTask didReceiveData:(NSData *)data { [self.currentRequest.dataRequest respondWithData:respondData];However, after the above callback responds to the dataRequest with data for a few time (3 or 4 times), we hit the 2nd issue, the loadingRequest was cancelled:data request = requested offset = 0, requested length = 28875525, requests all data to end of resource = YES, current offset = 25284&gt;We can see that the request currentOffset is moving forward but far from finished. What could trigger the cancel? (The user did not cancel it via AVPlayer UI).So, we have 2 questions:1). why the first data request always asks for the whole length of data?2). what could trigger cancelling the loadingRequest when we are feeding the data to the data request?Thanks.
5
0
2.7k
Feb ’20