Posts

Post marked as solved
1 Replies
240 Views
I'm following the approach in https://developer.apple.com/forums/thread/703234 section "Doing Even Better: Proper Security". My question is: does it work if the accessory is not in the local network (i.e. out there on the Internet with an IP address) ? I tried and: SecTrustEvaluateWithError(trust, nil) returns true, but TLS still fails: ATS failed system trust Connection 1: system TLS Trust evaluation failed(-9802) <snip> Domain=NSURLErrorDomain Code=-1200 "An SSL error has occurred and a secure connection to the server cannot be made." UserInfo={NSLocalizedRecoverySuggestion=Would you like to connect to the server anyway?, Here is my code : var err = SecTrustSetPolicies(trust, SecPolicyCreateBasicX509()) os_log("SecTrustSetPolicies returns \(err)") err = SecTrustSetAnchorCertificates(trust, [self.myCA] as NSArray) os_log("SecTrustSetAnchorCertificates returns \(err)") err = SecTrustSetAnchorCertificatesOnly(trust, true) os_log("SecTrustSetAnchorCertificatesOnly returns \(err)") // check the trust object let evalResult = SecTrustEvaluateWithError(trust, nil) os_log("SecTrust eval result: \(evalResult)") // create a credential with accepted server trust. let credential = URLCredential(trust: trust) completionHandler(.useCredential, credential) the logs are: SecTrustSetPolicies returns 0 SecTrustSetAnchorCertificates returns 0 SecTrustSetAnchorCertificatesOnly returns 0 SecTrust eval result: true Did I do anything wrong? or is it not supported outside the local network? Thanks.
Posted Last updated
.
Post not yet marked as solved
1 Replies
245 Views
I have a view FooView in a navigation stack presented via a NavigationLink. And FooView has a button that triggers a ActionSheet. The problem is: when the ActionSheet is dismissed (either cancelled or selected option), its parent view, i.e. FooView is also dismissed, i.e. pops up from the navigation stack. Is there a way to prevent it doing that? More importantly, why is this happening? What dismissed FooView? iOS version: iOS 15 on iPhone XCode: 14.2 Here is a minimum reproducible sample code. I noticed that if I moved NavigationView from outside of TabView into ContentView, then the issue is gone. But then the Tabs will show up even in FooView, hence I kept NavigationView outside TabView. (And why does that matter?) import SwiftUI @main struct ViewDismissApp: App { var body: some Scene { WindowGroup { NavigationView { TabView { ContentView(title: "Tab 1") .tabItem { Label("Tab1", systemImage: "wifi") } ContentView(title: "Tab 2") .tabItem { Label("Tab2", systemImage: "wrench.adjustable") } } } .navigationViewStyle(.stack) } } } struct ContentView: View { let title: String var body: some View { VStack { NavigationLink { FooView() } label: { Text(title) } } } } struct FooView: View { @State private var showActions = false var body: some View { VStack { Text("Foo View") actionButton() } } private func actionButton() -> some View { Button(action: { showActions = true }) { Image(systemName: "arrow.up.circle") } .actionSheet(isPresented: $showActions) { ActionSheet(title: Text("Actions"), message: nil, buttons: [ .default(Text("Option 1")) { // TODO }, .default(Text("Option 2")) { // TODO }, .cancel() ]) } } } Thanks!
Posted Last updated
.
Post not yet marked as solved
1 Replies
1.4k Views
I am trying to support video playback for remote video files in iOS. It works for a regular video, but has a problem with slow motion videos: the slow motion effect is lost. I.e. A slow motion plays back like a regular speed rate video. Here is what I am doing:         AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];         [asset.resourceLoader setDelegate:self.urlDelegate                                     queue:[self.urlDelegate getDispatchQueue]];         AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];         AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];         AVPlayerViewController *controller = [[AVPlayerViewController alloc] init];         controller.player = player;         [player play];         [self presentViewController:controller animated:false completion:^{ <snipped>         }]; Note, the url points to a QuickTime (.MOV) video file on a HTTP(s) server. Again, the above code plays a slow motion video just like a regular video without any slow motions. Is it because AVURLAsset does not support slow motion video? What are I missing to let AVPlayer plays slow motion? I am targeting iOS 12 and above. Thanks!
Posted Last updated
.
Post not yet marked as solved
4 Replies
834 Views
I'm trying to upload large files in a background URLSession to a server that accepts HTTP/3 requests. Per its design, the server uses a self-signed SSL certificate. In my testing, I can create a regular URLSession (foreground) with the following: var request = URLRequest(url: dst, cachePolicy: .reloadIgnoringLocalCacheData, timeoutInterval: 6.0) request.assumesHTTP3Capable = true request.httpMethod = "POST" self.session.dataTask(with: request) { &lt;snip&gt; } And handles the SSL certificate challenge by implementing the following for the delegate: func urlSession( _ session: URLSession, didReceive challenge: URLAuthenticationChallenge, completionHandler: @escaping @Sendable (URLSession.AuthChallengeDisposition, URLCredential?) -&gt; Void ) This approach works and I can communicate with the server. But when I try to apply the same for a background session for an upload task, the task seems to be stuck, i.e. never started by the system. More specifically, the background session upload task works with iOS simulator. But on a real device connected to macOS, the task never started, and none of the delegate methods are called. The http/3 server didn't receive any request either. The background session was created: let appBundleName = Bundle.main.bundleURL.lastPathComponent.lowercased().replacingOccurrences(of: " ", with: ".") let sessionIdentifier: String = "com.networking.\(appBundleName)" let config = URLSessionConfiguration.background(withIdentifier: sessionIdentifier) self.session = URLSession(configuration: config, delegate: self, delegateQueue: nil) later, an upload task was created : var request = URLRequest(url: dst, cachePolicy: .reloadIgnoringLocalCacheData, timeoutInterval: 6.0) request.assumesHTTP3Capable = true request.httpMethod = "POST" let backgroundTask = self.session.uploadTask(with: request, fromFile: src) backgroundTask.resume() My question(s): why is the background session upload task stuck? Is there a way to find out where it is stuck? Does a background URLSession support app's delegate handling SSL certificate challenge? The test env: Xcode 14.2 macOS 12.6.7 iOS 15.7.7 Thanks!
Posted Last updated
.
Post not yet marked as solved
1 Replies
433 Views
Hi, My SwiftUI app uses a third-party framework (written in objc, say with name Foo) that is built into a XCFramework. I can build and run this app in iOS simulator and an iPhone connected to my Mac. But when I do "Product" -> "Archive", it fails with the error: Cannot find 'Foo' in scope The error is reported at anywhere that uses things like Foo.bar(). Why would Archive fail while both simulator and real device works with the framework? What's the extra thing does Archive do? XCode version: 14.2 Target setting -> General -> Frameworks, Libraries and Embedded Content: "Embed & Sign". Thanks.
Posted Last updated
.
Post not yet marked as solved
2 Replies
1k Views
Hi, In my SwiftUI app, I'm using AVPlayer and a UIViewRepresentable view to play back a video from a network server. There are many videos on the server, so I have a LazyHStack for the video thumbnails. Only one video is shown on screen at a time. When the user scrolls between videos, I will create a new AVPlayerItem and replace the current player item in the AVPlayer. The problem is: the first video plays back without any problems. But when the user scrolls over to the next video, the new video would play for 1 second and then freezes while the audio track continues. The video track would resume after a few seconds. One note is that if I drag the video view a bit without fully scrolling to another video, the video track will resume after I released the drag. So it seems maybe somehow the AVPlayerLayer was not rendering? Related code pieces: struct PlayerView: UIViewRepresentable { let player: Player let width: CGFloat let height: CGFloat @Binding var updateCount: Int func makeUIView(context: Context) -> UIView { print("\(#function)") let playerLayer = AVPlayerLayer(player: player) let view = UIView() view.frame = CGRectMake(0, 0, width, height) playerLayer.frame = view.bounds view.layer.addSublayer(playerLayer) return view } func updateUIView(_ uiView: UIView, context: Context) { print("\(#function)") guard let playerLayer = uiView.layer.sublayers?.first as? AVPlayerLayer else { return } uiView.frame = CGRectMake(0, 0, width, height) if updateCount > 0 { playerLayer.player = player playerLayer.frame = uiView.bounds } } } updateCount was added to force updateUIView called when the parent View increases the updateCount. But it seems not helping much. My question: Is there any known issue about replacing the current item in AVPlayer in SwiftUI app? Here is my code of replacing the current item: let asset = AVURLAsset(url: url) asset.resourceLoader.setDelegate(urlDelegate, queue: urlDelegate.resourceLoaderQueue) let assetKeys = ["playable", "hasProtectedContent"] let playerItem = AVPlayerItem(asset: asset, automaticallyLoadedAssetKeys: assetKeys) player.replaceCurrentItem(with: playerItem) Note that I have a custom delegate to handle the resource loader for the URLAsset. This delegate is the same for all videos. They always plays well when selected first time (i.e. without replacing another video). One more data point: I have an ObjC version of the same design, and it worked without problems.
Posted Last updated
.
Post not yet marked as solved
1 Replies
2.4k Views
Hi,I am writing my own HTTP web service server that supports iOS devices to view video files. On iOS I am using `AVPlayer` and on the server side it is simply `.mp4` files. But it seems like AVPlayer buffers for a long time (10+ seconds) before starting playing a video.After a bit debugging, it seems that AVPlayer send weird range values in its HTTP request for the video, for example:1st request headers send range 0-1: (this is OK)X-Playback-Session-Id: B1F3B5AE-B49A-40B1-8F2E-A501E1BC24AF Range: bytes=0-1 Accept: */* User-Agent: AppleCoreMedia/1.0.0.17A860 (iPhone; U; CPU OS 13_1_2 like Mac OS X; en_us) Accept-Language: en-us Accept-Encoding: identity Connection: keep-alive2nd request headers send range 0-37724087, which is full length of the video.This is weird, why doesn't AVPlayer take advantage of range header and ask for only a small range?X-Playback-Session-Id: B1F3B5AE-B49A-40B1-8F2E-A501E1BC24AF Range: bytes=0-37724087 Accept: */* User-Agent: AppleCoreMedia/1.0.0.17A860 (iPhone; U; CPU OS 13_1_2 like Mac OS X; en_us) Accept-Language: en-us Accept-Encoding: identity Connection: keep-alive3rd request headers send range of 37683200-37724087, which is the last chunk of the video.Again very weird, why does AVPlayer ask for the range that was already covered by the previous request?X-Playback-Session-Id: B1F3B5AE-B49A-40B1-8F2E-A501E1BC24AF Range: bytes=37683200-37724087 Accept: */* User-Agent: AppleCoreMedia/1.0.0.17A860 (iPhone; U; CPU OS 13_1_2 like Mac OS X; en_us) Accept-Language: en-us Accept-Encoding: identity Connection: keep-aliveAny ideas? Any comments or pointers are appreciated.
Posted Last updated
.
Post not yet marked as solved
1 Replies
1k Views
Hi, I followed some online guides to import an objective-c framework into a SwiftUI app but did not work correctly. I am using Xcode 13.4. Here are the problems and my questions: The objective-c framework (Cronet from Chromium) has multiple versions: Debug-iphonesimulator Debug-iphoneos Release-iphoneos But in Xcode, the SwiftUI app only has one target, i.e. no different targets for Debug or Release, or iPhone device vs. Simulator. Which version of the objective-c framework should I import for the SwiftUI app target? I tried to import the version of "Debug-iphonesimulator" into SwiftUI app. The import seems to be OK, i.e. it shows under "Frameworks", but if I click the framework name, there are no header files listing. (see attached screenshots). Is there supposed to be a list of header files from the framework after importing? How can I use the framework if no header files available in the project? Thanks.
Posted Last updated
.
Post not yet marked as solved
1 Replies
1.1k Views
I'm using AVPlayerViewController to play back a remote video asset. It works when I use presentViewController: animated: completion: to pop up the controller. But it stopped working when I embed AVPlayerViewController in a UIView as following: [avPlayer play]; [videoView addSubview:playerController.view]; playController.view.frame = videoView.bounds; I also tried to move [avPlayer play] after the view frame setting, did not work either. The view shows a black background with a play button in the middle. (See attached screenshot). In more details, I use a delegate to handle the video asset reading via AVAssetResourceLoader setDelegate: queue: , the logs of the delegate shows that when embedding the AVPlayerViewController, the player did read the video URL a few times (4 requests) but then stopped for apparent no reason. By a request, I mean the following callback was called: resourceLoader:shouldWaitForLoadingOfRequestedResource: In the failed case, the last request tries to read the last part of the video even though it has not read the previous parts yet: <AVAssetResourceLoadingDataRequest: 0x60000163be40, requested offset = 8842658, requested length = 4156, requests all data to end of resource = YES, current offset = 8842658>> In contrast, when using presentViewController to play back successfully, the last request was reading a much larger length of the video till the end. <AVAssetResourceLoadingDataRequest: 0x6000005b4620, requested offset = 15884, requested length = 8765940, requests all data to end of resource = NO, current offset = 15884>> Is there any known issue with embedding AVPlayerViewController in a UIView? especially when using custom delegate to handle URL asset read? Thanks!
Posted Last updated
.
Post not yet marked as solved
8 Replies
2.3k Views
Hi, I was not able to set constraints for an image view inside a scroll view in XCode. Any ideas why? Here are the details: The reason to do this: Xcode complains that the scroll view missing constraints for its "scrollable content", please see attached screen shot. XCode version: 12.5.1 macOS: Big Sur. View relationship: The scroll view is the superview of the image view. Tried to use "ctrl drag" between the image view and the scroll view. No constraints options available. The scroll view is in a Collection View cell. Please see attached screen shot.
Posted Last updated
.
Post not yet marked as solved
4 Replies
1.2k Views
According to Wikipedia, Wi-Fi TDLS has been around for a number of years. Does iPhone (any model) support TDLS? If yes, how does it work? Does it need any help from the app by any means or fully automatic? Thanks in advance.
Posted Last updated
.
Post not yet marked as solved
2 Replies
1.2k Views
My developer membership expired but I cannot find the renewal link from my membership page after I logged in successfully. Where is the link to renew the member ship? Thanks.
Posted Last updated
.
Post marked as solved
2 Replies
893 Views
Is there a way to embed a framework for different platforms (device vs. iOS simulator) using one Xcode target? (Xcode version is 11.3)I know that I can add different paths for device vs. simulator, in Target -&gt; Build Settings -&gt; Search Paths -&gt; Framework Search Paths. But under Target -&gt; Build Phases -&gt; Embed Frameworks , when I tried to add a framework twice, under different paths, the Build will fail. What should I do without creating a seperate target ? Here is a related question, but did not get detailed answer about what exactly should be done: https://forums.developer.apple.com/thread/66978
Posted Last updated
.
Post not yet marked as solved
5 Replies
2.2k Views
Hi,I'm using AVPlayer and AVAssetResourceLoaderDelegate to handle a custom video playback (streaming) from remote server. In the delegate code, it will uses its own NSURLSession to create a data task retrieving the video from the remote server, and feed the data to AVAssetResourceLoader.The first loadingRequest I got is good: it's a contentInformationRequest and I do the following: (cotentType will be "video/mp4") request.contentInformationRequest.byteRangeAccessSupported = true; request.contentInformationRequest.contentType = response.MIMEType; request.contentInformationRequest.contentLength = totalLength;after that I'll get the first data request, however the data request always has "requestsAllDataToEndOfResource" set to true, and requestedLength set to "totalLength", as in this debug output:data request = requested offset = 0, requested length = 28875525, requests all data to end of resource = YES, current offset = 0&gt;I was expecting this data request is only asking for a range of data, especially it's the first data request (after the contentInfoRequest). Why is that?Then my code would continue to handle this as it's getting more data from the remote server asynchronically, it will give it to the data request:- (void)URLSession:(NSURLSession *)session dataTask:(NSURLSessionDataTask *)dataTask didReceiveData:(NSData *)data { [self.currentRequest.dataRequest respondWithData:respondData];However, after the above callback responds to the dataRequest with data for a few time (3 or 4 times), we hit the 2nd issue, the loadingRequest was cancelled:data request = requested offset = 0, requested length = 28875525, requests all data to end of resource = YES, current offset = 25284&gt;We can see that the request currentOffset is moving forward but far from finished. What could trigger the cancel? (The user did not cancel it via AVPlayer UI).So, we have 2 questions:1). why the first data request always asks for the whole length of data?2). what could trigger cancelling the loadingRequest when we are feeding the data to the data request?Thanks.
Posted Last updated
.
Post not yet marked as solved
1 Replies
1.7k Views
Hi,I'm trying to use AVPlayer with custom URL loading. We have a custom NSURLProtocol subclass. But it seems [NSURLProtocol registerClass] does not work directly with AVPlayer in real device (see this thread).Now I'm trying to use AVAssetResourceLoaderDelegate to do the custom URL loading. However it is a bit confusing to me how the delegate will be triggered. The URL looks like this "https://&lt;some_ip_address&gt;:&lt;port&gt;/resource/", however the protocol is not standard HTTP/1.1 (it's QUIC based instead). I did something like following, but does not work:(delegate is implemented in a different file) AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil]; AVAssetResourceLoader *resourceLoader = asset.resourceLoader; [resourceLoader setDelegate:delegate queue:dispatch_queue_create("MyURLDelegate loader", nil)]; AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset]; AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem]; AVPlayerViewController *controller = [[AVPlayerViewController alloc] init]; controller.player = player; [player play]; [self presentViewController:controller animated:false completion:^{}];With the above, I cannot see any methods are triggered in the delegate. What am I missing to allow the delegate to do custom URL loading for "https" URLs ? ThanksHan
Posted Last updated
.