In iOS, when I use AVPlayerViewController to play back a slow motion video, it has a "ramp-up" stage at the start and a "ramp-down" stage at the end, and the video plays at the normal speed (i.e. not slow motion) during these stages.
My question is: are these non-slow-motion stages defined in the video file itself? (e.g. some kind of meta data?) Or, is it just a playback approach used by AVPlayerViewController ?
Thanks!
Post
Replies
Boosts
Views
Activity
Hi,
We are using HTTP3 only and hence using assumesHTTP3Capable for every request. It worked so far but now encountered one iPhone that never honor this flag and always tries to create a connection using TCP:
[tcp] tcp_input [C1:3] flags=[R.] seq=0, ack=2023568485, win=0 state=SYN_SENT rcv_nxt=0, snd_una=2023568484
The request is created like this:
let url = URL(string: urlString)!
var request = URLRequest(url: url, cachePolicy: .reloadIgnoringLocalCacheData, timeoutInterval: 60.0)
request.assumesHTTP3Capable = true
return try await urlSession.data(for: request)
iOS: 16
XCode: 15.3
In what cases iOS CFNetwork would not honor "assumesHTTP3Capable" ? (or how can I find out why?)
Hi,
When running my iOS app in Xcode, I got the following message in the console multiple times:
[connection] nw_read_request_report [C1] Receive failed with error "Operation timed out"
It seems not critical as my app still works, but how can I find out more details of the connection that printed this message? For example, the network request the caused this, or the URL?
Xcode: 15.3
iOS 17
SwiftUI app
I have view hierarchy like this:
NavigationView {
VStack {
ScrollView {
// some content
}
.background(NavigationLink(destination:
MyView(),
isActive: $isActive) {
EmptyView()
})
// snipped
}
}
When the user follows the navigation link to MyView, by default NavigationBar and MyView is laid out like in a VStack. How can I make them like in a ZStack(alignment: .topLeading) ? i.e. I like MyView extends beneath the NavigationBar.
Here is the MyView:
GeometryReader { geo in
HStack {
// some content
}
.onAppear {
hideNavigationbar = false
}
.onTapGesture(perform: {
hideNavigationbar.toggle()
})
.navigationBarHidden(hideNavigationbar)
.navigationBarTitleDisplayMode(.inline)
// snipped
}
The problem is, when hideNavigationbar toggles, MyView would shift up and down because its height would change. I want NavigationBar sits on top of MyView like in a ZStack so that when NavigationBar hides, MyView will not shift (only uncover its top portion).
What is the proper way to achieve that?
XCode: 15.3 using SwiftUI
iOS: 15
I'm following the approach in https://developer.apple.com/forums/thread/703234 section "Doing Even Better: Proper Security".
My question is: does it work if the accessory is not in the local network (i.e. out there on the Internet with an IP address) ?
I tried and: SecTrustEvaluateWithError(trust, nil) returns true, but TLS still fails:
ATS failed system trust
Connection 1: system TLS Trust evaluation failed(-9802)
<snip>
Domain=NSURLErrorDomain Code=-1200 "An SSL error has occurred and a secure connection to the server cannot be made." UserInfo={NSLocalizedRecoverySuggestion=Would you like to connect to the server anyway?,
Here is my code :
var err = SecTrustSetPolicies(trust, SecPolicyCreateBasicX509())
os_log("SecTrustSetPolicies returns \(err)")
err = SecTrustSetAnchorCertificates(trust, [self.myCA] as NSArray)
os_log("SecTrustSetAnchorCertificates returns \(err)")
err = SecTrustSetAnchorCertificatesOnly(trust, true)
os_log("SecTrustSetAnchorCertificatesOnly returns \(err)")
// check the trust object
let evalResult = SecTrustEvaluateWithError(trust, nil)
os_log("SecTrust eval result: \(evalResult)")
// create a credential with accepted server trust.
let credential = URLCredential(trust: trust)
completionHandler(.useCredential, credential)
the logs are:
SecTrustSetPolicies returns 0
SecTrustSetAnchorCertificates returns 0
SecTrustSetAnchorCertificatesOnly returns 0
SecTrust eval result: true
Did I do anything wrong? or is it not supported outside the local network?
Thanks.
I have a view FooView in a navigation stack presented via a NavigationLink.
And FooView has a button that triggers a ActionSheet.
The problem is: when the ActionSheet is dismissed (either cancelled or selected option), its parent view, i.e. FooView is also dismissed, i.e. pops up from the navigation stack.
Is there a way to prevent it doing that?
More importantly, why is this happening? What dismissed FooView?
iOS version: iOS 15 on iPhone
XCode: 14.2
Here is a minimum reproducible sample code.
I noticed that if I moved NavigationView from outside of TabView into ContentView, then the issue is gone. But then the Tabs will show up even in FooView, hence I kept NavigationView outside TabView. (And why does that matter?)
import SwiftUI
@main
struct ViewDismissApp: App {
var body: some Scene {
WindowGroup {
NavigationView {
TabView {
ContentView(title: "Tab 1")
.tabItem {
Label("Tab1", systemImage: "wifi")
}
ContentView(title: "Tab 2")
.tabItem {
Label("Tab2", systemImage: "wrench.adjustable")
}
}
}
.navigationViewStyle(.stack)
}
}
}
struct ContentView: View {
let title: String
var body: some View {
VStack {
NavigationLink {
FooView()
} label: {
Text(title)
}
}
}
}
struct FooView: View {
@State private var showActions = false
var body: some View {
VStack {
Text("Foo View")
actionButton()
}
}
private func actionButton() -> some View {
Button(action: {
showActions = true
}) {
Image(systemName: "arrow.up.circle")
}
.actionSheet(isPresented: $showActions) {
ActionSheet(title: Text("Actions"),
message: nil,
buttons: [
.default(Text("Option 1")) {
// TODO
},
.default(Text("Option 2")) {
// TODO
},
.cancel()
])
}
}
}
Thanks!
I am trying to support video playback for remote video files in iOS. It works for a regular video, but has a problem with slow motion videos: the slow motion effect is lost. I.e. A slow motion plays back like a regular speed rate video.
Here is what I am doing:
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
[asset.resourceLoader setDelegate:self.urlDelegate
queue:[self.urlDelegate getDispatchQueue]];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerViewController *controller = [[AVPlayerViewController alloc] init];
controller.player = player;
[player play];
[self presentViewController:controller animated:false completion:^{
<snipped>
}];
Note, the url points to a QuickTime (.MOV) video file on a HTTP(s) server.
Again, the above code plays a slow motion video just like a regular video without any slow motions. Is it because AVURLAsset does not support slow motion video? What are I missing to let AVPlayer plays slow motion?
I am targeting iOS 12 and above.
Thanks!
I'm trying to upload large files in a background URLSession to a server that accepts HTTP/3 requests. Per its design, the server uses a self-signed SSL certificate.
In my testing, I can create a regular URLSession (foreground) with the following:
var request = URLRequest(url: dst, cachePolicy: .reloadIgnoringLocalCacheData, timeoutInterval: 6.0)
request.assumesHTTP3Capable = true
request.httpMethod = "POST"
self.session.dataTask(with: request) {
<snip>
}
And handles the SSL certificate challenge by implementing the following for the delegate:
func urlSession(
_ session: URLSession,
didReceive challenge: URLAuthenticationChallenge,
completionHandler: @escaping @Sendable (URLSession.AuthChallengeDisposition, URLCredential?) -> Void
)
This approach works and I can communicate with the server. But when I try to apply the same for a background session for an upload task, the task seems to be stuck, i.e. never started by the system.
More specifically, the background session upload task works with iOS simulator. But on a real device connected to macOS, the task never started, and none of the delegate methods are called. The http/3 server didn't receive any request either.
The background session was created:
let appBundleName = Bundle.main.bundleURL.lastPathComponent.lowercased().replacingOccurrences(of: " ", with: ".")
let sessionIdentifier: String = "com.networking.\(appBundleName)"
let config = URLSessionConfiguration.background(withIdentifier: sessionIdentifier)
self.session = URLSession(configuration: config, delegate: self, delegateQueue: nil)
later, an upload task was created :
var request = URLRequest(url: dst, cachePolicy: .reloadIgnoringLocalCacheData, timeoutInterval: 6.0)
request.assumesHTTP3Capable = true
request.httpMethod = "POST"
let backgroundTask = self.session.uploadTask(with: request, fromFile: src)
backgroundTask.resume()
My question(s): why is the background session upload task stuck? Is there a way to find out where it is stuck?
Does a background URLSession support app's delegate handling SSL certificate challenge?
The test env:
Xcode 14.2
macOS 12.6.7
iOS 15.7.7
Thanks!
Hi,
My SwiftUI app uses a third-party framework (written in objc, say with name Foo) that is built into a XCFramework. I can build and run this app in iOS simulator and an iPhone connected to my Mac. But when I do "Product" -> "Archive", it fails with the error:
Cannot find 'Foo' in scope
The error is reported at anywhere that uses things like Foo.bar().
Why would Archive fail while both simulator and real device works with the framework? What's the extra thing does Archive do?
XCode version: 14.2
Target setting -> General -> Frameworks, Libraries and Embedded Content: "Embed & Sign".
Thanks.
Hi,
In my SwiftUI app, I'm using AVPlayer and a UIViewRepresentable view to play back a video from a network server.
There are many videos on the server, so I have a LazyHStack for the video thumbnails. Only one video is shown on screen at a time.
When the user scrolls between videos, I will create a new AVPlayerItem and replace the current player item in the AVPlayer.
The problem is: the first video plays back without any problems. But when the user scrolls over to the next video, the new video would play for 1 second and then freezes while the audio track continues. The video track would resume after a few seconds.
One note is that if I drag the video view a bit without fully scrolling to another video, the video track will resume after I released the drag. So it seems maybe somehow the AVPlayerLayer was not rendering?
Related code pieces:
struct PlayerView: UIViewRepresentable {
let player: Player
let width: CGFloat
let height: CGFloat
@Binding var updateCount: Int
func makeUIView(context: Context) -> UIView {
print("\(#function)")
let playerLayer = AVPlayerLayer(player: player)
let view = UIView()
view.frame = CGRectMake(0, 0, width, height)
playerLayer.frame = view.bounds
view.layer.addSublayer(playerLayer)
return view
}
func updateUIView(_ uiView: UIView, context: Context) {
print("\(#function)")
guard let playerLayer = uiView.layer.sublayers?.first as? AVPlayerLayer else { return }
uiView.frame = CGRectMake(0, 0, width, height)
if updateCount > 0 {
playerLayer.player = player
playerLayer.frame = uiView.bounds
}
}
}
updateCount was added to force updateUIView called when the parent View increases the updateCount. But it seems not helping much.
My question: Is there any known issue about replacing the current item in AVPlayer in SwiftUI app?
Here is my code of replacing the current item:
let asset = AVURLAsset(url: url)
asset.resourceLoader.setDelegate(urlDelegate, queue: urlDelegate.resourceLoaderQueue)
let assetKeys = ["playable", "hasProtectedContent"]
let playerItem = AVPlayerItem(asset: asset, automaticallyLoadedAssetKeys: assetKeys)
player.replaceCurrentItem(with: playerItem)
Note that I have a custom delegate to handle the resource loader for the URLAsset. This delegate is the same for all videos. They always plays well when selected first time (i.e. without replacing another video).
One more data point: I have an ObjC version of the same design, and it worked without problems.
Hi,I am writing my own HTTP web service server that supports iOS devices to view video files. On iOS I am using `AVPlayer` and on the server side it is simply `.mp4` files. But it seems like AVPlayer buffers for a long time (10+ seconds) before starting playing a video.After a bit debugging, it seems that AVPlayer send weird range values in its HTTP request for the video, for example:1st request headers send range 0-1: (this is OK)X-Playback-Session-Id: B1F3B5AE-B49A-40B1-8F2E-A501E1BC24AF
Range: bytes=0-1
Accept: */*
User-Agent: AppleCoreMedia/1.0.0.17A860 (iPhone; U; CPU OS 13_1_2 like Mac OS X; en_us)
Accept-Language: en-us
Accept-Encoding: identity
Connection: keep-alive2nd request headers send range 0-37724087, which is full length of the video.This is weird, why doesn't AVPlayer take advantage of range header and ask for only a small range?X-Playback-Session-Id: B1F3B5AE-B49A-40B1-8F2E-A501E1BC24AF
Range: bytes=0-37724087
Accept: */*
User-Agent: AppleCoreMedia/1.0.0.17A860 (iPhone; U; CPU OS 13_1_2 like Mac OS X; en_us)
Accept-Language: en-us
Accept-Encoding: identity
Connection: keep-alive3rd request headers send range of 37683200-37724087, which is the last chunk of the video.Again very weird, why does AVPlayer ask for the range that was already covered by the previous request?X-Playback-Session-Id: B1F3B5AE-B49A-40B1-8F2E-A501E1BC24AF
Range: bytes=37683200-37724087
Accept: */*
User-Agent: AppleCoreMedia/1.0.0.17A860 (iPhone; U; CPU OS 13_1_2 like Mac OS X; en_us)
Accept-Language: en-us
Accept-Encoding: identity
Connection: keep-aliveAny ideas? Any comments or pointers are appreciated.
Hi,
I followed some online guides to import an objective-c framework into a SwiftUI app but did not work correctly. I am using Xcode 13.4. Here are the problems and my questions:
The objective-c framework (Cronet from Chromium) has multiple versions:
Debug-iphonesimulator
Debug-iphoneos
Release-iphoneos
But in Xcode, the SwiftUI app only has one target, i.e. no different targets for Debug or Release, or iPhone device vs. Simulator.
Which version of the objective-c framework should I import for the SwiftUI app target?
I tried to import the version of "Debug-iphonesimulator" into SwiftUI app. The import seems to be OK, i.e. it shows under "Frameworks", but if I click the framework name, there are no header files listing. (see attached screenshots).
Is there supposed to be a list of header files from the framework after importing? How can I use the framework if no header files available in the project?
Thanks.
I'm using AVPlayerViewController to play back a remote video asset. It works when I use presentViewController: animated: completion: to pop up the controller.
But it stopped working when I embed AVPlayerViewController in a UIView as following:
[avPlayer play];
[videoView addSubview:playerController.view];
playController.view.frame = videoView.bounds;
I also tried to move [avPlayer play] after the view frame setting, did not work either. The view shows a black background with a play button in the middle. (See attached screenshot).
In more details, I use a delegate to handle the video asset reading via AVAssetResourceLoader setDelegate: queue: , the logs of the delegate shows that when embedding the AVPlayerViewController, the player did read the video URL a few times (4 requests) but then stopped for apparent no reason.
By a request, I mean the following callback was called:
resourceLoader:shouldWaitForLoadingOfRequestedResource:
In the failed case, the last request tries to read the last part of the video even though it has not read the previous parts yet:
<AVAssetResourceLoadingDataRequest: 0x60000163be40, requested offset = 8842658, requested length = 4156, requests all data to end of resource = YES, current offset = 8842658>>
In contrast, when using presentViewController to play back successfully, the last request was reading a much larger length of the video till the end.
<AVAssetResourceLoadingDataRequest: 0x6000005b4620, requested offset = 15884, requested length = 8765940, requests all data to end of resource = NO, current offset = 15884>>
Is there any known issue with embedding AVPlayerViewController in a UIView? especially when using custom delegate to handle URL asset read?
Thanks!
Hi,
I was not able to set constraints for an image view inside a scroll view in XCode. Any ideas why? Here are the details:
The reason to do this: Xcode complains that the scroll view missing constraints for its "scrollable content", please see attached screen shot.
XCode version: 12.5.1
macOS: Big Sur.
View relationship: The scroll view is the superview of the image view.
Tried to use "ctrl drag" between the image view and the scroll view. No constraints options available.
The scroll view is in a Collection View cell.
Please see attached screen shot.
According to Wikipedia, Wi-Fi TDLS has been around for a number of years. Does iPhone (any model) support TDLS? If yes, how does it work? Does it need any help from the app by any means or fully automatic? Thanks in advance.