My developer membership expired but I cannot find the renewal link from my membership page after I logged in successfully. Where is the link to renew the member ship? Thanks.
Post
Replies
Boosts
Views
Activity
According to Wikipedia, Wi-Fi TDLS has been around for a number of years. Does iPhone (any model) support TDLS? If yes, how does it work? Does it need any help from the app by any means or fully automatic? Thanks in advance.
Hi,I am writing my own HTTP web service server that supports iOS devices to view video files. On iOS I am using `AVPlayer` and on the server side it is simply `.mp4` files. But it seems like AVPlayer buffers for a long time (10+ seconds) before starting playing a video.After a bit debugging, it seems that AVPlayer send weird range values in its HTTP request for the video, for example:1st request headers send range 0-1: (this is OK)X-Playback-Session-Id: B1F3B5AE-B49A-40B1-8F2E-A501E1BC24AF
Range: bytes=0-1
Accept: */*
User-Agent: AppleCoreMedia/1.0.0.17A860 (iPhone; U; CPU OS 13_1_2 like Mac OS X; en_us)
Accept-Language: en-us
Accept-Encoding: identity
Connection: keep-alive2nd request headers send range 0-37724087, which is full length of the video.This is weird, why doesn't AVPlayer take advantage of range header and ask for only a small range?X-Playback-Session-Id: B1F3B5AE-B49A-40B1-8F2E-A501E1BC24AF
Range: bytes=0-37724087
Accept: */*
User-Agent: AppleCoreMedia/1.0.0.17A860 (iPhone; U; CPU OS 13_1_2 like Mac OS X; en_us)
Accept-Language: en-us
Accept-Encoding: identity
Connection: keep-alive3rd request headers send range of 37683200-37724087, which is the last chunk of the video.Again very weird, why does AVPlayer ask for the range that was already covered by the previous request?X-Playback-Session-Id: B1F3B5AE-B49A-40B1-8F2E-A501E1BC24AF
Range: bytes=37683200-37724087
Accept: */*
User-Agent: AppleCoreMedia/1.0.0.17A860 (iPhone; U; CPU OS 13_1_2 like Mac OS X; en_us)
Accept-Language: en-us
Accept-Encoding: identity
Connection: keep-aliveAny ideas? Any comments or pointers are appreciated.
Hi,I am using NSURLSessionUploadTask in iOS to upload files (photos and videos) in background to my server via Wi-Fi. The small files are all good, but there is one large file that is about 1GB and it failed to upload. The error reported from the server side is: "connection reset by peer". It is interesting that after that error, it seems the client is again uploading the file using a new HTTP call, only to hit the same error again later. My question is: does iOS background upload task always upload via a single HTTP POST regardless file size? or possibily via multiple HTTP POST depends on the file size? Thanks.
Hi,I was posting a follow-up question to an old thread at here. But I am not sure if anyone would see it. Hence I took the liberty to post a separate thread. Thanks for understanding.I am using a custom protocol (Cronet network stack from Chromium) to play back video using AVPlayer. The code called [NSURLProtocol registerClass] to register its custom protocol and it works well in Xcode 11.3 iOS simulator testing. But the same code does not work on real device (iPhone, iOS 13). The old thread I referred above seems to suggest that AVPlayer would never "see" the registered custom protocol, but I am wondering why it worked in simulator. And is it still true in recent iOS versions that AVPlayer would use a separate process (mediaserverd ?) to play back the video hence ignoring the custom protocol? btw, I also tried to use AVAssetResourceLoaderDelegate to hijack the URL loading for AVPlayer, but did not work either in real device.Thanks.Han
Hi,I'm trying to use AVPlayer with custom URL loading. We have a custom NSURLProtocol subclass. But it seems [NSURLProtocol registerClass] does not work directly with AVPlayer in real device (see this thread).Now I'm trying to use AVAssetResourceLoaderDelegate to do the custom URL loading. However it is a bit confusing to me how the delegate will be triggered. The URL looks like this "https://<some_ip_address>:<port>/resource/", however the protocol is not standard HTTP/1.1 (it's QUIC based instead). I did something like following, but does not work:(delegate is implemented in a different file) AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
AVAssetResourceLoader *resourceLoader = asset.resourceLoader;
[resourceLoader setDelegate:delegate
queue:dispatch_queue_create("MyURLDelegate loader", nil)]; AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerViewController *controller = [[AVPlayerViewController alloc] init];
controller.player = player;
[player play];
[self presentViewController:controller animated:false completion:^{}];With the above, I cannot see any methods are triggered in the delegate. What am I missing to allow the delegate to do custom URL loading for "https" URLs ? ThanksHan
Hi,I'm using AVPlayer and AVAssetResourceLoaderDelegate to handle a custom video playback (streaming) from remote server. In the delegate code, it will uses its own NSURLSession to create a data task retrieving the video from the remote server, and feed the data to AVAssetResourceLoader.The first loadingRequest I got is good: it's a contentInformationRequest and I do the following: (cotentType will be "video/mp4") request.contentInformationRequest.byteRangeAccessSupported = true;
request.contentInformationRequest.contentType = response.MIMEType;
request.contentInformationRequest.contentLength = totalLength;after that I'll get the first data request, however the data request always has "requestsAllDataToEndOfResource" set to true, and requestedLength set to "totalLength", as in this debug output:data request = requested offset = 0,
requested length = 28875525,
requests all data to end of resource = YES,
current offset = 0>I was expecting this data request is only asking for a range of data, especially it's the first data request (after the contentInfoRequest). Why is that?Then my code would continue to handle this as it's getting more data from the remote server asynchronically, it will give it to the data request:- (void)URLSession:(NSURLSession *)session
dataTask:(NSURLSessionDataTask *)dataTask
didReceiveData:(NSData *)data
{
[self.currentRequest.dataRequest respondWithData:respondData];However, after the above callback responds to the dataRequest with data for a few time (3 or 4 times), we hit the 2nd issue, the loadingRequest was cancelled:data request =
requested offset = 0,
requested length = 28875525,
requests all data to end of resource = YES,
current offset = 25284>We can see that the request currentOffset is moving forward but far from finished. What could trigger the cancel? (The user did not cancel it via AVPlayer UI).So, we have 2 questions:1). why the first data request always asks for the whole length of data?2). what could trigger cancelling the loadingRequest when we are feeding the data to the data request?Thanks.
Is there a way to embed a framework for different platforms (device vs. iOS simulator) using one Xcode target? (Xcode version is 11.3)I know that I can add different paths for device vs. simulator, in Target -> Build Settings -> Search Paths -> Framework Search Paths. But under Target -> Build Phases -> Embed Frameworks , when I tried to add a framework twice, under different paths, the Build will fail. What should I do without creating a seperate target ? Here is a related question, but did not get detailed answer about what exactly should be done: https://forums.developer.apple.com/thread/66978
Hi,
I was not able to set constraints for an image view inside a scroll view in XCode. Any ideas why? Here are the details:
The reason to do this: Xcode complains that the scroll view missing constraints for its "scrollable content", please see attached screen shot.
XCode version: 12.5.1
macOS: Big Sur.
View relationship: The scroll view is the superview of the image view.
Tried to use "ctrl drag" between the image view and the scroll view. No constraints options available.
The scroll view is in a Collection View cell.
Please see attached screen shot.
I am trying to support video playback for remote video files in iOS. It works for a regular video, but has a problem with slow motion videos: the slow motion effect is lost. I.e. A slow motion plays back like a regular speed rate video.
Here is what I am doing:
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
[asset.resourceLoader setDelegate:self.urlDelegate
queue:[self.urlDelegate getDispatchQueue]];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerViewController *controller = [[AVPlayerViewController alloc] init];
controller.player = player;
[player play];
[self presentViewController:controller animated:false completion:^{
<snipped>
}];
Note, the url points to a QuickTime (.MOV) video file on a HTTP(s) server.
Again, the above code plays a slow motion video just like a regular video without any slow motions. Is it because AVURLAsset does not support slow motion video? What are I missing to let AVPlayer plays slow motion?
I am targeting iOS 12 and above.
Thanks!
I'm using AVPlayerViewController to play back a remote video asset. It works when I use presentViewController: animated: completion: to pop up the controller.
But it stopped working when I embed AVPlayerViewController in a UIView as following:
[avPlayer play];
[videoView addSubview:playerController.view];
playController.view.frame = videoView.bounds;
I also tried to move [avPlayer play] after the view frame setting, did not work either. The view shows a black background with a play button in the middle. (See attached screenshot).
In more details, I use a delegate to handle the video asset reading via AVAssetResourceLoader setDelegate: queue: , the logs of the delegate shows that when embedding the AVPlayerViewController, the player did read the video URL a few times (4 requests) but then stopped for apparent no reason.
By a request, I mean the following callback was called:
resourceLoader:shouldWaitForLoadingOfRequestedResource:
In the failed case, the last request tries to read the last part of the video even though it has not read the previous parts yet:
<AVAssetResourceLoadingDataRequest: 0x60000163be40, requested offset = 8842658, requested length = 4156, requests all data to end of resource = YES, current offset = 8842658>>
In contrast, when using presentViewController to play back successfully, the last request was reading a much larger length of the video till the end.
<AVAssetResourceLoadingDataRequest: 0x6000005b4620, requested offset = 15884, requested length = 8765940, requests all data to end of resource = NO, current offset = 15884>>
Is there any known issue with embedding AVPlayerViewController in a UIView? especially when using custom delegate to handle URL asset read?
Thanks!
Hi,
I followed some online guides to import an objective-c framework into a SwiftUI app but did not work correctly. I am using Xcode 13.4. Here are the problems and my questions:
The objective-c framework (Cronet from Chromium) has multiple versions:
Debug-iphonesimulator
Debug-iphoneos
Release-iphoneos
But in Xcode, the SwiftUI app only has one target, i.e. no different targets for Debug or Release, or iPhone device vs. Simulator.
Which version of the objective-c framework should I import for the SwiftUI app target?
I tried to import the version of "Debug-iphonesimulator" into SwiftUI app. The import seems to be OK, i.e. it shows under "Frameworks", but if I click the framework name, there are no header files listing. (see attached screenshots).
Is there supposed to be a list of header files from the framework after importing? How can I use the framework if no header files available in the project?
Thanks.
Hi,
In my SwiftUI app, I'm using AVPlayer and a UIViewRepresentable view to play back a video from a network server.
There are many videos on the server, so I have a LazyHStack for the video thumbnails. Only one video is shown on screen at a time.
When the user scrolls between videos, I will create a new AVPlayerItem and replace the current player item in the AVPlayer.
The problem is: the first video plays back without any problems. But when the user scrolls over to the next video, the new video would play for 1 second and then freezes while the audio track continues. The video track would resume after a few seconds.
One note is that if I drag the video view a bit without fully scrolling to another video, the video track will resume after I released the drag. So it seems maybe somehow the AVPlayerLayer was not rendering?
Related code pieces:
struct PlayerView: UIViewRepresentable {
let player: Player
let width: CGFloat
let height: CGFloat
@Binding var updateCount: Int
func makeUIView(context: Context) -> UIView {
print("\(#function)")
let playerLayer = AVPlayerLayer(player: player)
let view = UIView()
view.frame = CGRectMake(0, 0, width, height)
playerLayer.frame = view.bounds
view.layer.addSublayer(playerLayer)
return view
}
func updateUIView(_ uiView: UIView, context: Context) {
print("\(#function)")
guard let playerLayer = uiView.layer.sublayers?.first as? AVPlayerLayer else { return }
uiView.frame = CGRectMake(0, 0, width, height)
if updateCount > 0 {
playerLayer.player = player
playerLayer.frame = uiView.bounds
}
}
}
updateCount was added to force updateUIView called when the parent View increases the updateCount. But it seems not helping much.
My question: Is there any known issue about replacing the current item in AVPlayer in SwiftUI app?
Here is my code of replacing the current item:
let asset = AVURLAsset(url: url)
asset.resourceLoader.setDelegate(urlDelegate, queue: urlDelegate.resourceLoaderQueue)
let assetKeys = ["playable", "hasProtectedContent"]
let playerItem = AVPlayerItem(asset: asset, automaticallyLoadedAssetKeys: assetKeys)
player.replaceCurrentItem(with: playerItem)
Note that I have a custom delegate to handle the resource loader for the URLAsset. This delegate is the same for all videos. They always plays well when selected first time (i.e. without replacing another video).
One more data point: I have an ObjC version of the same design, and it worked without problems.
Hi,
My SwiftUI app uses a third-party framework (written in objc, say with name Foo) that is built into a XCFramework. I can build and run this app in iOS simulator and an iPhone connected to my Mac. But when I do "Product" -> "Archive", it fails with the error:
Cannot find 'Foo' in scope
The error is reported at anywhere that uses things like Foo.bar().
Why would Archive fail while both simulator and real device works with the framework? What's the extra thing does Archive do?
XCode version: 14.2
Target setting -> General -> Frameworks, Libraries and Embedded Content: "Embed & Sign".
Thanks.
I'm trying to upload large files in a background URLSession to a server that accepts HTTP/3 requests. Per its design, the server uses a self-signed SSL certificate.
In my testing, I can create a regular URLSession (foreground) with the following:
var request = URLRequest(url: dst, cachePolicy: .reloadIgnoringLocalCacheData, timeoutInterval: 6.0)
request.assumesHTTP3Capable = true
request.httpMethod = "POST"
self.session.dataTask(with: request) {
<snip>
}
And handles the SSL certificate challenge by implementing the following for the delegate:
func urlSession(
_ session: URLSession,
didReceive challenge: URLAuthenticationChallenge,
completionHandler: @escaping @Sendable (URLSession.AuthChallengeDisposition, URLCredential?) -> Void
)
This approach works and I can communicate with the server. But when I try to apply the same for a background session for an upload task, the task seems to be stuck, i.e. never started by the system.
More specifically, the background session upload task works with iOS simulator. But on a real device connected to macOS, the task never started, and none of the delegate methods are called. The http/3 server didn't receive any request either.
The background session was created:
let appBundleName = Bundle.main.bundleURL.lastPathComponent.lowercased().replacingOccurrences(of: " ", with: ".")
let sessionIdentifier: String = "com.networking.\(appBundleName)"
let config = URLSessionConfiguration.background(withIdentifier: sessionIdentifier)
self.session = URLSession(configuration: config, delegate: self, delegateQueue: nil)
later, an upload task was created :
var request = URLRequest(url: dst, cachePolicy: .reloadIgnoringLocalCacheData, timeoutInterval: 6.0)
request.assumesHTTP3Capable = true
request.httpMethod = "POST"
let backgroundTask = self.session.uploadTask(with: request, fromFile: src)
backgroundTask.resume()
My question(s): why is the background session upload task stuck? Is there a way to find out where it is stuck?
Does a background URLSession support app's delegate handling SSL certificate challenge?
The test env:
Xcode 14.2
macOS 12.6.7
iOS 15.7.7
Thanks!