I have Swift app that has a list of videos similar to the reels on social media apps.
I use AVplayer with HLS format videos in a UICollectionView.
I have 2 issues:
the buffering is always slow even after implementing preloading for the next 4 cells.
after scrolling over around 20 video, the buffer freeze and the player stalled for all the videos for around 20 seconds, then it start to work and after more 20 videos the issue happens and I get this error on the console every time the stalling happens:
[27944:5390492] [connection] nw_connection_copy_connected_local_endpoint_block_invoke [C60] Client called nw_connection_copy_connected_local_endpoint on unconnected nw_connection
[27944:5390492] [connection] nw_connection_copy_connected_remote_endpoint_block_invoke [C60] Client called nw_connection_copy_connected_remote_endpoint on unconnected nw_connection
[27944:5390492] [connection] nw_connection_copy_protocol_metadata_internal_block_invoke [C60] Client called nw_connection_copy_protocol_metadata_internal on unconnected nw_connection
[27944:5390492] [connection] nw_connection_copy_connected_local_endpoint_block_invoke [C61] Client called nw_connection_copy_connected_local_endpoint on unconnected nw_connection
[27944:5390492] [connection] nw_connection_copy_connected_remote_endpoint_block_invoke [C61] Client called nw_connection_copy_connected_remote_endpoint on unconnected nw_connection
[27944:5390492] [connection] nw_connection_copy_protocol_metadata_internal_block_invoke [C61] Client called nw_connection_copy_protocol_metadata_internal on unconnected nw_connection
this is the logic of the preload in the WillDisplay function
for section in 0..<collectionView.numberOfSections {
for i in indexPath.item ... indexPath.item + 4 {
let indexPath = IndexPath(item: i, section: section)
if indexPath.item >= 0,
indexPath.item < reelsArray.count,
!SharedManager.shared.players.contains(where: {$0.id == reelsArray[indexPath.item].id ?? ""}) ,
let urlString = reelsArray[indexPath.item].media,
let videoURL = URL(string: urlString) {
let asset = AVAsset(url: videoURL)
let playerItem = AVPlayerItem(asset: asset)
playerItem.preferredMaximumResolution = CGSize(width: 426, height: 240)
playerItem.preferredPeakBitRate = Double(200000)
playerItem.preferredForwardBufferDuration = 3
let player = NRPlayer(playerItem: playerItem)
player.automaticallyWaitsToMinimizeStalling = false
let playerPreload = PlayerPreloadModel(index: indexPath.item, timeCreated: Date(), id: reelsArray[indexPath.item].id ?? "", player: player)
SharedManager.shared.players.append(playerPreload)
}
}
}
here is the play function in the cell
setImage()
let asset = AVAsset(url: url)
let playerItem = AVPlayerItem(asset: asset)
playerLayer.player?.automaticallyWaitsToMinimizeStalling = false
playerItem.preferredMaximumResolution = CGSize(width: 426, height: 240)
playerItem.preferredPeakBitRate = Double(200000)
playerItem.preferredForwardBufferDuration = 3
let player = NRPlayer(playerItem: playerItem)
playerLayer = AVPlayerLayer(player: player)
playerLayer.player?.currentItem?.addObserver(self, forKeyPath: #keyPath(AVPlayerItem.status), options: NSKeyValueObservingOptions.new, context: nil)
playerContainer.layer.addSublayer(playerLayer)
playerLayer.frame = playerContainer.bounds
playerContainer.backgroundColor = .clear
playerLayer.videoGravity = .resize
playerContainer.layer.masksToBounds = true
playerLayer.masksToBounds = true
NotificationCenter.default.addObserver(self, selector: #selector(videoDidEnded), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: playerLayer.player?.currentItem)
playerLayer.player?.addObserver(self, forKeyPath: "timeControlStatus", options: NSKeyValueObservingOptions.new, context: nil)
playerLayer.player?.play()
imgThumbnailView.layoutIfNeeded()
if SharedManager.shared.isAudioEnableReels == false {
playerLayer.player?.volume = 0
imgSound.image = UIImage(named: "newMuteIC")
} else {
playerLayer.player?.volume = 1
imgSound.image = UIImage(named: "newUnmuteIC")
}