I'm trying the sample app from here:
https://developer.apple.com/documentation/vision/detecting_moving_objects_in_a_video
I made a tweak to read the video from library instead of document picker:
var recordedVideoURL: AVAsset?
@IBAction func uploadVideoForAnalysis(_ sender: Any) {
var configuration = PHPickerConfiguration()
configuration.filter = .videos
configuration.selectionLimit = 1
let picker = PHPickerViewController(configuration: configuration)
picker.delegate = self
present(picker, animated: true, completion: nil)
}
The delegation method:
extension HomeViewController: PHPickerViewControllerDelegate {
func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {
guard let selectedResult = results.first else {
print("assetIdentifier: null")
dismiss(animated: true, completion: nil)
return
}
selectedResult.itemProvider.loadFileRepresentation(forTypeIdentifier: UTType.movie.identifier) { [weak self] url, error in
guard error == nil, let url = url else {
print(error?.localizedDescription ?? "Failed to load image")
return
}
let asset = AVAsset(url: url)
self?.recordedVideoURL = asset
DispatchQueue.main.async { [weak self] in
self?.dismiss(animated: true) { // dismiss the picker
self?.performSegue(withIdentifier: ContentAnalysisViewController.segueDestinationId,
sender: self!)
self?.recordedVideoURL = nil
}
}
}
}
}
everything else is pretty much the same. Then, in the camera controller, it raised an error: "The requested URL was not found on this server." I put a debug break point and it show the error was from the line init AssetReader: let reader = AVAssetReader(asset: asset)
func startReadingAsset(_ asset: AVAsset, reader: AVAssetReader? = nil) {
videoRenderView = VideoRenderView(frame: view.bounds)
setupVideoOutputView(videoRenderView)
videoFileReadingQueue.async { [weak self] in
do {
guard let track = asset.tracks(withMediaType: .video).first else {
throw AppError.videoReadingError(reason: "No video tracks found in the asset.")
}
let reader = AVAssetReader(asset: asset)
let settings = [kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_420YpCbCr8BiPlanarFullRange]
let output = AVAssetReaderTrackOutput(track: track, outputSettings: settings)
if reader.canAdd(output) {
reader.add(output)
} else {
throw AppError.videoReadingError(reason: "Couldn't add a track to the asset reader.")
}
if !reader.startReading() {
throw AppError.videoReadingError(reason: "Couldn't start the asset reader.")
}
...
I tried to create a reader directly in the asset creation block, it worked:
selectedResult.itemProvider.loadFileRepresentation(forTypeIdentifier: UTType.movie.identifier) { [weak self] url, error in
guard error == nil, let url = url else {
print(error?.localizedDescription ?? "Failed to load image")
return
}
let asset = AVAsset(url: url)
do {
let reader = try AVAssetReader(asset: asset)
self?.assetReader = reader
print("reader: \(reader)")
}
catch let e { print("No reader: \(e)") }
...
but if I just move it a little bit to the Dispatch.main.async block, it printed out "No reader: The requested URL was not found on this server.
Therefore, I have to keep an instance of the reader and pass it to the cameraVC.
Can someone please explain why is this happening? What's the logic behind this?
Post
Replies
Boosts
Views
Activity
I posted my question here: https://stackoverflow.com/questions/63986215/why-scnview-not-working-on-iphone-xs-max
TL;DR:
Added a SCNView as a subview, after sometimes (api request call), display my nodes. In the question, I actually just add a sphere, but the SCNView turn purple (it's actual color should be clear). And this only happen on iPhone XS Max.
I have no idea why this happen and how to face the problem if the problem is my face.