Posts

Post not yet marked as solved
0 Replies
232 Views
Problem I need to import a video, process and then export the video with alpha. I noticed the video gets a lot grayer/loses quality compared to the original. I don't need any compression. Sidenote: I need to export video's with transparency enabled, that's why I use AVAssetExportPresetHEVCHighestQualityWithAlpha. It seems that that is causing the problem, since AVAssetExportPresetHighestQuality is looking good. This are side-by-side frames of the original and a video that's processed. The left is the original frame, the right is a processed video: https://i.stack.imgur.com/ORqfz.png This is another example where the bottom is exported and the above is the original. You can see at the bar where the YouTube NL is displayed, that the above one is almost fully black, while the below one (exported) is really gray: https://i.stack.imgur.com/s8lCn.png As far as I know, I don't do anything special, I just load the video and directly export it. It still loses quality. How can I prevent this? Reproduction path You can either clone the repository, or see the code below. The repository is available here: https://github.com/Jasperav/VideoCompression/tree/main/VideoCompressionTests. After you cloned it, run the only unit-test and check the logging of where the output of the video is stored. You can then observe that temp.mov is a lot grayer than the original video. The code of importing and exporting the video is here. As far as I can see, I just import and directly export the movie without modifying it. What's the problem? import AppKit import AVFoundation import Foundation import Photos import QuartzCore import OSLog let logger = Logger() class VideoEditor { func export( url: URL, outputDir: URL ) async { let asset = AVURLAsset(url: url) let extract = try! await extractData(videoAsset: asset) try! await exportVideo(outputPath: outputDir, asset: asset, videoComposition: extract) } private func exportVideo(outputPath: URL, asset: AVAsset, videoComposition: AVMutableVideoComposition) async throws { let fileExists = FileManager.default.fileExists(atPath: outputPath.path()) logger.debug("Output dir: \(outputPath), exists: \(fileExists), render size: \(String(describing: videoComposition.renderSize))") if fileExists { do { try FileManager.default.removeItem(atPath: outputPath.path()) } catch { logger.error("remove file failed") } } let dir = outputPath.deletingLastPathComponent().path() logger.debug("Will try to create dir: \(dir)") try? FileManager.default.createDirectory(atPath: dir, withIntermediateDirectories: true) var isDirectory = ObjCBool(false) guard FileManager.default.fileExists(atPath: dir, isDirectory: &isDirectory), isDirectory.boolValue else { logger.error("Could not create dir, or dir is a file") fatalError() } guard let exporter = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetHEVCHighestQualityWithAlpha) else { logger.error("generate export failed") fatalError() } exporter.outputURL = outputPath exporter.outputFileType = .mov exporter.shouldOptimizeForNetworkUse = false exporter.videoComposition = videoComposition await exporter.export() logger.debug("Status: \(String(describing: exporter.status)), error: \(exporter.error)") if exporter.status != .completed { fatalError() } } private func extractData(videoAsset: AVURLAsset) async throws -> AVMutableVideoComposition { guard let videoTrack = try await videoAsset.loadTracks(withMediaType: .video).first else { fatalError() } let composition = AVMutableComposition(urlAssetInitializationOptions: nil) guard let compositionVideoTrack = composition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: videoTrack.trackID) else { fatalError() } let duration = try await videoAsset.load(.duration) try compositionVideoTrack.insertTimeRange(CMTimeRangeMake(start: CMTime.zero, duration: duration), of: videoTrack, at: CMTime.zero) let naturalSize = try await videoTrack.load(.naturalSize) let preferredTransform = try await videoTrack.load(.preferredTransform) let mainInstruction = AVMutableVideoCompositionInstruction() mainInstruction.timeRange = CMTimeRange(start: CMTime.zero, end: duration) let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack) let videoComposition = AVMutableVideoComposition() let frameRate = try await videoTrack.load(.nominalFrameRate) videoComposition.frameDuration = CMTimeMake(value: 1, timescale: Int32(frameRate)) mainInstruction.layerInstructions = [layerInstruction] videoComposition.instructions = [mainInstruction] videoComposition.renderSize = naturalSize return videoComposition } }
Posted
by Jasperav.
Last updated
.
Post not yet marked as solved
0 Replies
857 Views
I am running my UI Tests on Github CI and the tests are flaky. I don't understand how I can fix it. The animations are disabled and I am running the tests on a iPhone 13 plus. A lot of tests are running green, but some are not working. Locally, I got everything working. These are some logs: 2022-06-21T13:42:23.2627250Z t = 63.34s Tap Cell 2022-06-21T13:42:23.2707530Z t = 63.34s Wait for com.project.project to idle 2022-06-21T13:42:23.2733620Z t = 63.41s Unable to monitor event loop 2022-06-21T13:42:23.2734250Z t = 63.41s Unable to monitor animations 2022-06-21T13:42:23.2734800Z t = 63.42s Find the Cell 2022-06-21T13:42:24.1158670Z t = 64.45s Find the Cell (retry 1) 2022-06-21T13:42:24.1287900Z t = 64.45s Collecting extra data to assist test failure triage 2022-06-21T13:42:24.2022460Z /Users/runner/work/project/UITestCase.swift:665: error: -[project.UserTagTest testTapInTextView] : Failed to get matching snapshot: Lost connection to the application (pid 12676). (Underlying Error: Couldn’t communicate with a helper application. Try your operation again. If that fails, quit and relaunch the application and try again. The connection to service created from an endpoint was invalidated: failed to check-in, peer may have been unloaded: mach_error=10000003.) It can not find the cell because of these logs: Unable to monitor event loop Unable to monitor animations I know this because I sometimes get different errors than the error above, which says that the connection to the application is lost, right below the Unable to monitor... error logging. Is there anything I can try? I don't have a reproduction project. This is the command that is executed: xcodebuild test -project project.xcodeproj -scheme project-iosUITests -destination 'platform=iOS Simulator,name=iPhone 13 Pro,OS=15.5' The CI runs 35 tests and 5 fails randomly with the Unable to errors. Is there any suggestion to fix this problem?
Posted
by Jasperav.
Last updated
.
Post not yet marked as solved
2 Replies
2.4k Views
I want to have a TabView inside a NavigationView. The reason is that I am showing a List inside the TabView. When a user taps on an item in the list, the list uses a NavigationLink to show a detailed screen. The problem is that the navigationBarTitle is now broken. It does not animate when the user scrolls through the items (sometimes). I don't want to wrap my NavigationView inside the TabView, since that will always show the TabView. I don't want it. This is the reproduction code. If you run this in the simulator, the animation will break when switch a few times between the tabs and you scroll through the list. You will see that the navigation bar will remain where it is, without the animation. import SwiftUI struct ContentView: View { var body: some View { NavigationView { TabView { TestView() .tabItem { Image(systemName: "person.3") } TestView() .tabItem { Image(systemName: "person.2") } } .navigationTitle("Test") .navigationViewStyle(.stack) } } } struct TestView: View { var body: some View { List { Text("test") } .listStyle(.plain) } }
Posted
by Jasperav.
Last updated
.
Post not yet marked as solved
3 Replies
2.3k Views
Also on https://stackoverflow.com/questions/57630483/annotations-are-hidden-when-zooming-outWhen zooming out and property showsUserLocation is set to true and the annotation are centered around the users location, clusters will disappear when zooming out. I don't want this, they should always be visible. The displayPriority's default value is set to required so I am not sure how to fix this. **Every annotation and cluster should be always visible**. It works when I set the property showsUserLocation to false, or make sure the annotations/clusters are not near the current location.There are 2 ways to reproduce my problem:1. Clone and run https://github.com/Jasperav/AnnotationsGlitch2. Copy paste and run the code below**Code** import UIKit import MapKit class ViewController: UIViewController, MKMapViewDelegate, CLLocationManagerDelegate { let mapView = MKMapView(frame: .zero) private let locationManager = CLLocationManager() override func viewDidLoad() { view.addSubview(mapView) mapView.translatesAutoresizingMaskIntoConstraints = false mapView.widthAnchor.constraint(equalTo: view.widthAnchor).isActive = true mapView.heightAnchor.constraint(equalTo: view.heightAnchor).isActive = true mapView.centerYAnchor.constraint(equalTo: view.centerYAnchor).isActive = true mapView.centerXAnchor.constraint(equalTo: view.centerXAnchor).isActive = true mapView.delegate = self mapView.showsUserLocation = true } override func viewDidAppear(_ animated: Bool) { let authorizationStatus = CLLocationManager.authorizationStatus() switch authorizationStatus { case .authorizedWhenInUse, .authorizedAlways: break case .notDetermined, .restricted, .denied: locationManager.requestWhenInUseAuthorization() @unknown default: fatalError() } DispatchQueue.main.asyncAfter(deadline: .now() + .seconds(1)) { for _ in 0...20 { let currentLocation = self.mapView.userLocation.coordinate let newLocation = currentLocation.shiftRandomPosition(meters: 30000, randomMeters: true) self.mapView.addAnnotation(Annotation(coordinate: newLocation)) } self.mapView.setRegion(MKCoordinateRegion(center: self.mapView.userLocation.coordinate, span: MKCoordinateSpan(latitudeDelta: 0, longitudeDelta: 0)), animated: false) } } func mapView(_ mapView: MKMapView, clusterAnnotationForMemberAnnotations memberAnnotations: [MKAnnotation]) -> MKClusterAnnotation { MKClusterAnnotation(memberAnnotations: memberAnnotations) } func locationManager(_ manager: CLLocationManager, didUpdateLocations locations: [CLLocation]) { } func mapView(_ mapView: MKMapView, viewFor annotation: MKAnnotation) -> MKAnnotationView? { guard let annotation = annotation as? Annotation else { return nil } let identifier = "marker" let view: MKMarkerAnnotationView if let dequeuedView = mapView.dequeueReusableAnnotationView(withIdentifier: identifier) as? MKMarkerAnnotationView { dequeuedView.annotation = annotation view = dequeuedView } else { view = MKMarkerAnnotationView(annotation: annotation, reuseIdentifier: identifier) } view.clusteringIdentifier = "identifier" return view } } class Annotation: NSObject, MKAnnotation { let coordinate: CLLocationCoordinate2D init(coordinate: CLLocationCoordinate2D) { self.coordinate = coordinate super.init() } } // https://stackoverflow.com/a/55459722/7715250 extension CLLocationCoordinate2D { /// Get coordinate moved from current to `distanceMeters` meters with azimuth `azimuth` [0, Double.pi) /// /// - Parameters: /// - distanceMeters: the distance in meters /// - azimuth: the azimuth (bearing) /// - Returns: new coordinate private func shift(byDistance distanceMeters: Double, azimuth: Double) -> CLLocationCoordinate2D { let bearing = azimuth let origin = self let distRadians = distanceMeters / (6372797.6) // earth radius in meters let lat1 = origin.latitude * Double.pi / 180 let lon1 = origin.longitude * Double.pi / 180 let lat2 = asin(sin(lat1) * cos(distRadians) + cos(lat1) * sin(distRadians) * cos(bearing)) let lon2 = lon1 + atan2(sin(bearing) * sin(distRadians) * cos(lat1), cos(distRadians) - sin(lat1) * sin(lat2)) return CLLocationCoordinate2D(latitude: lat2 * 180 / Double.pi, longitude: lon2 * 180 / Double.pi) } func shiftRandomPosition(meters: Double, randomMeters: Bool) -> CLLocationCoordinate2D { let finalMeters: Double if randomMeters { finalMeters = Double.random(in: -meters..<meters) } else { finalMeters = meters } let randomAzimuth = Double.random(in: -Double.pi..<Double.pi) return shift(byDistance: finalMeters, azimuth: randomAzimuth) } } [1]: https://i.stack.imgur.com/zbKem.gif
Posted
by Jasperav.
Last updated
.
Post not yet marked as solved
0 Replies
3.3k Views
Hi all,So I am having this problem for a few months now, and I asked it on stackoverflow already:https://stackoverflow.com/questions/54281617/bounce-occurs-when-changing-rowsIf you clone this project, you can directly run it and see the problem described below for yourself (if you scroll a little bit after you see the rows): https://github.com/Jasperav/FetchResultControllerGlitchI am using a UITableView with performBatchUpdates to process new rows. When I insert a new row at the top, it sometimes glitches (as seen in the link above, it does not glitch when starting the app up and wait). I don't want it to glitch. I want the same behaviour as whatsapp (assuming they use a UITableView/UICollectionView): When a new message comes in and the user has scrolled a little upward to read older messages, the messages the user is seeing do not change (no bounce, glitches). I just can not get it to work.I already tried: CATransaction, UIView.performWithoutAnimation, layoutSubviews and layoutIfNeeded. Different calculations do not help. I checked ofcourse other answers of this community and stackoverflow, nothing is working for me...This is the most important part I think. It tries to calculate the new content offset: public func controllerDidChangeContent(_ controller: NSFetchedResultsController<NSFetchRequestResult>) { let currentSize = tableView.contentSize.height UIView.performWithoutAnimation { tableView.performBatchUpdates({ tableView.insertRows(at: inserts, with: .automatic) }) inserts.removeAll() self.view.layoutIfNeeded() // Tableview scrolls weirdly after the correction, this call will omit that. let newSize = tableView.contentSize.height let correctedY = tableView.contentOffset.y + newSize - currentSize print("Will apply an corrected Y value of: \(correctedY)") tableView.setContentOffset(CGPoint(x: 0, y: correctedY), animated: false) print("Corrected to: \(correctedY)") } }
Posted
by Jasperav.
Last updated
.