Posts

Post not yet marked as solved
1 Replies
240 Views
Hello) I use PHImageManager.default() for fetch UIImage from iOS gallery (Photo App). I noticed that screenshot images from galley has bitsPerComponent value equal 16. That's supported only in Mac OS. I could ONLY fix this by redrawing the image using UIGraphicsBeginImageContextWithOptions. private func redrawImageTo8Bit() -> UIImage? { // UIGraphicsImageRenderer can not change bitsPerComponent value UIGraphicsBeginImageContextWithOptions(size, false, UIScreen.main.scale) draw(in: CGRect(origin: .zero, size: size)) let redrawedImage = UIGraphicsGetImageFromCurrentImageContext() UIGraphicsEndImageContext() return redrawedImage } https://developer.apple.com/library/archive/documentation/GraphicsImaging/Conceptual/drawingwithquartz2d/dq_context/dq_context.html Example of using PHImageManager.default PHImageManager.default().requestImage( for: phAsset, targetSize: size, contentMode: contentMode, options: options ) { [weak self] image, info in if let image = image // Screenshot of an image taken on the device (iPhone 12 Pro Max) { let bitsPerComponent = image.cgImage?.bitsPerComponent // 16 } }
Posted Last updated
.
Post not yet marked as solved
0 Replies
249 Views
Hello, I fetch Live Photo AVAsset using: PHImageManager and PHAssetResourceManager for getting Data. And then I want to wrap it to AVAsset using fileURL, and everything works fine, but also I want to trim this AVAsset using AVMutableComposition. I use insertTimeRange of AVMutableCompositionTrack method, and I don't not why but naturalSize of originalVideoTrack and newVideoTrack are different, and this happening only with Live Photo, default Videos work fine. Seems like this is AVMutableCompositionTrack bug inside AVFoundation. Please give me some info. Thanks) PHImageManager.default().requestLivePhoto( for: phAsset, targetSize: size, contentMode: .aspectFill, options: livePhotoOptions ) { [weak self] livePhoto, info in guard let livePhoto else { return } self?.writeAVAsset(livePhoto: livePhoto, fileURL: fileURL) } private func writeAVAsset(livePhoto: PHLivePhoto, fileURL: URL) { let resources = PHAssetResource.assetResources(for: livePhoto) guard let videoResource = resources.first(where: { $0.type == .pairedVideo }) else { return } let requestOptions = PHAssetResourceRequestOptions() var data = Data() dataRequestID = PHAssetResourceManager.default().requestData( for: videoResource, options: requestOptions, dataReceivedHandler: { chunk in data.append(chunk) }, completionHandler: { [weak self] error in try data.write(to: fileURL) let avAsset = AVAsset(url: fileURL) let composition = AVMutableComposition() if let originalVideoTrack = tracks(withMediaType: .video).first, let videoTrack = composition.addMutableTrack(withMediaType: .video, preferredTrackID: 0) { // originalVideoTrack has naturalSize (width: 1744, height: 1308) try? videoTrack.insertTimeRange(timeRange, of: originalVideoTrack, at: .zero) videoTrack.preferredTransform = originalVideoTrack.preferredTransform // videoTrack has naturalSize (width: 1920.0, height: 1440.0) } ) }
Posted Last updated
.