Posts

Post not yet marked as solved
0 Replies
777 Views
I'm making a widget app that shows today's date and calendar. I want the widget refresh every midnight, so I tried several methods. But eventually I failed to figure out how to do it. First try : give entries that update every midnight. But it doesn't update the widget every midnight exactly. struct Provider: TimelineProvider { &#9;func getTimeline(in context: Context, completion: @escaping (Timeline<Entry>) -> Void) { &#9;&#9;var entries: [SomeEntry] = [] &#9;&#9;let currentDate = Date() &#9;&#9;let currentEntry = SomeEntry(date: currentDate), content: content) &#9;&#9;entries.append(currentEntry) &#9;&#9;for offset in 1 ..< 5 { &#9;&#9;&#9;let day = Calendar.autoupdatingCurrent.date(byAdding: .day, value: offset, to: currentDate)! &#9;&#9;&#9;let midnight = Calendar.autoupdatingCurrent.startOfDay(for: day) &#9;&#9;&#9;let entry = SomeEntry(date: midnight, content: content) &#9;&#9;&#9;entries.append(entry) &#9;&#9;} &#9;&#9;completion(entries, .atEnd) &#9;} } Second try : Dynamic Dates I tried to use dynamic dates which was described here (https://developer.apple.com/documentation/widgetkit/displaying-dynamic-dates) But it's not customizable, so I think I can't use it when making calendar widget. Third try : Local notification I tried to use local notification to reload widget every midnight. But I found in iOS, I can't use silent local notification. Fourth try : Background Tasks I tried background tasks, but it won't refresh widget if the app is terminated. I know that other popular widget app's widget updates exact every midnight. Even if I manually change device time, they work. I think there is a way that can alert widget extension when date changes. Any idea how to do it??
Posted
by hyoungbin.
Last updated
.
Post not yet marked as solved
0 Replies
1.1k Views
I want to apply depth blur effect on real-time preview like system camera app's portrait mode.I tried this function with this code,- CameraViewControllerextension CameraViewController: AVCaptureDataOutputSynchronizerDelegate { func dataOutputSynchronizer(_ synchronizer: AVCaptureDataOutputSynchronizer, didOutput synchronizedDataCollection: AVCaptureSynchronizedDataCollection) { guard let syncedDepthData: AVCaptureSynchronizedDepthData = synchronizedDataCollection.synchronizedData(for: depthDataOutput) as? AVCaptureSynchronizedDepthData, let syncedVideoData: AVCaptureSynchronizedSampleBufferData = synchronizedDataCollection.synchronizedData(for: videoDataOutput) as? AVCaptureSynchronizedSampleBufferData else { print("Could not get data from synchronizedDataCollection") return } let sampleBuffer = syncedVideoData.sampleBuffer let depthData = syncedDepthData.depthData guard let videoPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return } // Scale &amp; Origin // previewView is a subclass of MTKView let drawableSize = previewView.drawableSize let scaleX = drawableSize.width / finalImage.extent.width let scaleY = drawableSize.height / finalImage.extent.height let scale = min(scaleX, scaleY) finalImage = finalImage.transformed(by: CGAffineTransform(scaleX: scale, y: scale)) let originX = (drawableSize.width - finalImage.extent.size.width) / 2 let originY = (drawableSize.height - finalImage.extent.size.height) / 2 finalImage = finalImage.transformed(by: CGAffineTransform(translationX: originX, y: originY)) if let depthData = depthData { var depthImage = CIImage(cvPixelBuffer: depthData.depthDataMap).applyingFilter("CIColorInvert") let scaleFactor = Float(finalImage.extent.width) / Float(depthImage.extent.width) depthImage = depthImage.applyingFilter("CIBicubicScaleTransform", parameters: ["inputScale" : scaleFactor]) finalImage = finalImage.applyingFilter("CIMaskedVariableBlur", parameters: ["inputMask": depthImage, "inputRadius": 8.0]) } preview.iamge = finalImage } } - Previewclass Preview: MTKView { ... override func draw(_ rect: CGRect) { guard let currentDrawable = currentDrawable, let commandBuffer = self.commandQueue!.makeCommandBuffer(), let previewImage = image else { return } let destination = CIRenderDestination(width: Int(drawableSize.width), height: Int(drawableSize.height), pixelFormat: colorPixelFormat, commandBuffer: commandBuffer) { () -&gt; MTLTexture in return currentDrawable.texture } do { try self.context.startTask(toRender: previewImage, to: destination) } commandBuffer.present(currentDrawable) commandBuffer.commit() } }I chose CIMaskedVariableBlur filter instead CIDepthBlurEffect because it needs lower computation. However applying CIMaskedVariableBlur to MTKView (which is 30FPS) is too slow, and takes huge CPU usage (almost 50 ~ 60%)I used CIRenderDestination and CIContext.startTask because they are effective between CPU and GPU.Also, I use CIContext powered by MTLDevice (which is the shared device between CIContext and MTKView). I can't find any further way to improve performance so that I can apply depth blur effect on real-time. Do I have to write custom filter or find some way with Metal Performance Shader?
Posted
by hyoungbin.
Last updated
.
Post not yet marked as solved
0 Replies
604 Views
Hi, I'm trying to replace portraitMatte with a custom pixel buffer I created.func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) { var finalPortraitMatte: AVPortraitEffectsMatte? if let portraitMatte = photo.portraitEffectsMatte { let mattingImage = portraitMatte.mattingImage var mattingCIImage = CIImage(cvPixelBuffer: mattingImage) // this should be kCVPixelFormatType_OneComponent8 let matteImageType = CVPixelBufferGetPixelFormatType(mattingImage) if let destMatteBuffer = processMatteImage(ciImage: mattingCIImage, formatType: matteImageType){ do { finalPortraitMatte = try portraitMatte.replacingPortraitEffectsMatte(with: destMatteBuffer) } catch { // I get an error here print("replacing matte error", error) } } } } func processMatteImage(ciImage: CIImage, formatType: OSType) -&gt; CVPixelBuffer? { var ciMatteImage = ciImage var destPixelBuffer: CVPixelBuffer? let width = Int(ciMatteImage.extent.width) let height = Int(ciMatteImage.extent.height) let attrs = [kCVPixelBufferColorPrimariesKey: kCVImageBufferColorPrimaries_ITU_R_709_2, kCVPixelBufferTransferFunctionKey: kCVImageBufferTransferFunction_Linear] as [CFString: Any] CVPixelBufferCreate(kCFAllocatorDefault, width, height, formatType, attrs as CFDictionary, &amp;destPixelBuffer) if self.cameraDevicePosition == .front { ciMatteImage = ciMatteImage.transformed(by: CGAffineTransform(a: -1, b: 0, c: 0, d: 1, tx: CGFloat(width), ty: 0)) } if let pixelBuffer = destPixelBuffer { CVPixelBufferLockBaseAddress(pixelBuffer, CVPixelBufferLockFlags(rawValue: 0)) CIContext().render(ciMatteImage, to pixelBuffer) CVPixelBufferUnlockBaseAddress(pixelBuffer, CVPixelBufferLockFlags(rawValue: 0)) return pixelBuffer } return nil }When I try to replace matte pixel buffer, I get an errorError Domain=AVFoundationErrorDomain Code=-11864 "Format Unsupported" UserInfo={NSLocalizedDescription=Format Unsupported, NSLocalizedFailureResion=The format of this content is unsupported.}But according to https://developer.apple.com/documentation/avfoundation/avportraiteffectsmatte/2976124-replacingportraiteffectsmatte,PortraitMatte Buffer's pixel format is kCVPixelFormatType_OneComponent8, which is the same format above the code.I also added some attributes according to that link.My device is iPhone XS and OS version is 12.1.4I can't figure out why "format unsupported" error occurs. Is there anything I missed?Thanks in advance.
Posted
by hyoungbin.
Last updated
.