Post

Replies

Boosts

Views

Activity

What are dual camera limitations?
In some cases when I use "builtInDualCamera" capture device I get white blank depth photo. In other cases it works fine. So I guess there is a limitation. I use this code to get depth image: private func saveDepth(name: String, photo: AVCapturePhoto) throws -> URL { guard var depthData = photo.depthData else { throw CameraError.photo("No depth data") } print("DEBUG: depth data quality \(depthData.depthDataQuality)") if let orientationValue = photo.metadata[kCGImagePropertyOrientation as String] as? UInt32 { if let orientation = CGImagePropertyOrientation(rawValue: orientationValue) { depthData = depthData.applyingExifOrientation(orientation) } } let converted = depthData.converting(toDepthDataType: kCVPixelFormatType_DepthFloat32) let image = CIImage(cvPixelBuffer: converted.depthDataMap) guard let colorSpace = image.colorSpace else { throw CameraError.photo("Can not get image color space") } guard let data = context.tiffRepresentation( of: image, format: .Lf, colorSpace: colorSpace, options: [:] ) else { throw CameraError.photo("Can not create TIFF") } let url = Path.cachePath.appendingPathComponent(name + "_depth.tiff") try data.write(to: url) return url } Any information about maximum distance or some other limitations for dual camera?
0
0
501
Mar ’22
ASTC encoding on iOS
I found following code for ASTC encoding: NSMutableData *data = [NSMutableData data]; CGImageDestinationRef destination = CGImageDestinationCreateWithData((CFMutableDataRef)data, (CFStringRef)@"org.khronos.ktx", 1, nil); NSDictionary *properties = @{@"kCGImagePropertyASTCBlockSize": @(0x88)}; CGImageDestinationAddImage(destination, source, (CFDictionaryRef)properties); CGImageDestinationFinalize(destination); This code produce data in KTX format. I can not find any documentation about kCGImagePropertyASTCBlockSize parameter name so I consider that this code is undocumented. I have two questions: Is it safe to use it? May Apple change this in future? Is it hardware encoding or CPU encoding?
0
0
695
Aug ’20
Blending for the MTLPixelFormat.r8Unorm texture
Is it possible to have correct blending with MTLPixelFormat.r8Unorm texture? Here is what I use now but it works wrong: 		private func createBrushDescriptor(library: MTLLibrary) -> MTLRenderPipelineDescriptor { 				let vertexDescriptor = MTLVertexDescriptor() 				vertexDescriptor.attributes[0].format = .float2 				vertexDescriptor.attributes[0].offset = 0 				vertexDescriptor.attributes[0].bufferIndex = 0 				vertexDescriptor.layouts[0].stride = 2 * MemoryLayout<simd_float1>.size 				vertexDescriptor.layouts[0].stepFunction = .perVertex 				vertexDescriptor.layouts[0].stepRate = 1 				 				let descriptor = MTLRenderPipelineDescriptor() 				descriptor.vertexFunction = library.makeFunction(name: "brush_vertex_function") 				descriptor.vertexDescriptor = vertexDescriptor 				descriptor.fragmentFunction = library.makeFunction(name: "brush_fragment_function") 				descriptor.colorAttachments[0].pixelFormat = .r8Unorm 				 				descriptor.colorAttachments[0].isBlendingEnabled = true 				descriptor.colorAttachments[0].rgbBlendOperation = .add 				descriptor.colorAttachments[0].sourceRGBBlendFactor = .one 				descriptor.colorAttachments[0].destinationRGBBlendFactor = .one 				 				descriptor.depthAttachmentPixelFormat = .invalid 				 				return descriptor 		} But it works wrong.
1
0
451
Jul ’20
How to show open and save dialog?
I tried to use following code:struct FilePickerView: UIViewControllerRepresentable { func makeUIViewController(context: Context) -> UIDocumentPickerViewController { let documentPicker = UIDocumentPickerViewController(documentTypes: [(kUTTypeImage as String)], in: .open) return documentPicker } func updateUIViewController(_ uiViewController: UIDocumentPickerViewController, context: Context) { // TODO } }And present it like this:.sheet(isPresented: $showFilePicker) { FilePickerView() }But on macOS I see blank screen centered in my app window. I solution working on Catalyst.
4
0
3.9k
Feb ’20