Posts

Post not yet marked as solved
0 Replies
499 Views
MPS API allows to run kernels in MTLCommandBuffer but is it possible to create MTLComputeCommandEncoder and run several kernels in it without creating a separate encoder for each kernel under the hood? Something like: // Create Command Buffer // Create Encoder kernel1.encode(encoder: encoder, sourceTexture: source, destinationTexture: k1Destination) kernel2.encode(encoder: encoder, sourceTexture: k1Destination, destinationTexture: destination) encoder.endEncoding() commandBuffer.commit()
Posted Last updated
.
Post marked as solved
1 Replies
843 Views
I'm working on 2D drawing application. I receive CGPoints from UITouches and transform it to Metal coordinate space. In most cases I have to create several vertices from one CGPoint, apply transformation to them and convert to Metal coordinate space. I use simd and vector-matrix multiplication. So I have 4 options to do it. Create affine 3D matrix with linear transform (scale/rotation in my case) + translation (matrix_float3x3) and perform vector-matrix multiplication on CPU side using simd. Create affine transform and perform multiplication on GPU side in vertex function. Create uniform with separate matrix_float2x2 linear transformation and simd_float2 translation and perform fma operation with 2D vector, linear 2D matrix and translation 2D vector on CPU side using Accelerate. The same as third option but perform fma on GPU side in vertex function. What is more efficient? And what are best practices in GPU programming? As I understand correctly fma and vector-matrix multiplication use one processor instruction. Am I right? I have no more than 10 CGPoints which produce about 40-80 vertices on every draw call.
Posted Last updated
.
Post not yet marked as solved
0 Replies
678 Views
I have several CGImages with transparent background and save it to one HEIC image with Image I/O func heic(from images: [CGImage]) -> Data? { &#9;&#9;guard &#9;&#9;&#9;&#9;let mutableData = CFDataCreateMutable(kCFAllocatorDefault, 0), &#9;&#9;&#9;&#9;let destination = CGImageDestinationCreateWithData(mutableData, "public.heic" as CFString, images.count, nil) &#9;&#9;else { return nil } &#9;&#9;for image in images { &#9;&#9;&#9;&#9;CGImageDestinationAddImage( &#9;&#9;&#9;&#9;&#9;&#9;destination, &#9;&#9;&#9;&#9;&#9;&#9;image, &#9;&#9;&#9;&#9;&#9;&#9;[ &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;kCGImageDestinationEmbedThumbnail: true, &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;kCGImageDestinationLossyCompressionQuality: 1 &#9;&#9;&#9;&#9;&#9;&#9;] as CFDictionary &#9;&#9;&#9;&#9;) &#9;&#9;} &#9;&#9;guard CGImageDestinationFinalize(destination) else { return nil } &#9;&#9;return mutableData as Data } Then get thumbnail from HEIC data with CGImageSourceCreateThumbnailAtIndex func thumbnail(from data: Data, size: CGSize) -> CGImage? { &#9;&#9;guard let source = CGImageSourceCreateWithData(data as CFData, nil) else { return nil } &#9;&#9;guard let context = CGContext( &#9;&#9;&#9;&#9;data: nil, &#9;&#9;&#9;&#9;width: Int(size.width), &#9;&#9;&#9;&#9;height: Int(size.height), &#9;&#9;&#9;&#9;bitsPerComponent: 8, &#9;&#9;&#9;&#9;bytesPerRow: 0, &#9;&#9;&#9;&#9;space: CGColorSpaceCreateDeviceRGB(), &#9;&#9;&#9;&#9;bitmapInfo: CGImageAlphaInfo.premultipliedFirst.rawValue &#9;&#9;) else { return nil } &#9;&#9;for index in 0 ..< CGImageSourceGetCount(source) { &#9;&#9;&#9;&#9;guard let image = CGImageSourceCreateThumbnailAtIndex( &#9;&#9;&#9;&#9;&#9;&#9;source, &#9;&#9;&#9;&#9;&#9;&#9;index, &#9;&#9;&#9;&#9;&#9;&#9;[ &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;kCGImageSourceCreateThumbnailFromImageIfAbsent: true, &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;kCGImageSourceThumbnailMaxPixelSize: Int(max(size.width, size.height)) &#9;&#9;&#9;&#9;&#9;&#9;] as CFDictionary &#9;&#9;&#9;&#9;) else { continue } &#9;&#9;&#9;&#9;context.draw(image, in: CGRect(origin: .zero, size: size)) &#9;&#9;} &#9;&#9;return context.makeImage() } The problem is that resulting image has black background instead of transparent. I see this only on device (iPad Pro, iPadOS 14.2) not on simulator. Also if I remove kCGImageDestinationEmbedThumbnail: true from destination properties, so that thumbnail not saved in Data, I receive correct thumbnail with transparent background from CGImageSourceCreateThumbnailAtIndex function. What am I doing wrong? I want to have thumbnails in HEIC image and not create them in runtime on request.
Posted Last updated
.
Post not yet marked as solved
1 Replies
1.7k Views
I'm trying to shift my current UITableView to UICollectionView with compositional layout and new configuration API. So UICollectionViewDiffableDataSource<SectionIdentifierType, ItemIdentifierType> has two generic types for section and item and two providers public typealias CellProvider = (UICollectionView, IndexPath, ItemIdentifierType) -> UICollectionViewCell? public typealias SupplementaryViewProvider = (UICollectionView, String, IndexPath) -> UICollectionReusableView? The problem is that only CellProvider accepts data (ItemIdentifierType) as a parameter and I can configure the cell with it. SupplementaryViewProvider doesn't have SectionIdentifierType in its parameters so I can't understand how to properly configure section header/footer with section data. I don't want to use IndexPath to find necessary section data in some kind of collection view's data array since it is the same approach as previous delegate API. What is the solution for creating list with custom section headers in new API?
Posted Last updated
.
Post marked as solved
1 Replies
1.1k Views
I've added folder with JSON resources to SPM package target resources with .process("Folder") and I see that in built bundle these resources are the same as in sources, even not minified. So what is better to do with JSON files: process them or just copy. Looks like .copy is faster according to common sense and if files are not changed after .process maybe it would be better to use .copy instead?
Posted Last updated
.