IOSurface

RSS for tag

Share hardware-accelerated buffer data across multiple processes and frameworks using IOSurface.

Posts under IOSurface tag

15 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

IOSurface with System Extensions
Hi All, I'm working on a camera system extension where the main app is supposed to transfer a video stream using IOSurface memory sharing to the cam extension. I have built a sample app that does contains all the logic, but without a camera extension. So I'm essentially using IOSurface to render a video in one SwiftUI view and show the result in another SwiftUI view. Just for testing purposes. And everything works fine so far. Now, when moving the receiver code to the camera extensions, I'm having problems in accessing the IOSurface via ID. I am sharing the IOSurface ID via UserDefaults. I know from the logs the ID is correctly transferred. Here is the code that uses IOSurfaceLookup to get the IOSurface. But this fails with the given message. The error message prints the surface ID which is the correct one. I know this from the main app where I get the ID and print it as well. private var surfaceId: Int = -1 { didSet { logger.info("surfaceId has changed") if surfaceId == -1 { stopReceivingFrames() ioSurface = nil } else { guard let surface = IOSurfaceLookup(IOSurfaceID(surfaceId)) else { logger.error("failed to lookup IOSurface with ID: \(self.surfaceId)") return } self.ioSurface = surface logger.info("surface set, now starting receiving frames") startReceivingFrames() } } } My gut feeling says that this issue might be related to some missing entitlement, sandboxing. In general, I have a working camera extension. I'm just not able to render a video in the main app, and send it over to the camera extension to overlay another web cam. Both, the main app and camera extension are in the same XCode workspace and share the same AppGroup. In short, my actual questions are: Is there any entitlement required for using IOSurface between app and camera system extension? Is using IOSurface actually possible in system extensions? Is there any specific setting/requirement that I need to handle to make this work?
0
0
64
23h
MacOS input buffer size over 1020 bytes
Hello Currently, we are developing an application that collects measurement data from devices via USB port but we have some problems regarding input buffer size that might reduce the reading speed. The data size is quite big (~1.5 million bytes). For every read action, the app will check the available size in the input buffer by command: ioctl(serialPort.hHandle, FIONREAD, readBuffSize) Then that size will be passed as the desired size for the read command. Between read actions, there will be a short sleep (1 - 3 ms). Please confirm some points below: Regardless of the size of the measure data, the available size will only be up to 1020 bytes, is it expected? Is the above size OS defined or driver defined? Could we increase it? As we checked, there were some missing bytes between the measure data and the actual read data. Is there any chance that the data in the input buffer be overwritten while the read thread is in sleep? Any recommended technique that can increase reading speed while still maintaining the integrity of the data? Tested OS: Sonoma 14.5
1
0
137
2w
CarPlay issues
I am also currently having the same issue after updating to IOS18 And iPhone 16pro max Whether wired or wireless sound quality is poof and very mono. At some point it corrects itself then you touch the phone screen then back to mono again. Making phone calls big issue. Nothing respond,nothing else resolve it. My Bluetooth works fine. Please fix this… so tired of this
1
0
123
Oct ’24
IOSurface vs. IOSurfaceRef on Catalyst
I have an IOSurface and I want to turn that into a CIImage. However, the constructor of CIImage takes a IOSurfaceRef instead of a IOSurface. On most platforms, this is not an issue because the two types are toll-free bridgeable... except for Mac Catalyst, where this fails. I observed the same back in Xcode 13 on macOS. But there I could force-cast the IOSurface to a IOSurfaceRef: let image = CIImage(ioSurface: surface as! IOSurfaceRef) This cast fails at runtime on Catalyst. I found that unsafeBitCast(surface, to: IOSurfaceRef.self) actually works on Catalyst, but it feels very wrong. Am I missing something? Why aren't the types bridgeable on Catalyst? Also, there should ideally be an init for CIImage that takes an IOSurface instead of a ref.
2
1
577
Jun ’24
IOSurface objects aren't released in ScreenCaptureKit
I use ScreenCaptureKit, CoreVideo, CoreImage, CoreMedia frameworks to capture screenshots on macOS 14.0 and higher. Example of creating CGImageRef: CVImageBufferRef cvImageBufferRef = ..; CIImage* temporaryImage = [CIImage imageWithCVPixelBuffer:cvImageBufferRef]; CIContext* temporaryContext = [CIContext context]; CGImageRef imageRef = [temporaryContext createCGImage:temporaryImage fromRect:CGRectMake(0, 0, CVPixelBufferGetWidth(cvImageBufferRef), CVPixelBufferGetHeight(cvImageBufferRef))]; I have the next results of profiling with XCode Instruments Memory Leaks & Allocations: there is constantly increasing memory usage, but no memory leaks are detected, and there are many calls to create IOSurface objects, that have been never released. The most part of memory - All Anonymous VM - VM: IOSurface. The heaviest stack trace: [RPIOSurfaceObject initWithCoder:] [IOSurface initWithMachPort:] IOSurfaceClientLookupFromMachPort I don't have any of IOSurface objects created by myself. There are low-level calls to it. In Allocation List I can see many allocations of IOSurface objects, but there are no info about releasing it. Due to this info, how can I release them to avoid permanent increasing memory consumption?
2
0
861
May ’24
"IOSurface creation failed" drawing to CGContext
(more details on StackOverflow) I'm getting messages like the following, SOMETIMES, when I draw to a CGContext IOSurface creation failed: e00002c2 parentID: 00000000 properties: { IOSurfaceAddress = 5207703552; IOSurfaceAllocSize = 9461418; IOSurfaceCacheMode = 0; IOSurfaceName = CMPhoto; IOSurfacePixelFormat = 1246774599; } call to context.draw(): context.draw(photo.image, in: CGRect(x: 0, y: top, width: width, height: height), byTiling: false) The results are just fine, so the draw seems to be working. It also, most often, draws without producing this error, but it fails pretty often. I'm not sure where to begin looking to sort out what I might need to do differently to avoid this error message in the console. Complete code: import Foundation import SwiftUI func generateSpritesImage(thumbPhotos: [Photo], width: Int, filename: URL) -> [Int] { var indices = [Int]() let totalHeight = thumbPhotos.reduce(0) { $0 + $1.heightOfImage(ofWidth: width) } debugPrint("creating context") let context = CGContext(data: nil, width: width, height: totalHeight, bitsPerComponent: 8, bytesPerRow: 0, space: CGColorSpace(name: CGColorSpace.sRGB)!, bitmapInfo: CGImageAlphaInfo.noneSkipLast.rawValue)! var top = totalHeight for photo in thumbPhotos { let height = photo.heightOfImage(ofWidth: width) indices.append(top - totalHeight) top -= height debugPrint("drawing \(photo.filteredFileURL())") context.draw(photo.image, in: CGRect(x: 0, y: top, width: width, height: height), byTiling: false) } debugPrint("write jpeg") writeJpegFromContext(context: context, filename: filename) return indices } func writeJpegFromContext(context: CGContext, filename: URL) { let cgImage = context.makeImage()! let bitmapRep = NSBitmapImageRep(cgImage: cgImage) let jpegData = bitmapRep.representation(using: NSBitmapImageRep.FileType.jpeg, properties: [:])! try! jpegData.write(to: filename) } sample of output: "drawing 0002-_MG_8542.jpg" "drawing 0003-_MG_8545.jpg" "drawing 0004-_MG_8550.jpg" IOSurface creation failed: e00002c2 parentID: 00000000 properties: { IOSurfaceAddress = 5211357184; IOSurfaceAllocSize = 9983331; IOSurfaceCacheMode = 0; IOSurfaceName = CMPhoto; IOSurfacePixelFormat = 1246774599; } "drawing 0005-_MG_8555.jpg" IOSurface creation failed: e00002c2 parentID: 00000000 properties: { IOSurfaceAddress = 5221351424; IOSurfaceAllocSize = 10041215; IOSurfaceCacheMode = 0; IOSurfaceName = CMPhoto; IOSurfacePixelFormat = 1246774599; } "drawing 0006-_MG_8562.jpg" "drawing 0007-_MG_8563.jpg" IOSurface creation failed: e00002c2 parentID: 00000000 properties: { IOSurfaceAddress = 5376163840; IOSurfaceAllocSize = 10109756; IOSurfaceCacheMode = 0; IOSurfaceName = CMPhoto; IOSurfacePixelFormat = 1246774599; } "drawing 0008-_MG_8584.jpg" "drawing 0009-_MG_8618.jpg" IOSurface creation failed: e00002c2 parentID: 00000000 properties: { IOSurfaceAddress = 5394612224; IOSurfaceAllocSize = 8425564; IOSurfaceCacheMode = 0; IOSurfaceName = CMPhoto; IOSurfacePixelFormat = 1246774599; } "drawing 0010-_MG_8627.jpg" "drawing 0011-_MG_8649.jpg" "drawing 0012-_MG_8658.jpg" "drawing 0013-_MG_8665.jpg" "drawing 0014-_MG_8677.jpg" "drawing 0015-_MG_8675.jpg" "drawing 0016-_MG_8676.jpg" "drawing 0017-IMGP0873.jpg" "drawing 0018-_MG_8719.jpg" "drawing 0019-_MG_8743.jpg" ...
3
0
1.2k
Aug ’24
IOSurface-based MTLTexture gets corrupted
I have this code to create an IOSurface from a bitmap image: auto src = loadSource32f(); // rgba 32-bit float image const auto desc = src->getDescriptor(); // metadata for that image auto pixelFmt = CGMTLBufferManager::getCVPixelFormat( desc.channelBitDepth, desc.channelOrder ); // returns proper `RGfA` int width = static_cast<int>( desc.width ); int height = static_cast<int>( desc.height ); int trowbytes = static_cast<int>( desc.trueRowbytes() ); // returns proper rowbytes value CFMutableDictionaryRef properties = CFDictionaryCreateMutable( kCFAllocatorDefault, 0, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks ); CFDictionarySetValue( properties, kIOSurfaceWidth, CFNumberCreate( kCFAllocatorDefault, kCFNumberIntType, &width ) ); CFDictionarySetValue( properties, kIOSurfaceHeight, CFNumberCreate( kCFAllocatorDefault, kCFNumberIntType, &height ) ); CFDictionarySetValue( properties, kIOSurfacePixelFormat, CFNumberCreate( kCFAllocatorDefault, kCFNumberIntType, &pixelFmt ) ); CFDictionarySetValue( properties, kIOSurfaceBytesPerRow, CFNumberCreate( kCFAllocatorDefault, kCFNumberIntType, &trowbytes ) ); NSDictionary *nsprops = ( __bridge NSDictionary * )properties; IOSurface *oSurface = [[IOSurface alloc] initWithProperties:nsprops]; CFRelease( properties ); ASSERT_TRUE( oSurface ); auto ioSurface = (IOSurfaceRef) oSurface; I tested that the pixels are properly written into the iosurface: // copy data to surface memcpy([oSurface baseAddress], src->getRawPtr(), src->getSizeInBytes()); auto surfPtr = (uint8_t*)[oSurface baseAddress]; // extract raw surface data and write it into a file saveOutputRaw(desc, surfPtr, getFileName("IOSurfaceTestSurfaceRaw")); And I see this: Now I want to create a MTLTexture based on the iosurface: // create texture auto fmt = IOSurfaceGetPixelFormat( ioSurface ); auto w = IOSurfaceGetWidth( ioSurface ); auto h = IOSurfaceGetHeight( ioSurface ); auto rowbytes = IOSurfaceGetBytesPerRow( ioSurface ); MTLTextureDescriptor *textureDescriptor = [MTLTextureDescriptor texture2DDescriptorWithPixelFormat:CGMTLBufferManager::getMTLPixelFormat( fmt ) width:w height:h mipmapped:NO]; textureDescriptor.usage = MTLTextureUsageShaderRead | MTLTextureUsageShaderWrite; textureDescriptor.storageMode = MTLStorageModeShared; auto device = MTLCreateSystemDefaultDevice(); id<MTLTexture> surfaceTex = [device newTextureWithDescriptor:textureDescriptor iosurface:ioSurface plane:0]; And now I want to test this: auto region = MTLRegionMake2D(0, 0, w, h); auto bufSize = [oSurface allocationSize]; // get texture bytes auto outBuf2 = std::vector<uint8_t>(bufSize); [surfaceTex getBytes:outBuf2.data() bytesPerRow:rowbytes fromRegion:region mipmapLevel:0]; // save to file saveOutputRaw(desc, outBuf2.data(), getFileName("IOSurfaceTestCreateTex")); // get bytes saveOutputRaw(desc, surfPtr, getFileName("IOSurfaceTestCreateRaw")); And I get this result: I also tried replaceRegion and blitEncoder copyFromTexture: toTexture: as well as managed texture with syncing, but the result is always the same - only the first 22 pixels get filled and the rest is transparent. I have no idea what I'm missing. Please help.
0
0
763
Jan ’24