Posts

Post marked as solved
2 Replies
983 Views
Hi there, I have a device attached to my mac. The manufacturer has published a spec to enable third party developers to communicate with the device and access the LCD screen to display pixel data. The way you do this is by using frame buffers which must be transferred using for example libusb. They gave us an example code in C to open/close the device using libusb and also a sample code for a frame buffer transfer . So, I was wondering what would be the best strategy in order to feed the frame buffers with pixel data if I want to create a simple GUI for the device, drawing rectangles, text and other basic shapes. I don't want to re-invent the wheel, so, I would prefer using a library that allows to draw shapes and text and get back pixel data to feed the frame buffer. Also, I would prefer if it were possible to do all this in Swift but I have no problem using C if it is required. Is it possible to hack SpriteKit pipeline to render to a frame buffer? Or MetalKit ? I guess I could also create a CGContext and use CoreGraphics functions to draw stuff and then get all the pixels data but wouldn't that be very ineffective knowing the device's LCD is 60 fps ... Any advice would be very much appreciated. Best regards, Thierry
Posted Last updated
.