Attached USB device (LCD) frame buffer transfer and GUI HOWTO?

Hi there,

I have a device attached to my mac. The manufacturer has published a spec to enable third party developers to communicate with the device and access the LCD screen to display pixel data. The way you do this is by using frame buffers which must be transferred using for example libusb. They gave us an example code in C to open/close the device using libusb and also a sample code for a frame buffer transfer .

So, I was wondering what would be the best strategy in order to feed the frame buffers with pixel data if I want to create a simple GUI for the device, drawing rectangles, text and other basic shapes.

I don't want to re-invent the wheel, so, I would prefer using a library that allows to draw shapes and text and get back pixel data to feed the frame buffer. Also, I would prefer if it were possible to do all this in Swift but I have no problem using C if it is required.

Is it possible to hack SpriteKit pipeline to render to a frame buffer? Or MetalKit ? I guess I could also create a CGContext and use CoreGraphics functions to draw stuff and then get all the pixels data but wouldn't that be very ineffective knowing the device's LCD is 60 fps ...

Any advice would be very much appreciated.

Best regards, Thierry

Accepted Reply

Hi Thierry!

Technically you have unlimited options to render your contents to... CoreGraphics seems like a good one to start with. You'd need double or triple-buffers drawables anyway to be able to pipeline submissions and "freeze" the buffers on CPU while sending-over-usb is happening. Also, make sure bandwidth is enough for your case with the USB bandwidth you have.

Really, start with a framework which will give you CPU-visible linear pixels representation in the most straightforward way. I can't recommend you any, since it depends on what you are used to.

Regards, Eugene.

Replies

Hi Thierry!

Technically you have unlimited options to render your contents to... CoreGraphics seems like a good one to start with. You'd need double or triple-buffers drawables anyway to be able to pipeline submissions and "freeze" the buffers on CPU while sending-over-usb is happening. Also, make sure bandwidth is enough for your case with the USB bandwidth you have.

Really, start with a framework which will give you CPU-visible linear pixels representation in the most straightforward way. I can't recommend you any, since it depends on what you are used to.

Regards, Eugene.

Thank you Eugene,

I will start by implementing a CoreGraphics mockup and see how it goes. Two buffers seems indeed the minimum to guaranty immutability. The bandwidth is USB 2.0 bandwidth and I assume it will be enough to drive the LCD since the manufacturer does it already with its own software.

You said "drawables" in your answer which leads me to think you have a Metal API background, if it is the case, do you think Metal would be any improvement over CG in this case where I'm dealing with 2D only? Can Metal pipeline be set to not render to screen but into a buffer?

Best regards, Thierry

  • Hi again Thierry!

    When I told drawables I did mean some logical surface you are drawing to. Assuming the contents would not be like AAA-game or really high in polygon count, CoreGraphics would be a great choice, so there is no absolute need to do it with Metal. But if there is any good reason to do it with Metal, Metal totally supports off-screen rendering to linear textures (backed by an MTLBuffer) so you can achieve what you need. Though again, I doubt there will be any benefit assuming you are light on content.

    Best regards, Eugene.

  • Thank you Eugene. Great advise. Best regards, Thierry

Add a Comment