I'm writing an application that uses Metal to paste and play multiple video images in real time in a specific area of 3D.
I would like to place multiple camcorder images captured from a Magewell capture board (Pro Capture DUAL HDMI 4K Plus LT) using a Mac Pro in the texture area of the graphics board (AMD Radeon Pro Vega II 32GB) via PCIe.
However, it is taken into the area prepared on the CPU side (obtained by NsMutableData) and converted to Mipmap using [Texture replaceRegion:-], but it is not as fast as expected because it is 4K video.
Considering the processing speed, I would like to use an API (like DMA transfer to another rectangular area) that sends BGRA data directly to the texture area on the capture board side without going through the CPU. However, there is a problem transferring the address of the texture area of the graphics board.
Is there a way to get the Mipmpa address of the texture on Mac, or is there a way to get the same effect using a different approach?
Post
Replies
Boosts
Views
Activity
I am currently developing applicsation using Metal for MacOS.
When the sample program "CreatingAndSamplingTextures" is run, the memory used will gradually increase.
Will this increasing memory be released someday? Or does the sample require the omitted release process? (It seems that the memory used increases like other samples)
If you create the product level code as it is, if the memory usage increases in the same way, it will not be able to withstand long-term use, so please answer.
I want to use it as an input source for an existing app, but is it useless to draw it in View?
Where can I recognize it?
It also means that you want to instruct rendering to something other than view.
When using a view, there is a phenomenon that you have to wait one frame.
…Processing is too slow.
I want to prevent the app from launching after I turn off my Mac, kill the app, and then restart it.
I want to know the settings in Xcode when creating an application.