Drawing app using Metal slows down as screen fills up

I have been studying Metal and to help, I used an old drawing app and converted the drawing layer to a MTKView. I got textures working using pdf images and they draw very well on the screen using Apple Pencil. The initial FPS is 120. But as I draw on the screen and begin filling it up with brush strokes, the FPS will gradually drop until it is well below 50 FPS. If I clear the canvas, the FPS goes back up to 120 and the issue will repeat. Here is the basic method I am using that I have found in other drawing app code:

TouchesBegan I begin the vertices
TouchesMoved I append vertices and draw primitives using point with the vertices count, then setNeedsDisplay is called and I present and commit the drawable
TouchesEnded is same as moved, but then clear the vertices

Why is the FPS drop happening? I presume because it is redrawing the entire view every frame and the textures have alphas. Is there a method I could use to maintain the FPS throughout the life of the drawing?
Answered by Graphics and Games Engineer in 650473022
There could be a number of reasons performance decreases as the vertex sound increases. Are you calling draw[Index]Primitives to render each stroke.

The idea of only drawing what necessary by saving the current state makes a lot of sense. Instead of taking a snapshot with UIImage, I would render to an offscreen texture and maintain that as your canvas without clearing. Then, for each frame you can copy that texture to a drawable.

As a bonus, if you're using MTKView, to make your app more efficient so that you're only drawing frame when a new stroke has been added I would enableSetNeedsDisplay to YES so that you only draw when an touch event occurs and you need to update the canvas.

Since this is an exercise in learning Metal, this would be a good opportunity to familiarize yourself with the performance analysis tools. I suggest firing up Xcode Instruments and choose the "Metal System Trace" template to create your project. With this you can see where your app is spending its time on the GPU with the "GPU" instrument and on the CPU with the "Time Profiler" instrument. This article should give you some ideas on using it. You can also find a number WWDC videos which demonstrate practical use of the tools including this one which gives a nice overview of the current state of the Metal tools.
So far I determined that my largest brush texture causes performance to drop after 2500 vertices have been drawn with it, and my smallest texture can go 110000 vertices. I have implemented a scheme where the verts are calculated after every draw, and if they exceed these predetermined numbers (using a percentage of max verts), I put all the elements on screen into a backup array, take a UIImage snapshot of the canvas, clear the canvas, and then display the snapshot on a UIImageView which sits just below the metal view. Since the metal canvas is now empty, performance is maintained. I feel this is a horrible solution but it is the only thing my mediocre coding brain could come up with. The UIImage snapshot displays a little darker than the metal view, so when the change happens, there's a noticeable jarring switch. It works, but it's ugly. I wish I knew how the Procreate developers are able to maintain such solid and smooth performance in their drawing app. I'm open to suggestions on how to better handle this scenario.
Probably there is a better way to implement it, but your idea is the right one.
Accepted Answer
There could be a number of reasons performance decreases as the vertex sound increases. Are you calling draw[Index]Primitives to render each stroke.

The idea of only drawing what necessary by saving the current state makes a lot of sense. Instead of taking a snapshot with UIImage, I would render to an offscreen texture and maintain that as your canvas without clearing. Then, for each frame you can copy that texture to a drawable.

As a bonus, if you're using MTKView, to make your app more efficient so that you're only drawing frame when a new stroke has been added I would enableSetNeedsDisplay to YES so that you only draw when an touch event occurs and you need to update the canvas.

Since this is an exercise in learning Metal, this would be a good opportunity to familiarize yourself with the performance analysis tools. I suggest firing up Xcode Instruments and choose the "Metal System Trace" template to create your project. With this you can see where your app is spending its time on the GPU with the "GPU" instrument and on the CPU with the "Time Profiler" instrument. This article should give you some ideas on using it. You can also find a number WWDC videos which demonstrate practical use of the tools including this one which gives a nice overview of the current state of the Metal tools.
Thank you! Yes I am calling draw(index)Primitives. I will work on the offscreen texture method, this is extremely helpful for me.
While UIKit should really never be used - rendering to offscreen textures has other performance consequences and dozens of adjacent things you have to implement for a typical drawing tool. Sometimes it is necessary, but not always, and the difference is large.

If it's slowing down already, you should first address that as I did - and you may not want to hear this, but I would bet on it that your app is also suffering from cpu performance problems in how you manage the stroke data, the memory, and of course, that you are using Swift (which by now, if it isn't clear, is in fact ironically slow and not appropriate for a performant drawing tool)

From reading what you wrote in the other threads, I could predict with high certainty there are dozens of other common fundamental things causing the performance problems you are experiencing, well before you even confront the ones Apple is responsible for. I've built many drawing tools that optimize for different use case workflows, and each one required different arrangements based on their specifics to truly get a stable 120 fps (That is, before apple messed up the Display Link path).

You may not want to hear this, but you are only around 10% there. But, really, there is a bigger problem. If we are to try to be moral people who care about each other, then we have to pause before talking about all that code, and talk about the serious health damage introduced with the Apple Pencil 2nd generation. If we are making tech demos or toys for ourselves and we accept the health damage for our own body that is one thing, but when we talk about releasing apps that have a tendency to lure people to use the Pencil more, then these moral health concerns have to be assessed. Are we not morally responsible for the activities we encourage through the apps we make? If we become aware as third party developers that the Apple Pencil 2nd gen is a serious health problem, and that the users are not aware of it - don't we then have an obligation to be responsible for that? The science of how bad it really is, is over the head of the average person, and it's not something we can clearly and fully warn them about. Apple has continued to ignore the reports, and we can assume they have chosen a position of disingenuous morality - because of this, as developers it really does come down to our personal moral character. You, me, the other developers releasing on the store, our actions and how we choose to respond to the immoral circumstances will determine the extent of the negative impact until Apple listens. So, as much as you may feel attached to your work, and are still struggling with the fundamentals, there are some serious moral concerns with releasing apps for the Apple Pencil 2nd gen today.
I may be a bit naive, MoreLighnting, on the subject you are referring to. First I accept that I am probably only 10% there with my app. I have a grand idea for it, but I know deep down that I need a developer who can rewrite the entire thing and make it shine because my coding skills are not that great. There are too many things I just don't understand. So for now if I can get it working and doing everything that I need it to, then at least I have a working prototype that I can then hand over to an experienced and qualified professional to make it what it needs to be. That is my hope. I have been stashing away money for this very purpose. Finding a developer though for me is a scary process. But that will be my next step.

As for my naivety, what I mean by that is I have never heard of health issues related to the Apple Pencil. What are those issues? It is something that has never crossed my mind. I am an artist by trade and have been very happy with the iPad Pro and Apple Pencil. Can you elaborate?
I'm practically retired and don't really work for money, so I can't help you with that.

But, what I would suggest, since you've been working on it long enough...

Is to reassess your Application Definition Statement.

Take a hard look at what solution and differentiator you are actually providing.

Why do you believe your prototype depicts something that a more experienced developer would need to see? What is the real purpose of making such a prototype?

...

As for the health problems...

It comes from:

A) The touch pad
B) The induction coil
C) The magnet
D) Bluetooth modulation (distinct from older modulation formats)
E) The iPad's touchscreen itself
F) General lowering standards of material choices, like GF2

The last few existed along with the Apple Pencil 1, and they were measurably bad, but the additions and design of the Apple Pencil 2 are on another league of damage.

The thing here, is if I just list these things, no one will actually understand anything. In fact, they might assume that I don't know what I'm talking about, and am just one of those tin foil hat people who are afraid of microwave radiation. (As opposed to being informed with actual measurements and having constructed custom antenna and hardware that replicate the phenomena)

We can only describe things in relation to things you already understand.

If I talk about specific measurements I personally took, and describe how certain very specific kinds of electromagnetic field patterns interact with different specific materials in specific layouts, and how those interact with the body - then we'd have to go very deep in describing the internal hardware and measurements.

The problem with having such a conversation in public, is similar to how we have trouble talking about the fundamentals of programming in Metal. Just like there are very complex 'decision-consequence' chains in writing your Metal program, there are similar phenomena of 'decision-consequence' chains in the hardware for health safety.

Most people don't even understand the surface level of basic signal noise phenomena - where we can even pollute the battery on purpose to demonstrate functional use errors in the form of drawing stroke gaps.

(Also notice how the Apple graphics engineer's responses are not always informed enough.)

Just like how Apple releases obvious bugs in software that show they don't do proper testing, it's the same with the hardware. They are not measuring like I was. Otherwise, fundamental mistakes in material science and antenna design would have never have gotten this far out of hand.

The problems of the electronics are a result of the same people problems that are present in the software.

And it's not like we really needed those features in the Apple Pencil 2. Especially the touch pad. That's quite a big health problem for something I personally didn't even want to use.

Using the induction coil to charge, means that the signal is always more polluted that what we had in the Apple Pencil 1. And like I pointed out, the signal noise will trigger stroke gaps - because the touch screen is an analog antenna array. It's just outright ignorance for engineers to do these things.

And to not understand how the materials, purity, layout, and fabrication choices influence the interactions, is like saying the engineer is missing fundamental understanding required to be adequate for their role.

Why are we repeating age old mistakes like how the shell is reverberating the signal of the processor at an audible level, and out through the cable?

And I can almost predict that even though they were told years ago, the material choices and construction of the next iPad will be even worse. You would think a rational company that says "we care" would listen to things they could easily avoid ahead of time, and that's what makes me so upset about it.




Drawing app using Metal slows down as screen fills up
 
 
Q