Custom shaders and particle systems in RealityKit

Is there a way to apply custom shaders to RealityKit ModelEntitys?

Also, is it possible to create particle systems in RealityKit?

Accepted Reply

User-provided custom shaders are not supported in RealityKit at present. You can signal your interest by submitting feedback with Feedback Assistant, if you want.

There is no built-in particle system in RealityKit. Submitting feedback is also useful on that front, if it is a feature you would use.

Replies

It seems that there is not currently a way to add custom shaders according to this. https://developer.apple.com/forums/thread/131858
User-provided custom shaders are not supported in RealityKit at present. You can signal your interest by submitting feedback with Feedback Assistant, if you want.

There is no built-in particle system in RealityKit. Submitting feedback is also useful on that front, if it is a feature you would use.
It seems that in RealityKit, video materials are like custom shaders which is kind of ingenious. It looks like there is a different video for each UV Map of a model and then these video segments are packed and put together on a video material atlas that plays as a video. Take a look at In the WWDC 2020 session What's new in RealityKit. This seems like it would allow artists and creatives to create video materials instead of writing shader GPU code. This is really out of the box thinking!
Hi,

i believe curveddesign has a good point; the ability to use mapped movie materials will help achieve the effects and results many developers would like. Perhaps not served up the way they might ‘normally’ think of applying a ‘shader’ file effect. The movie material is an economical way to deal with some ‘transparency’ needs. Apply the transparency needed on the frames of the movie; then apply the movie as ‘material’ to the object.

During the past 6 months or so since I discovered Reality Composer (with a dash of some RealityKit and Converter on the Mac) for the iPad Pro+Pencil2 I am consistently pleased with how good a 3-D production environment it is. In general, I have found that; ‘there’s usually a way’ to do what you’d like to accomplish, but it isn’t always obvious from the Developer documentation ‘how to do things’. I do understand the documentation focus is on addressing specifics of ‘the what’.

The ‘thinking outside the point cloud’ aspect has been important for me. More than once I have attempted ‘this or that idea’ in a Reality Composer scene; I’ll work on it for awhile, and if I hit the wall, I stop. I put that project file away, and find another file that would benefit from less lofty aspirations to work on. It may take some re-reading of both Apple Developer resources and others, but in a few days to a week I usually can find at least a ‘different approach’ to try.

Getting ‘outside the box’; or me, has often been coupled to ‘widening my focus’ to include ‘combining’ features and capabilities in ways not considered before. It also seems one gets there quicker by knowing when to stop working on one strategy or approach; at least long enough to scramble out of the ‘box’ for a bit!

Somewhere along the way ‘out of the box’, I discovered that; ‘Sometimes Reality is the Strangest Fantasy of them all!’ Another good reason to ‘widen the focus’ sometimes.


Best,

Catfish


The video materials look pretty interesting, but they look a lot more effort to create (for a developer anyway) than shaders, and less powerful. I'm happy it's something dynamic though.

I really wish RealityKit had shaders and a particle system. The Entity-Component (and other APIs) are so nice, I'd love to use it for non-AR and possibly AR. But without those features, it doesn't quite match what SceneKit can do.

I'll file feedback requesting the features, and hopefully others will too.