I have been reading up on creating metal filters & how they can be used with SCNTechnique to apply a post processing effect with AR Kit. I can get the basic effects working following other examples.
I would like to use filters that were made by FlexMonkey some years ago; I see they are C programs in a way. E.g:
https://github.com/FlexMonkey/Filterpedia/blob/master/Filterpedia/customFilters/TransverseChromaticAberration.swift
https://github.com/FlexMonkey/Filterpedia
Is there way a to adapt such filter to be used with a metal shader?
I am quite willing to watch any relevant WWDC videos to learn how to do this.
Ideally, I want to apply a chromatic aberration effect that will shift the RGB color position using a time uniform.
For reference:
https://en.wikipedia.org/wiki/Chromatic_aberration
I would like to use filters that were made by FlexMonkey some years ago; I see they are C programs in a way. E.g:
https://github.com/FlexMonkey/Filterpedia/blob/master/Filterpedia/customFilters/TransverseChromaticAberration.swift
https://github.com/FlexMonkey/Filterpedia
Is there way a to adapt such filter to be used with a metal shader?
I am quite willing to watch any relevant WWDC videos to learn how to do this.
Ideally, I want to apply a chromatic aberration effect that will shift the RGB color position using a time uniform.
For reference:
https://en.wikipedia.org/wiki/Chromatic_aberration
Happy days, I managed to get something working by hacking together some bits from a few places.
And I need to turn off the camera grain to get the filter to apply to the entire scene:
I have a repo here: https://github.com/ManjitBedi/SCNTechnique-Experiment
This project on GitHub really helped me out as well: https://github.com/2RKowski/ARSCNViewImageFiltersExample
Code Block // Based on code from https://github.com/JohnCoates/Slate fragment half4 scene_filter_fragment_chromatic_abberation(VertexOut vert [[stage_in]], texture2d<half, access::sample> scene [[texture(0)]]) { float2 coordinates = vert.texcoord; constexpr sampler samp = sampler(coord::normalized, address::repeat, filter::nearest); half4 color = scene.sample(samp, coordinates); float2 offset = (coordinates - 0.4) * 2.0; float offsetDot = dot(offset, offset); const float strength = 5.0; float2 multiplier = strength * offset * offsetDot; float2 redCoordinate = coordinates - 0.003 * multiplier; float2 blueCoordinate = coordinates + 0.01 * multiplier; half4 adjustedColor; adjustedColor.r = scene.sample(samp, redCoordinate).r; adjustedColor.g = color.g; adjustedColor.b = scene.sample(samp, blueCoordinate).b; adjustedColor.a = color.a; return adjustedColor; }
And I need to turn off the camera grain to get the filter to apply to the entire scene:
Code Block Swift if #available(iOS 13.0, *) { sceneView.rendersCameraGrain = false }
I have a repo here: https://github.com/ManjitBedi/SCNTechnique-Experiment
This project on GitHub really helped me out as well: https://github.com/2RKowski/ARSCNViewImageFiltersExample