Happy days, I managed to get something working by hacking together some bits from a few places.
// Based on code from https://github.com/JohnCoates/Slate
fragment half4 scene_filter_fragment_chromatic_abberation(VertexOut vert [[stage_in]],
texture2dhalf, access::sample scene [[texture(0)]])
{
float2 coordinates = vert.texcoord;
constexpr sampler samp = sampler(coord::normalized, address::repeat, filter::nearest);
half4 color = scene.sample(samp, coordinates);
float2 offset = (coordinates - 0.4) * 2.0;
float offsetDot = dot(offset, offset);
const float strength = 5.0;
float2 multiplier = strength * offset * offsetDot;
float2 redCoordinate = coordinates - 0.003 * multiplier;
float2 blueCoordinate = coordinates + 0.01 * multiplier;
half4 adjustedColor;
adjustedColor.r = scene.sample(samp, redCoordinate).r;
adjustedColor.g = color.g;
adjustedColor.b = scene.sample(samp, blueCoordinate).b;
adjustedColor.a = color.a;
return adjustedColor;
}
And I need to turn off the camera grain to get the filter to apply to the entire scene:
Swift
if #available(iOS 13.0, *) {
sceneView.rendersCameraGrain = false
}
I have a repo here: https://github.com/ManjitBedi/SCNTechnique-Experiment
This project on GitHub really helped me out as well: https://github.com/2RKowski/ARSCNViewImageFiltersExample
Post
Replies
Boosts
Views
Activity
I have been experiencing this issue with a MacBook Pro 14" model since I bought it late last year (2021).
I went to the Apple store & at the genius bar they checked my MacBook & then I was referred to technical support; I had uploaded video recording showing the audio issues & also uploaded system logs. This was back in Jan when I was in contact with technical support the last I heard was that it was being investigating by engineering at Apple...
I went took a look at the Hello World sample & I see I made 2 mistakes:
use Entity(named:
and use the appropriate path:
struct ContentView: View {
var body: some View {
RealityView { content in
do {
let scene = try await Entity(named: "Scenes/HOF", in: HeadsOnFire.headsOnFireBundle)
content.add(scene)
} catch is CancellationError {
// The entity initializer can throw this error if an enclosing
// RealityView disappears before the model loads. Exit gracefully.
return
} catch let error {
// Other errors indicate unrecoverable problems.
print("Failed to load cube: \(error)")
}
}
}
}
I also keen to know if this is possible & or what the options are, I have an UIKit application that I would like to add Reality Kit content to as well. I did watch the Apple developer video, it alludes to what is possible - it would be nice if there was a complete code example.
I would also like to know if this is possible with Reality Composer Pro v1 - is something possible with shader graph?
Does RCP v2 with Xcode 16 have any new options?
In my case, I created a plane in Blender with a texture image with an alpha mask. When I view the USDZ in visionOS on a device or the simulator, the mesh is quite dark. I am loaded the scene with Model3D & not a Reality Kit view (for various reasons).