Though somewhat late with the answer, here is the link to the Swift Playgrounds book that I made recently:
https://github.com/gadirom/Metal-Compute-Offscreen-Render-and-Postprocess
It will run on an iPad or a Mac.
Instead of loading shaders from a file I load them from a String constant. The downside is that you won't have syntax highlights.
Anyway, to edit and run the Metal Shader code and research different pipeline logic right on the device is a nice experience! :)
Post
Replies
Boosts
Views
Activity
This means that you use a white theme, hence white instead of black.
The image on the right should be identical to the one on the left: the coloured gradient should be visible through the transparent parts of the image.
Yes, I tried
format.opaque = true
though, this would make no sense, since I want to work with a non-opaque view.
To get a sense of what I need you could use this function:
extension View{
@MainActor func render(scale: CGFloat) -> UIImage?{
let renderer = ImageRenderer(content: self)
renderer.scale = scale
return renderer.uiImage
}
}
And use it like this:
struct ContentView: View {
@Environment(\.displayScale) var scale
var view: some View{
VStack {
Image(systemName: "globe")
.imageScale(.large)
.foregroundColor(.accentColor)
Text("Hello, world!")
}
}
var body: some View {
HStack{
view
Image(uiImage: view.render(scale: scale)!)
}
.background(LinearGradient(stops: [.init(color: .green, location: 0), .init(color: .red, location: 1)], startPoint: .bottom, endPoint: .top))
}
}
This should work perfectly, but only on iOS16. I need to support earlier iOS versions.
The problem was caused by this line:
int3 mapPos = int3(floor(rayPos + 0.5));
I used ChatGPT for code conversion and somehow missed that it made this silent addition.