Good day, everyone, I have a some questions.
I got color frame from ipad camera and rendered on scnview using metal.
and framedata's format is kCVPixelFormatType_420YpCbCr8BiPlanarFullRange.
And I can't change color frame pixel format because it's from some private sdk...
I got CMSamplerBuffer at every frame, metal shader changed it to rgba color texture.
it's some weird, metal texture from ycbcr is darker or brighter than original image.
It's my metal shader code.
// https://github.com/google/filament/blob/main/filament/backend/src/metal/MetalExternalImage.mm
#include <metal_stdlib>
#include <simd/simd.h>
using namespace metal;
kernel void
ycbcrToRgb(texture2d<half, access::read> inYTexture [[texture(0)]],
texture2d<half, access::read> inCbCrTexture [[texture(1)]],
texture2d<half, access::write> outTexture [[texture(2)]],
uint2 gid [[thread_position_in_grid]])
{
if (gid.x >= outTexture.get_width() || gid.y >= outTexture.get_height()) {
return;
}
half luminance = inYTexture.read(gid).r;
// The color plane is half the size of the luminance plane.
half2 color = inCbCrTexture.read(gid / 2).rg;
half4 ycbcr = half4(luminance, color, 1.0);
const half4x4 ycbcrToRGBTransform = half4x4(
half4(+1.0000f, +1.0000f, +1.0000f, +0.0000f),
half4(+0.0000f, -0.3441f, +1.7720f, +0.0000f),
half4(+1.4020f, -0.7141f, +0.0000f, +0.0000f),
half4(-0.7010f, +0.5291f, -0.8860f, +1.0000f)
);
outTexture.write(ycbcrToRGBTransform * ycbcr, gid);
}
and it's my render code
func render(colorFrame: STColorFrame) {
// var frame = CGImage.create(sampleBuffer: colorFrame.sampleBuffer)!
let buffer = CMSampleBufferGetImageBuffer(colorFrame.sampleBuffer)!
convertVideoFrameToImage1(buffer)
scnview.scene?.background.contents = outTexture
}
private func convertVideoFrameToImage1(_ buffer: CVImageBuffer) {
// kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
CVPixelBufferLockBaseAddress(buffer, CVPixelBufferLockFlags(rawValue: 0))
let commandQueue = device!.makeCommandQueue()!
let library = device!.makeDefaultLibrary()!
let commandBuffer = commandQueue.makeCommandBuffer()!
let encoder = commandBuffer.makeComputeCommandEncoder()!
encoder.setComputePipelineState(
try! device!.makeComputePipelineState(function:
library.makeFunction(name: "ycbcrToRgb")!))
// input
// Extract Y and CbCr textures
// https://stackoverflow.com/questions/58175811/how-to-convert-an-rgba-texture-to-y-and-cbcr-textures-in-metal
let imageTextureY = createTexture(fromPixelBuffer: buffer,
pixelFormat: .r8Unorm, planeIndex: 0)!
let imageTextureCbCr = createTexture(fromPixelBuffer: buffer,
pixelFormat: .rg8Unorm, planeIndex: 1)!
let width = CVPixelBufferGetWidth(buffer)
let height = CVPixelBufferGetHeight(buffer)
// NSLog("aaa3 \(imageTextureY.usage.rawValue) \(imageTextureCbCr.usage.rawValue) \(MTLTextureUsage.shaderRead.rawValue)")
encoder.setTexture(imageTextureY, index: 0)
encoder.setTexture(imageTextureCbCr, index: 1)
if outTexture == nil {
let descriptor = MTLTextureDescriptor()
descriptor.textureType = .type2D
descriptor.pixelFormat = .rgba32Float
descriptor.width = width
descriptor.height = height
descriptor.usage = [.shaderWrite, .shaderRead]
outTexture = device!.makeTexture(descriptor: descriptor)
}
encoder.setTexture(outTexture, index: 2)
let numThreadgroups = MTLSize(width: 32, height: 32, depth: 1)
let threadsPerThreadgroup = MTLSize(width: width / numThreadgroups.width,
height: height / numThreadgroups.height, depth: 1)
encoder.dispatchThreadgroups(numThreadgroups,
threadsPerThreadgroup: threadsPerThreadgroup)
encoder.endEncoding()
commandBuffer.commit()
commandBuffer.waitUntilCompleted()
CVPixelBufferUnlockBaseAddress(buffer, CVPixelBufferLockFlags(rawValue: 0))
}
original image is this,
but render in scnview like this
They are a different little bit... I've been thinking about it for three days.
Can You help me?
Post
Replies
Boosts
Views
Activity
Hello, everyone, I work as AR Graphic Programmer.
I have a some question about alpha blended in scenekit with arframe.
I expected when shader defines the pixel's alpha to zero, and then they rendered background color like this.
Yeah it's not problem in case background of scnview scene is nothing.
But in case I changed scnview.scene.backgound.contents to ar frame, it looks black like this.
I can using discard in fragment shader but heard using discard in fragment shader makes performance issue.
(And I implement shader using scnprogram...)
changing alpha blend option or transparency option can solve this?
I hope solve it.
Hello everyone,
I 'm doing a AR Medical Solution project. and have some questions.
We can get depth frame, color frame, camera transform matrix and camera projection matrix per every frame.
For my comfort, I render on scnview using camera matrix and projection matrix.
At a point I want to render simutaneously refresh color frame and camera matrices belonging to one frame, but it wasn't.
maybe drawcalls be conducted different times
It's my code and can you advice for me?
scnview.delegate = self
...
@MainActor func renderer(_: SCNSceneRenderer,
didRenderScene _: SCNScene,
atTime _: TimeInterval)
{
if let camTransform = lastFrame.camTransform,
let camProjection = lastFrame.camProjection,
let colorFrame = lastFrame.colorFrame
{
mars.updateCamera(cameraPose: camTransform,
cameraGLProjection: camProjection)
// update camera transforms at scenekit
render(colorFrame: colorFrame)
}
}
func render(colorFrame: STColorFrame) {
let buffer = CMSampleBufferGetImageBuffer(colorFrame.sampleBuffer)!
convertVideoFrameToImage(buffer)
// ycbcr buffer to rgba image texture by metal computing shader
scnview.scene?.background.contents = outTexture
}
Hello everyone,
I have seen color pixel of scnview.
but it's get-only property.
I want to change bgra_8unorm to bgra_8unorm_srgb.
how can I change it???
Hello developers,
I have a original image that is ordinary one.
and set image in image view like this
imageview.image = UIImage(cgImage: image)
result is this,
and then set same image in scenekit like this
scnview.scene.background.contents = image
and result is,
I use same image to two othre view, and one in scnview is darker than original one, and I don't know the problem...
found in google, who has a same problem, but there is no answer.
(https://stackoverflow.com/questions/60679819/adding-uiimageview-to-arscnview-scene-background-causes-saturation-hue-to-be-off)
checked scnview's pixelformat it was bgra8unorm.
What's the problem??
Hello, developers,
I'm implementing slice rendering of 3d volume.
And then, I have a simple question...
I use a simple vertex buffer type both in swift code and in metal code. Firstly, I defined uv to float2 but it's not working. It has weird texture coordinates when I use float2...
public struct VertexIn: sizeable {
var position = float3()
var normal = float3()
var uv = float3()
}
struct VertexIn {
float3 position [[ attribute(0) ]];
float3 normal [[ attribute(1) ]];
float3 uv [[ attribute(2) ]];
};
like this
float2.
float3.
It has just difference at the uv type. And I have same issue at passing uniform to shader. When I pass uniform that includes float or short types it doesn't work. So I change type to float3... So I inquire that metal data type is so difference compared with swift type??? Or what types are same and supported from metal.
Hello Everybody.
I'm trying to port graphic code written cg in unity to metal.
And, one more thing I don't want to manually implement scene graph, so I gonna use SceneKit.
So I should use SCNProgram or SCNNodeRendererDelegate, and I think SCNProgram is more comfort.
And real my question is how convert this code, in cg
Cull Front
ZTest LEqual
ZWrite On
Blend SrcAlpha OneMinusSrcAlpha
I know source alpha blending in MTLPipelineDescriptor, zbuffer in RenderCommandEncoder and Cull Face also. But When I use SCNProgram or SCNSceneRendererNode, can't find these options... how I change these. Help me.
Hello, Everyone.
I try to use a metal kit with a scene kit. Because, the scene kits scene graph is great, I want to implement a low-level metal shader.
I want to use SCNNodeRenderDelegate, without SCNProgram. Because I want low-level implement for example passing custom extra MTLBuffer, or multi-pass-rendering.
So I pass model view projection matrix like that,
in metal shader
struct NodeBuffer {
float4x4 modelTransform;
float4x4 modelViewProjectionTransform;
float4x4 modelViewTransform;
float4x4 normalTransform;
float2x3 boundingBox;
};
in Swift code
struct NodeMatrix: sizeable {
var modelTransform = float4x4()
var modelViewProjectionTransform = float4x4()
var modelViewTransform = float4x4()
var normalTransform = float4x4()
var boundingBox = float2x3()
}
...
private func updateNodeMatrix(_ camNode: SCNNode) {
guard let camera = camNode.camera else {
return
}
let modelMatrix = transform
let viewMatrix = camNode.transform
let projectionMatrix = camera.projectionTransform
let viewProjection = SCNMatrix4Mult(viewMatrix, projectionMatrix)
let modelViewProjection = SCNMatrix4Mult(modelMatrix, viewProjection)
nodeMatrix.modelViewProjectionTransform = float4x4(modelViewProjectionMatrix)
}
...
public func renderNode(_ node: SCNNode,
renderer: SCNRenderer,
arguments: [String: Any])
{
guard let renderTexturePipelineState = renderTexturePipelineState,
let renderCommandEncoder = renderer.currentRenderCommandEncoder,
let camNode = renderer.pointOfView,
let texture = texture
else { return }
updateNodeMatrix(camNode)
guard let nodeBuffer
= renderer.device?.makeBuffer(bytes: &nodeMatrix,
length: NodeMatrix.stride,
options: [])
else { return }
renderCommandEncoder.setDepthStencilState(depthState)
renderCommandEncoder.setRenderPipelineState(renderTexturePipelineState)
renderCommandEncoder.setFragmentTexture(texture, index: 0)
renderCommandEncoder.setVertexBuffer(vertexBuffer, offset: 0, index: 0)
renderCommandEncoder.setVertexBuffer(nodeBuffer, offset: 0, index: 1)
renderCommandEncoder.drawIndexedPrimitives(type: .triangle,
indexCount: indexCount,
indexType: .uint16,
indexBuffer: indexBuffer,
indexBufferOffset: 0)
}
But I got the wrong model view projection matrix in the shader.
I think scene kit has modify intermediate transform hiding.
I can't know, help me...
Hello bro,
I want to know the supported lidar depth resolution from apple.
But I couldn't find about this.
My device is an iPad 11 inch 3rd.
It seems to be '256 * 192' is the default (I saw from ARKit).
Where can I find this information about supported resolution options?
Hello, everyone,
I have trouble, to build for ios simulator.
I am trying open3d python on my project, with this example. (https://github.com/kewlbear/Open3D-iOS/issues)
I use Open3D-iOS, PythonKit, and anyone else...
I have no problem when I build for a real device, but it occurred when I build for the arm64 simulator...
Error Message
Ld /Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Products/Debug-iphonesimulator/PythonKitTest.app/PythonKitTest normal (in target 'PythonKitTest' from project 'PythonKitTest')
cd /Users/wonki/Documents/Project/UsingPythonKitTest/PythonKitTest
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang -target arm64-apple-ios15.0-simulator -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator15.0.sdk -L/Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Products/Debug-iphonesimulator -F/Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Products/Debug-iphonesimulator/PackageFrameworks -F/Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Products/Debug-iphonesimulator/PackageFrameworks -F/Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Products/Debug-iphonesimulator/PackageFrameworks -F/Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Products/Debug-iphonesimulator/PackageFrameworks -F/Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Products/Debug-iphonesimulator/PackageFrameworks -F/Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Products/Debug-iphonesimulator/PackageFrameworks -F/Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Products/Debug-iphonesimulator/PackageFrameworks -F/Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Products/Debug-iphonesimulator/PackageFrameworks -F/Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Products/Debug-iphonesimulator/PackageFrameworks -F/Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Products/Debug-iphonesimulator -filelist /Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Intermediates.noindex/PythonKitTest.build/Debug-iphonesimulator/PythonKitTest.build/Objects-normal/arm64/PythonKitTest.LinkFileList -Xlinker -rpath -Xlinker @executable_path/Frameworks -dead_strip -Xlinker -object_path_lto -Xlinker /Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Intermediates.noindex/PythonKitTest.build/Debug-iphonesimulator/PythonKitTest.build/Objects-normal/arm64/PythonKitTest_lto.o -Xlinker -export_dynamic -Xlinker -no_deduplicate -Xlinker -objc_abi_version -Xlinker 2 -fobjc-link-runtime -L/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/lib/swift/iphonesimulator -L/usr/lib/swift -Xlinker -add_ast_path -Xlinker /Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Intermediates.noindex/PythonKitTest.build/Debug-iphonesimulator/PythonKitTest.build/Objects-normal/arm64/PythonKitTest.swiftmodule -lc++ -lstdc++ -framework Accelerate -framework Accelerate -lz -lsqlite3 -Xlinker -sectcreate -Xlinker __TEXT -Xlinker __entitlements -Xlinker /Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Intermediates.noindex/PythonKitTest.build/Debug-iphonesimulator/PythonKitTest.build/PythonKitTest.app-Simulated.xcent -lassimp -lfaiss -lIrrXML -lturbojpeg -ljsoncpp -lOpen3D_3rdparty_lzf -lOpen3D_3rdparty_qhull_r -lOpen3D_3rdparty_qhullcpp -lOpen3D_3rdparty_rply -lOpen3D -lpng16 /Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Products/Debug-iphonesimulator/pybind.a -ltbb_static -lassimp -lfaiss -lIrrXML -lturbojpeg -ljsoncpp -lOpen3D_3rdparty_lzf -lOpen3D_3rdparty_qhull_r -lOpen3D_3rdparty_qhullcpp -lOpen3D_3rdparty_rply -lOpen3D -lpng16 /Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Products/Debug-iphonesimulator/pybind.a -ltbb_static -lnumpy -lnpymath -lnpyrandom -lnumpy -lnpymath -lnpyrandom -lpython3 -lssl -lcrypto -lffi -lpython3 -lssl -lcrypto -lffi -llapacke-iOS-simulator -Xlinker -no_adhoc_codesign -Xlinker -dependency_info -Xlinker /Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Intermediates.noindex/PythonKitTest.build/Debug-iphonesimulator/PythonKitTest.build/Objects-normal/arm64/PythonKitTest_dependency_info.dat -o /Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Products/Debug-iphonesimulator/PythonKitTest.app/PythonKitTest -Xlinker -add_ast_path -Xlinker /Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Intermediates.noindex/Open3D-iOS.build/Debug-iphonesimulator/Open3DSupport.build/Objects-normal/arm64/Open3DSupport.swiftmodule -Xlinker -add_ast_path -Xlinker /Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Intermediates.noindex/NumPy-iOS.build/Debug-iphonesimulator/NumPySupport.build/Objects-normal/arm64/NumPySupport.swiftmodule -Xlinker -add_ast_path -Xlinker /Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Intermediates.noindex/Python-iOS.build/Debug-iphonesimulator/PythonSupport.build/Objects-normal/arm64/PythonSupport.swiftmodule -Xlinker -add_ast_path -Xlinker /Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Intermediates.noindex/LAPACKE-iOS.build/Debug-iphonesimulator/_LapackeLink.build/Objects-normal/arm64/_LapackeLink.swiftmodule -Xlinker -add_ast_path -Xlinker /Users/wonki/Library/Developer/Xcode/DerivedData/PythonKitTest-geoicgbvhlgamxbhtzfsbgdqmfft/Build/Intermediates.noindex/PythonKit.build/Debug-iphonesimulator/PythonKit.build/Objects-normal/arm64/PythonKit.swiftmodule
Undefined symbols for architecture arm64:
"_ffi_type_longdouble", referenced from:
_formattable in libpython3.a(cfield.o)
ld: symbol(s) not found for architecture arm64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
Storyboard vs SwiftUI
I experienced only storyboard for two years. So I want to know about SwiftUI and Xib style. Could you recommend a hot style these days? I want to know the feature points, pros, and cons of SwiftUI and simply.
Hello developers,
I already see parallel code using constant number of parallel like this from official site.
async let firstPhoto = downloadPhoto(named: photoNames[0])
async let secondPhoto = downloadPhoto(named: photoNames[1])
async let thirdPhoto = downloadPhoto(named: photoNames[2])
let photos = await [firstPhoto, secondPhoto, thirdPhoto]
show(photos)
But I want to use this to variable size of works.
for photoName in photoNames {
async let ... = downloadPhoto(named: photoName)
}
let photos = await ...
show(photos)
Does swift 5.5 support this case???
By any chance, do I use TaskGroup?
I use TaskGroup already, but I want to know that swift supports it.