Posts

Post not yet marked as solved
2 Replies
716 Views
Hi, I'm using ARSCNView's delegate(ARSCNViewDelegate) to receive SCNNode(ARSCNFaceGeometry) updates. I've observed cases where the face geometry (blink detection) misses by 1 frame. I detect this case by visually comparing the image and the mesh in the method:   renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) I pull the ARFrame from (renderer as?ARSCNView).session.currentFrame My question is: Is it guarantied that the session.currentFrame in the renderer:didUpdate scope to be always in sync with the face geometry? I suspect a potential problem here since the geometry is being "pushed" while the current frame is being "pulled".
Posted Last updated
.
Post not yet marked as solved
0 Replies
825 Views
I'm animating body+ face moments using skeleton animation , SceneKit and Metal custom shader. I need to deform some vertices in the vertex shader, therefore I need skinningJointMatrices in the vertex shader. However if I just add  the line   float4 skinningJointMatrices[183]; in the NodeBuffer : struct NodeBuffer {   float4x4 inverseModelTransform;   float4x4 inverseModelViewTransform;   float4x4 modelTransform;   float4x4 modelViewProjectionTransform;   float4x4 modelViewTransform;   float4x4 normalTransform;   float2x3 boundingBox; float4 skinningJointMatrices[765]; }; I get the following assertion: [SceneKit] Assertion 'C3DSkinnerGetEffectiveCalculationMode(skinner, C3DNodeGetGeometry(context->_nodeUniforms.instanceNode)) == kC3DSkinnerCalculationModeGPUVertexFunction' failed. skinningJointMatrices should only be used when skinning is done in the vertex function Is there a way to workaround this assert? The code seems to be working fine although the assertion.
Posted Last updated
.
Post not yet marked as solved
0 Replies
999 Views
I'm trying to overwrite some skeleton animation using : didApplyAnimationsAtTime time: TimeInterval) here is a link to the api. According to SCNSceneRendererDelegate : "SceneKit executes actions and performs animations attached to the scene graph. " before didApplyAnimationsAtTime is called. However, when I'm changing the animated node properties (for example position or eulerAngles). It seems like the animation continue to (somehow) change the nodes property, although it shouldn't. How can that be?
Posted Last updated
.
Post marked as solved
2 Replies
589 Views
Can we have an ARKit face capture session with un-mirrored frames, including un-mirrored face geometry and depth? Currently ARSession face capture delivered all the above data mirrored and there is no option to change it.
Posted Last updated
.
Post not yet marked as solved
1 Replies
709 Views
Where can I find the original sample code, as specify here: "WWDC20 session 10611: Explore ARKit 4 references a prior version of this sample app that accumulates points in a cloud. For the original version of the app as shown in the session, clone the initial commit from the Git repository in the download’s root folder." Can someone send a link to the Git repo?
Posted Last updated
.
Post not yet marked as solved
1 Replies
491 Views
Setting up an ARSession with default ARFaceTrackingConfiguration setup seems to not support HDR. I'm checking the CVPixelBuffer in ARFrame.capturedImage using CVPixelBufferGetPixelFormatType and get kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
Posted Last updated
.
Post not yet marked as solved
1 Replies
675 Views
Is it possible to create a movie file with the capturedDepthData received from ARFrame? I get error for the following code: assetWriter = try AVAssetWriter(outputURL: output, fileType: AVFileType.mov)         videoDepthInput = AVAssetWriterInput(mediaType: .depthData, outputSettings: nil) // no setting to allow passthrou assetWriter.add(videoDepthInput) Error: [AVAssetWriter addInput:] media type dpth is not allowed for this asset writer
Posted Last updated
.