Is there a way to scale a RealityKit ShapeResource?

I can generate a ShapeResource from a ReakityKit entity's extents. Could I apply some scaling to the generated shape. Is there a way to do that?

// model is a ModelResource and bounds is a BoundingBox
var shape = ShapeResource.generateConvex(from: model.mesh);
shape = shape.offsetBy(translation: bounds.center)
// How can I scale the shape to fit within the bounds?

The following API only provide the rotation and translation support. and I cannot find the scale support.

offsetBy(rotation: simd_quatf = simd_quatf(ix: 0, iy: 0, iz: 0, r: 1), translation: SIMD3<Float> = SIMD3<Float>())

I can put the ShapeResource on an entity and scale the entity. But, I would like to know if it is possible to scale the ShapeResource itself without attaching it to an entity.

Answered by hale_xie in 822687022

Use CPU to calculate update the shader.

func transform(_ transform: Transform) async throws -> ShapeResource {
    if (transform.scale == .one) {
        return self.offsetBy(rotation: transform.rotation, translation: transform.translation)
    } else {
        let mesh = await MeshResource(shape: self)
        var contents = mesh.contents
        mesh.contents.models.forEach { model in
            var mutableModel = model
            model.parts.forEach { part in
                var mutablePositions: [SIMD3<Float>] = []

                part.positions.forEach { position in
                    let updatedPosition = transform.matrix * SIMD4<Float>(position, 1)
                    mutablePositions.append(updatedPosition.xyz)
                }
            
                var mutablePart = part
                mutablePart.positions = MeshBuffer<SIMD3<Float>>(mutablePositions)
                mutableModel.parts.update(mutablePart)
            }
            contents.models.update(mutableModel)
        }
        try await mesh.replace(with: contents)
        return try await ShapeResource.generateConvex(from: mesh)
    }
}
Accepted Answer

Use CPU to calculate update the shader.

func transform(_ transform: Transform) async throws -> ShapeResource {
    if (transform.scale == .one) {
        return self.offsetBy(rotation: transform.rotation, translation: transform.translation)
    } else {
        let mesh = await MeshResource(shape: self)
        var contents = mesh.contents
        mesh.contents.models.forEach { model in
            var mutableModel = model
            model.parts.forEach { part in
                var mutablePositions: [SIMD3<Float>] = []

                part.positions.forEach { position in
                    let updatedPosition = transform.matrix * SIMD4<Float>(position, 1)
                    mutablePositions.append(updatedPosition.xyz)
                }
            
                var mutablePart = part
                mutablePart.positions = MeshBuffer<SIMD3<Float>>(mutablePositions)
                mutableModel.parts.update(mutablePart)
            }
            contents.models.update(mutableModel)
        }
        try await mesh.replace(with: contents)
        return try await ShapeResource.generateConvex(from: mesh)
    }
}

Hi @hale_xie,

What you have done here is probably the most general approach you could take. So it should work in almost any situation. One thing to be aware of though is as you switch from the updatedPosition x, y and z components you should be dividing that SIMD3<Float> by the w component of the SIMD4<Float> result of the transform application. It's unlikely you'll have a skew in that matrix but just in case it's better to do that to be sure you are getting expected results.

Is there a way to scale a RealityKit ShapeResource?
 
 
Q