RealityKit is making my iPhone thermalStatus go crazy with no apparent reason

I have the following demo app, In which I set an ARWorldTrackingConfiguration over my RealityKit.

I also use plane detection.

When a plane is detected, I add the ability to "Fire" a rectangle on to the plane with a simple square collision box.

After about 100 squares, the app

thermalStatus
changes to
serious
and my frame rate goes down to 30fps.

For the life of me, I can't understand why 100 simple shapes in an RealityKit world, with no special textures or even collision events will cause this.

Does anyone have any idea?

PS1: Running this on an iPhone XS, which should be able to perform better according to HW specifications.

PS2: Adding the code below

import UIKit

import RealityKit

import ARKit


let material = SimpleMaterial(color: .systemPink, isMetallic: false)

var sphere: MeshResource = MeshResource.generatePlane(width: 0.1, depth: 0.1)

var box = ShapeResource.generateBox(width: 0.1, height: 0.03, depth: 0.1)

var ballEntity = ModelEntity(mesh: sphere, materials: [material])

let collider = CollisionComponent(

shapes: [box],

mode: .trigger

)


class ViewController: UIViewController {


@IBOutlet var arView: ARView!


@IBOutlet weak var button: UIButton!


override func viewDidLoad() {

super.viewDidLoad()


let configuration = ARWorldTrackingConfiguration()

configuration.planeDetection = [.vertical]

configuration.worldAlignment = .camera


// Add the box anchor to the scene

configuration.frameSemantics.remove(.bodyDetection)

configuration.frameSemantics.remove(.personSegmentation)

configuration.frameSemantics.remove(.personSegmentationWithDepth)


arView.renderOptions.insert(.disableCameraGrain)

arView.renderOptions.insert(.disableGroundingShadows)

arView.renderOptions.insert(.disableHDR)

arView.renderOptions.insert(.disableMotionBlur)

arView.renderOptions.insert(.disableFaceOcclusions)

arView.renderOptions.insert(.disableDepthOfField)

arView.renderOptions.insert(.disablePersonOcclusion)


configuration.planeDetection = [.vertical, .horizontal]


arView.debugOptions = [.showAnchorGeometry, .showStatistics]


let gesture = UITapGestureRecognizer(target: self,

action: #selector(self.tap(_:)))

arView.addGestureRecognizer(gesture)


arView.session.run(configuration, options: [ .resetSceneReconstruction ])

}


@objc func tap(_ sender: UITapGestureRecognizer) {

let point: CGPoint = sender.location(in: arView)


guard let query = arView.makeRaycastQuery(from: point,

allowing: .existingPlaneGeometry,

alignment: .vertical) else {

return

}


let result = arView.session.raycast(query)

guard let raycastResult = result.first else { return }



let anchor = AnchorEntity(raycastResult: raycastResult)

var ballEntity = ModelEntity(mesh: sphere, materials: [material])

ballEntity.collision = collider

anchor.addChild(ballEntity)



arView.scene.anchors.append(anchor)

}



@IBAction func removePlaneDebugging(_ sender: Any) {

if arView.debugOptions.contains(.showAnchorGeometry) {

arView.debugOptions.remove(.showAnchorGeometry)

button.setTitle("Display planes", for: .normal)

return

}


button.setTitle("Remove planes", for: .normal)

arView.debugOptions.insert(.showAnchorGeometry)

}

}

Can anyone please assist?