Post

Replies

Boosts

Views

Activity

How to avoid the True-Depth frame rate dropping when the ThermalState is serious?
iPhone12 Pro, iOS 15.0.1  * Occurs on most iPhone devices I am using True-Depth on the Front side that I got from ARKit for my own application. (ARFrame.capturedDepthData) ...     func session(_ session: ARSession, didUpdate frame: ARFrame)     {         // get capturedDepthData         guard let depthData = frame.capturedDepthData else { return } ... There are many other processes, so the CPU gets hot as soon as I launch the app. When the CPU is hot, the True-Depth frame rate drops from about 15 to 5 fps. Is there a way to maintain a constant frame rate? Is there a way to get a lower resolution? QVGA resolution is enough for my app. A simple code is attached below. How to use: Create a new project in Xcode13.0 (select"App", "Storyboard" template) Replace "ViewController.swift" in the generated code Prepare a suitable UI in "Main" and attach it Build the app Please use another app to heat up the CPU. capture: ThermalState.nominal ThermalState.serious //  ViewController.swift import Foundation import UIKit import ARKit import RealityKit class ViewController: UIViewController {     // Outlet     @IBOutlet weak var arView: ARView!     @IBOutlet weak var depthImageView: UIImageView!     @IBOutlet weak var fpsLabel: UILabel!             private let fpsManager: FpsManager = FpsManager()     override func viewDidLoad()     {         super.viewDidLoad()         // Do any additional setup after loading the view.                 arView.session.delegate = self         arView.session.run(ARFaceTrackingConfiguration())     } }     extension ViewController: ARSessionDelegate {     func session(_ session: ARSession, didUpdate frame: ARFrame)     {         // get capturedDepthData         guard let depthData = frame.capturedDepthData else { return }         let depthMap = depthData.depthDataMap         depthImageView.image = UIImage(ciImage: CIImage.init(cvPixelBuffer: depthMap))                 // update log         guard let fps = fpsManager.Update() else { return }         let thermalStateStr = ProcessInfo.processInfo.thermalState.toString()         let depthWidth = CVPixelBufferGetWidth(depthMap)         let depthHeight = CVPixelBufferGetHeight(depthMap)         fpsLabel.text = String(format: "[%d, %d], fps: %2.2f, thermalState: \(thermalStateStr)",                                depthWidth, depthHeight, fps)     } } class FpsManager {     private var start: Date = Date()     private var frames: Int = 0     private let waitTime: TimeInterval = 1.0         public func Update() -> Double? {         let now = Date()         let secFromStart = now.timeIntervalSince(start)         if secFromStart < waitTime {             frames += 1             return nil         }                 let fps = Double(frames) / secFromStart as Double         start = now         frames = 0         return fps     } } extension ProcessInfo.ThermalState {     func toString() -> String {         switch self {         case .nominal:             return "nominal"         case .fair:             return "fair"         case .serious:             return "serious"         case .critical:             return "critical"         default:             return "[error] toString(): unknown type"         }     } }
2
0
1.4k
Dec ’21
How to use Depth in Object Capture API ?
OS: 12.0 Beta I'm interested in the Object Capture API and am trying to get the sample app to work. https://developer.apple.com/documentation/realitykit/taking_pictures_for_3d_object_capture https://developer.apple.com/documentation/realitykit/creating_a_photogrammetry_command-line_app/ After trying a few times, I noticed that the output doesn't change much even if Depth (.TIF) and Gravity (.TXT) aren't in the folder. I wanted to use Depth, so I tried using PhotoGrametorySample. Because, I noticed PhotoGrametorySample.depthDataMap The session was successfully created by using the same CVPixelBuffer as this. AVCapturePhoto.depthData.converting (toDepthDataType: kCVPixelFormatType_DisparityFloat32) .depthMapData However, the output from the session is .invalidSample only for the id containing the depth. [command-line app log] Successfully created session. (PhotogrammetrySample API) Using request: modelFile(url: ***, detail: RealityFoundation.PhotogrammetrySession.Request.Detail.full, geometry: nil) Invalid Sample! id=1 reason="The sample is not supported." Invalid Sample! id=2 reason="The sample is not supported." Invalid Sample! id=3 reason="The sample is not supported." Invalid Sample! id=4 reason="The sample is not supported." ... What is this reason ? "The sample is not supported." Is there sample code to use Depth in the process?
2
0
1.2k
Nov ’21