SceneKit

RSS for tag

Create 3D games and add 3D content to apps using high-level scene descriptions using SceneKit.

Posts under SceneKit tag

72 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Why does the planeNode in SceneKit flicker when using class property instead of a local variable?
I am working on a SceneKit project where I use a CAShapeLayer as the content for SCNMaterial's diffuse.contents to display a progress bar. Here's my initial code: func setupProgressWithCAShapeLayer() { let progressLayer = createProgressLayer() progressBarPlane?.firstMaterial?.diffuse.contents = progressLayer DispatchQueue.main.async { var progress: CGFloat = 0.0 Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true) { timer in progress += 0.01 if progress > 1.0 { progress = 0.0 } progressLayer.strokeEnd = progress // Update progress } } } // MARK: - ARSCNViewDelegate func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) { progressBarPlane = SCNPlane(width: 0.2, height: 0.2) setupProgressWithCAShapeLayer() let planeNode = SCNNode(geometry: progressBarPlane) planeNode.position = SCNVector3(x: 0, y: 0.2, z: 0) node.addChildNode(planeNode) } This works fine, and the progress bar updates smoothly. However, when I change the code to use a class property (self.progressLayer) instead of a local variable, the rendering starts flickering on the screen: func setupProgressWithCAShapeLayer() { self.progressLayer = createProgressLayer() progressBarPlane?.firstMaterial?.diffuse.contents = progressLayer DispatchQueue.main.async { [weak self] in var progress: CGFloat = 0.0 Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true) { [weak self] timer in progress += 0.01 if progress > 1.0 { progress = 0.0 } self?.progressLayer?.strokeEnd = progress // Update progress } } } After this change, the progressBarPlane in SceneKit starts flickering while being rendered on the screen. My Question: Why does switching from a local variable (progressLayer) to a class property (self.progressLayer) cause the flickering issue in SceneKit rendering?
0
0
56
2d
Cannot assign AVCaptureDevice to SCNMaterialProperty.contents
I want to apply a SCNTechnique pipeline to the camera feed. To achieve this, I want to bring the camera input into the SceneKit world. The perfects API seems to be: let captureDevice = … scnScene.background.contents = captureDevice This is demonstrated in "SceneKit: What's New" (WWDC17) (at 44m19s) and is mentioned in the documentation of SCNMaterialProperty's contents. Instead of showing camera feed, it crashes with these messages: *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVCaptureVideoDataOutput setVideoSettings:] Unsupported pixel format type - use -availableVideoCVPixelFormatTypes' *** First throw call stack: (0x18993c7cc <REDACTED> 0x211e18488) libc++abi: terminating due to uncaught exception of type NSException Please advise. STEPS TO REPRODUCE Create a new Xcode project, starting from the SceneKit game template. Add Info.plist entry for NSCameraUsageDescription. Add a capture device property to GameViewController: class GameViewController: UIViewController { let captureDevice = AVCaptureDevice.default(for: .video) Set the background contents: scene.background.contents = captureDevice Run the app on device. PLATFORM AND VERSION iOS Development environment: Xcode 16.1, macOS 15.0.1. Run-time configuration: iOS 18.1
1
0
164
1w
SKTexture used for SceneKit object is rendered too bright
I would like to preload and use some images for both SpriteKit and SceneKit models (my game uses SceneKit with a SpriteKit overlay), and as far as I can see the only efficient way would be to create and preload SKTexture objects which can be supplied to SKSpriteNode(texture:) and SCNMaterial.diffuse.contents. The problem is that SKTexture are rendered too bright in SceneKit, for some unknown reason. Here a comparison between rendering an image (from URL) and a SKTexture: And the code that produces it: let url = Bundle.main.url(forResource: "art.scnassets/texture.png", withExtension: nil)! let plane1 = SCNPlane(width: 10, height: 10) plane1.firstMaterial!.diffuse.contents = url.path let node1 = SCNNode(geometry: plane1) node1.position.x = -5 scene.rootNode.addChildNode(node1) let plane2 = SCNPlane(width: 10, height: 10) plane2.firstMaterial!.diffuse.contents = SKTexture(image: NSImage(byReferencing: url)) let node2 = SCNNode(geometry: plane2) node2.position.x = 5 scene.rootNode.addChildNode(node2) This issue was already mentioned in this other post, but since I wasn't notified of the reply from Quinn asking about the feedback number I created at the time, it didn't make any progress.
5
0
244
2w
SceneKit app seriously hangs when run in fullscreen
I've been running my SceneKit game for many weeks in Xcode without performance issues. The game itself is finished, so I thought I could go on with publishing it on the App Store, but when archiving it in Xcode and running the archived app, I noticed that it seriously hangs. The hangs only seem to happen when I run the game in fullscreen mode. I tried disabling game mode, but the hangs still happen. Only when I run in windowed mode the game runs smoothly. Instruments confirms that there are many serious hangs, but it also reports that CPU usage is quite low during those hangs, on average about 15%. From what I know, hangs happen when the main thread is busy, but how can that be when CPU usage is so low, and why does it only happen in fullscreen mode for release builds?
2
0
266
1w
SceneKit custom physics fields using wrong position?
In the simplest case I can come up with, I create a scene (either fully or partially in code) with a single dynamic body, located slightly away from the origin. I give the body a charge as well as adding an electric field to the node. Body does nothing (as to be expected, since it's the source of the field). However if I replace that field with a custom field (does nothing except reports back the passed in position value) the position shown is the location of the body in the local space of its parent (in this case, the root node) rather than the node the field is attached to (i.e. itself). I've attached the code customising the SwiftUI app template. Hopefully someone can tell me what I'm doing wrong? ContentView customisation… struct ContentView: View { var body: some View { SceneView(scene: ElectricScene(), options: [.allowsCameraControl, .autoenablesDefaultLighting]) } } And the code to create the scene… import Foundation import SceneKit class ElectricScene: SCNScene { override init() { super.init() physicsWorld.gravity = SCNVector3(0, 0, 0) let cameraNode = SCNNode() cameraNode.camera = SCNCamera() cameraNode.position = SCNVector3(0, 0, 10) rootNode.addChildNode(cameraNode) let ballNode = SCNNode(geometry: SCNSphere(radius: 0.5)) ballNode.position = SCNVector3(2, 0, 0) ballNode.physicsBody = SCNPhysicsBody(type: .dynamic, shape: nil) ballNode.physicsBody?.charge = -1 rootNode.addChildNode(ballNode) // ballNode.physicsField = SCNPhysicsField.electric() ballNode.physicsField = SCNPhysicsField .customField {position, _, _, _, _ in print(position) return SCNVector3Zero } } @available(*, unavailable) required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } } This (repeatedly) prints out the following… SCNVector3(x: 2.0, y: 0.0, z: 0.0) …which is the position of the node relative to the root node, rather than relative to the source of the field (itself).
1
0
248
1w
SKAction.removeFromParent() causes crash when run in SCNView.overlaySKScene on iOS
Even when the action is run on the main thread, the following code causes a crash on iOS, but not on macOS. The game launches with a simple yellow rectangle, and when it finishes fading out and should be removed from the overlay scene, the app crashes. The code can be pasted into the file GameController.swift of Xcode's default project for Multiplatform macOS and iOS game. import SceneKit import SpriteKit @MainActor class GameController: NSObject { let scene: SCNScene let sceneRenderer: SCNSceneRenderer init(sceneRenderer renderer: SCNSceneRenderer) { sceneRenderer = renderer scene = SCNScene(named: "Art.scnassets/ship.scn")! super.init() sceneRenderer.scene = scene renderer.overlaySKScene = SKScene(size: CGSize(width: 500, height: 500)) DispatchQueue.main.async { let node = SKShapeNode(rect: CGRect(x: 100, y: 100, width: 100, height: 100)) node.fillColor = .yellow node.run(.sequence([ .fadeOut(withDuration: 1), .removeFromParent() ])) renderer.overlaySKScene!.addChild(node) } } } The Xcode console shows this stacktrace: *** Assertion failure in -[UIApplication _performAfterCATransactionCommitsWithLegacyRunloopObserverBasedTiming:block:], UIApplication.m:3246 *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Call must be made on main thread' *** First throw call stack: ( 0 CoreFoundation 0x00000001804ae0f8 __exceptionPreprocess + 172 1 libobjc.A.dylib 0x0000000180087db4 objc_exception_throw + 56 2 Foundation 0x0000000180d17058 _userInfoForFileAndLine + 0 3 UIKitCore 0x00000001853cf678 -[UIApplication _performAfterCATransactionCommitsWithLegacyRunloopObserverBasedTiming:block:] + 376 4 UIKitCore 0x000000018553f7a0 -[_UIFocusUpdateThrottle scheduleProgrammaticFocusUpdate] + 300 5 UIKitCore 0x0000000184e2e22c -[UIFocusSystem _requestFocusUpdate:] + 548 6 UIKitCore 0x0000000184e2dfa4 -[UIFocusSystem requestFocusUpdateToEnvironment:] + 76 7 UIKitCore 0x0000000184e2e864 -[UIFocusSystem _focusEnvironmentWillDisappear:] + 408 8 SpriteKit 0x00000001a3d472f4 _ZL12_removeChildP6SKNodeS0_P7SKScene + 240 9 SpriteKit 0x00000001a3d473b0 -[SKNode removeChild:] + 80 10 SpriteKit 0x00000001a3d466b8 -[SKNode removeFromParent] + 128 11 SpriteKit 0x00000001a3d1678c -[SKRemove updateWithTarget:forTime:] + 64 12 SpriteKit 0x00000001a3d1b740 _ZN11SKCSequence27cpp_updateWithTargetForTimeEP7SKCNoded + 84 13 SpriteKit 0x00000001a3d20e3c _ZN7SKCNode6updateEdf + 156 14 SpriteKit 0x00000001a3d20f20 _ZN7SKCNode6updateEdf + 384 15 SpriteKit 0x00000001a3d26fb8 -[SKScene _update:] + 464 16 SpriteKit 0x00000001a3cf3168 -[SKSCNRenderer _update:] + 80 17 SceneKit 0x000000019c932bf0 -[SCNMTLRenderContext renderSKSceneWithRenderer:overlay:atTime:] + 60 18 SceneKit 0x000000019c9ebd98 -[SCNRenderer _drawOverlaySceneAtTime:] + 204 19 SceneKit 0x000000019cb1a1c0 _ZN3C3D11OverlayPass7executeERKNS_10RenderArgsE + 60 20 SceneKit 0x000000019c8e05ec _ZN3C3D13__renderSliceEPNS_11RenderGraphEPNS_10RenderPassERtRKNS0_9GraphNodeERPNS0_5StageENS_10RenderArgsEbRPU27objcproto16MTLCommandBuffer11objc_object + 2660 21 SceneKit 0x000000019c8e18ac _ZN3C3D11RenderGraph7executeEv + 3808 22 SceneKit 0x000000019c9ed26c -[SCNRenderer _renderSceneWithEngineContext:sceneTime:] + 756 23 SceneKit 0x000000019c9ed544 -[SCNRenderer _drawSceneWithNewRenderer:] + 208 24 SceneKit 0x000000019c9ed9fc -[SCNRenderer _drawScene:] + 40 25 SceneKit 0x000000019c9edce4 -[SCNRenderer _drawAtTime:] + 500 26 SceneKit 0x000000019ca87950 -[SCNView _drawAtTime:] + 368 27 SceneKit 0x000000019c943b74 __83-[NSObject(SCN_DisplayLinkExtensions) SCN_setupDisplayLinkWithQueue:screen:policy:]_block_invoke + 44 28 SceneKit 0x000000019ca50600 -[SCNDisplayLink _displayLinkCallbackReturningImmediately] + 132 29 libdispatch.dylib 0x000000010239173c _dispatch_client_callout + 16 30 libdispatch.dylib 0x0000000102394c14 _dispatch_continuation_pop + 756 31 libdispatch.dylib 0x00000001023aa4e0 _dispatch_source_invoke + 1736 32 libdispatch.dylib 0x00000001023997f0 _dispatch_lane_serial_drain + 340 33 libdispatch.dylib 0x000000010239a774 _dispatch_lane_invoke + 420 34 libdispatch.dylib 0x00000001023a71a8 _dispatch_root_queue_drain_deferred_wlh + 324 35 libdispatch.dylib 0x00000001023a6604 _dispatch_workloop_worker_thread + 488 36 libsystem_pthread.dylib 0x000000010242bb74 _pthread_wqthread + 284 37 libsystem_pthread.dylib 0x000000010242a934 start_wqthread + 8 ) libc++abi: terminating due to uncaught exception of type NSException Am I doing something wrong?
6
0
279
1w
Can SceneKit be used with Swift 6 Concurrency ?
I am trying to port SceneKit projects to Swift 6, and I just can't figure out how that's possible. I even start thinking SceneKit and Swift 6 concurrency just don't match together, and SceneKit projects should - hopefully for the time being only - stick to Swift 5. The SCNSceneRendererDelegate methods are called in the SceneKit Thread. If the delegate is a ViewController: class GameViewController: UIViewController { let aNode = SCNNode() func renderer(_ renderer: any SCNSceneRenderer, updateAtTime time: TimeInterval) { aNode.position.x = 10 } } Then the compiler generates the error "Main actor-isolated instance method 'renderer(_:updateAtTime:)' cannot be used to satisfy nonisolated protocol requirement" Which is fully understandable. The compiler even tells you those methods can't be used for protocol conformance, unless: Conformance is declare as @preconcurrency SCNSceneRendererDelegate like this: class GameViewController: UIViewController, @preconcurrency SCNSceneRendererDelegate { But that just delays the check to runtime, and therefore, crash in the SceneKit Thread happens at runtime... Again, fully understandable. or the delegate method is declared nonisolated like this: nonisolated func renderer(_ renderer: any SCNSceneRenderer, updateAtTime time: TimeInterval) { aNode.position.x = 10 } Which generates the compiler error: "Main actor-isolated property 'position' can not be mutated from a nonisolated context". Again fully understandable. If the delegate is not a ViewController but a nonisolated class, we also have the problem that SCNNode can't be used. Nearly 100% of the SCNSceneRendererDelegate I've seen do use SCNNode or similar MainActor bound types, because they are meant for that. So, where am I wrong ? What is the solution to use SceneKit SCNSceneRendererDelegate methods with full Swift 6 compilation ? Is that even possible for now ?
4
0
377
2h
Particle burst with exact amount of particles
Hello dear forum, I need to emit an exact amount of particles from a SCNParticleSystem in a burst (this is for an UI effect). This worked for me perfectly in the scene editor by setting the birthrate to the amount and emission duration to 0. Sadly when I either load such a particle system from a scene or creating it by code, the emitted particles are sometimes one less, or one more. The first time I run it in the simulator, it seems to work fine but then amount of particles varies as described. video: https://youtube.com/shorts/MRzqWBy2ypA?feature=share Does anybody know how to make this predictable? Thanks so much in advance, Seb
1
0
425
Oct ’24
Xcode 16.0 and 16.1 regression: only allows .scn files to be added to .scnassets group
There is a significant regression in Xcode 16.0 and 16.1 (16B5014f) in that it no longer allows asset files other than .SCN files to be added to a .scnasset group and folder. This really wrecks our ability to maintain our app since we heavily depend on additional assets like .USDZ, .JPEG, .PNG, etc. files to be able to be included in the group for on-demand resource tagging and loading. I filed FB15239224 but I'm wondering if this is intentional and if anyone knows of a workaround.
0
0
227
Sep ’24
How can I pause ARView?
In my app, I have an ARView that has cameraMode set to nonAR. I occasionally hide the ARView when it is not needed and reveal it again later. While the ARView is hidden, I'd like to pause the animation to save iPhone battery life. I'd also like to do this when I know that animation in my scene has paused and the contents of the view, although still visible, is static. This was possible using SceneKit, but I can't seem to find an equivalent way to do it using RealityKit. At least as of iOS 18, a hidden ARView with an empty scene appears to use approximately 30% of the CPU. How can I pause ARView so that it won't use the battery unnecessarily? Thank you for considering this question.
2
0
493
Sep ’24
SceneKit app randomly crashes with EXC_BAD_ACCESS in jet_context::set_fragment_texture
Every now and then my SceneKit game app crashes and I have no idea why. The SCNView has a overlaySKScene, so it might also be SpriteKit's fault. The stack trace is #0 0x0000000241c1470c in jet_context::set_fragment_texture(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>> const&, jet_texture*) () #27 0x000000010572fd40 in _pthread_wqthread () Does anyone have an idea where I could start debugging this, without being able to consistently reproduce it?
11
0
598
1w
RoomPlan: textures for walls and floors.
After scanning the room I use the .export method passing a ModelProvider. Then I import the USDZ into a SCNView and continue processing the scene: I would like to apply a texture to the walls and floor, but I can't do it for the walls and floor because they don't contain the TextureCoordinates. Creating them from the USDZ file is not easy. I tried to combine the CapturedRoom data for the walls and floor only, adding the TextureCoordinates. I'm managing, but I'm struggling a lot. Isn't there an easier way to do it? Is there a ModelProvider planned for surfaces in the future? If so, where can I access the RoomPlan beta documentation?
1
1
378
Sep ’24
How to convert FBX to USDZ in-app
Hi everyone, I'm looking for a way to convert an FBX file to USDZ directly within my iOS app. I'm aware of Reality Converter and the Python USDZ converter tool, but I haven't been able to find any documentation on how to do this directly within the app (assuming the user can upload their own file). Any guidance on how to achieve this would be greatly appreciated. I've heard about Model I/O and SceneKit, but I haven't found much information on using them for this purpose either. Thanks!
0
1
392
Aug ’24
Camera control in an immersive environment
Hello, I’m playing around with making an fully immersive multiplayer, air to air dogfighting game, but I’m having trouble figuring out how to attach a camera to an entity. I have a plane that’s controlled with a GamePad. And I want the camera’s position to be pinned to that entity as it moves about space, while maintaining the users ability to look around. Is this possible? -- From my understanding, the current state of SceneKit, ARKit, and RealityKit is a bit confusing with what can and can not be done. SceneKit Full control of the camera Not sure if it can use RealityKits ECS system. 2D Window. - Missing full immersion. ARKit Full control of the camera* - but only for non Vision Pro devices. Since Vision OS doesn't have a ARView. Has RealityKits ECS system 2D Window. - Missing full immersion. RealityKit Camera is pinned to the device's position and orientation Has RealityKits ECS system Allows full immersion
0
1
389
Aug ’24
Crash on iOS18 Beta5 when adding a SCNAnimation animationDidStop closure with Swift6 selected
I'm trying to update my projects to use Swift6, if I change the project settings to use Swift6 then my app crashes when I add a closure to the SCNAnimation animationDidStop property. The error is inside the SceneKit renderingQueue and indicates that the callback is being called on the wring queue. Maybe I need to do something in the code to fix this but I can't seem to make it work, maybe a SceneKit bug? If you create a new game template in Xcode using SceneKit and replace the contents of GameViewController.swift with the following you will see the app crash after it is launched. import UIKit import SceneKit class GameViewController: UIViewController { let player: SCNAnimationPlayer = { let a = CABasicAnimation(keyPath: "opacity") return SCNAnimationPlayer(animation: SCNAnimation(caAnimation: a)) }() override func viewDidLoad() { super.viewDidLoad() let scnView = self.view as! SCNView scnView.scene = SCNScene() // Change the project settings to use Swift6 // Setting this closure will then cause a _dispatch_assert_queue_fail // EXC_BREAKPOINT error in the scenekit.renderingQueue.SCNView queue, // the only thing on the stack is: // "%sBlock was %sexpected to execute on queue [%s (%p)]" player.animation.animationDidStop = { (a: SCNAnimation, b: SCNAnimatable, c: Bool) in print("stopped") } scnView.scene?.rootNode.addAnimationPlayer(player, forKey: nil) player.play() } }
2
2
493
Aug ’24
GameKit Access Point causes camera background image in ARKit to be black in iOS 18 beta only
I have an AR game using ARKit with SceneKit that works just fine in iOS 17. In the iOS 18 betas, the AR background image shows black instead of showing the real world. As a result there's no tracking and obviously the whole game is useless. I narrowed down the issue to showing the Game Center Access Point. My app has ViewController 1 (VC1) showing the main menu and that's where I want to show the GC Access Point. From there you open VC2 which shows a list of levels. Selecting any level will open VC3 which has the ARScene. Following is the code I use to start Game Center in VC1: GKLocalPlayer.local.authenticateHandler = { gcAuthVC, error in let isGameCenterReady = (gcAuthVC == nil) && (error == nil) if let viewController = gcAuthVC { self.present (viewController, animated: true, completion: nil) } if error != nil { print(error?.localizedDescription ?? "") } if isGameCenterReady { GKAccessPoint.shared.location = .topLeading GKAccessPoint.shared.showHighlights = true GKAccessPoint.shared.isActive = true } } When switching to VC2 I run GKAccessPoint.shared.isActive = false so that the Access Point will no longer show in any of the following VCs. I tried running it in VC1, VC2, and again in VC3 - it doesn't change anything. Once I reach VC3, the background is black. If in VC1 I don't run GKAccessPoint.shared.isActive = true, so I don't activate the access point, the behavior is as follows: If I wait until after the Game Center login animation completes and closes on its own and then I proceed to VC2 and VC3, the camera image will show correctly If I quickly move to VC2 before the Game Center login animation has completed, so my code will close it by setting active = false, and then I continue to VC3, I will see the black background problem. So it does look like activating the access point and then de-activating it causes the issue. BTW, if I activate the access point and leave it on in all VCs, the same black background issue persists. Other than that, when I'm in VC3 with the black background and I switch to another app (so my game moves to the background), when it returns to the foreground, the camera suddenly shows the real world correctly! I tried to manually reset the AR session by pausing and restarting it, but that didn't change anything. Also, when I check with the debugger, it looks like when the app comes back to the foreground it also doesn't run the session start code. But something does seem to reset itself so I wonder what that is. Maybe I could trigger the same manually in my cdoe??? I repeat that everything works just fine in iOS 17 and below. This problem only started with the iOS 18 beta (currently on beta 5, but it started in some of the previous betas as well). So could this be a bug in iOS 18? As a workaround I could check the iOS version and if it's iOS18 not activate the access point, hoping that the user will not jump to VC2 too quickly, and show my own button which will open Game Center. But I'd rather give the users the full experience with their own avatar and the highlights showing up. Plus, certainly some users will move quickly to VC2 and that will be an awful experience. Any help would be greatly appreciated. Thanks!
2
0
484
Aug ’24
Failing to draw a window correctly using SceneKit
HI guys, I'm integrating the RoomPlan framework into my app. I'm able to scan a room and extract the nodes from the CaptureStructure object. So far, I can rebuild the 3D object in the SceneView, but I can't render the openings and the windows correctly. I'm struggling to add these two objects correctly in the wall, in order to make the wall transparent where they are supposed to be. If I export the CaptureStructure into a usda file and then I load it directly in the SceneView, all the doors, windows and openings are correctly rendered, therefore I do believe that I'm doing something wrong. Could you please tell me what I'm doing wrong? I added here a screenshot of my problem: I have also a prototype, which you can run and see the problem I'm talking about: https://github.com/renanstig/3d-scenekit-prototype
1
0
519
Aug ’24
How to add a sort of engraving type of text to a mesh in SceneKit?
i'm trying to figure out how to basically engrave some text into this ellipsoid mesh. so far the only thing i've learned that can sort of come close to what im looking for is SCNText but it floats above the ellipsoid and doesnt conform to the angular shape. let allocator = MTKMeshBufferAllocator(device: MTLCreateSystemDefaultDevice()!) let disc = MDLMesh.newEllipsoid( withRadii: vector_float3(Float(discDiameter/2), Float(discDiameter/2), Float(discThickness/2)), radialSegments: 64, verticalSegments: 64, geometryType: .triangles, inwardNormals: false, hemisphere: false, allocator: allocator ) let discGeometry = SCNGeometry(mdlMesh: disc) let material = createIridescentMaterial() discGeometry.materials = [material]
2
0
636
Aug ’24
Convert CGPoint into SCNVector3
I want to convert CGPoint into SCNVector3. I am using ARFaceTrackingConfiguration for face tracking. Below is my code to convert SCNVector3 to CGPoint let point = faceAnchor.verticeAndProjection(to: sceneView, facePoint: faceAnchor.geometry.vertices[0]) print(point, faceAnchor.geometry.vertices[0]) which prints below values CGPoint = (350.564453125, 643.4456787109375) SIMD3<Float>(0.014480735, 0.01397189, 0.04508282) extension ARFaceAnchor{ // struct to store the 3d vertex and the 2d projection point struct VerticesAndProjection { var vertex: SIMD3<Float> var projected: CGPoint } // return a struct with vertices and projection func verticeAndProjection(to view: ARSCNView, facePoint: Int) -> CGPoint{ let point = SCNVector3(geometry.vertices[facePoint]) let col = SIMD4<Float>(SCNVector4()) let pos = SIMD4<Float>(SCNVector4(point.x, point.y, point.z, 1)) let pworld = transform * simd_float4x4(col, col, col, pos) let vect = view.projectPoint(SCNVector3(pworld.position.x, pworld.position.y, pworld.position.z)) let p = CGPoint(x: CGFloat(vect.x), y: CGFloat(vect.y)) return p } } extension matrix_float4x4 { /// Get the position of the transform matrix. public var position: SCNVector3 { get{ return SCNVector3(self[3][0], self[3][1], self[3][2]) } } } Now i want to convert same CGPoint to SCNVector3. I tried using below code but it is not giving expected values, which is SIMD3(0.014480735, 0.01397189, 0.04508282) let projectedOrigin = sceneView.projectPoint(SCNVector3Zero) let unproject = sceneView.unprojectPoint(SCNVector3(point.x, point.y, CGFloat(projectedOrigin.z))) let vector = SCNVector3(unproject.x, unproject.y, unproject.z) Is there any way to convert CGPoint to SCNVector3? I cannot use hitTest because this CGPoint is not present on the node. It is present somewhere on the face area.
3
0
472
Jul ’24
USD to scenekit conversion removes normals
As the title suggests, clicking the "Export to SceneKit" button indeed converts a USD to .scn but removes the normals in the process if the mesh has blendshapes. When I export the same file without any blendshapes / morphtargets, the normals stay on as expected. If I try to create normals in the scenekit editor (adding them as a new geometry source) Xcode crashes (no matter if there are blendshapes or not) I've tried loading the resulting scene with [SCNSceneSource.LoadingOption.createNormalsIfAbsent : true] but this doesn't change anything either. I suppose this is a bug? My last resort is to load my character without any blendshapes and then add the targets from a different scene. Thanks for any insight! seb
0
0
455
Jul ’24