Greetings!
I have made use of Apple ARKit documentations to create a simple ARKit application which utilizes SceneKit (Tried Metal too)
I am currently unsure of how to make use of SmoothedSceneDepth(SceneDepth) in general to acquire the DepthData from the DataMap acquired in the View.
is there any particular method or way that I can access this data for displaying the depth.
would be grateful with any inputs or suggestions.
Thanks in advance
SceneKit
RSS for tagCreate 3D games and add 3D content to apps using high-level scene descriptions using SceneKit.
Posts under SceneKit tag
72 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
I found Scenekit crash on iOS 17 very frequently for all device on iOS 17
here is crash trace
Crashed: com.apple.scenekit.renderingQueue.SCNView0x15878c630
0 SceneKit 0x3eee4 C3DMatrix4x4GetAffineTransforms + 344
1 SceneKit 0x30208 C3DAdjustZRangeOfProjectionInfos + 140
2 SceneKit 0x2c0a90 C3DCullingContextSetupPointOfViewMatrices + 700
the attachment have the whole log
Crash Log
have anybody know how fo fix it
Hi there -
Where would a dev go these days to get an initial understanding of SceneKit?
The WWDC videos linked in various places seem to be gone?!
For example, the SceneKit page at developer.apple.com lists features a session videos link that comes up without any result, https://developer.apple.com/scenekit/
Any advice..?
Cheers,
Jan
The touch input stutter issue that exists since iOS 16 on devices with Pro Motion Displays has not been fixed yet. I filed a bug report in July but there isn't any progress since months.
I see the problem in all games I tried. My game is fast paced so the stutters are quite obvious and I receive a lot of complaining emails.
My game did run smoothly on Pro Motion devices with iOS 15. Is there a known workaround? I am seeing other developers having the same issue but I can't find any solutions.
Other threads about this issue:
IPhone 14 Pro stuttering in most games when using touch controls
FPS drops when tapping the screen on iPhone 13 Pro Max
I am trying to use my animated model in XCode with SceneKit. I exported my model from Maya with Animation Data in .usd format, then converted it to .usdz with Reality Converter. When I open it in XCode viewer it is animated and everything is fine. However when I try to use it in my app it doesn't animate. On the other hand, when I try with the robot_walk_idle model from Apple's example models, it is animated. Maybe I am missing a option in export settings. Thanks for any help.
import SwiftUI
import SceneKit
struct ModelView: View {
var body: some View{
VStack{
SceneView(scene: SCNScene(named: "robot_walk_idle.usdz"))
}
}
}
I am trying to control the orientation of a box in Scene Kit (iOS) using gestures. I am using the translation in x and y to update the x and y rotation of the SCNNode.
After a long search I have realised that x and y rotation will always lead to z rotation, thanks to this excellent post: [https://gamedev.stackexchange.com/questions/136174/im-rotating-an-object-on-two-axes-so-why-does-it-keep-twisting-around-the-thir?newreg=130c66c673f848a7be2873bf675573a9)
So I am trying to get the z rotation causes, and then remove this from my object by applying the inverse quaternion however when I rotate the object 90 deg around x, and then 90 deg around Y it behaves VERY weirdly.
It is almost behaving as it is in gimbal lock, but I did not think that using quaternion in the way that I am would cause gimbal lock in this way.
I am sure it is something I am missing, or perhaps I am not able to remove the z rotation in this way.
Thanks!
I have added a video of the strange behaviour here [https://github.com/marcusraty/RotationExample/blob/main/Example.MP4)
And the code example is here [https://github.com/marcusraty/RotationExample)
Hi everyone, I am trying on iOS 17.1.1 the Nearby Interaction framework and SceneKit. I am testing it on iPhone 15 Pro Max and iPhone 12 Pro Max.
if NISession.deviceCapabilities.supportsDirectionMeasurement {
print("Interact using device distance and direction.")
} else if NISession.deviceCapabilities.supportsPreciseDistanceMeasurement {
print("Interact using distance only.")
}
iPhone 12 Pro Max is working normally, but supportsDirectionMeasurement property in iPhone 15 Pro Max returns false and i cant use the SceneKit.
Is anyone experiencing the same issue?
Regards,
Shin
We are attempting to update the texture on a node. The code below works correctly when we use a color, but it encounters issues when we attempt to use an image. The image is available in the bundle, and it image correctly in other parts of our application. This texture is being applied to both the floor and the wall. Please assist us with this issue."
for obj in Floor_grp[0].childNodes {
let node = obj.flattenedClone()
node.transform = obj.transform
let imageMaterial = SCNMaterial()
node.geometry?.materials = [imageMaterial]
node.geometry?.firstMaterial?.diffuse.contents = UIColor.brown
obj.removeFromParentNode()
Floor_grp[0].addChildNode(node)
}
Hi there, i have recently started development in swift Ui. I wanted to ask whether it is possible to design an AR app which generates and tracks a 3d model or .scn based on a real world 3d model if .usdz format is used. for example i want to generate and track the movement of an aeroplane in AR and i have .scn file but i want a real world object as an anchor like a pen or pencil and i want to use its 3d data in .usdz format. i know you can use ARobjects abnd object tracking but it uses .arobject format and doesnot use LiDAR. important thing is that i want to use Lidar tracking not point cloud. is it possible? point me in right direction
Thank you.
I am using xcode 15 &
ios 17 beta
Will visionOS support SceneKit?
Does anyone have a working example on how to play OGG files with swift?
I've been trying for over a year now. I was able to wrap the C Vorbis library in swift. I then used it to parse an OGG file successfully. Then I was required to use Obj-C\++ to fill the PCM because this method seems to only be available in C\++ and that part hangs my app for a good 40 seconds to several minutes depending on the audio file, it then plays for about 2 seconds and then crashes.
I can't get the examples on the Vorbis site to work in objective-c and i tried every example on github I could find (most of which are for iOS - I want to play the files on mac)
I also tried using Cricket Audio framework below.
https://github.com/sjmerel/ck
It has a swift example and it can play their proprietary soundbank format but it is also supposed to play OGG and it just doesn't do anything when trying to play OGG as you can see in the posted issue
https://github.com/sjmerel/ck/issues/3
Right now I believe every player that can play OGGs on mac is written in Objective-C or C++.
Anyway, any help/advice is appreciated. OGG format is very prevalent in the gaming community. I could use unity, which I believe plays oggs through the mono framework but I really really want to stay in swift.
I am generating an SKTexture with a GKNoiseMap. When I look at the texture in a swift playground, it has the expected colours. But when I apply the texture to a material and render it in a SCNView, the colours are different (colours appear too bright). What am I doing wrong?
Swift playground to reproduce the issue (look at the texture variable in the playground and compare to rendered image). - https://developer.apple.com/forums/content/attachment/68210adc-98e9-4984-bca7-01f6e658d555