Posts

Post not yet marked as solved
2 Replies
493 Views
I have the simplest Playground to display image. I have images inside Resources. Somewhat it does not work and I am really confused. I am using macOS XCode. I think on iPad Swift Playground this might work. import SwiftUI import PlaygroundSupport struct ContentView: View { var body: some View { Image("image1") .resizable() .frame(width: 512, height: 512) } } let hostingController = UIHostingController(rootView: ContentView()) PlaygroundPage.current.liveView = hostingController
Posted Last updated
.
Post marked as solved
29 Replies
35k Views
I found that there is still lacking of documentation of "USDZ Converter", so I might start some kind of thread talking about this. So far I found developer 'Ash' out there has been really helpful 😎 and we are still discovering USDZ feature, gotchas and limitations along the way. There is this USD Interest group at Google Groups that is more specifically talk about USD.A bit of information about me:I am an independent developer, mostly using Blender 3D and XCode only. I have not touched Unity or Unreal. I have old Maya and Houdini, but installing USD plugins seems like not trivial. I did manage to have some USD tools on my Mac, but it is not optimal since I do not have NVidia graphic cards. However, USDZ Converter and the intermediate USDA file generated is pretty helpful.A couple of interesting things:Make sure your 3D mesh has UV layout, without UV the PBR will simply display black on iOS devices, although it might shows okey on MacOS when using 3D or AR Quick Look.USDZ makes PBR materials and shaders linking quite easyWhen mesh is too complex or USDZ is too big, or animation has too many objects, USDZ will not display on iOS device. Might work ok on the MacOS. There is no warning about RAM exceeding iOS.Watch out for Mapping, do not use 8K image texture for now.Animation seems to be supported via Alembic export, works for Transform only. No vertex, no bones, no blendshapes yet. USDZ can be easily imported into XCode for 3D SceneKit. I tested it using basic AR app.There is this ModelIO inside XCode that can be used to unpack USDZ, it seems, but I have not gone that far.Procedural animation can be embeded into USDZ, check USD example of Spinning Top animation. It's pretty cool for turntable.I am currently investigating:- Materials that is not PBR, but able to display color, is this possible? Because often time I just need Material with simple Color- Can we have Lights inside USDZ?
Posted Last updated
.
Post marked as solved
5 Replies
1.5k Views
I just updated to macOS Monterey 12.3 beta (21E5196i) and I've been using Reality Converter app for a while with Blender, but after this update, somewhat this app no longer accept GLB or USDC from Blender. This is really weird, because I need it to work. Any idea what's going on?
Posted Last updated
.
Post marked as solved
3 Replies
1.8k Views
Just wondering about a couple of thing in regards to SCN inside ARKit environment.I am testing Procedural Sky with ARKit, seems like it does not actually work. I was hoping to get some kind of realtime sunlight affecting the 3D model to cast shadow that is more or less automatic based on light condition.Is there something I was missing? First of all, can Procedural Sky affecting model in SceneKit editor actually works with ARKit? Or it needs to be PBR only?
Posted Last updated
.
Post not yet marked as solved
6 Replies
2.2k Views
Traceback (most recent call last): File "/Applications/usdpython/usdzconvert/usdzconvert", line 17, in usdUtils.printError("failed to import pxr module. Please add path to USD Python bindings to your PYTHONPATH.") NameError: name 'usdUtils' is not definedI kept getting that error and unable to solve it.In older version of USDZConvert like version 0.62, I just unzip the tool and run usdzconvert after running USD.command and it will simply works. But with version 0.63, for few months now I cannot run the latest USDZConvert 😟 I have reported this many times to Apple and not resolved.Maybe 0.64 USDZConvert can fix this?Or is there in anyway I can fix the path error message?Thanks!
Posted Last updated
.
Post marked as solved
2 Replies
721 Views
After updating macOS Monterey 3rd beta, HelloPhotogrammetry is bugging again. dyld[4495]: Symbol not found: _$s17RealityFoundation21PhotogrammetrySessionC7OutputsV8IteratorVs05AsyncF8ProtocolAAMc   Referenced from: /Users/jimmygunawan/Library/Developer/Xcode/DerivedData/HelloPhotogrammetry-fljhediczpqycdgoysdrwzqsgxxf/Build/Products/Debug/HelloPhotogrammetry   Expected in: /System/Library/Frameworks/RealityFoundation.framework/Versions/A/RealityFoundation zsh: abort      ./HelloPhotogrammetry
Posted Last updated
.
Post not yet marked as solved
12 Replies
3.7k Views
I am looking into this example of capturing body motion in 3D:https://developer.apple.com/documentation/arkit/capturing_body_motion_in_3dThe example given is using hi-fidelity skeletons hierarchy as USDZ. So I give it a look at the skeleton inside XCode, and trying to replicate the skeleton hierarchy hoping that my MUCH SIMPLER 3D character skeleton will just work. I have maybe just 20-30% of all the skeletons. I omit some nodes.But my character does not work, it seems to give error like could not find skeleton X.2019-07-15 22:36:43.985210+1000 BodyDetection[4434:181726] [API] Created entity character does not contain supplied joint name: spine_4_joint.2019-07-15 22:36:43.985257+1000 BodyDetection[4434:181726] [API] Cannot get joint count for non-character entity.Error: Unable to load model: The operation couldn’t be completed. (RealityKit.Entity.LoadError error 4.)Is it actually possible to supply a USDZ character with less skeleton, or mayb just from chest and above, or maybe even just left hand and right hand?Thanks!
Posted Last updated
.
Post marked as solved
1 Replies
976 Views
I am trying to make this to work but after building the command line app, I kept getting error when running https://developer.apple.com/documentation/realitykit/creating_a_photogrammetry_command-line_app Example error: jimmygunawan@192-168-1-112 Debug % ./HelloPhotogrammetry -d full /Users/jimmygunawan/Downloads/_APPLESCAN/test ~/Desktop/output.usdz Using configuration: Configuration(isObjectMaskingEnabled: true, sampleOverlap: RealityFoundation.PhotogrammetrySession.Configuration.SampleOverlap.normal, sampleOrdering: RealityFoundation.PhotogrammetrySession.Configuration.SampleOrdering.unordered, featureSensitivity: RealityFoundation.PhotogrammetrySession.Configuration.FeatureSensitivity.normal) Error creating session: cantCreateSession("Native session create failed: CPGReturn(rawValue: -11)") jimmygunawan@192-168-1-112 Debug % RealityKit is cool, but I wonder why this RealityCapture can't just be an app (on Mac and iPad and iPhone) like RealityComposer.
Posted Last updated
.
Post not yet marked as solved
7 Replies
2.3k Views
Ok finally usdzconvert 0.64 brings usdzaudioimport tool that seems to be working!I can actually embed MP3 into usdz easily. However, when I import it into Reality Composer, the USDZ does not seem to recognize any audio embedded. I wonder what's the purpose of the MP3 embed for USDZ? Is it only for sharing AR asset with audio? Wouldn't it be more useful if each USDZ with audio can be seen inside Reality Composer and assigned behaviour?
Posted Last updated
.
Post marked as solved
3 Replies
2.5k Views
I need to have a floor or wall that hides whatever AR that's overlapping with the anchor. Is this possible in RealityKit or it needs to be an app?Let's supposed the floor is anchor and I want 3D AR that slowly appears from the floor.Probably related to another question asked regarding "Face Occlusion".
Posted Last updated
.
Post not yet marked as solved
0 Replies
627 Views
I have a texture which is very dense, almost pixel like, each one has different color stripes. The texture is 1024 x 1024 for example and I have 1024 color stripes applied into a 3D pipes. It results in USDZ which texture being blurred across the pipe. While what I really want is a sharp texture. In 3D packages like Blender, there is filtering options such as: Linear, Closest, Cubic, Smart. Does USDZ have this? EDIT: Probably related to this: https://developer.apple.com/documentation/spritekit/sktexture/1519659-filteringmode I am using Reality Converter, conversion happens from GLB to USDZ.
Posted Last updated
.
Post marked as solved
1 Replies
739 Views
Does ARKit or RealityKit 4 for iOS or iPadOS 14 support ability for video texture to be slightly transparent or using alpha of animated video texture? For example if I want to simulate projection mapping AR on 3D objects (static object anchor), but I don't want the AR to blend 100%, I want to keep some layers or reality.
Posted Last updated
.
Post not yet marked as solved
1 Replies
1.1k Views
I have used usdzconvert in the past and have never had issues until this morning after I upgraded to latest MacOS Catalina Beta 5 (installation took forever like 10+ hours).I am not sure what I did wrong, but during the MacOS install, I might have restarted my MacBook Pro a few times, because I thought it was frozen.Anyhow, I am now getting below:bash-3.2$ usdzconvert ------------------------ 'Python' is dying ------------------------ Python crashed. FATAL ERROR: Failed axiom: ' Py_IsInitialized() ' in operator() at line 148 of /USD/pxr/base/lib/tf/pyTracing.cpp The stack can be found in 192-168-1-5.tpgi.com.au:/var/folders/r3/13y_zxhn0557ftnbz9xrt7x00000gn/T//st_Python.7546 done. ------------------------------------------------------------------ Abort trap: 6This is similar to what I had before, when compiling my own Pixar USD, and I use brew etc, and changing path, and ended up with something like above.Anyways, I don't know how to fix this 😟
Posted Last updated
.
Post not yet marked as solved
0 Replies
716 Views
I have a setup using Image Reference Anchor, and multiple scenes, with each one of them is anchored to the same image.I tested many times and somewhat when I am tryig to switch scene, the whole anchor etc failed. Scene A position (supposedly attached to image) is different to Scene B and Scene C and so on. Is this a known bug?
Posted Last updated
.
Post not yet marked as solved
0 Replies
632 Views
I have been using Reality Composer on both iPhone, iPad and MacOS for a few weeks now. I got a hang of it. It is very intuitive and powerful even at this stage with limited interactivity but great because we can easily publish the AR Experience as USDZ / Reality --> if they don't freeze or crash on iPad / iPhone when viewed.Anyways, I want to use some features of RealityKit that is not available inside Reality Composer. I use Apple XCode template to start with and so far I did manage to bring in my own RealityComposer project. But I seem to lose all the interactive setup like "tap and animate" etc. So I am wondering if I am doing this correctly?I can see in the template that Reality Project loads perfectly. That's a file *.rcproject inside XCode. Now could I actually bring in *.reality files and keep all interactions as it is, but adding more into it?I have to watch some WWDC videos a couple of times, but still even the sun planets projects seem slightly difficult to follow.Eventually I would like to be able to use SwiftUI to jump between *.reality scene, but then I need to have all the interactions. It's kind of strange if I setup everything using Reality Composer, brought the assets compositions but having to redo the interactions using codes.
Posted Last updated
.