I have the simplest Playground to display image. I have images inside Resources. Somewhat it does not work and I am really confused. I am using macOS XCode.
I think on iPad Swift Playground this might work.
import SwiftUI
import PlaygroundSupport
struct ContentView: View {
var body: some View {
Image("image1")
.resizable()
.frame(width: 512, height: 512)
}
}
let hostingController = UIHostingController(rootView: ContentView())
PlaygroundPage.current.liveView = hostingController
Post
Replies
Boosts
Views
Activity
I found that there is still lacking of documentation of "USDZ Converter", so I might start some kind of thread talking about this. So far I found developer 'Ash' out there has been really helpful 😎 and we are still discovering USDZ feature, gotchas and limitations along the way. There is this USD Interest group at Google Groups that is more specifically talk about USD.A bit of information about me:I am an independent developer, mostly using Blender 3D and XCode only. I have not touched Unity or Unreal. I have old Maya and Houdini, but installing USD plugins seems like not trivial. I did manage to have some USD tools on my Mac, but it is not optimal since I do not have NVidia graphic cards. However, USDZ Converter and the intermediate USDA file generated is pretty helpful.A couple of interesting things:Make sure your 3D mesh has UV layout, without UV the PBR will simply display black on iOS devices, although it might shows okey on MacOS when using 3D or AR Quick Look.USDZ makes PBR materials and shaders linking quite easyWhen mesh is too complex or USDZ is too big, or animation has too many objects, USDZ will not display on iOS device. Might work ok on the MacOS. There is no warning about RAM exceeding iOS.Watch out for Mapping, do not use 8K image texture for now.Animation seems to be supported via Alembic export, works for Transform only. No vertex, no bones, no blendshapes yet. USDZ can be easily imported into XCode for 3D SceneKit. I tested it using basic AR app.There is this ModelIO inside XCode that can be used to unpack USDZ, it seems, but I have not gone that far.Procedural animation can be embeded into USDZ, check USD example of Spinning Top animation. It's pretty cool for turntable.I am currently investigating:- Materials that is not PBR, but able to display color, is this possible? Because often time I just need Material with simple Color- Can we have Lights inside USDZ?
I just updated to macOS Monterey 12.3 beta (21E5196i) and I've been using Reality Converter app for a while with Blender, but after this update, somewhat this app no longer accept GLB or USDC from Blender.
This is really weird, because I need it to work.
Any idea what's going on?
Just wondering about a couple of thing in regards to SCN inside ARKit environment.I am testing Procedural Sky with ARKit, seems like it does not actually work. I was hoping to get some kind of realtime sunlight affecting the 3D model to cast shadow that is more or less automatic based on light condition.Is there something I was missing? First of all, can Procedural Sky affecting model in SceneKit editor actually works with ARKit? Or it needs to be PBR only?
Traceback (most recent call last):
File "/Applications/usdpython/usdzconvert/usdzconvert", line 17, in
usdUtils.printError("failed to import pxr module. Please add path to USD Python bindings to your PYTHONPATH.")
NameError: name 'usdUtils' is not definedI kept getting that error and unable to solve it.In older version of USDZConvert like version 0.62, I just unzip the tool and run usdzconvert after running USD.command and it will simply works. But with version 0.63, for few months now I cannot run the latest USDZConvert 😟 I have reported this many times to Apple and not resolved.Maybe 0.64 USDZConvert can fix this?Or is there in anyway I can fix the path error message?Thanks!
After updating macOS Monterey 3rd beta, HelloPhotogrammetry is bugging again.
dyld[4495]: Symbol not found: _$s17RealityFoundation21PhotogrammetrySessionC7OutputsV8IteratorVs05AsyncF8ProtocolAAMc
Referenced from: /Users/jimmygunawan/Library/Developer/Xcode/DerivedData/HelloPhotogrammetry-fljhediczpqycdgoysdrwzqsgxxf/Build/Products/Debug/HelloPhotogrammetry
Expected in: /System/Library/Frameworks/RealityFoundation.framework/Versions/A/RealityFoundation
zsh: abort ./HelloPhotogrammetry
I am looking into this example of capturing body motion in 3D:https://developer.apple.com/documentation/arkit/capturing_body_motion_in_3dThe example given is using hi-fidelity skeletons hierarchy as USDZ. So I give it a look at the skeleton inside XCode, and trying to replicate the skeleton hierarchy hoping that my MUCH SIMPLER 3D character skeleton will just work. I have maybe just 20-30% of all the skeletons. I omit some nodes.But my character does not work, it seems to give error like could not find skeleton X.2019-07-15 22:36:43.985210+1000 BodyDetection[4434:181726] [API] Created entity character does not contain supplied joint name: spine_4_joint.2019-07-15 22:36:43.985257+1000 BodyDetection[4434:181726] [API] Cannot get joint count for non-character entity.Error: Unable to load model: The operation couldn’t be completed. (RealityKit.Entity.LoadError error 4.)Is it actually possible to supply a USDZ character with less skeleton, or mayb just from chest and above, or maybe even just left hand and right hand?Thanks!
I am trying to make this to work but after building the command line app, I kept getting error when running
https://developer.apple.com/documentation/realitykit/creating_a_photogrammetry_command-line_app
Example error:
jimmygunawan@192-168-1-112 Debug % ./HelloPhotogrammetry -d full /Users/jimmygunawan/Downloads/_APPLESCAN/test ~/Desktop/output.usdz
Using configuration: Configuration(isObjectMaskingEnabled: true, sampleOverlap: RealityFoundation.PhotogrammetrySession.Configuration.SampleOverlap.normal, sampleOrdering: RealityFoundation.PhotogrammetrySession.Configuration.SampleOrdering.unordered, featureSensitivity: RealityFoundation.PhotogrammetrySession.Configuration.FeatureSensitivity.normal)
Error creating session: cantCreateSession("Native session create failed: CPGReturn(rawValue: -11)")
jimmygunawan@192-168-1-112 Debug %
RealityKit is cool, but I wonder why this RealityCapture can't just be an app (on Mac and iPad and iPhone) like RealityComposer.
Ok finally usdzconvert 0.64 brings usdzaudioimport tool that seems to be working!I can actually embed MP3 into usdz easily. However, when I import it into Reality Composer, the USDZ does not seem to recognize any audio embedded. I wonder what's the purpose of the MP3 embed for USDZ? Is it only for sharing AR asset with audio? Wouldn't it be more useful if each USDZ with audio can be seen inside Reality Composer and assigned behaviour?
I need to have a floor or wall that hides whatever AR that's overlapping with the anchor. Is this possible in RealityKit or it needs to be an app?Let's supposed the floor is anchor and I want 3D AR that slowly appears from the floor.Probably related to another question asked regarding "Face Occlusion".
I have a texture which is very dense, almost pixel like, each one has different color stripes. The texture is 1024 x 1024 for example and I have 1024 color stripes applied into a 3D pipes.
It results in USDZ which texture being blurred across the pipe. While what I really want is a sharp texture.
In 3D packages like Blender, there is filtering options such as: Linear, Closest, Cubic, Smart.
Does USDZ have this?
EDIT:
Probably related to this:
https://developer.apple.com/documentation/spritekit/sktexture/1519659-filteringmode
I am using Reality Converter, conversion happens from GLB to USDZ.
Does ARKit or RealityKit 4 for iOS or iPadOS 14 support ability for video texture to be slightly transparent or using alpha of animated video texture?
For example if I want to simulate projection mapping AR on 3D objects (static object anchor), but I don't want the AR to blend 100%, I want to keep some layers or reality.