Post not yet marked as solved
Hello Dev Community,
I've been thinking over Apple's preference for USDZ for AR and 3D content, especially when there's the widely used GLTF. I'm keen to discuss and hear your insights on this choice.
USDZ, backed by Apple, has seen a surge in the AR community. It boasts advantages like compactness, animation support, and ARKit compatibility. In contrast, GLTF too is a popular format with its own merits, like being an open standard and offering flexibility.
Here are some of my questions toward the use of USDZ:
Why did Apple choose USDZ over other 3D file formats like GLTF?
What benefits does USDZ bring to Apple's AR and 3D content ecosystem?
Are there any limitations of USDZ compared to other file formats?
Could factors like compatibility, security, or integration ease have influenced Apple's decision?
I would love to hear your thoughts on this. Feel free to share any experiences with USDZ or other 3D file formats within Apple's ecosystem!
Post not yet marked as solved
Got Sonoma, got Xcode 15 with Mac and iOS sdks.
Launched XCOde but no way to find Reality Composer pro in the Open Developer Tool or in other items.
Hi,
Will the Apple Vision Pro developer kit be sold only in the US? I live in Japan. Is it possible to get the Apple Vision Pro developer kit in Japan?
Best regards.
Sadao Tokuyama
https://1planet.co.jp/
https://twitter.com/tokufxug
Post not yet marked as solved
While it seems like this is being aimed at indoor seated use initially, is there going to be anything stopping us from making an outdoor art experience? I'd love to make an installation that a user could experience in a soccer field, etc.
I saw the at the WWDC23 session "Meet Object Capture for iOS" that the new tool that was released today along with Xcode 15 beta 2 called "Reality Composer Pro" will be capable of creating 3D models with Apple's PhotogrammetrySession. However, I do not see any of its features on the tool. Has anyone managed to find the feature for creating 3D models as shown in the session?
Post not yet marked as solved
Hey everyone,
In the WWDC23 - Explore Materials in Reality Composer Pro
Starting at 14:17 the instructor uses a "Disk Model" to modify the base surface mesh, however, how the video does not show or explain how that was created. Was this done in Blender/another 3D software tool, or can this be done within Reality Composer Pro using the primitive shapes? All suggestions are really appreciated! Thank you!
Post not yet marked as solved
i haven't even edited the template file and this happens.
steps:
Create visonOS app from template
the immersive view previews and it show 3d objects
Select the realitykit package and Click 'open in reality composer pro'
Change color of material and save
go back to xcode
result: file no longer previews and i get the error in the title
like that's it, how do i edit 3d files?
Post not yet marked as solved
I made a scene in Reality Composer Pro and used the „Light Blue Denim Fabric“ material.
Saving the scene and exporting to USDZ resulted in that material using this line:
asset inputs:file = @0/LightBlueDenimFabric_basecolor.png@ (
colorSpace = "Input - Texture - sRGB - sRGB"
)
Question 1:
Was this colorSpace wrongfully exported from a different tool ?
The updated Apple page https://developer.apple.com/documentation/realitykit/validating-usd-files
mentions these new known colorSpace(s): “srgb_texture”, “lin_srgb”, “srgb_displayp3”, or “lin_displayp3”.
To be honest, this confused me even more.
My understanding is that the prefix "lin_" means linear (aka no gamma) and the established "srgb_" prefix means gamma corrected.
But the word srgb could also describe the encoding (sRGB vs DisplayP3)
As a 3D AR developer who has been struggling to put the pieces together to create a decent interface and quality 3D into an AR format the past couple years, I’ve really been looking forward to Apples Glasses or AR goggles.
There has been, and I can’t imagine that I’m in any way alone in this feeling, a complete lack of help from Apple in helping people get started in a meaningful way.
The WWDC content, while being perfect snippets to explain new classes and function, is the absolute worst at giving any help whatsoever to give meaning to these snippets of code. To let us know what to do with them. The sample projects are great, but a simple few pages of written explanations to walk us through the maze of function, and what calls what, would make these sample projects gold, instead of semi useless eye candy. It’s pure programmer hubris to toss us the box of puzzle pieces and expect miracles. I have world changing app ideas that would make people scramble to buy the Vision Pro, but hit the usual wall.
There have been a few rays of light in the Apple AR world, and certainly not from Apple. Ryan Kopinsky with his Reality School YouTube Channel is number one. A hand full of others create some helpful tutorials as well.
What if Apple put a pittance of its might into actually creating something useful to teach a comprehensive AR coarse? Hire a few Ryan Kopinsky’s, put together a course that strings together all those gold nuggets on display from WWDC and put them to actual use.
Now is the time, the exact time, to do this right. You have a world changing product here that could possibly languish in relative doldrums of simple 3D windows. We need actual help to help you.
Post not yet marked as solved
Attempting to load an entity via Entity.loadAsync(contentsOf: url) throws an error when passing a .realitycomposerpro file:
"Cannot determine file format for ...Package.realitycomposerpro"
AdditionalErrors=(
"Error Domain=USDKitErrorDomain Code=3 "Failed to open layer
}
I have a VisionOs app with the same Reality Composer pro file referenced. The project automatically builds a .reality file out of the project. The project contains the realityComposerPro package as a library in the Link Binary build phase.
I duplicated this setup with my iPhone app using RealityKit. Attempting to include a realityconvertpro package into my existing app. causes a build error:
RealityAssetsCompile:
Error: for --platform, value must be one of [xros, xrsimulator], not 'iphoneos'
Usage: realitytool compile --output-reality [--schema-file ] [--derived-data ] --platform --deployment-target [--use-metal ]
See 'realitytool compile --help' for more information.
Lastly, I extracted the .reality file generated by a separate, working, visionOs app using reality converter pro. Attempting to actually load an entity from this file results in an error:
Reality File version 9 is not supported. (Latest supported version is 7.)
Post not yet marked as solved
The version of Reality Composer Pro is 1.0 from Xcode 15 beta 2. Every time I click 'Particle Emitter' button, it will crash. I can't open the Diorama, the demo from document, neither.
such as a human model, use code to let him walk to a certain position?
Hi, would a Mac Mini 2M chip 8gb of ram be fine?
Hi,
I would like to learn how to create custom materials using Shader Graph in Reality Composer Pro. I would like to know more about Shader Graph in general, including node descriptions and how the material's display changes when nodes are connected. However, I cannot find a manual for Shader Graph in Reality Composer Pro. This leaves me totally clueless on how to create custom materials.
Thanks.
Sadao Tokuyama
https://1planet.co.jp/
https://twitter.com/tokufxug
Post not yet marked as solved
I want to see the entire code for the tutorial made in this session please
Post not yet marked as solved
For the MaterialX shadergraph, the given example hard-codes two textures for blending at runtime ( https://developer.apple.com/documentation/visionos/designing-realitykit-content-with-reality-composer-pro#Build-materials-in-Shader-Graph )
Can I instead generate textures at runtime and set what those textures are as dynamic inputs for the material, or must all used textures be known when the material is created? If the procedural texture-setting is possible, how is it done, since the example shows a material with those hard-coded textures?
EDIT: It looks like the answer is ”yes” since setParameter accepts textureResources https://developer.apple.com/documentation/realitykit/materialparameters/value/textureresource(_:)?changes=l_7
However, how do you turn a MTLTexture into a TextureResource?
I have a newly installed Xcode 15 beta 2 on two machines: Apple Studio with 13.4.1, and M1 Mini with 14.0 beta. With both machines when I start a new project as an iOS Augmented Reality App and -- without doing anything else-- try to run with the Apple Vision Pro simulator, the simulation crashes.
A short section of the crash report is below.
I've sent these to Apple, from both machines. i'm at a loss. I can't imagine that this is the case for everyone, but I'm seeing it on both of my machines. (It also crashes when aimed at an iPad 17.0 beta simulator as well.)
Does anyone have suggestions of how to get past this starting point?
Thanks, in advance.
Translated Report (Full Report Below)
Incident Identifier: B19370C3-21F5-4AA8-A977-BFF69FD9732A
CrashReporter Key: 3BA15700-B5BA-5C3B-0F30-39509BCFDE58
Hardware Model: Macmini9,1
Process: Tet5 [66050]
Path: /Users/USER/Library/Developer/Xcode/UserData/Previews/Simulator Devices/3CC9EFB1-EFDA-4622-A04F-067FE72BDF40/data/Containers/Bundle/Application/54BB34F8-0D07-4ED5-9F2E-6FA61E0990B6/Tet5.app/Tet5
Identifier: com.apple-bbbbbb.Tet5
Version: 1.0 (1)
Code Type: ARM-64 (Native)
Role: Foreground
Parent Process: launchd_sim [65662]
Coalition: com.apple.CoreSimulator.SimDevice.3CC9EFB1-EFDA-4622-A04F-067FE72BDF40 [84827]
Responsible Process: SimulatorTrampoline [14070]
Date/Time: 2023-06-30 11:39:35.9221 -0400
Launch Time: 2023-06-30 11:39:33.7148 -0400
OS Version: macOS 14.0 (23A5276g)
Release Type: User
Report Version: 104
Exception Type: EXC_CRASH (SIGABRT)
Exception Codes: 0x0000000000000000, 0x0000000000000000
Termination Reason: DYLD 4 Symbol missing
Symbol not found: _$s21DeveloperToolsSupport15PreviewRegistryPAAE04makeD0AA0D0VyKFZ
Referenced from: <9E12C915-D6B1-3DD3-83B8-6921502C7F73> /Users/USER/Library/Developer/Xcode/UserData/Previews/Simulator Devices/3CC9EFB1-EFDA-4622-A04F-067FE72BDF40/data/Containers/Bundle/Application/54BB34F8-0D07-4ED5-9F2E-6FA61E0990B6/Tet5.app/Tet5
Expected in: <31FB64EE-D651-3287-9607-1ED38855E80F> /Library/Developer/CoreSimulator/Volumes/xrOS_21N5165g/Library/Developer/CoreSimulator/Profiles/Runtimes/xrOS 1.0.simruntime/Contents/Resources/RuntimeRoot/System/Library/Frameworks/DeveloperToolsSupport.framework/DeveloperToolsSupport
(terminated at launch; ignore backtrace)
Post not yet marked as solved
When I create an USDZ file from the original Reality Composer (non pro) and view it in the Vision OS simulator, the transforms and rotations don't look similar.
For example a simple Tap and Flip behaviour does not rotate similar in Vision OS.
Should we regard RC as discontinued sw and only work with RC-pro?
Hopefully Apple will combine the features from the original RC into the new RC pro !
Post not yet marked as solved
I been playing around with VisionOS and currently trying to play with Audio, but somehow following the steps in wwdc isn't working for me.
what I'm trying to do is to play audio when you navigate to a certain view.
Currently in wwdc, they're doing this to programmatically set up audio
let audioSource = Entity()
audioSource.spatialAudio = SpatialAudioComponent(directivity: .beam(focus: 0.75))
audioSource.orientation = .init(angle: .pi, axis: [0, 1 , 0])
if let audio = try? await AudioFileResource(named: "audioFile",
configuration: .init(shouldLoop: true)) {
}
But I'm getting these errors and cannot proceed
'AudioFileResource' cannot be constructed because it has no accessible initializers
I couldn't quite follow on how to do it from Reality Composer pro either.. therefore I'm stuck.
Any help is really appreciated.
Post not yet marked as solved
@Apple Really excited by the updates on the Reality Composer and Object Captures side - is it possible to publish the sample Apps and project material referenced in both the Reality Composer Pro sessions as well as the "Meet Object Capture for iOS" sessions. Looks like it hasn't yet been updated to the Sample Code site.
Thank you!
https://developer.apple.com/videos/play/wwdc2023/10273/
https://developer.apple.com/videos/play/wwdc2023/10191/