With WebXR and event-exchange with maybe coming next summer, would you please consider to add Live-Text to ARQL?
Currently an interactive AR scene is not able to redirect the user to any other webpage, except the originating. And only a single tapToPay event can be sent at the end of an ARQL session without any reasonable payload.
I've come up with a in-house workflow to build interactiveUSDZ archives for product configuration that dynamically show/update an https-URL representing the user's choice that forwards you to an order form / Webshop.
A sample project can be evaluated here:
https://kreativekk.de/Swivel.html
Currently users have to take a screenshot of their configuration, including this URL and switch over to the Photos.app to let it recognise and invite to open the link.
Post
Replies
Boosts
Views
Activity
I made a scene in Reality Composer Pro and used the „Light Blue Denim Fabric“ material.
Saving the scene and exporting to USDZ resulted in that material using this line:
asset inputs:file = @0/LightBlueDenimFabric_basecolor.png@ (
colorSpace = "Input - Texture - sRGB - sRGB"
)
Question 1:
Was this colorSpace wrongfully exported from a different tool ?
The updated Apple page https://developer.apple.com/documentation/realitykit/validating-usd-files
mentions these new known colorSpace(s): “srgb_texture”, “lin_srgb”, “srgb_displayp3”, or “lin_displayp3”.
To be honest, this confused me even more.
My understanding is that the prefix "lin_" means linear (aka no gamma) and the established "srgb_" prefix means gamma corrected.
But the word srgb could also describe the encoding (sRGB vs DisplayP3)
I tried to train on some of my USDZs.
For some the CreateML viewport shows the object's dimensions and enables the Train button.
Many others, including the official USDZ for the iPhone 15Pro (iphone_15_pro_blue_titanium_5G.usdz) do not show the dimensions and keep [Train] disabled.
What do I have to author inside the USDZ to make it work?
There are no lights or cameras in the asset file, as recommended in the release notes. Also no animations.
For years, the preliminary behaviours provided a way to trigger an action sequence (now called timeline) when the user came close to an object.
I could not find the same in the new RealityComposerPro.