I'm using below code to show image in .usdz model image frame
var imageMaterial = UnlitMaterial()
imageMaterial.color = .init(tint: .white, texture: .init(texture))
modelEntity.model?.materials = [imageMaterial]
anchorEntity.addChild(modelEntity)
The issue is that image is showing on frame border as well but we need to show in image inside frame not on frame border
frame.usdz file
frame after adding image
Post
Replies
Boosts
Views
Activity
I'm using ReplayKit RPScreenRecorder for screen recorder but it's recording whole screen, which is not my requirement, my requirement to record ARView area only
In My project, I have a requirement to show feature points (like circle dot based on rawFeaturePoints) in AR so that user can sure about scan accuracy around.
I saw android doing same with ARCore api, my client is asking to do same in iOS app also, as per my Research every frame has rawFeaturePoints, but how can show it in RealityKit ARView?
My requirement is that I have to show image in Material and play audio. User can place any number of image with audio and when he/she will tap on a image then I have to play that image audio file.
I have a requirement to hold some data in ModelEntity/Entity and show as a info when user tap on ModelEntity/Entity, Currently I'm doing this by assigning on name property because I've not found any property to store/hold data and it's not a perfect way to do and at some point it's falling because 3d model has it's own name and it replacing my stored/hold data by 3d model name, then user getting wrong info.
then which is a perfect way to hold/store data in ModelEntity/Entity?