Hi all,
I'm trying to create a simple 3d application using RealityKit but for some reason in XCode project this code yield error
let arView = ARView(frame: .zero, cameraMode: .nonAR, automaticallyConfigureSession: true)
saying :
Cannot infer contextual base in reference to member 'nonAR'
Extra arguments at positions #2, #3 in call
Basically suddenly the project won't except the other way to initialize ARView. This is confusing because I tried on Swift Playground and it works fine!
I've tried updating my XCode, clean install and even tried on several macbooks but still the same. Looking up at the documentation and there doesn't seem any information regarding this being deprecated or changed.
So here's a funny thing, this code totally works on Swift Playground for Mac but it doesn't work on Playground on XCode :
import PlaygroundSupport
import RealityKit
let arView = ARView(frame: .zero, cameraMode: .nonAR, automaticallyConfigureSession: true)
let point = PointLight()
point.light.intensity = 1000
let lightAnchor = AnchorEntity(world: [0,0,0])
lightAnchor.addChild(point)
arView.scene.addAnchor(lightAnchor)
let textMesh = MeshResource.generateSphere(radius: 0.5)
let material = SimpleMaterial(color: .white, isMetallic: false)
let entity = ModelEntity(mesh: textMesh, materials: [material])
let anchor = AnchorEntity(world: [0,0.25,0])
anchor.addChild(entity)
arView.scene.addAnchor(anchor)
let camera = PerspectiveCamera()
let camAnchor = AnchorEntity(world: [0,0,5])
camAnchor.addChild(camera)
arView.scene.addAnchor(camAnchor)
PlaygroundPage.current.setLiveView(arView)
Help?? Any idea?
Post
Replies
Boosts
Views
Activity
I have a arobject files that's already tested and working perfectly in Reality Composer as an anchor. But for whatever reason when I try them both as AR Resource group in my asset or even loading it directly from the url it always fails (returns nil) I've double check all the file/group names and they seems fine and I couldn't find the error, just always nil.
This is my code :
var referenceObject: ARReferenceObject?
if let referenceObjects = ARReferenceObject.referenceObjects(inGroupNamed: "TestAR", bundle: Bundle.main) {
referenceObject = referenceObjects[referenceObjects.startIndex]
}
if let referenceObject = referenceObject{
delegate.didFinishScan(referenceObject, false)
}else {
do {
if let url = Bundle.main.url(forResource: "Dragonball1", withExtension: "arobject") {
referenceObject = try ARReferenceObject(archiveURL: url)
}
} catch let myError{
let error = myError as NSError
print("try \(error.code)")
}
}
Any idea? Thanks
I'm stuck in trying to debug my shader cause it says "Unable to create shader debug session" (something about my metal build setting doesn't have debugging information) and I've tried to follow this : developer.apple.com/documentation/xcode/…
I've made my metal build setting to produce debugging information - Yes, Include source code as instructed
Is it because I'm doing an iOS app instead of macOS?
I have a simple material with a texture that has transparency baked in (PNG file), this is the code that I did to load it :
var triMat = SimpleMaterial(color: .orange, isMetallic: false)
triMat.color = SimpleMaterial.BaseColor( tint: .white.withAlphaComponent(0.9), texture: MaterialParameters.Texture(try! .load(named: "whity_4x4")))
self.components[ModelComponent.self] = try! ModelComponent(mesh: .generate(from: [meshDesc]), materials: [triMat])
It's pretty straight foward and it works fine, the geometry is a plane and it shows the texture correctly with transparency.
But when I try to create a surface shader (I want to animate the uv so it looks like it scrolls) the transparency become a black color. This is the part where I sample the texture for color and opacity
auto color = params.textures().base_color().sample(textureSampler, float2(u1,v1));
auto opacity = params.textures().opacity().sample(textureSampler, float2(u1,v1)).r;
and I set the surface color and opacity
params.surface().set_base_color(color.rgb);
params.surface().set_opacity(opacity);
upon debugging, it seems the opacity value is wrong (always 1) that's why it doesn't show as transparent. Did I do something wrong with sampling the texture opacity? I looked at the Metal API documentation that's how you supposed to do it?
Thanks for any help
I use simple transform and move method to animate my entity. Something like this :
let transform = Transform(scale: .one,
rotation: simd_quatf(angle: .pi, axis: SIMD3(x:0, y:0, z:1),
translate: .zero)
myEntity.move(to: transform, relativeTo: myEntity, duration: 1)
All is well, but when I try to rotate any more than 180 degree, the rotation stays still ?
How do I animate something that wants to turn 360 degree?
Thanks
So I have this simple test code to test how I can communicate between two views using @Binding.
It's supposed to be simple in which there's a list of buttons that when it's pressed it'll open a simple view showing which value they pick.
struct SimpleButtonView: View {
@Binding var bindingValueBool:Bool
@Binding var bindingValueInt:Int
var myNumber:Int
var body: some View {
Button(action: {
//Change the values
bindingValueBool = true
bindingValueInt = myNumber
}, label: {
Text("Press me \(myNumber)")
})
}
}
struct TestView: View {
@State var bindingValueBool: Bool = false
@State var bindingValueInt: Int = 0
var body: some View {
ForEach(1...10, id: \.self){
SimpleButtonView(bindingValueBool: $bindingValueBool, bindingValueInt: $bindingValueInt, myNumber: $0)
}.sheet(isPresented: $bindingValueBool, content: {
//This should show the number selected?
//How come it's sometimes correct sometimes shows default value 0 as if the bindingValueInt hasn't got the updated value yet
Text("This is the selected number \(bindingValueInt)")
})
}
}
Pretty straightfoward right? I thought I did it according to what I understood about properties modifier and swiftui, but the result is weird. It seems the bindingValueInt doesn't updated properly ?
I need to click two different buttons to make sure it works, as if the first click on the button and the first update to those binding properties doesn't get propagated to the main view?
Which is weird because the bindingValueBool is always changed as it never fails to show the new sheet? It's just the other binding property that somehow stays to its default until I click "a different" button?
Can someone help me to understand this?
Thanks
Hi, I'm doing a research on AR using real world object as the anchor. For this I'm using ARKit's ability to scan and detect 3d object.
Here's what I found so far after scanning and testing object detection on many objects :
It works best on 8-10" object (basically objects you can place on your desk)
It works best on object that has many visual features/details (makes sense just like plane detection)
Although things seem to work well and exactly the behavior I need, I noticed issues in detecting when :
Different lighting setup, this means just directions of the light. I always try to maintain bright room light. But I noticed testing in the morning and in the evening sometimes if not most of the time will make detection harder/fails.
Different environment, this means simply moving the object from one place to another will make detection fails or harder(will take significant amount of time to detect it). -> this isn't scanning process, this is purely anchor detection from the same arobject file on the same real world object.
These two difficulties make me wonder if scanning and detecting 3d object will ever be reliable enough for real world case. For example you want to ship an AR app that contains the manual of your product where you can use AR app to detect and point the location/explanation of your product features.
Has anyone tried this before? Is your research show the same behavior as mine? Does using LIDAR will help in scanning and detection accuracy?
So far there doesn't seem to be any information on what actually ARKit does when scanning and detecting, maybe if anyone has more information I can learn on how to make better scan or what not.
Any help or information regarding this matter that any of you willing to share will be really appreciated
Thanks
Just updated my XCode as 15 is released today (been using 15 beta for the last couple of months) and this part of code doesn't compile anymore :
let anchor = AnchorEntity(anchor: objectAnchor)
error says "no exact match in call to initializer" ? seems like it doesn't accept any parameter or only expecting AnchoringComponent.Target
The code was okay and runs well before the update and I've been using 15 beta to make and test my current code.
I checked the documentation and it seems initializing AnchorEntity using existing ARAnchor hasn't been deprecated
https://developer.apple.com/documentation/realitykit/anchorentity/init(anchor:)
What happens here?