Hi,
I'm developing an AR App using Apple ARKit.
At the moment in my AugmentedView I'm using the boilerplate directly provided by Apple for AR Apps template when choosing initial type of project.
When I run my app in debug mode I see that I'm receiving this warning/advice in the console:
ARSession is being deallocated without being paused. Please pause running sessions explicitly.
Is possible to pause ARSession in SwiftUI? Because as far as I know all this stuff is managed by default by the OS. Is that correct?
To notice, I have two Views, a parent View and by using a NavigationStack / NavigationLink the child "subView"
Here is my code snippet for the parent View:
import SwiftUI
struct ArIntroView: View {
var body: some View {
NavigationStack{
NavigationLink(destination: AugmentedView(), label: {
HStack {
Text("Go to ARView")
}
.padding()
}
)
}
}
}
struct ArIntroView_Previews: PreviewProvider {
static var previews: some View {
ArIntroView()
}
}
Here is my code for the child View:
import SwiftUI
import RealityKit
struct AugmentedView : View {
@State private var showingSheet = false
// add state to try explicitly end AR
// @State private var isExitTriggered: Bool = false
var body: some View {
ZStack {
ARViewContainer()
// to hide toolbar in View
.toolbar(.hidden, for: .tabBar)
.ignoresSafeArea(.all)
VStack {
Spacer()
HStack {
Button {
//toggle bottom sheet
showingSheet.toggle()
} label: {
Text("Menu")
.frame(maxWidth: .infinity)
.padding()
}
.background()
.foregroundColor(.black)
.cornerRadius(10)
//present the bottom sheet
.sheet(isPresented: $showingSheet) {
HStack{
Button {
// dismiss bottom sheet
showingSheet.toggle()
} label: {
Label("Exit", systemImage: "cross")
}
}
.presentationDetents([.medium])
.presentationDragIndicator(.visible)
}
}
.padding()
}
Spacer()
}
}
}
struct ARViewContainer: UIViewRepresentable {
// @Binding var isExitTriggered: Bool
let arView = ARView(frame: .zero)
func makeUIView(context: Context) -> ARView {
// Load the "Box" scene from the "Experience" Reality File
let boxAnchor = try! Experience.loadBox()
// Add the box anchor to the scene
arView.scene.anchors.append(boxAnchor)
return arView
}
// not work it will remove the View and after it will re-create it from makeUIView
func updateUIView(_ uiView: ARView, context: Context) {
// if isExitTriggered {
// uiView.session.pause()
// uiView.removeFromSuperview()
// }
}
}
#if DEBUG
struct Augmented_Previews : PreviewProvider {
static var previews: some View {
AugmentedView()
}
}
#endif
In addition, when running the app and check the performance with Apple Instruments, during the AR session I have memory leaks, apparently with the CoreRE library here the snap:
Any suggestion or critique will be welcome.
Post
Replies
Boosts
Views
Activity
I'm trying to scan a real world object with Apple ARKit Scanner .
Sometimes the scan is not perfect, so I'm wondering if I can obtain an .arobject in other ways, for example with other scanning apps, and then merge all the scans into one single more accurate scan. I know that merging is possible, in fact, during the ARKit Scanner session the app prompts me if I want to merge multiple scans, and in that case I can select previous scan from my files app, in this context I would like to add from other sources.
Is it possible ? And if yes, are out there any other options to obtain an .arobject, or is that a practical way to improve the quality of object detection?
Thanks
Hi,
I’ve implemented an ARKit app that display an usdz object in the real world. In this scenario, the placement is via image recognition (Reality Composer Scene)
Obviously when I don’t see the image (QR marker), the app could not detect the anchor and it will not place the object in the real world.
Is it possibile to recognize an image (QR marker) and after placing the object on it, leave the object there ?
So basically
detect the marker
place the object
leave the object there, not depending on the image (marker) recognition
Thanks
I was using Reality Composer for my AR App.
Today I'm receiving this error:
Unable to export Reality Composer project: Object anchor scenes require an object to recognize
Any advice ?
Thanks
Hi,
I'm working on an AR App. With Reality Composer and Xcode 14 I was triggering custom behaviours from with just:
myScene.notifications.myBox.post()
called from
let myScene = try! Experience.loadBox()
Now in Xcode 15 I don't have an Experience instead with the .reality file I have to use Entities so :
let objectAR = try! Entity.load(named: "myProject.reality")
How can I trigger my previously Reality Composer exported custom behaviour from that ?