Hello, I'm setting up an ar view using scene anchors from reality composer. The scenes load perfectly fine the first time entering the AR View. When I go back to the previous screen and re-enter the AR View the app crashes before any of the scenes appear on the screen. I've tried pausing and resuming the session and am still getting the following error.
validateFunctionArguments:3536: failed assertion `Fragment Function(fsRenderShadowReceiverPlane): incorrect type of texture (MTLTextureTypeCube) bound at texture binding at index 0 (expect MTLTextureType2D) for projectiveShadowMapTexture[0].'
Any help would be very much appreciated.
Thanks
Discuss spatial computing on Apple platforms and how to design and build an entirely new universe of apps and games for Apple Vision Pro.
Post
Replies
Boosts
Views
Activity
Hi all,
I have an .obj model which I converted using Reality converter.
When trying to load it in safari quicklook, it pops up an error "Object requires a newer version of iOS"
I'm on an iPhone X with iOS 13.6.
Any thoughts?
Thanks
Are there any good tutorials or suggestions on creating models in Blender and exporting with the associated materials and nodes? Specifically I'm looking to see if there is an ability to export translucency associated with an object (i.e. glass bottle). I have created a simple cube with a Principled BSDF shader, but the transmission and IOR settings are not porting over. Any tips or suggestions would be helpful.
Hello,
I'm working on an application that will try to take measures from parts of a human body (hand, foot..).
Now that Lidar has been integrated on the iPhone Pro and Arkit 4.0 came out, I would like to know if it seems feasible to combine the feature of the library such as model creation and geometry measurement, to get precise measurement of a body part.
Thanks for your help,
Tom
I am developing an app using ARKit4 ARGeoTrackingConfiguration following https://developer.apple.com/documentation/arkit/content_anchors/tracking_geographic_locations_in_ar.
I am outside of the US, so my location is not supported. I simulate a location but the CoachingState always stays at .initializing
Is there a way to test geotracking apps outside of the US?
I'm trying to Replay a recorded session, but I keep getting the error.
2021-10-02 14:44:57.687259-0700 arZero[11120:1137758] MOVReaderInterface - ERROR - Error Domain=com.apple.videoeng.streamreaderwarning Code=0 "Cannot grab metadata. Unknown metadata stream 'CVAUserEvent'." UserInfo={NSLocalizedDescription=Cannot grab metadata. Unknown metadata stream 'CVAUserEvent'.}
2021-10-02 14:44:58.798050-0700 arZero[11120:1137758] [Session] ARSession <0x111d066d0>: did fail with error: Error Domain=com.apple.arkit.error Code=101 "Required sensor unavailable." UserInfo={NSLocalizedDescription=Required sensor unavailable., NSLocalizedFailureReason=A required sensor is not available on this device.}
The same video works on the iPad Pro, but not on an iPhone 12 Pro and iPhone 13 Pro. I've tried recording the video with all different phones.
Hi,
I am using world tracking configuration with userFaceTrackingEnabled enabled. I am also setting frame semantics to contain scene depth.
Having such world & front configuration, is it possible to access both camera depth and color images?
Thank you,
Robert
I'm creating a custom scanning solution for iOS and using RealityKit Object Capture PhotogrammetrySession API to build a 3D model. I'm finding the data I'm sending to it is ignoring the depth and not building the model to scale. The documentation is a little light on how to format the depth so I'm wondering if someone could take a look at some example files I send to the PhotogrammetrySession. Would you be able to tell me what I'm not doing correctly?
https://drive.google.com/file/d/1-GoeR_KMhX_i7-y8M8ElDRrySasOdoHy/view?usp=sharing
Thank you!
I'm using ARKit to collect LiDAR data. I have read the depth map and its format is 'kCVPixelFormatType_DepthFloat32'. I saved the depth map to a PNG image by converting it to UIImage, but I found that the PNG depth map is incorrect. The PNG format only supports 16bit data.
let ciImage = CIImage(cvPixelBuffer: pixelBuf)
let cgImage = context.createCGImage(ciImage, from: ciImage.extent)
let uiImage = UIImage(cgImage: cgImage!).pngData()
So, I have to save the depth map to a TIFF image.
let ciImage = CIImage(cvPixelBuffer: pixelBuf)
do {
try context.writeTIFFRepresentation(of: ciImage, to: path, format: context.workingFormat, colorSpace: context.workingColorSpace!, options: [:])
} catch {
self.showInfo += "Save TIFF failed;"
print(error)
}
How to convert the depth map from kCVPixelFormatType_DepthFloat32 to Float16? Is there a correct way to save the depth map to a PNG image?
I'm making an app that captures data using ARKit and will ultimately send the images+depth+gravity to an Object Capture Photogrammetry agent. I need to use the depth data and produce a model with correct scale, so from what I understand I need to send the depth file + set proper exif data in the image. Since I'm getting the images+depth from ARKit I'll need to set the exif data manually before saving the images. Unfortunately the documentation on this is a bit light, so would you be able to let me know what exif data needs to be set in order for the Photogrammetry to be able to create the model with proper scale?
If I try and set my Photogrammetry agent with manual metadata like this:
var sample = PhotogrammetrySample(id: id, image: image)
var dict:[ String: Any ] = [:]
dict["FocalLength"] = 23.551325
dict["PixelWidth"] = 1920
dict["PixelHeight"] = 1440
sample.metadata = dict
I get the following error in the output and depth is ignored:
[Photogrammetry] Can't use FocalLenIn35mmFilm to produce FocalLengthInPixel! Punting...
Hello!
I am having trouble calculating accurate distances in the real world using the camera's returned intrinsic matrix and pixel coordinates/depths captured from the iPhone's LiDAR. For example, in the image below, I set a mug 0.5m from the phone. The mug is 8.5cm wide. The intrinsic matrix returned from the phone's AVCameraCalibrationData class has focalx = 1464.9269, focaly = 1464.9269, cx = 960.94916, and cy = 686.3547. Selecting the two pixel locations denoted in the image below, I calculated each one's xyz coordinates using the formula:
x = d * (u - cx) / focalx
y = d * (v - cy) / focaly
z = d
Where I get depth from the appropriate pixel in the depth map - I've verified that both depths were 0.5m. I then calculate the distance between the two points to get the mug width. This gives me a calculated width of 0.0357, or 3.5 cm, instead of the 8.5cm I was expecting. What could be accounting for this discrepancy?
Thank you so much for your help!
I've tried converting a gltf to usdz but it only puts out a USDZ-conversion error. Under deatils it only says "An unexpected error occurred while converting this file to USDZ. Please fix any other errors and try again."
This error just showed up recently. I'm on MacOS 12.3.1 and Reality Converter 1.0 (47.1)
Hi
Is this possible to have RoomCaptureSession and ARSession together, as we need feature points
Hello, I created in Blender a simple cube with 2 animations, one animation move up and down the cube and second one rotating cube on his position.
I exported this file in glb format and I tried to converted using Reality Converter, unfortunately I can only see 1 animation.
Is there any limitation of Reality Converter? Can I include more than 1 animation?
The original file glb has the 2 animation inside, as you can see from the screenshot I checked the file using a online viewer for glb and there are no problem, both animations are in.
The converter unfortunately see only the last one created.
Any reason or explanation? I believe is a limitation on Reality Converter
Regards
I have a arobject files that's already tested and working perfectly in Reality Composer as an anchor. But for whatever reason when I try them both as AR Resource group in my asset or even loading it directly from the url it always fails (returns nil) I've double check all the file/group names and they seems fine and I couldn't find the error, just always nil.
This is my code :
var referenceObject: ARReferenceObject?
if let referenceObjects = ARReferenceObject.referenceObjects(inGroupNamed: "TestAR", bundle: Bundle.main) {
referenceObject = referenceObjects[referenceObjects.startIndex]
}
if let referenceObject = referenceObject{
delegate.didFinishScan(referenceObject, false)
}else {
do {
if let url = Bundle.main.url(forResource: "Dragonball1", withExtension: "arobject") {
referenceObject = try ARReferenceObject(archiveURL: url)
}
} catch let myError{
let error = myError as NSError
print("try \(error.code)")
}
}
Any idea? Thanks
I have a single fbx model with several animations(idle, walk, run, eat......), but after I convert it to usdz format, I can only play 1 animation, where can I find other animations and how can I play them, or USDZ doesn't support it yet?
Thank you.
Cyan
hi I am not sure what is going on...
I have been working on this model for a while on reality composer, and had no problem testing it that way...it always worked out perfectly.
So I imported the file into a brand new Xcode project... I created a new ARApp, and used SwiftUI.
I actually did it twice ...
And tested the version apple has with the box. In Apple's version, the app appears but the whole part where it tries to detect planes didn't show up. So I am confused.
I found a question that mentions the error messages I am getting but I am not sure how to get around it?
https://developer.apple.com/forums/thread/691882
//
// ContentView.swift
// AppToTest-02-14-23
//
// Created by M on 2/14/23.
//
import SwiftUI
import RealityKit
struct ContentView : View {
var body: some View {
return ARViewContainer().edgesIgnoringSafeArea(.all)
}
}
struct ARViewContainer: UIViewRepresentable {
func makeUIView(context: Context) -> ARView {
let arView = ARView(frame: .zero)
// Load the "Box" scene from the "Experience" Reality File
//let boxAnchor = try! Experience.loadBox()
let anchor = try! MyAppToTest.loadFirstScene()
// Add the box anchor to the scene
arView.scene.anchors.append(anchor)
return arView
}
func updateUIView(_ uiView: ARView, context: Context) {}
}
#if DEBUG
struct ContentView_Previews : PreviewProvider {
static var previews: some View {
ContentView()
}
}
#endif
This is what I get at the bottom
2023-02-14 17:14:53.630477-0500 AppToTest-02-14-23[21446:1307215] Metal GPU Frame Capture Enabled
2023-02-14 17:14:53.631192-0500 AppToTest-02-14-23[21446:1307215] Metal API Validation Enabled
2023-02-14 17:14:54.531766-0500 AppToTest-02-14-23[21446:1307215] [AssetTypes] Registering library (/System/Library/PrivateFrameworks/CoreRE.framework/default.metallib) that already exists in shader manager. Library will be overwritten.
2023-02-14 17:14:54.716866-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/suFeatheringCreateMergedOcclusionMask.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2023-02-14 17:14:54.743580-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arKitPassthrough.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2023-02-14 17:14:54.744961-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/drPostAndComposition.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2023-02-14 17:14:54.745988-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arSegmentationComposite.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2023-02-14 17:14:54.747245-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute0.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2023-02-14 17:14:54.748750-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute1.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2023-02-14 17:14:54.749140-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute2.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2023-02-14 17:14:54.761189-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute3.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2023-02-14 17:14:54.761611-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute4.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2023-02-14 17:14:54.761983-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute5.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2023-02-14 17:14:54.762604-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute6.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2023-02-14 17:14:54.763575-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute7.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2023-02-14 17:14:54.764859-0500 AppToTest-02-14-23[21446:1307215] [Foundation.Serialization] Json Parse Error line 18: Json Deserialization; unknown member 'EnableARProbes' - skipping.
2023-02-14 17:14:54.764902-0500 AppToTest-02-14-23[21446:1307215] [Foundation.Serialization] Json Parse Error line 20: Json Deserialization; unknown member 'EnableGuidedFilterOcclusion' - skipping.
2023-02-14 17:14:55.531748-0500 AppToTest-02-14-23[21446:1307215] throwing -10878
2023-02-14 17:14:55.534559-0500 AppToTest-02-14-23[21446:1307215] throwing -10878
2023-02-14 17:14:55.534633-0500 AppToTest-02-14-23[21446:1307215] throwing -10878
2023-02-14 17:14:55.534680-0500 AppToTest-02-14-23[21446:1307215] throwing -10878
2023-02-14 17:14:55.534733-0500 AppToTest-02-14-23[21446:1307215] throwing -10878
2023-02-14 17:14:55.534777-0500 AppToTest-02-14-23[21446:1307215] throwing -10878
2023-02-14 17:14:55.534825-0500 AppToTest-02-14-23[21446:1307215] throwing -10878
2023-02-14 17:14:55.534871-0500 AppToTest-02-14-23[21446:1307215] throwing -10878
2023-02-14 17:14:55.534955-0500 AppToTest-02-14-23[21446:1307215] throwing -10878
2023-02-14 17:14:56.207438-0500 AppToTest-02-14-23[21446:1307383] [Technique] ARWorldTrackingTechnique <0x1149cd900>: World tracking performance is being affected by resource constraints [2]
2023-02-14 17:17:15.741931-0500 AppToTest-02-14-23[21446:1307414] [Technique] ARWorldTrackingTechnique <0x1149cd900>: World tracking performance is being affected by resource constraints [1]
2023-02-14 17:22:07.075990-0500 AppToTest-02-14-23[21446:1308137] [Technique] ARWorldTrackingTechnique <0x1149cd900>: World tracking performance is being affected by resource constraints [1]
code-block
I'm looking for ARKit standard face mesh with blendshapes to download, is this available?
Got Sonoma, got Xcode 15 with Mac and iOS sdks.
Launched XCOde but no way to find Reality Composer pro in the Open Developer Tool or in other items.