My RealityKit app uses an ARView with camera mode .nonAR. Later it puts another ARView with camera mode .ar on top of this.
When I apply layout constraints to the second view the program aborts with the follow messages. If both views are of type .ar this doesn't occur, it is only when the first view is .nonAR and then has the second presented over it.
I have been unable so far to reproduce this behavior in a demo program to provide to you and the original code is complex and proprietary.
Does anyone know what is happening? I've seen other questions concerning situation but not under the same circumstances.
2021-12-01 17:59:11.974698-0500 MyApp[10615:6672868] -[MTLTextureDescriptorInternal validateWithDevice:], line 1325: error ‘Texture Descriptor Validation
MTLTextureDescriptor has width (4294967295) greater than the maximum allowed size of 16384.
MTLTextureDescriptor has height (4294967295) greater than the maximum allowed size of 16384.
MTLTextureDescriptor has invalid pixelFormat (0).
’
-[MTLTextureDescriptorInternal validateWithDevice:]:1325: failed assertion `Texture Descriptor Validation
MTLTextureDescriptor has width (4294967295) greater than the maximum allowed size of 16384.
MTLTextureDescriptor has height (4294967295) greater than the maximum allowed size of 16384.
MTLTextureDescriptor has invalid pixelFormat (0).
Post
Replies
Boosts
Views
Activity
I don’t understand what the times are that are returned in forecasts.
guard let (current, hourly, daily) = try? await weatherService.weather(for: location, including: .current, .hourly, .daily(startDate: startDate, endDate: endDate))
I’m expecting this to be the time to which the forecast applies.
let utc = hourly.forecast[hour].date
When I convert it from UTC to local time I get something unusual for localDate
let dateFormatter = DateFormatter()
dateFormatter.timeZone = TimeZone.current
dateFormatter.dateFormat = "ha"
let localDateString = dateFormatter.string(from: utc)
What are these times? Shouldn’t the hourly be times increasing in one hour increments from startDate? Am I doing something incorrectly?
Thanks
Hi, When I import a USDZ from the RoomPlan demo code into Blender it results in no geometry. Xcode has no problem with the model, on the other hand, nor does Preview. Has anyone else had this issue? Apparently the Forum won't let me upload a model here.
Hi, is it possible to disable scrolling behavior in the SwiftUI List view? I'd like to take advantage of the new grouping features
List(content, children: \.children)
in List and want the list to be part of a larger scrolling view. As it stands I get an embedded scroll view for the list which is not my intent.
Thanks!
Using the code below to fetch data for multiple urls I encounter a number of errors such as:
2021-12-01 20:20:32.090690-0500 foo[63170:6750061] [assertion] Error acquiring assertion: <Error Domain=RBSAssertionErrorDomain Code=2 "Specified target process does not exist" UserInfo={NSLocalizedFailureReason=Specified target process does not exist}>
021-12-01 20:20:32.115662-0500 yerl[63170:6749861] [ProcessSuspension] 0x10baf8cc0 - ProcessAssertion: Failed to acquire RBS assertion 'ConnectionTerminationWatchdog' for process with PID=63200, error: Error Domain=RBSServiceErrorDomain Code=1 "target is not running or doesn't have entitlement com.apple.runningboard.assertions.webkit" UserInfo={NSLocalizedFailureReason=target is not running or doesn't have entitlement com.apple.runningboard.assertions.webkit}
I've added the following to my entitlements file:
<key>com.apple.security.network.client</key>
<true/>
with no change in the result. I gather that these are errors from a WKWebView but don't know how to resolve them.
@State private var metadataProvider: LPMetadataProvider?
...
metadataProvider?.startFetchingMetadata(for: url) { (linkMetadata, error) in
guard let linkMetadata = linkMetadata, let imageProvider = linkMetadata.iconProvider else { return }
imageProvider.loadObject(ofClass: UIImage.self) { (fetchedImage, error) in
if let error = error {
print(error.localizedDescription)
return
}
if let uiimage = fetchedImage as? UIImage {
DispatchQueue.main.async {
let image = Image(uiImage: uiimage)
self.image = image
print("cache: miss \(url.absoluteString)")
model.set(uiimage, for: url)
}
} else {
print("no image available for \(url.absoluteString)")
}
}
}
Thanks in advance.
Hi, I'm embedding the QLPreviewController in a UIViewControllerRepresentable. When I view .usdz models I don't see the AR/Object selector at the top, nor the sharing button. I have tried presenting modally with a .sheet modifier and had the same result. What do I need to do to get the controls? Thanks, code attached.
Code
Spiff
I'm suddenly getting link errors on the RoomPlan framework. Even the Apple sample code is not building. I'm on 13.0 Beta (22A5311f) with Xcode Version 14.0 beta 3 (14A5270f) on a Mac Studio. Is anyone else seeing this?
dyld[1232]: Symbol not found: _$s8RoomPlan08CapturedA0V6export2toy10Foundation3URLV_tKF
Referenced from: <24F0F645-03CF-3834-BB80-F51E9CAEC10D> /private/var/containers/Bundle/Application/041C1232-EE67-42F3-AE47-F351FE81CE9A/RoomPlanExampleApp.app/RoomPlanExampleApp
Expected in: <FFA44726-2CCB-3BE3-ABDE-26B7F6214D68> /System/Library/Frameworks/RoomPlan.framework/RoomPlan
Symbol not found: _$s8RoomPlan08CapturedA0V6export2toy10Foundation3URLV_tKF
Referenced from: <24F0F645-03CF-3834-BB80-F51E9CAEC10D> /private/var/containers/Bundle/Application/041C1232-EE67-42F3-AE47-F351FE81CE9A/RoomPlanExampleApp.app/RoomPlanExampleApp
Expected in: <FFA44726-2CCB-3BE3-ABDE-26B7F6214D68> /System/Library/Frameworks/RoomPlan.framework/RoomPlan
dyld config: DYLD_LIBRARY_PATH=/usr/lib/system/introspection DYLD_INSERT_LIBRARIES=/usr/lib/libBacktraceRecording.dylib:/usr/lib/libMainThreadChecker.dylib:/usr/lib/libRPAC.dylib:/Developer/Library/PrivateFrameworks/DTDDISupport.framework/libViewDebuggerSupport.dylib
In the WWDC video the code includes a reference to the visualizer object.
var previewVisualizer: Visualizer!
I'd like to create my own ViewController using RoomCaptureSession and incorporate the visualizer. This doesn't seem to be available in the RoomPlan framework, have I missed something or has this been removed from the public interface?
Thanks
When using the RoomPlan UI (RoomCaptureView), one obtains the final result using
public func captureView(didPresent processedResult: CapturedRoom, error: Error?)
which then gets exported via
finalResults.export(to: url)
What is the best way to do this if only using RoomCaptureSession?
Should I just keep track if each CapturedRoom coming back in the delegate methods and use the final one?
The RoomCaptureView seems to have a coaching controller analogous to the ARCoachingOverlayView. The content is available via
public func captureSession(_ session: RoomCaptureSession, didProvide instruction: RoomCaptureSession.Instruction)
Is the view that presents these instructions available if not using the RoomCaptureView?
I’m implementing my first Component Entity System and am having an issue. I have a requirement that some component properties be dynamic. I do not want to create a subclass that conforms to HasExampleComponent, so this was my approach. My issue is that even though the entity contains the property I can’t cast it to HasExampleComponent.
When I create the entity I set the component like this:
entity.components[ExampleComponent.self] = .init()
I'd appreciate a template for a ECS with component properties that can be updated from the app.
Thanks
public struct ExampleComponent: Component {
public var value = 0
}
public protocol HasExampleComponent: Entity {
var value: Int
}
public class ExampleSystem: System {
private static let query = EntityQuery(where: .has(ExampleComponent.self))
public required init(scene: Scene) {}
public func update(context: SceneUpdateContext) {
context.scene.performQuery(Self.query).forEach { entity in
// this won’t work because entity doesn’t conform to HasExampleComponent
entity.value += 1
}
}
}
extension Entity {
@available (iOS 15.0, *)
public var value: Int? {
get { components[RotatingComponent.self].value ?? 0}
set { components[RotatingComponent.self].value = newValue }
}
}
I’m loading a USDZ model using Entity.loadAsync(contentsOf:)
I’d like to get the dimensions of the model and I find that visualBounds(relativeTo: nil).extents returns dimensions larger than the actual dimensions while I see the correct dimensions when viewing the USDZ in Blender or when instantiating it as a MDLAsset(url:). What is the method to get the actual dimensions from an Entity?
Thanks
Thanks
I'd like to know the location of the pivot point of a ModelEntity, is there any way to get this? Alternatively can I get it from a USDZ file?
I want to place models in a specific location and some models have the pivot in the center while others have it at the bottom. If I can detect this I can adjust accordingly and place them correctly. I don't have control over the models, alas.
Thanks,
Spiff
I'm recreating the ARQuickLook controller in code. One of its behaviors is to move the model to the visible center when entering Obj mode. I've hacked the ARViewContainer of the default Xcode Augmented Reality App to demonstrate what I'm trying to do.
I think that moving the entity to 0,0,0 will generally not do the right thing because the world origin will be elsewhere. What I'm not clear on is how to specify the translation for entity.move() in the code. I'm assuming I'll need to raycast using a CGPoint describing view center to obtain the appropriate translation but I'm not sure about the details. Thanks for any help with this.
struct ARViewContainer: UIViewRepresentable {
let arView = ARView(frame: .zero)
let boxAnchor = try! Experience.loadBox()
func makeUIView(context: Context) -> ARView {
arView.scene.anchors.append(boxAnchor)
return arView
}
func updateUIView(_ uiView: ARView, context: Context) {
DispatchQueue.main.asyncAfter(deadline: .now() + 4) {
arView.environment.background = .color(.white)
arView.cameraMode = .nonAR
if let entity = boxAnchor.children.first {
let translation = SIMD3<Float>(x: 0, y: 0, z: 0 )
let transform = Transform(scale: .one, rotation: simd_quatf(), translation: translation)
entity.move(to: transform, relativeTo: nil, duration: 2, timingFunction: .easeInOut)
}
}
}
}
AR Quick Look has two modes:
Object Mode: one can view a model in an empty space with a ground plane and a shadow
AR Mode: one can view the model in an SR context, within a real environment
Does the developer have access to this functionality (moving between camera and non-camera modes)? I'm really asking if the camera can be disabled and reenabled in the same session.
Thanks