Posts

Post not yet marked as solved
1 Replies
911 Views
i saw there is a way to track hands with vision, but is there also a way to record that movement and export it to fbx? Oh and is there a way to set only one hand to be recorded or both at the same time? Implementation will be in SwiftUI
Posted
by chaert-s.
Last updated
.
Post not yet marked as solved
1 Replies
945 Views
Hello everyone, I want to add FBX capabilities to my app so I downloaded and installed the FBX SDK for iOS from the Autodesk website. But when it came to setting up the sdk for my Xcode project, the only article I could find was from 2014 and the guide is outdated and doesn't work anymore. I do not know a lot about c or working with frameworks/APIs, so I need some help getting this set up... Thanks for any help in advance!
Posted
by chaert-s.
Last updated
.
Post not yet marked as solved
0 Replies
975 Views
Hello all, I am trying to achieve the following: I have an app that lets the user record positional data of the device via an ARKit session and would like to also add a way to record depth information using the dual camera setup as seen in the Taking pictures for photogrammetry example app. The reason for this is that the depth data provided by ARKits .sceneDepth is only 256x192 pixels in resolution, which is not really usable for compositing. Also, it only exports as 8bit png or jpeg and the depth information from the dual camera setup seems to be 32bit TIF. The actual question I guess is if it is possible to add a (dual camera) depth configuration to an ARKit session or if it is possible to get a higher resolution depth map from ARKit (for rear camera experience) and export it as 32bit, as I haven't been able to figure that out. Any help would be greatly appreciated! : )
Posted
by chaert-s.
Last updated
.
Post not yet marked as solved
0 Replies
611 Views
I am trying to record a video from my camera while an ARView is running and I was wondering how I could do this, as documentation on doing this with an arview is sparse to none. I only need the actual camera feed, not the content that my arview is presenting on top. I know I can get a CVPixelBuffer for each frame of the ARView, but I don't quite understand how to package this up into a video and save it in documents. I have this code to manage updates from my ARView in my SwiftUI View: arView.scene.subscribe(to: SceneEvents.Update.self) { _ in if recording { //do stuff if record button is pushed } }.store(in: &subs) What do I need to do to package up all the CVPixelBuffers as a video once the stop recording button is pushed? Totally new in this field so any help is greatly appreciated!
Posted
by chaert-s.
Last updated
.
Post not yet marked as solved
0 Replies
531 Views
Hey everyone, I want to animate an MDLObject and then export it to a .dae file. Currently, only the objects mesh is exported, but without animation. I could not find any articles online how to animate objects correctly, so I came here to ask for help. Here is the code I have: First, I set up a camera transform: var cameraTransform = MDLTransform() Then I write to the transform: let transform = frame.camera.transform let rotation = frame.camera.eulerAngles let position = transform.columns.3 let elapsedTime = frame.timestamp - firstUpdateTime! cameraTransform.setTranslation(position[SIMD3(0,1,2)], forTime: elapsedTime) cameraTransform.setRotation(rotation, forTime: elapsedTime) And assign the transform to a mesh and a scene: let object = MDLMesh(boxWithExtent: .init(0.1, 0.1, 0.1), segments: .init(10, 10, 10), inwardNormals: false, geometryType: .triangles, allocator: nil) object.name = "Camera Transform" object.transform = ARSessionManager().cameraTransform let asset = MDLAsset() asset.add(object) And lastly, I export the file: try! asset.export(to: colladaFileURL) This spits out a file that contains a cube, but no animation. The cameraTransform is in the frame update function of my ARSessionDelegate and when I add a print in there I also see values being spit out. (e.g. SIMD3<Float>(-0.522029, -0.40510324, -1.5561341)). Any help would be greatly appreciated!
Posted
by chaert-s.
Last updated
.
Post not yet marked as solved
0 Replies
563 Views
Hello everyone, This post shows how you can record device motion inside Reality Composer to then use as a "testing" environment for AR experiences. Is there a way to implement such recording in another app? I basically want to create a tool that records the phones position and rotation over time and then exports to a 3d file format for later use. But recording as a 4x4 matrix and then manually writing to a collada file has turned out to be quite tedious and it hasn't been working for me. That is when I came across the article I mentioned above. If there were an implementation for this in ARKit natively that I could use, it would help a lot. Any help is greatly appreciated!
Posted
by chaert-s.
Last updated
.
Post not yet marked as solved
2 Replies
1.6k Views
I would like to integrate a function into my iOS app, that allows the user to use his or her device as a virtual camera, meaning they hit record, then the motion of the iPhone or iPad is captured and then when done, saved as an fbx or similar file format virtual camera movement. Is there a more of less easy way to do this? thanks in advance!
Posted
by chaert-s.
Last updated
.
Post not yet marked as solved
0 Replies
652 Views
Hello everyone, a a few days ago I coded a file picker struct in UIKit and it was working fine, the app I implemented it in even ran on my device no problems. Now all of a sudden, it gives me errors (Type 'FilePicker' does not conform to protocol 'UIViewControllerRepresentable') and refuses to compile. I changed absolutely nothing in the file... Does anybody know why this is giving errors all of a sudden? import SwiftUI import UIKit struct FilePicker: UIViewControllerRepresentable { @Binding var isShown: Bool @Binding var fileURL: URL? func makeUIViewController(context: UIViewControllerRepresentableContext<FilePicker>) -> UIDocumentPickerViewController { let picker = UIDocumentPickerViewController(documentTypes: ["public.data"], in: .import) picker.delegate = context.coordinator return picker } func updateUIViewController(_ uiViewController: UIDocumentPickerViewController, context: UIViewControllerRepresentableContext<FilePicker>) { } func makeCoordinator() -> Coordinator { Coordinator(self) } class Coordinator: NSObject, UIDocumentPickerDelegate { var parent: FilePicker init(_ parent: FilePicker) { self.parent = parent } func documentPicker(_ controller: UIDocumentPickerViewController, didPickDocumentsAt urls: [URL]) { parent.fileURL = urls[0] parent.isShown = false } func documentPickerWasCancelled(_ controller: UIDocumentPickerViewController) { parent.isShown = false } } }
Posted
by chaert-s.
Last updated
.
Post marked as solved
1 Replies
1.7k Views
Hello Y'all, a friend of mine and I are working on a project together and are using git source control to collaborate. When I commit (and push) new files or changes to pre-existing files, everything works fine and my friend can pull those without a problem. But when he adds new files and then commits and pushes them, my Xcode doesn't recognise that new files have been added after I pull them. They show up in Finder in the right place and directory, but they do not show up in Xcode. Changes he makes to pre-existing files show up fine, its only when new files are added, that those changes do not show up. I don't know if I have something incorrectly configured or if its on his end, but even deleting the project off my laptop and cloning it again from GitHub as a new project results in the exact same problem: The files are all on my drive, just not in my Xcode Project Navigator. Cleaning the build folder also does nothing. Is there a file that keeps track of all files that belong in the project that my friend is not committing? That's about the best guess I've got right now... He is on an intel Mac and I am on an m1 Mac, if that's important. Thanks in advance for any help!
Posted
by chaert-s.
Last updated
.
Post not yet marked as solved
0 Replies
577 Views
hello everyone, i am trying to implement a way for users to record images/videos on their phone or iPad that are then sent to the Mac for processing with the photogrammetry API (swift and SwiftUI). I was wondering if I could use airdrop somehow to quickly and easily send the data to the Mac but I cannot seem to find an implementation for that. Is there some other way? I don’t want to put the files in the users iCloud storage, as that would fill up storage too much, especially if you are on the base 5gb thanks in advance for any help! : )
Posted
by chaert-s.
Last updated
.
Post not yet marked as solved
3 Replies
1.1k Views
Hello all, I want to create an app right now that can generate SwiftUI views from csv tables. They have two columns, one with a name and the other contains a hex color code (e.g. Apple,#FFFFFF). I want to read the csv file and then use the first column of each row as text and the second row of the column as the color of the text. But I cannot seem to get this to work. Currently I read the csv with: var filecontent = String(contentsOf: file); where file is defined as a URL(fileURLWithPath: „colors.csv“) How do I now separate the two columns from each other and how do I implement this in a ForEach struct to get this kind of „prodcedural“ item generation?
Posted
by chaert-s.
Last updated
.