Posts

Post not yet marked as solved
6 Replies
2.7k Views
I tried animating the scrollTo() like so, as described in the docs. - https://developer.apple.com/documentation/swiftui/scrollviewreader swift withAnimation { scrollProxy.scrollTo(index, anchor: .center) } the result is the same as if I do swift withAnimation(Animation.easeIn(duration: 20)) {     scrollProxy.scrollTo(progress.currentIndex, anchor: .center) } I tried this using the example from the ScrollViewReader docs. With the result that up and down scrolling has exactly the same animation. struct ScrollingView: View {     @Namespace var topID     @Namespace var bottomID     var body: some View {         ScrollViewReader { proxy in             ScrollView {                 Button("Scroll to Bottom") {                     withAnimation {                         proxy.scrollTo(bottomID)                     }                 }                 .id(topID)                 VStack(spacing: 0) {                     ForEach(0..100) { i in                         color(fraction: Double(i) / 100)                             .frame(height: 32)                     }                 }                 Button("Top") {                     withAnimation(Animation.linear(duration: 20)) {                         proxy.scrollTo(topID)                     }                 }                 .id(bottomID)             }         }     }     func color(fraction: Double) - Color {         Color(red: fraction, green: 1 - fraction, blue: 0.5)     } } struct ScrollingView_Previews: PreviewProvider {     static var previews: some View {         ScrollingView()     } }
Posted
by Bersaelor.
Last updated
.
Post not yet marked as solved
0 Replies
1.1k Views
Today I was surprised that the latest version of Three.js (browser based 3D Engine) actually supports a transmission property for transparent materials that looks pretty good and is really performant: So far I have only used this material property when rendering in Blender, not in a realtime engine. But if THREE can run this at 60fps in the browser, there must be a way of achieving the same in ARKit/SceneKit/RealityKit? I just haven't found anything in the SCNMaterial documentation about Transmission nor anyone mentioning relevant shader modifiers. Especially if we want to present objects with frosted glas effects in AR this would be super useful? (The Three.js app for reference: https://q2nl8.csb.app/ and I learned about this in an article by Kelly Mulligan: https://tympanus.net/codrops/2021/10/27/creating-the-effect-of-transparent-glass-and-plastic-in-three-js/)
Posted
by Bersaelor.
Last updated
.
Post not yet marked as solved
2 Replies
2.5k Views
Hey there, When I run the following 50 lines of code in release mode, or turn Optimization on in Build-Settings Swift Compiler - Code Generation I will get the following crash. Anyone any idea why that happens? (Xcode 13.4.1, happens on Device as well as simulator on iOS 15.5 and 15.6) Example Project: https://github.com/Bersaelor/ResourceCrashMinimalDemo #0 0x000000010265dd58 in assignWithCopy for Resource () #1 0x000000010265d73c in outlined init with copy of Resource<VoidPayload, String> () #2 0x000000010265d5dc in specialized Resource<>.init(url:method:query:authToken:headers:) [inlined] at /Users/konradfeiler/Source/ResourceCrashMinimalDemo/ResourceCrashMinimalDemo/ContentView.swift:51 #3 0x000000010265d584 in specialized ContentView.crash() at /Users/konradfeiler/Source/ResourceCrashMinimalDemo/ResourceCrashMinimalDemo/ContentView.swift:18 Code needed: import SwiftUI struct ContentView: View {     var body: some View {         Button(action: { crash() }, label: { Text("Create Resouce") })     }     /// crashes in `outlined init with copy of Resource<VoidPayload, String>`     func crash() {         let testURL = URL(string: "https://www.google.com")!         let r = Resource<VoidPayload, String>(url: testURL, method: .get, authToken: nil)         print("r: \(r)")     } } struct VoidPayload {} enum HTTPMethod<Payload> {     case get     case post(Payload)     case patch(Payload) } struct Resource<Payload, Response> {     let url: URL     let method: HTTPMethod<Payload>     let query: [(String, String)]     let authToken: String?     let parse: (Data) throws -> Response } extension Resource where Response: Decodable {     init(         url: URL,         method: HTTPMethod<Payload>,         query: [(String, String)] = [],         authToken: String?,         headers: [String: String] = [:]     ) {         self.url = url         self.method = method         self.query = query         self.authToken = authToken         self.parse = {             return try JSONDecoder().decode(Response.self, from: $0)         }     } }
Posted
by Bersaelor.
Last updated
.
Post not yet marked as solved
1 Replies
694 Views
So, we use ARFaceTrackingConfiguration and ARKit for a magic mirror like experience in our apps, augmenting users faces with digital content. On the iPad Pro 5gen customers are complaining that the camera image is too wide, I'm assuming that is because of the new wide-angle camera necessary for Apples center-stage Facetime calls? I have looked through Tracking and Visualizing Faces and the WWDC 2021 videos, but I must have missed any API's that allow us to disable the wide-angle feature on the new iPads programmatically?
Posted
by Bersaelor.
Last updated
.
Post marked as solved
1 Replies
1.1k Views
Working with a M1 Macbook Air, macos 12.4. Anytime I open a new terminal window or just a new tab, it takes a really long time till I can type. I have commented out my entire ~/.zshrc and when run for i in $(seq 1 10); do /usr/bin/time $SHELL -i -c exit; done directly in an open terminal window it says it finished in 0.1s. So it must be something macos is doing before zsh is even starting. PS: In the activity monitor I can only see a spike in kernel_task cpu usage when opening a new terminal
Posted
by Bersaelor.
Last updated
.
Post not yet marked as solved
0 Replies
1k Views
So, I'm using some small python script that imports from pxr import Usd, Sdf, UsdGeom, Kind which used to work fine a few months ago. I got the USD tools from https://developer.apple.com/augmented-reality/tools/ and always had to run it in a Rosetta-Terminal (plus making sure the USD tools are in PATH and PYTHONPATH) but it worked. Now, whatever I do I always get   File "create_scene.py", line 2, in <module>     from pxr import Usd, Sdf, UsdGeom, Kind   File "/Applications/usdpython/USD/lib/python/pxr/Usd/__init__.py", line 24, in <module>     import _usd ModuleNotFoundError: No module named '_usd' Does anyone have any clues what this _usd is about?
Posted
by Bersaelor.
Last updated
.
Post not yet marked as solved
0 Replies
789 Views
I added a basic Hello World SwiftUI view to an existing UIKit project, yet I can not get the preview to work. Usual error is: MessageSendFailure: Message send failure for send render message to agent ================================== | RemoteHumanReadableError: Could not connect to agent | | Bootstrap timeout after 8.0s waiting for connection from 'Identity(pid: 30286, sceneIdentifier: Optional("XcodePreviews-30286-133-static"))' on service com.apple.dt.uv.agent-preview-service Neither this nor the generated report is very helpful. I also created a new Xcode project to see if the same View works in a new test project, which it does. If my project compiles without warnings and errors, but SwiftUI preview fails, what are the options I have left? (My deployment target is IOS14, my Xcode is the fresh Xcode 13.0)
Posted
by Bersaelor.
Last updated
.
Post marked as solved
1 Replies
1.1k Views
Aloha Quick Lookers, I'm using the usdzconvert preview 0.64 to create *.usdz files, which I then edit in ascii *.usda format, and then recipe them as *.usdz. This way I was able to fix the scale (the original gltf's usually are in m=1, while the usdzconvert 0.64 always sets the result to m=0.01). Now I was trying to follow the docs to anchor my Glasses prim on the users face and whatever I try, it will only ever place it on my tables surface. If I import my *.usdz file into Reality Composer and export it to usdz It does get anchored to the face correctly. Now when I open the Reality-Composer-Export-usdz it really doesn't look so different from my manually edited usdz (it just wraps the Geometry in another layer, I assume because the import-export through Reality-Composer). What am I doing wrong? Here's the Reality Composer generated usda: #usda 1.0 ( autoPlay = false customLayerData = { string creator = "com.apple.RCFoundation Version 1.5 (171.5)" string identifier = "9AAF5C5D-68AB-4034-8037-9BBE6848D8E5" } defaultPrim = "Root" metersPerUnit = 1 timeCodesPerSecond = 60 upAxis = "Y" ) def Xform "Root" { def Scope "Scenes" ( kind = "sceneLibrary" ) { def Xform "Scene" ( customData = { bool preliminary_collidesWithEnvironment = 0 string sceneName = "Scene" } sceneName = "Scene" ) { token preliminary:anchoring:type = "face" quatf xformOp:orient = (0.70710677, 0.70710677, 0, 0) double3 xformOp:scale = (1, 1, 1) double3 xformOp:translate = (0, 0, 0) uniform token[] xformOpOrder = ["xformOp:translate", "xformOp:orient", "xformOp:scale"] // ... and here is my own , minimal, usdz with the same face-anchoring-token: #usda 1.0 ( autoPlay = false customLayerData = { string creator = "usdzconvert preview 0.64" } defaultPrim = "lupetto_local" metersPerUnit = 1 timeCodesPerSecond = 60 upAxis = "Y" ) def Xform "lupetto_local" ( assetInfo = { string name = "lupetto_local" } kind = "component" ) { def Scope "Geom" { def Xform "Glasses" { token preliminary:anchoring:type = "face" double3 xformOp:translate = (0, 0.01799999736249447, 0.04600000008940697) uniform token[] xformOpOrder = ["xformOp:translate"] // ... I'd add the full files, but they are 2Mb of size and the max file size is 200kb. I could create a minimal example file with less geometry in case the above code is not enough.
Posted
by Bersaelor.
Last updated
.
Post not yet marked as solved
0 Replies
897 Views
In our app, we're streaming short HLS streams to local AVPlayers. In the view we have an AVPlayerLayer which is connected to the AVPlayer we start, and usually we hear audio and see the video. Sometimes, instead of video, we'll see a grey screen, while still being able to hear the audio. The playerlayer will be a shade of grey, which is not a shade coming from our app. Also if for testing purposes we don't connect the player to the PlayerLayer, this grey color will not be there, so it's definitely coming from the AVPlayerLayer. If this is somehow related to an HLSStream becoming corrupted or something, what are the API's in AVFoundation we could use to debug this? So far, when this happens we see no peculiar catch blocks or errors being thrown in our debug logs.
Posted
by Bersaelor.
Last updated
.
Post not yet marked as solved
1 Replies
1.1k Views
In an app with multiple AVPlayers (think TikTok UX for example), where: • one video is currently streaming • other videos , that users will move to next, have already been loaded into an AVPlayer for prebuffering Is there an API to give the currently playing Stream higher priority than the pre-buffering ones that are not on screen? It's nice to have instant play while scrolling but we also don't want to starve the currently playing player of bandwidth.
Posted
by Bersaelor.
Last updated
.
Post not yet marked as solved
3 Replies
1.7k Views
So, I've been following along the video on "Distribute binary frameworks as Swift packages" - https://developer.apple.com/videos/play/wwdc2020/10147/, but now I'm sort of stuck. I have my Swift Package that works fine when used as a source package, which I want to ship as a binary now. The video says, I'm supposed to add a target: swift import PackageDescription let package = Package(     name: "Test",     defaultLocalization: "de",     platforms: [         .iOS(.v13)     ],     products: [         .library(             name: "Test",             targets: ["Test"]         ),     ],     dependencies: [     ],     targets: [ //        .target(name: "Test")         .binaryTarget(             name: "Test",             url: "https://static.looc.io/Test/Test-1.0.0.xcframework.zip",             checksum: "9848327892347324789432478923478"         )     ] ) but, the xcframework is what I am trying to build, don't have an xcframework yet? With the above, if I run: bash [konrad@iMac-2 Source]$ xcodebuild archive -workspace Test -scheme Test \archivePath "tmp/iOS" \ destination "generic/platform=iOS" \ SKIP_INSTALL=NO BUILD_LIBRARY_FOR_DISTRIBUTION=YES I get the error bash xcodebuild: error: Could not resolve package dependencies:   artifact of binary target 'Test' failed download: invalid status code 403
Posted
by Bersaelor.
Last updated
.
Post not yet marked as solved
0 Replies
556 Views
Is anyone aware of any work or has anyone ever written an anisotropic shader using glsl for SceneKit? It's common place in renderers like Blender or Maya (can't link to the docs as the dev forums don't let me post urls, but it's easy to google). Unity has anisotropic shader packages too, so it doesn't seem to be impossible to achieve in a game engine.
Posted
by Bersaelor.
Last updated
.
Post not yet marked as solved
1 Replies
695 Views
Hey there, does someone here have a translucency shader (modifier) for SceneKit they'd be willing to share?Obviously we can do clear transparency vianode.opacity material.transparencybut thats for clear materials.What about translucent materials, like a blurry bathroom window or a page of paper? (Or the UIKit UINavigationbar.translucent)I.e. the light goes through the material but is partly scattered.There is a tutorial for a shader in unity.Did anyone port a shader like this to Scenekit?PS: Wikipedia also has a good article.
Posted
by Bersaelor.
Last updated
.