I working on of proof of concept app to load some 3D shapes & display them on a Apple Watch using SceneKit. I am using an Apple Watch series one running Watch OS 4 Beta 3 & get the following error ar run-time:failed to unarchive scene at file: (omitted)
food.scnassets/burger.scn (*** -[SCNKeyedUnarchiver decodeInt32ForKey:]: value (9223372036854775807) for key (primitiveRangeLocation) too large to
fit in 32-bit integer)This also happens in the simualtor. This does not happen when I use Xcode 8.3.3 and a watch simlauor for Watch OS 3.2 with same project. And I can also load the SCN file on an iPhone running the iOS 11 Beta Anyone else having this problem? I should add that I am creating the SCN file by converting a DAE file within Xcode.
Post
Replies
Boosts
Views
Activity
I have been reading up on creating metal filters & how they can be used with SCNTechnique to apply a post processing effect with AR Kit. I can get the basic effects working following other examples.
I would like to use filters that were made by FlexMonkey some years ago; I see they are C programs in a way. E.g:
https://github.com/FlexMonkey/Filterpedia/blob/master/Filterpedia/customFilters/TransverseChromaticAberration.swift
https://github.com/FlexMonkey/Filterpedia
Is there way a to adapt such filter to be used with a metal shader?
I am quite willing to watch any relevant WWDC videos to learn how to do this.
Ideally, I want to apply a chromatic aberration effect that will shift the RGB color position using a time uniform.
For reference:
https://en.wikipedia.org/wiki/Chromatic_aberration
I have an older app that is a mix of Swift & Objective-C. I have 2 groups of storyboards for the iPhone and the iPad using storyboard references.
There seems to be a bug, when using the Simulator, it is loading the storyboard specified by the key "Main storyboard file base name" and not using the key "Main storyboard file base name (iPad)". I did change the first key to use the iPad storyboard & it then worked as expected in the visionOS simulator.
The raw keys are:
UIMainStoryboardFile
UIMainStoryboardFile~ipad
What should I do?
I am trying to establish a workflow with using Reality Composer Pro to make scenes - I am grey boxing a scene using primitives at the moment.
I have set up a cube with a texture material and a simple animation to spin.
I am confused as to what I should be loading. I have created what I think is a scene asset in the package for the Reality Composer Project.
Here is a code snippet:
struct ContentView: View {
var body: some View {
RealityView { content in
do {
let scene = try await ModelEntity(named: "HOF")
content.add(scene)
} catch {
print("Error loading scene: \(error.localizedDescription)")
}
}
}
}
Here is the project layout in Reality Composer Pro:
I wanted to try running my project using Xcode 16 - I am using beta 4.It seems I cannot have both Xcode 15 & 16 installed at the same time.
I have a project with visionOS & iOS build targets.
The following error occurs in Xcode 16:
/Applications/Xcode-beta.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/include/swift/bridging.modulemap:13:8 redefinition of module 'SwiftBridging'
I did try removing Xcode 15 as a test & that worked but that is not a solution.
Is there a way I can have both apps installed?
I have a plane with a texture that was made in Blender & then exported using the Reality Converter from a USDC to a USDZ.
The translucency looks 100% translucent in RCP but it looks a bit like glass with some reflection in a Reality Kit scene in visionOS.
Is there a material setting that I need to change?