I'm looking at performance around large codable nested structures that come in from HTTP/JSON.
We are seeing stalls on the main thread, and after reviewing all the code, the webrequests and parsing are async and background. The post to set the new struct value (80K) is handled on mainthread.
When I looked at the nested structures, they are about 80K.
Reading several articles and posts suggested that observing structs will cause a refresh on any change. And that large structures will take longer as they have to be copied for passing to each observer. And that more observers will slow things down.
So a made a test app to verify these premises.
The app has an timer animating a slider.
A VM with a structure containing a byte array.
Sliders to scale the size of the byte array from 10K to 200K and to scale the number of observers from 1 to 100.
It also measures the actual duration between the timer ticks. My intention is to be able to visual see mainthread stalls and be able to measure them and see the average and max frame delays.
Using this to test I found little difference in performance given different structure sizes or number of observers. I'm not certain if this is expected or if I missing something in creating my test app.
I have also created a variation where the top struct is a an observable class. I see no difference between struct or class.
I'm wondering if this is due to copy-on-mutate causing the struct to actually be passed as reference under the good?
I wonder if other optimizations are minimizing the affect of scaling from 1 to 100 observers.
I appreciate any insights & critiques.
#if CLASS_BASED
class LargeStruct: ObservableObject {
@Published var data: [UInt8]
init(size: Int = 80_000) {
self.data = [UInt8](repeating: 0, count: size)
}
func regenerate(size: Int) {
self.data = [UInt8](repeating: UInt8.random(in: 0...255), count: size)
}
var hashValue: String {
let hash = SHA256.hash(data: Data(data))
return hash.compactMap { String(format: "%02x", $0) }.joined()
}
}
#else
struct LargeStruct {
var data: [UInt8]
init(size: Int = 80_000) {
self.data = [UInt8](repeating: 0, count: size)
}
mutating func regenerate(size: Int) {
self.data = [UInt8](repeating: UInt8.random(in: 0...255), count: size)
}
var hashValue: String {
let hash = SHA256.hash(data: Data(data))
return hash.compactMap { String(format: "%02x", $0) }.joined()
}
}
#endif
class ViewModel: ObservableObject {
@Published var largeStruct = LargeStruct()
}
struct ContentView: View {
@StateObject var vm = ViewModel()
@State private var isRotating = false
@State private var counter = 0.0
@State private var size: Double = 80_000
@State private var observerCount: Double = 10
// Variables to track time intervals
@State private var lastTickTime: Date?
@State private var minInterval: Double = .infinity
@State private var maxInterval: Double = 0
@State private var totalInterval: Double = 0
@State private var tickCount: Int = 0
var body: some View {
VStack {
Model3D(named: "Scene", bundle: realityKitContentBundle)
.padding(.bottom, 50)
// A rotating square to visualize stalling
Rectangle()
.fill(Color.blue)
.frame(width: 50, height: 50)
.rotationEffect(isRotating ? .degrees(360) : .degrees(0))
.animation(.linear(duration: 2).repeatForever(autoreverses: false), value: isRotating)
.onAppear {
isRotating = true
}
Slider(value: $counter, in: 0...100)
.padding()
.onAppear {
Timer.scheduledTimer(withTimeInterval: 0.005, repeats: true) { timer in
let now = Date()
if let lastTime = lastTickTime {
let interval = now.timeIntervalSince(lastTime)
minInterval = min(minInterval, interval)
maxInterval = max(maxInterval, interval)
totalInterval += interval
tickCount += 1
}
lastTickTime = now
counter += 0.2
if counter > 100 {
counter = 0
}
}
}
HStack {
Text(String(format: "Min: %.3f ms", minInterval * 1000))
Text(String(format: "Max: %.3f ms", maxInterval * 1000))
Text(String(format: "Avg: %.3f ms", (totalInterval / Double(tickCount)) * 1000))
}
.padding()
Text("Hash: \(vm.largeStruct.hashValue)")
.padding()
Text("Hello, world!")
Button("Regenerate") {
vm.largeStruct.regenerate(size: Int(size)) // Trigger the regeneration with the selected size
}
Button("Clear Stats") {
minInterval = .infinity
maxInterval = 0
totalInterval = 0
tickCount = 0
lastTickTime = nil
}
.padding(.bottom)
Text("Size: \(Int(size)) bytes")
Slider(value: $size, in: 10_000...200_000, step: 10_000)
.padding()
Text("Number of Observers: \(observerCount)")
Slider(value: $observerCount, in: 1...100, step: 5)
.padding()
HStack {
ForEach(0..<Int(observerCount), id: \.self) { index in
Text("Observer \(index + 1): \(vm.largeStruct.data[index])")
.padding(5)
}
}
}
.padding()
}
}
Post
Replies
Boosts
Views
Activity
I am using Model3D to display an RCP scene/model in my UI.
How can I get to the entities so I can set material properties to adjust the appearance?
I looked at interfaces for Model3D and ResolvedModel3D and could not find a way to get access to the RCP scene or RealityKit entity.
Where can I find sample video, sample encoding apps, viewing apps, etc?
I see specs, high level explanations, etc. but not finding any samples or command lines or app documentation to explain how to make and view these files.
Thank you, looking forward to promoting a spatial video rich future.
I have a plane that is stereoscopic so represents to the user depth that is beyond the plane.
I would like to have the options to render the depth buffer for the pixels or to not render any information into the depth for the plane.
I cannot see any option in Shader Graph Material to affect the depth buffer during render. I also cannot see any way in RealityKit to not render to the depth buffer for an entity.
I'm open to any suggestions.
We deliver an SDK that enables rich spatial computing experiences.
We want to enable our customers to develop apps using Swift or RealityComposer Pro.
Composer allows the creation of custom components from the add components button in the inspector panel. These source files are dropped into the RealityComposer package and directory.
We would like to be able to have our customers import our SDK components into their applications RealityComposer package and have our components be visible to be applied by our customer into their scene compositions.
How can we achieve this? We believe this will lead to a risk ecosystem components extensions for RealityComposer Pro.
Posting here as I did not see a section for Dev Documentation portal
Using the search box in the documentation portal I searched for "frustum" hoping to find any APIs that game me control over frustum culling.
https://developer.apple.com/search/?q=frustum&type=Documentation
The search came up empty for hits in RealityKit.
Hours later I found the boundsMargin API which explains how it affect frustum culling.
I went back and tried the search again to verify the documentation search result were incomplete.
site:developer.apple.com/documentation/realitykit frustum
on google worked fine.
Fixing this can save everyone time and stress.
I'm waiting to buy my next apple watch until I can take calls on it. That can only work if it goes though my hearing aides like the iphone does.
I see it lightly documented that the first variant listed in the manifest is the first variant to play, then adaptive streaming takes over to pick the next variants.
When we do this we find the 3rd variant is loaded first.
How can I investigate the reason for this?
I've run media analyzer and there were some suggestion but nothing that made the 3d variant different from the first 2.
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-STREAM-INF:BANDWIDTH=14985704,AVERAGE-BANDWIDTH=6811200,CODECS="avc1.4d4033,mp4a.40.2",RESOLUTION=3840x2160,FRAME-RATE=29.970
clip4_2160p30-6M.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=24835374,AVERAGE-BANDWIDTH=11211200,CODECS="avc1.4d4033,mp4a.40.2",RESOLUTION=3840x2160,FRAME-RATE=29.970
clip4_2160p30-10M.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=37147462,AVERAGE-BANDWIDTH=16711200,CODECS="avc1.4d4033,mp4a.40.2",RESOLUTION=3840x2160,FRAME-RATE=29.970
clip4_2160p30-15M.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=49459550,AVERAGE-BANDWIDTH=22211200,CODECS="avc1.640033,mp4a.40.2",RESOLUTION=3840x2160,FRAME-RATE=29.970
clip4_2160p30-20M.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=74013326,AVERAGE-BANDWIDTH=33140800,CODECS="avc1.640033,mp4a.40.2",RESOLUTION=3840x2160,FRAME-RATE=29.970
clip4_2160p30-30M.m3u8