I am trying to build a visionOS app that uses a map as a central user interface.
This works fine on high zoom levels when there are only a couple of markers present. But as soon as I zoom out and the markers number gets to hundreds or even thousands, the performance gets super, super bad. It takes seconds for the map to render, and pans are also laggy. What makes things worse is that the SwiftUI map does not support clustering yet.
Has anyone found a solution to this?
I found this example by Apple about how to implement clustering:
It works, but it's using UIKit and storyboards and I could not get it transformed into SwiftUI compatible code.
I also found this blog post that created a neat SwiftUI integration for a clusterable map: https://www.linkedin.com/pulse/map-clustering-swiftui-dmitry-%D0%B2%D0%B5l%D0%BEv-j3x7f/
However, I wasn't able to adapt it so the map would update itself in a reactive way. I want to retrieve new data from our server if the user changes the visible region of the map and zooms in or out. I have no clue how to transfer my .onChange(of:) and .onMapCameraChange() modifiers to the UIKit world.