Map view in SwiftUI on visionOS: Bad performance with many markers

I am trying to build a visionOS app that uses a map as a central user interface.

This works fine on high zoom levels when there are only a couple of markers present. But as soon as I zoom out and the markers number gets to hundreds or even thousands, the performance gets super, super bad. It takes seconds for the map to render, and pans are also laggy. What makes things worse is that the SwiftUI map does not support clustering yet.

Has anyone found a solution to this?

I found this example by Apple about how to implement clustering:

https://developer.apple.com/documentation/mapkit/mkannotationview/decluttering_a_map_with_mapkit_annotation_clustering

It works, but it's using UIKit and storyboards and I could not get it transformed into SwiftUI compatible code.

I also found this blog post that created a neat SwiftUI integration for a clusterable map: https://www.linkedin.com/pulse/map-clustering-swiftui-dmitry-%D0%B2%D0%B5l%D0%BEv-j3x7f/

However, I wasn't able to adapt it so the map would update itself in a reactive way. I want to retrieve new data from our server if the user changes the visible region of the map and zooms in or out. I have no clue how to transfer my .onChange(of:) and .onMapCameraChange() modifiers to the UIKit world.

I'm finding the same thing. On an iPhone 14 Pro, 2000 annotations is unusable even if none of them are on display. Unless I'm missing something, it's regenerating all of the annotations every time the user interacts with the map. (A MapKit version of my map handles 6000+ without getting unusably laggy even when it's fully zoomed out.)

For clustering, you might want to look at https://github.com/vospennikov/ClusterMap.

Map view in SwiftUI on visionOS: Bad performance with many markers
 
 
Q