Display map or satellite imagery from your app's interface, call out points of interest, and determine placemark information for map coordinates using MapKit.

Posts under MapKit tag

142 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

How to search location in global rather than in local?
I'm doing a weather app, users can search locations for getting weather, but the problem is, the results only shows locations in my country, not in global. For example, I'm in China, I can't search New York, it just shows nothing. Here's my code: @Observable class SearchPlaceManager: NSObject { var searchText: String = "" let searchCompleter = MKLocalSearchCompleter() var searchResults: [MKLocalSearchCompletion] = [] override init() { super.init() searchCompleter.resultTypes = .address searchCompleter.delegate = self } @MainActor func seachLocation() { if !searchText.isEmpty { searchCompleter.queryFragment = searchText } } } extension SearchPlaceManager: MKLocalSearchCompleterDelegate { func completerDidUpdateResults(_ completer: MKLocalSearchCompleter) { withAnimation { self.searchResults = completer.results } } } Also, I've tried to set searchCompleter.region = MKCoordinateRegion( center: CLLocationCoordinate2D(latitude: 0, longitude: 0), span: MKCoordinateSpan(latitudeDelta: 180, longitudeDelta: 360) ), but it doesn't work.
2
0
200
1d
Separate RealityView inside a RealityView attachment
Hi, I am having some troubles creating a "nested" RealityView content using MapKit attachment. I am building a visionOS app that has horizontal MapKit map as an attachment to RealityView. I want to display 3D pins on that map, therefore I am using native map annotation and inside of these annotations, I create a new RealityView just for the 3D pin. This worked completely fine, unitil I wanted to have those RealityViews interact with each other. By interaction of those RealityViews I mean that I wanted to group entities from the first "main" RealityViews content with the 3D pins using ModelSortGroupComponent. Why I want this? I want to make the map circular, that is not a problem. Problem is that when I move the map with 3D pins, these pins have their own RealityView space and are only bounded by volumetric window dimensions. What happes is that these pins float next to the map (shown on attached image). So I came up with this solution: create a custom "toroid" like 3D entity model that occludes the pins that go outside the map region. In order to occlude only the pins, I need to use ModelSortGroupComponent to group the "toroid" entity with 3D pins entities (as described in another forum thread). To summarize: need the content of the superior RealityView to interact with map attachment annotations RealityView content in order to group them. There might be of course another, better way to achieve my whole goal, so I would naturally appreciate any help or guidance. Image below showing 3D pins on circular map. Since pins RealityView does no know anything about other RealityViews, it just overlows and hangs in space until is cropped by volumetric window boundary. Simplified code: var body: some View { let modelSortGroup = ModelSortGroup(depthPass: .prePass) RealityView { content, attachments in var mainEntity = Entity() // My other entities here... if let mapAttachment = attachments.entity(for: "mapAttachment") { // Edit map properties, position, horizontal layout etc. mainEntity.addChild(mapAttachment) } // Create and add to content mask "toroid" entity mapMaskEntity. Use OcclusionMaterial() material. mapMaskEntity.components.set(ModelSortGroupComponent(group: modelSortGroup, order: 0)) // For all pins, somehow also set the group // 3DPinEntity.components.set(ModelSortGroupComponent(group: modelSortGroup, order: 1)) content.add(mainEntity) } attachments: { Attachment(id: "mapAttachment") { Map { ForEach(mapViewModel.clusters, id: \.id) { cluster in Annotation("", coordinate: cluster.coordinate) { MapPin3DView(cluster: cluster) } } } .clipShape(Circle()) } } } // MapPin3DView is an map annotation view that includes a model of 3D pin and some details like image etc., uses RealityView. struct MapPin3DView: View { var body: some View { RealityView { content in // 3D pin entities... } } }
1
0
103
3d
Is MapKit.mapCameraKeyframeAnimator broken on macOS 15.2?
Hi! I'm attempting to run the Quakes Sample App^1 from macOS. I am running breakpoints and confirming the mapCameraKeyframeAnimator is being called: .mapCameraKeyframeAnimator(trigger: selectedId) { initialCamera in let start = initialCamera.centerCoordinate let end = quakes[selectedId]?.location.coordinate ?? start let travelDistance = start.distance(to: end) let duration = max(min(travelDistance / 30, 5), 1) let finalAltitude = travelDistance > 20 ? 3_000_000 : min(initialCamera.distance, 3_000_000) let middleAltitude = finalAltitude * max(min(travelDistance / 5, 1.5), 1) KeyframeTrack(\MapCamera.centerCoordinate) { CubicKeyframe(end, duration: duration) } KeyframeTrack(\MapCamera.distance) { CubicKeyframe(middleAltitude, duration: duration / 2) CubicKeyframe(finalAltitude, duration: duration / 2) } } But I don't actually see any map animations taking place when that selection changes. Running the application from iPhone simulator does show the animations. I am building from Xcode Version 16.2 and macOS 15.2. Are there known issues with this API on macOS?
0
0
76
5d
MapKit super slow loading tiles stored on device
Loading tile overlays is slow even when the raster data is locally available on the device running iOS 18.2 and built with Xcode 16.2. In this video (https://3dtopo.com/superSlowTileLoading.mov) it takes 38 seconds to load tiles readily available on the device. Then, the whole screen flashes when tiles that are already drawn are redrawn, making for a very poor user experience. 38 seconds to load a dozen or so small images (512x512) stored locally on the device is simply unacceptable. I can't release a product like this that I've spent the last 1.5 years building and many years developing the maps themselves. This severe issue is new since I committed to basing my app on MapKit. Note that this issue does not occur with Apple's base map tiles. I created a Feedback Assitant case, FB16110803, for this issue. For the video, I disabled loading any tiles from the network and disabled loading any other data, such as polylines. Essentially all I am doing is loading the tiles stored on the device and returning them, such as: public func loadTile(at path: MKTileOverlayPath, result: @escaping (Data?, Error?) -> Void) { fetchData(forKey: key, failure: {error in result(nil, error)}, success: {data in result(data, nil)}) } open func fetchData(forKey key: String, failure fail: ((Error?) -> ())? = nil, success succeed: @escaping (Data) -> ()) { let path = self.path(forKey: key) do { let data = try Data( contentsOf: URL(fileURLWithPath: path), options: Data.ReadingOptions()) succeed(data) self.updateDiskAccessDate(atPath: path) } catch { if let block = fail { block(error) } } }
6
0
154
1h
Deriving Apple Place ID from CLPlacemark or MKMapItem
I am trying to derive the Apple Place ID from a CLPlacemark (or via a MKMapItem derived from it) created via either CLGeocoder().reverseGeocodeLocation() or CLGeocoder().geocodeAddressString(). In many cases, the placemark returned from these functions contains detailed information (name, address, coordinates, etc), implying that the Apple Place ID is known, but the identifier is not present. The only way I have found to get a Place ID is via MKLocalSearch. Wondering if I am missing something here.
1
0
139
1w
How to set the ZPriority of a MapKit Annotation with SwiftUI
Am displaying a number of annotations on a map which vary their location over time and can inherently cluster together. I'd like to be able to set the Z order on the annotations as I have simple logic how to order them in the Z direction. I see the ZPriority property on UIKit Annotation view, but unclear how I can do that in SwiftUI. Is this exposed in any manner? I thought I could create a viewModifier and using the content parameter I'd be able to drop down into the UIKIt view to apply the value to the property, but alas I'm not seeing a clear way to do so. Is this at all possible without dropping down entirely to creating a NSViewRepresentable/UIViewRepresentable implementation of Map?
0
0
117
2w
iOS 18 MapKit discrepancy between physical devices and simulators
I have noticed a discrepancy between behavior on physical devices and simulators in iOS 18. I am using the latest MapKit APIs to fetch MKMapItems using the following MKLocalSearch: private func performLocalSearch(_ query: String) async throws -> [MKMapItem] { let request = MKLocalSearch.Request() request.naturalLanguageQuery = query let search = MKLocalSearch(request: request) return try await search.start().mapItems } This returns an array of MKMapItem on both the simulator and physical device. The key difference is my physical device (iOS 18.1.1) is missing the MKMapItem's identifier value. On the simulator, identifier is always populated in addition to my search. Any ideas on how to resolve this? The new MapKit API for those curious: @available(iOS 6.0, *) open class MKMapItem : NSObject { @available(iOS 18.0, *) open var identifier: MKMapItem.Identifier? { get }
1
0
205
3w
Map in SwiftUI just wont follows users heading when first shown
I want to use a Map to show current location and heading, but the map first show, it just wont work. And when I switch off and switch on again, it works. codes below: BackgroundMapView.swift import SwiftUI import MapKit struct BackgroundMapView: View { var scope: Namespace.ID? private var cameraPosition: MapCameraPosition = .userLocation( followsHeading: true, fallback: .automatic ) private var locations: [Location] init(_ scope: Namespace.ID? = nil, locations: [Location]) { self.scope = scope self.locations = locations } var body: some View { Map( initialPosition: cameraPosition, scope: scope ) { MapPolyline(coordinates: locations.map({ $0.coordinate.toGCJ02 })) .stroke( .red, lineWidth: 3 ) } .mapControlVisibility(.hidden) } } #Preview { MainView() } HomeVIew.swift import SwiftUI struct HomeView: View { @StateObject private var locationManager = LocationManager() @State private var isMapEnabled = UserDefaults.isHomeMapEnabled { didSet { UserDefaults.isHomeMapEnabled = isMapEnabled } } @Namespace private var mapScope var body: some View { if isMapEnabled { BackgroundMapView(mapScope, locations: locationManager.locations) .mapScope(mapScope) } } }
1
0
123
4w
Horizontal window/map on visionOS
Hi, would anyone be so kind and try to guide me, which technologies, Kits, APIs, approaches etc. are useful for creating a horizontal window with map (preferrably MapKit) on visionOS using SwiftUI? I was hoping to achieve scenario: User can walk around and interact with horizontal map window and also interact with (3D) pins on the map. Similar thing was done by SAP in their "SAP Analytics Cloud" app (second image from top). Since I am complete beginner in this area, I was looking for a clean, simple solution. I need to know, if AR/RealityKit is necessary or is this achievable only using native SwiftUI? I tried using just Map() with .rotation3DEffect() which actually makes the map horizontal, but gestures on the map are out of sync and I really dont know, if this approach is valid or complete rubbish. Any feedback appreciated.
2
0
199
Nov ’24
MapKit - render a line above everything, including annotations
I have a map application that needs to show a line (representing a direct route) that is above everything, including annotations. This is important because the map has lots of annotations (possibly hundreds) and the line is representing a route from point to another. With that many annotations being on top the line / route is basically useless because you can't see it. I've looked at things like MKOverlayLevel but it only supports .aboveRoads or .aboveLabels. Is there a way to set the z-axis of a map overlay so that it truly is on top of everything else on the map, including annotations? And if not directly in MapKit, what other options might I have? Worth noting that I'm targeting 16.4 and above, so that's my limitation on this
1
0
192
1w
SwiftUI Color Issue
I have ran into an issue that is illustrated by the code in the following GitHub repository. https://github.com/dougholland/ColorTest When a SwiftUI color originates from the ColorPicker it can be persisted correctly and renders the same as the original color. When the color originates from the MapFeature.backgroundColor, it is always rendered with the light appearance version of the color and not the dark appearance version. The readme in the GitHub repo has screenshots that show this. Any assistance would be greatly appreciated as this is affecting an app that is in development and I'd like to resolve this before the app is released. If this is caused by a framework bug, any possible workaround would be greatly appreciated also. I suspect it maybe a framework issue, possibly with some code related to the MapFeature.backgroundColor, because the issue does not occur when the color originates from the ColorPicker.
2
1
257
2w
How the duck can I use .p8 to get the Mapkit Server API token to use it without 7 day duration?
https://developer.apple.com/documentation/applemapsserverapi/creating-and-using-tokens-with-maps-server-api This doesn't really say what to do with .p8 private key. I know I am a noob but I really don't understand Apple Doc at all. There is no example or anything like that. const appleMapKit = await fetch("https://maps-api.apple.com/v1/token", { headers: { Authorization: `Bearer ${server_api_token}`, }, }); This is what I did but because I created the token on website, server_api_token only lasts 7 days. So, I tried to use Profile - Key to replace that. I have .p8 file and how can I use this to create the token for the server api?
1
0
217
Nov ’24
MapCameraPosition & userLocation
I have a Map within a SwiftUI. I initialize a variable like this: var cameraPosition: MapCameraPosition = .userLocation(followsHeading: true, fallback: .automatic) and then I want to zoom map, code like this: let userRegion = MKCoordinateRegion( center: userLocation.coordinate, span: MKCoordinateSpan( latitudeDelta: 0.01, longitudeDelta: 0.01 ) ) cameraPosition = .region(userRegion) But it does't work. How can I solve this problem?
0
1
168
Nov ’24
MapSelection of MapFeatures and MKMapItems
I have a Map within a SwiftUI app that has the selection parameter set to an optional MapSelection. I need to be able to select MapFeatures which works as expected as well as MKMapItems which are added to the map. The MKMapItems are not selectable. Is it possible to make it such that the Map allows the user to select MKMapItems as well as MapFeatures?
1
0
207
Nov ’24
How to set followsUserLocation and define a custom camera distance dynamically using MapKit for SwiftUI ?
Hello, I've created an app that follows the user as they navigate through public transport. I want the camera to follow the user and at the same time I want the camera distance (elevation) to change dynamically depending on speed and other factors. I've tried a first approach using camera keyframes, but I've noticed a lot of crashes when the app comes back to the foreground. .mapCameraKeyframeAnimator(trigger: cameraFrame) { camera in KeyframeTrack(\MapCamera.centerCoordinate) { LinearKeyframe(cameraFrame.center, duration: cameraFrame.duration, timingCurve: cameraFrame.curve) } KeyframeTrack(\MapCamera.distance) { LinearKeyframe(cameraFrame.distance, duration: cameraFrame.duration, timingCurve: cameraFrame.curve) } } Some logs mention wrong center coordinates (nan, nan) but when I print them, it's show me valid coordinates. I've also tried manually setting the camera for each position update, but the result is not smooth. position = .camera( .init(.init( lookingAtCenter: newPosition, fromDistance: cameraElevation, pitch: .zero, heading: .zero) ) ) What's the best way to achieve this?
0
0
155
Nov ’24
Center Map in SwiftUI to specific latitude and longitude.
I have a Map in SwiftUI using MapKit and the map has several annotations and MapCircles added to it. I need to have the ability to center the map on a specific latitude and longitude. The issue is that the map instead is centering so that all annotations and MapCircles etc. are visible. How can I have it disregard items added to the map and center the map at a specific latitude and longitude and ideally, control the zoom level of the map also?
1
0
238
Nov ’24
MapkitJS - not able to click on PointOfInterest and use it for actions
Hi, Have been trying to work with MapkitJS for a website, but I'm stumped on once basic capability: I want to be able to click on a point of interest, and perform some actions such as: Get its coordinates Attach an annotation to it (e.g. a callout) In my code, PointOfInterest's are selectable: map.selectableMapFeatures = [ mapkit.MapFeatureType.PointOfInterest, ]; But when I click on one, I do see the marker pop up but nothing else (which is not much help since there is no additional information in the marker itself). I see no event getting triggered that I can do something with. I am using an event listener as follows: map.addEventListener('single-tap', (event) => { const coordinate = map.convertPointOnPageToCoordinate(event.pointOnPage); console.log('Map tapped at:', coordinate); console.log('Map tapped event:', event); ... I guess I have to grab the Place ID somehow but I don't know how to. Thanks for any help.
0
0
245
Nov ’24
MapKit polygon render bug when single-vertex self-intersection
We have to draw polygons inside a MKMapView based on coordinates retrieved from external source. It seems that MapKit does not behave correctly where polygons have single-vertex self-intersection. Here it's a simple point list example (every element is a pair of latitude and longitude values): [(0, 0), (20, 0), (10, 10), (20, 20), (0, 20), (10, 10), (0, 0)] The next image shows the rendering issue. But if the list is slightly changed in this way [(0, 0), (20, 0), (10, 10), (20, 20), (0, 20), (15, 10), (0, 0)] the issue disappears. The next image shows it. So it's not a self-intersection and self-tangency problem, but we think single-vertex self-intersection is a buggy edge case for MapKit. Right now we fixed this problem by finding the duplicated coordinates and applying a small offset (1e-8) to one of them, but it's a temporary solution and adds rendering delay. The problem is not due to iOS versions, since iOS 17 and 18 have the same issue. Also it happens on simulators and real devices. Here is the playground example, based mostly on the "Map Playground" template Xcode offers. If you run it without modifying it, the playground shows the "bugged" polygon. If you use notBugPoints instead of the default bugPoints for the polygon, the playground shows the "not-bugged" polygon. import MapKit import PlaygroundSupport // Create an MKMapViewDelegate to provide a renderer for our overlay class MapViewDelegate: NSObject, MKMapViewDelegate { func mapView(_ mapView: MKMapView, rendererFor overlay: MKOverlay) -> MKOverlayRenderer { if let overlay = overlay as? MKPolygon { let polygonRenderer = MKPolygonRenderer(overlay: overlay) polygonRenderer.fillColor = .red return polygonRenderer } return MKOverlayRenderer(overlay: overlay) } } // Create a strong reference to a delegate let delegate = MapViewDelegate() // Create an MKMapView let mapView = MKMapView(frame: CGRect(x: 0, y: 0, width: 800, height: 800)) mapView.delegate = delegate // Configure The Map elevation and emphasis style let configuration = MKStandardMapConfiguration(elevationStyle: .realistic, emphasisStyle: .default) mapView.preferredConfiguration = configuration // Create an overlay let bugPoints = [ MKMapPoint(CLLocationCoordinate2DMake(0, 0)), MKMapPoint(CLLocationCoordinate2DMake(20, 0)), MKMapPoint(CLLocationCoordinate2DMake(10, 10)), MKMapPoint(CLLocationCoordinate2DMake(20, 20)), MKMapPoint(CLLocationCoordinate2DMake(0, 20)), MKMapPoint(CLLocationCoordinate2DMake(10, 10)), MKMapPoint(CLLocationCoordinate2DMake(0, 0)) ] let notBugPoints = [ MKMapPoint(CLLocationCoordinate2DMake(0, 0)), MKMapPoint(CLLocationCoordinate2DMake(20, 0)), MKMapPoint(CLLocationCoordinate2DMake(10, 10)), MKMapPoint(CLLocationCoordinate2DMake(20, 20)), MKMapPoint(CLLocationCoordinate2DMake(0, 20)), MKMapPoint(CLLocationCoordinate2DMake(15, 10)), MKMapPoint(CLLocationCoordinate2DMake(0, 0)) ] let polygon = MKPolygon(points: bugPoints, count: notBugPoints.count) mapView.addOverlay(polygon) // Frame our annotation and overlay mapView.camera = MKMapCamera(lookingAtCenter: CLLocationCoordinate2DMake(10, 10), fromDistance: 5000000, pitch: 0, heading: 0) // Add the created mapView to our Playground Live View PlaygroundPage.current.liveView = mapView
2
0
270
Nov ’24
Apple Map Snapshot API broken??
I have been using the following python library to generate apple map snapshots. has worked fine until about ~12 hours ago - now all I'm getting is "bad request" for any snapshot with overlays. if it's just a snapshot with a defined center and no polyline overlays, it still works. perhaps something has changed with the api's way of parsing percent encoded parameters? it's super irritating that there's no changelog or source code to view. what the heck??? https://pypi.org/project/mapsnap/
0
2
236
Oct ’24