On IOS 18, starting October this year, the location is not syncing in real time.You might drive for several kilometers and the location displayed on Waze will remain on the same position.This might put in position to miss the exit on the highway and have to drive another 40/50 km, lose time and energy, to get back in the original track. Reinstall the application twice is not correcting the behavior, as well as changing the device, the same issue is present on iPhone 16.
Waze support team shared the following:
”Hey there! We'd like to apologize for any hassle that this has caused you. This article: https://*******.com/3fu88jwj might help solve your issue. If you still need help, please open a support ticket by copying and pasting this link: https://*******.com/mrx77ukz in a web browser, and someone from the team will get back to you soon."
Based on the above facts, this is clearly an IOS 18 issue, that needs to be prioritized.
Maps and Location
RSS for tagBuild maps and location awareness capabilities into your apps.
Posts under Maps and Location tag
90 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Hi all, I have a delivery app that requires the user's location on start up. For example to determine the list of services available to the user. So on start up the app asks for location permission.
While the app does have a concept of creating addresses, we still require location services so that the user can only select an address close to their current GPS.
This is done to prevent fake orders, many apps with similar functionality allow you to pick any location on the map even if you decline location services, which in turn causes a lot of trouble since there are constantly fake orders being created. So to counter that we're asking for the user's location.
Now my question is: will apple force me to absolutely never require location access or will explaining to them the problem of fake orders allow me to pass the review?
Another feature I have is a driving mode, it's a map for drivers to see routes to a specific order. Obviously I need location services for that to function, or will Apple's review team also cling to that and force me to implement some weird way of letting the user choose their current location?
Because there are a lot of trolls that create fake orders, and then have 20 delivery drivers come to a place in a middle of a road. This is a serious issue and will literally kill our business if not countered somehow.
I am trying to derive the Apple Place ID from a CLPlacemark (or via a MKMapItem derived from it) created via either CLGeocoder().reverseGeocodeLocation() or CLGeocoder().geocodeAddressString(). In many cases, the placemark returned from these functions contains detailed information (name, address, coordinates, etc), implying that the Apple Place ID is known, but the identifier is not present.
The only way I have found to get a Place ID is via MKLocalSearch. Wondering if I am missing something here.
I have noticed a discrepancy between behavior on physical devices and simulators in iOS 18.
I am using the latest MapKit APIs to fetch MKMapItems using the following MKLocalSearch:
private func performLocalSearch(_ query: String) async throws -> [MKMapItem] {
let request = MKLocalSearch.Request()
request.naturalLanguageQuery = query
let search = MKLocalSearch(request: request)
return try await search.start().mapItems
}
This returns an array of MKMapItem on both the simulator and physical device. The key difference is my physical device (iOS 18.1.1) is missing the MKMapItem's identifier value. On the simulator, identifier is always populated in addition to my search. Any ideas on how to resolve this?
The new MapKit API for those curious:
@available(iOS 6.0, *)
open class MKMapItem : NSObject {
@available(iOS 18.0, *)
open var identifier: MKMapItem.Identifier? { get }
I am trying to get my head around how to implement a MapKit view using UIViewRepresentable (I want the map to rotate to align with heading, which Map() can't handle yet to my knowledge). I am also playing with making my LocationManager an Actor and setting up a listener. But when combined with UIViewRepresentable this seems to create a rather convoluted data flow since the @State var of the vm needs to then be passed and bound in the UIViewRepresentable. And the listener having this for await location in await lm.$lastLocation.values seems at least like a code smell. That double await just feels wrong. But I am also new to Swift so perhaps what I have here actually is a good approach?
struct MapScreen: View {
@State private var vm = ViewModel()
var body: some View {
VStack {
MapView(vm: $vm)
}
.task {
vm.startWalk()
}
}
}
extension MapScreen {
@Observable
final class ViewModel {
private var lm = LocationManager()
private var listenerTask: Task<Void, Never>?
var course: Double = 0.0
var location: CLLocation?
func startWalk() {
Task {
await lm.startLocationUpdates()
}
listenerTask = Task {
for await location in await lm.$lastLocation.values {
await MainActor.run {
if let location {
withAnimation {
self.location = location
self.course = location.course
}
}
}
}
}
Logger.map.info("started Walk")
}
}
struct MapView: UIViewRepresentable {
@Binding var vm: ViewModel
func makeCoordinator() -> Coordinator {
Coordinator(parent: self)
}
func makeUIView(context: Context) -> MKMapView {
let view = MKMapView()
view.delegate = context.coordinator
view.preferredConfiguration = MKHybridMapConfiguration()
return view
}
func updateUIView(_ view: MKMapView, context: Context) {
context.coordinator.parent = self
if let coordinate = vm.location?.coordinate {
if view.centerCoordinate != coordinate {
view.centerCoordinate = coordinate
}
}
}
}
class Coordinator: NSObject, MKMapViewDelegate {
var parent: MapView
init(parent: MapView) {
self.parent = parent
}
}
}
actor LocationManager{
private let clManager = CLLocationManager()
private(set) var isAuthorized: Bool = false
private var backgroundActivity: CLBackgroundActivitySession?
private var updateTask: Task<Void, Never>?
@Published var lastLocation: CLLocation?
func startLocationUpdates() {
updateTask = Task {
do {
backgroundActivity = CLBackgroundActivitySession()
let updates = CLLocationUpdate.liveUpdates()
for try await update in updates {
if let location = update.location {
lastLocation = location
}
}
} catch {
Logger.location.error("\(error.localizedDescription)")
}
}
}
func stopLocationUpdates() {
updateTask?.cancel()
updateTask = nil
}
func locationManagerDidChangeAuthorization(_ manager: CLLocationManager) {
switch clManager.authorizationStatus {
case .authorizedAlways, .authorizedWhenInUse:
isAuthorized = true
// clManager.requestLocation() // ??
case .notDetermined:
isAuthorized = false
clManager.requestWhenInUseAuthorization()
case .denied:
isAuthorized = false
Logger.location.error("Access Denied")
case .restricted:
Logger.location.error("Access Restricted")
@unknown default:
let statusString = clManager.authorizationStatus.rawValue
Logger.location.warning("Unknown Access status not handled: \(statusString)")
}
}
func locationManager(_ manager: CLLocationManager, didFailWithError error: Error) {
Logger.location.error("\(error.localizedDescription)")
}
}
According to the following article, the CLCircularGeographicCondition has a limit whereby only 20 conditions can be monitored by any single app.
Monitoring the user’s proximity to geographic regions
While I understand the rationale behind this limit, 20 conditions seems quite low for some apps. It would be good if an app could request that the user opt-in to allowing more conditions if they understand the impact this might have on the battery etc.
I'm migrating an app presently to use CLCircularGeographicCondition instead of the now deprecated CLCircularRegion. It would be good if there were more guidance on how to use the new Core Location API's to monitor how many conditions are in use within an app and how they can be deactivated when no longer required, allowing the app to free up more of the 20 conditions available.
Sometimes, I receive old location data from the location manager.
In my app, I am using geofencing to perform an action when the user enter or leaves a specified location. The geofencing (CLMonitor) is active permanently, and should work across multiple app sessions or after the device is restarted. It should also work after the app was minimized or terminated. This worked perfectly with iOS 17 and prior, but with iOS 18, things changed. As soon as iOS 18 dropped, users were informing me that the app does no longer perform the entry/exit action reliably (without me making any changes to the app). Most of the times, events are missed entirely. Sometimes, after the user opens or resumes the app, duplicate events are delivered and/or events with the current time instead of the correct time of entry/exit.
I am making sure that the app has the "Always" location permission before geofencing is enabled
The gefocence radius is between 20 and 500m, but even with the max. radius specified, the geofencing is unreliable
For the same user and geofence, the entry/exit event is delivered occasionally, but not always
I am currently not using CLLocationManager.allowsBackgroundLocationUpdates (even though it's documented as "Apps that receive location updates when running in the background must include the UIBackgroundModes key (with the location value) in their app’s Info.plist file") because it wasn't necessary on iOS 17 and in my tests, using it didn't yield any improvements
In my search for what could have caused this change, I found this WWDC video about location authorization: . It appears that with iOS 18, it is now required to have an active CLServiceSession to ensure that location updates are delivered to my app. Even though the video is long (and I've watched it multiple times), some things are still unclear. For example, the docs state:
If your app actively receives and processes location updates and terminates, it should restart those APIs upon launch in order to continue receiving updates.
Also, in the video it is stated that:
... So your job, ..., is to make sure that your process launch logic knows what features it has been tasked with pursuing, and re-takes session objects...
But on the other hand it's also said that:
you can only start holding one (a CLServiceSession) when your app is in the foreground
and also
... CLMonitor.events won’t yield results when it is not in use, unless a session which was started in the foreground, ....
To summarize my questions, for the geofencing to work as described above:
when exactly do I need to create a CLServiceSession if the app is launched into the backgorund? Immediately in the applicationDidFinishLaunching method, even though the app is still in the background (applicationState is background)? Or later on, when the app is opened again by the user, e.g. in applicationDidBecomeActive (and applicationState is active)?
do I need to specify the background mode capability as noted in the Handling location updates in the background article?
do I need to create a CLBackgroundActivitySession as noted in the Handling location updates in the background article?
does it matter, which of the four initializer methods I am using to create the CLServiceSession (with CLServiceSessionAuthorizationRequirementAlways)?
does it matter if I specify NSLocationRequireExplicitServiceSession in the Info.plist or not when I already do ensure that the app has the "Always" location permission when the feature is being enabled
Does a CLServiceSession last indefinitely and should it only be invalidated once the user disables the feature?
Can I integrate Timer along with liveUpdates(_:) in order to manipulate the frequency. Let say for example I need the user location ever 1 min. Is it possible to do so?
I encountered a crash in iOS 17 related to CLBackgroundActivitySession, which appears to be due to misleading guidance in an Apple’s WWDC video.
Crash sample code: https://github.com/steve-ham/AppleLocationCrash
Simplified Reproduction Steps:
1. Open the GitHub sample app.
2. Archive and export (Distribute App -> Custom -> (Release Testing, Enterprise, or Debugging) -> Export).
3. Open the app.
4. Tap enableBackgroundLocation -> select Allow While Using App on the system popup.
5. Tap disableBackgroundLocation.
6. Go to the iOS home screen.
7. Wait for 10 seconds.
8. Reopen the app -> crash occurs.
The crash happens because setting CLBackgroundActivitySession to nil does not end the session, despite Apple’s guidance suggesting it should. Below is the exact quote from WWDC 2023, which explicitly states that both calling invalidate() or letting the object get destroyed (i.e., setting to nil) would end the session:
WWDC 2023 Discover Streamlined Location Updates (https://developer.apple.com/videos/play/wwdc2023/10180/)
“Before starting the updates, you should instantiate a CLBackgroundActivitySession object to start a new session. Note, we are assigning the session to self.backgroundActivity, which is a property and not to a local variable. And this is important because if we used a local variable, then when it goes out of scope, the object it holds would be deallocated, invalidating the session and potentially ending your app’s access to location. Then when we want to end our session, we can do that by sending the invalidate message or by letting the object be destroyed.”
I’ve submitted this to Apple for resolution but wanted to share this with the community. This misguidance has caused issues in my app’s release. If Apple could reply to confirm or provide clarification, it would be greatly appreciated.
P.S. Even a minimal implementation in viewDidLoad triggers the crash:
let session = CLBackgroundActivitySession()
print("session (session)")
Hi,
Have been trying to work with MapkitJS for a website, but I'm stumped on once basic capability: I want to be able to click on a point of interest, and perform some actions such as:
Get its coordinates
Attach an annotation to it (e.g. a callout)
In my code, PointOfInterest's are selectable:
map.selectableMapFeatures = [
mapkit.MapFeatureType.PointOfInterest,
];
But when I click on one, I do see the marker pop up but nothing else (which is not much help since there is no additional information in the marker itself). I see no event getting triggered that I can do something with.
I am using an event listener as follows:
map.addEventListener('single-tap', (event) => {
const coordinate = map.convertPointOnPageToCoordinate(event.pointOnPage);
console.log('Map tapped at:', coordinate);
console.log('Map tapped event:', event);
...
I guess I have to grab the Place ID somehow but I don't know how to.
Thanks for any help.
We have to draw polygons inside a MKMapView based on coordinates retrieved from external source.
It seems that MapKit does not behave correctly where polygons have single-vertex self-intersection.
Here it's a simple point list example (every element is a pair of latitude and longitude values):
[(0, 0), (20, 0), (10, 10), (20, 20), (0, 20), (10, 10), (0, 0)]
The next image shows the rendering issue.
But if the list is slightly changed in this way
[(0, 0), (20, 0), (10, 10), (20, 20), (0, 20), (15, 10), (0, 0)]
the issue disappears. The next image shows it.
So it's not a self-intersection and self-tangency problem, but we think single-vertex self-intersection is a buggy edge case for MapKit.
Right now we fixed this problem by finding the duplicated coordinates and applying a small offset (1e-8) to one of them, but it's a temporary solution and adds rendering delay.
The problem is not due to iOS versions, since iOS 17 and 18 have the same issue. Also it happens on simulators and real devices.
Here is the playground example, based mostly on the "Map Playground" template Xcode offers. If you run it without modifying it, the playground shows the "bugged" polygon. If you use notBugPoints instead of the default bugPoints for the polygon, the playground shows the "not-bugged" polygon.
import MapKit
import PlaygroundSupport
// Create an MKMapViewDelegate to provide a renderer for our overlay
class MapViewDelegate: NSObject, MKMapViewDelegate {
func mapView(_ mapView: MKMapView, rendererFor overlay: MKOverlay) -> MKOverlayRenderer {
if let overlay = overlay as? MKPolygon {
let polygonRenderer = MKPolygonRenderer(overlay: overlay)
polygonRenderer.fillColor = .red
return polygonRenderer
}
return MKOverlayRenderer(overlay: overlay)
}
}
// Create a strong reference to a delegate
let delegate = MapViewDelegate()
// Create an MKMapView
let mapView = MKMapView(frame: CGRect(x: 0, y: 0, width: 800, height: 800))
mapView.delegate = delegate
// Configure The Map elevation and emphasis style
let configuration = MKStandardMapConfiguration(elevationStyle: .realistic, emphasisStyle: .default)
mapView.preferredConfiguration = configuration
// Create an overlay
let bugPoints = [
MKMapPoint(CLLocationCoordinate2DMake(0, 0)),
MKMapPoint(CLLocationCoordinate2DMake(20, 0)),
MKMapPoint(CLLocationCoordinate2DMake(10, 10)),
MKMapPoint(CLLocationCoordinate2DMake(20, 20)),
MKMapPoint(CLLocationCoordinate2DMake(0, 20)),
MKMapPoint(CLLocationCoordinate2DMake(10, 10)),
MKMapPoint(CLLocationCoordinate2DMake(0, 0))
]
let notBugPoints = [
MKMapPoint(CLLocationCoordinate2DMake(0, 0)),
MKMapPoint(CLLocationCoordinate2DMake(20, 0)),
MKMapPoint(CLLocationCoordinate2DMake(10, 10)),
MKMapPoint(CLLocationCoordinate2DMake(20, 20)),
MKMapPoint(CLLocationCoordinate2DMake(0, 20)),
MKMapPoint(CLLocationCoordinate2DMake(15, 10)),
MKMapPoint(CLLocationCoordinate2DMake(0, 0))
]
let polygon = MKPolygon(points: bugPoints, count: notBugPoints.count)
mapView.addOverlay(polygon)
// Frame our annotation and overlay
mapView.camera = MKMapCamera(lookingAtCenter: CLLocationCoordinate2DMake(10, 10), fromDistance: 5000000, pitch: 0, heading: 0)
// Add the created mapView to our Playground Live View
PlaygroundPage.current.liveView = mapView
On iPhone, we can use iBeacon to wake up the APP in the background for Bluetooth scanning connection, now we want to port the function to AppleWatch APP, but the API related to iBeacon is not applicable on watchOS, does watchOS have a similar wake up mechanism?
We create new development maps for builders and developers. Is it possible for us to add the new streets and addresses to Apple Maps so that visitors can find these brand-new streets and addresses?
I have an application that uses geolocation to track the user’s location and trigger actions when the app is in either the foreground or background. Currently, it seems that region entry is not triggered unless an app like Maps (which actively uses location services) is opened. The location permissions are correctly set to “Always” with precise location enabled.
We are using geofencing to setup region and trigger actions when entering or leaving.
Is there something I’m missing in the configuration that could be preventing region monitoring from triggering properly when the app is in use or in background?
My app has been using MKLocalSearch.Request for keyword-based location searches, and it has worked smoothly for a long time. However, starting last Wednesday, I began receiving an error from MKLocalSearch.start: MKErrorDomain (error code 4).
This issue only occurs when the network environment is based in mainland China (where the API uses the Amap data source). When the network switches to other regions and other Apple Maps data source is used, the error does not occur.
Another complication is that the API doesn't always fail—certain keywords still work (for example, "Huawei").
Already filed a ticket in Feedback Assistant: https://feedbackassistant.apple.com/feedback/15544549
For some background, my app is a Flutter app and I have opened it in xcode to try the following, I can build and run it on a simulator from xcode as well.
I want to be able to simulate custom moving locations for my app. I tried going to Debug -> Simulate Location like online tutorials show but it is all greyed out.
I have location simulation enabled (I have tried changing the default location too):
Within the simulator itself I only have the predefined routes or a custom static location with no option for a moving location.
How can I simulate a custom moving location?
If it isn't possible is there anyway to simulate the location for a Mac app as I can build the Flutter app for Mac as well?
I am currently developing an application that requires access to GPS coordinates from photos on iOS. However, with the recent update to iOS 18 beta, I have encountered a challenge: I can only view photos within maps, and I am unable to access the GPS coordinates directly.
Could you please provide guidance on how to enable or retrieve GPS coordinates from photos in the current iOS 18 beta version? Any insights or resources you could share would be greatly appreciated.
Thank you for your assistance!
Hello.
I’m trying to use Core Location Service on any application in Mac (Safari, Maps, Weather, Chrome with Core Location configuration, or just trigger the API with a standalone app) and it seems that all of them constantly fail to provide the geolocation of my device because of the same error (kCLErrorLocationUnknown).
I'm using MacBook Pro (M3 Max) with Sonoma 14.7.
I’m sure the apps have access in the settings and everything is in order except the Core Location response.
I’ve tested the issue on multiple MacBooks and it reproduced 100% of the times on all of them.
Did anyone encounter a similar issue?
Hi,
Where can I get more information on expected behaviour for the liveUpdates() configuration options, .default, .automotiveNavigation, .fitness, .automotive, .otherNavigation?
Looking for expected accuracy, frequency of updates, non/stationary transition, etc.
Thanks