If you are encountering this via wireless debugging, turn off wifi + bluetooth on both your Mac and iPhone. Not just the toggle no new connections but go to settings and disable. After both are off, turn both radios back on. Something else that sometimes helps is swiping to reveal Control Center and just let it sit there for a few moments.
When this happens to me, I will typically plug in and debug with cable. If that doesn't work, reboot both devices.
Post
Replies
Boosts
Views
Activity
Same here too. Xcode 12.1. Even though I'm a team of one, it is still bothersome trying to keep clean commits. I always discard changes on it. Strangely, the next time I open the project, the same changes are not made.
I added elevation gain tracking to my fitness app earlier this year in my Hiking update. I haven't seen anything like this in development or heard any reporting from the field. I have a Series 3 with 6.2.8 and have never noticed anything like this.
As far as I know, CMAltimeter solely uses a barometric pressure sensor under the hood, well at least until the new watches that use GPS and Wi-Fi for improved data in that always-on scenario. Barometric pressure sensors have a sampling time where they 'accumulate' change, so we are limited by the update frequency of both the hardware and software.
I'll probably put something like this in my code to account for this; so I appreciate your share! I'd start by keeping track of my last known presumed good/first CMAltitudeData, then put the following (pseudo-code) guards into my update handler: guard newData.timestamp > lastData.timestamp else { BAD }
guard abs(newData.relativeAltitude - lastData.relativeAltitude) < Constants.reasonableAltitudeDelta else { BAD }
Unless your app supports cliff jumping or some other crazy activity, assuming the update frequency of the altimeter API is 1hz, maybe it would be appropriate for the user to change say 2 m/s? So in guard 2, compare your elevation change over time to some velocity number. Furthermore, you'd need to understand and handle the cases where your lastData is actually the bad one, or enough time has passed where you just wanna give up start accumulating +/- elevation again.
I applied similar filtering (plus more) to my CLLocation data before adding to my workout route.
Questions: By chance, do you have any CLLocation data during the 'paused' portion of your workout to correlate it against?
Could you share an example print out of your [CMAltitudeData] as well as markers where you paused your workout?
Roughly how many seconds after you paused the workout session did you see the elevation return to 'normal' values?
Any difference when you are doing a running workout with your watch set to auto-pause, vs you programmatically calling session.pause()?
As I mentioned, I have a Series 3, and a new Series 6 on the way. I will do some testing to see if I can reproduce what you're seeing and reply with my results.
Think about Apple's new hand washing detection, a developer probably wouldn't be able to achieve this given the public API set; as you pointed out we don't get 'always' motion updates. To speculate, their ML model probably uses acceleration, hand position, sound, and other factors as input into their classifier.
Is your application health or fitness related? If so, given your legitimate use case for accessing the users Health and Fitness data, you could perform a long running query with background updates for something like Step Count or distanceWalkingRunning updates.
https://developer.apple.com/documentation/healthkit/hkhealthstore/1614175-enablebackgrounddelivery
Do note, as the documentation states, some of the data is 'throttled' to so many updates over time. Hope this helps.
Fitness and Navigations apps commonly use what is called the "Kalman Filter" for positional updates over time.
In summary, you can use GPS to determine your position in 2D or 3D coordinate space. For the x and y position use case, you then use velocity, and acceleration to account for error in your sampling. CLLocation has a speed value, but it isn't to be used other than for informational.
So the idea here is to use CoreLocation + CMDeviceMotion data to get a better location fix. Will it get the accuracy you're looking for? No promises there.
For discussion purposes, is your iOS device free body or fixed body? I.e. in someone's hand, or on a robot where the axis won't change. If the latter where the orientation will always be the same respective to the x/y plane, you could look into using linear acceleration data with accumulation.
I haven't gotten this far in my prototyping yet, but I was going to take the approach of using an app group between my main app target, and the widget extension. I believe this is the correct way to share file space / sandbox container between extensions and main app target.
If this assumption is correct and works for you, then do the following: Put your heavy CoreData stuff in your main app--it doesn't belong in your widget.
Determine what you want your widget to display
Serialize it to simple files (use the codable protocol or images!)
Use WidgetCenter API and request your timeline provider get an update
In your timeline provider, decode those files and create your timeline entries
Let the widget configuration do the rest
Delete the file from the timeline provider only if it can independently determine that the data is no longer relevant, I'd defer anything heavy weight to your main app so your timeline provider can return your timeline entries as fast as possible. You never know when the user may interact with your widget and your timeline gets reloaded so keep until no longer relevant.
The method for getSnapshot and getTimeline give you a closure to do things async--but you should still return fast as there are probably some underlying timeouts / quality of service expecations in the OS for providing those entries quickly.
I recreated the iOS 14 weather widget in all size families and had played around with using the environment widgetFamily but this only helps if you're wanting to reuse views across a single widget type. Notes and Reminders both have two different widgets and without official support of an environment value out of the box I'd do something like this.
In your Widget definition, pass something into the parent most view of the hierarchy, then pass it around in properties or even a custom environment variable/object!
struct WeatherWidget: Widget {
let kind: String = "WeatherWidget"
var body: some WidgetConfiguration {
IntentConfiguration(kind: kind, intent: ConfigurationIntent.self, provider: WeatherTimelineProvider(), content: { entry in
WeatherWidgetEntryView(entry: entry, kind: kind)
})
.configurationDisplayName("Forecast")
.description("See the current weather conditions and forecast for a location.")
.supportedFamilies([.systemSmall, .systemMedium, .systemLarge])
}
}
BS_Developer,
Check out "Custom Apps" and see if that fits your white label need for businesses.
https://developer.apple.com/business/custom-apps/
https://developer.apple.com/wwdc20/10667
Custom Apps might be a preferred way to get the app to your business client's employees. From the reading I did, it seems way easier than an enterprise store.
This was something Apple announced at WWDC and might fit your needs better than having multiple versions of the same app published to the store. However, like all new things, not available until sometime this fall--so it might not fit the urgency for your client(s).
Hey @liazkam,
Edit: An Apple Engineer replied to another thread here stating that the Smart Banner won't be live until iOS 14 is released: https://developer.apple.com/forums/thread/657921
I got my App Clip submission working I think the day after beta 6 dropped (compiled/uploaded via Xcode 12 beta 6). So far, I managed to get the Advanced App Clip submitted with one of my domains for demo purposes after much trial and error with the associated domains. From what I can tell, everything is setup on my end property to get the banner / app clip card to show when visiting my domain but I don't see either showing up. Additionally, when I do an iMessage with my domain url to an iOS 14 devices, the experience isn't showing up there either. It is hard to tell what is live and what is still pending developer release on Apple's side.
My "Experience URL Status" under the Advanced App Clip Experiences is set to "Received". I couldn't find documentation on the different status values. What does yours show? I'm wondering if there is a status that represents something to the effect of 'live in production'.
With that having been said, I can invoke the App Clip itself via test flight, and then it shows up in Settings > App Clips and can be removed from there.
NOTE: If anyone has issues submitting their binary and having it show up with the correct associated domain capability--your associated domain app clip listing should NOT include https://, but when you setup your advanced app clip, you do need to specify the https protocol.
I created a suggestion feedback FB8581967 requesting more documentation details around the associated domain app clip tag requirements.
App Delegate still exists, you just need to use the new property wrapper to wire up your subclass.
https://developer.apple.com/documentation/swiftui/uiapplicationdelegateadaptor
I couldn't find much info on how to implement it myself or haven't come across the magic video yet, but Paul Hudson had a good writeup last week titled: "What is the @UIApplicationDelegateAdaptor property wrapper?"
The new Multiplatform template just doesn't have a checkbox, I'd love to have a "add CloudKit" in addition to the CoreData bit. Workaround, create a regular iOS app using the old template, check the box, then copy paste it into your new project.
I'm interested in this too. I assume you have upgraded an existing app from watchOS 6 and have an extension delegate class around somewhere--that you're not using the new SwiftUI App struct.
@main
struct FrutaApp: App {
Would making your extended runtime delegate object static, or a property of your extension delegate be acceptable solution for you? That object's lifecycle should be 100% isolated from any SwiftUI view.
Effectively, one way or another you'd be navigated to the website that is registered with your NFC chip as described above. If your site used the "user agent" header to do redirects that'd be up to you. For example, you could try and see if the user agent said it was Android OS and redirect to a page telling them to get an iPhone if they want to use your App Clip :)
So it is ultimately up to you what they see. It is your site, and their web browser.
I tried Core NFC out with some of the tags I bought when I got my Xs which supports background tag reading. iOS beautifully shows a local notification on the top of the screen, and then tapping opens Safari directly to my page, or presumably the universal link into my app although I didn't try that. My Android friends, the experience for them wasn't the same and varied for each of them. Some OEMs went straight to internet browser, while others prompt the user asking them what they wanted to do (choose chrome or x, y, z other browsers).
How often do you advertise?
What is the Connection Interval of your BLE device?
What is the Supervisory Timeout of your BLE device?
Do you use Indications or Notifications?
Did you enable the background mode for connecting to BLE accessories?
The above questions drive at discoverability and connections staying alive in the background.
If you require your device to STAY connected, good luck. What if the user sets down their phone and walks away?
If you are okay with a connection loss in the background followed by a reconnection, then you're in business. I don't see why you'd not be able to do this with the API.
Note: If your app uses the Data Protection capability feature (file system locked/encrypted when phone is locked) you may need to add an ignore flag on the files you would presumably be interacting with during background BLE connection.
Implementation advice would be When device disconnects with error, issue reconnect.
Opt-in on the State Restoration (init with options key).
Some helpful links.
State Restore Key:
https://developer.apple.com/documentation/corebluetooth/cbcentralmanageroptionrestoreidentifierkey
Info on restore conditions:
https://developer.apple.com/library/archive/qa/qa1962/_index.html
General Bluetooth background guide:
https://developer.apple.com/library/archive/documentation/NetworkingInternetWeb/Conceptual/CoreBluetooth_concepts/CoreBluetoothBackgroundProcessingForIOSApps/PerformingTasksWhileYourAppIsInTheBackground.html
iOS 13.0 -- No.
iOS 13.1 and newer, yes. They did add a way for this.
The property got moved from an instance property to a class property. You're seeing the prompt when you initialize a CBManager subclass, which is needed to access the authorizationStatus instance property. Refer to this API change: https://developer.apple.com/documentation/corebluetooth/cbmanager/3377595-authorization?changes=latest_major
Before you construct your CBManager class, do CBManager.authorization. Then handled the case that it is not authorized and present your user with what ever onboarding experience you choose.
In my workout app, I connect to Heart Rate Monitors (and some cycling sensors too in a future update!). For my user experience, I live with it for iOS 13.0. iOS 13.1 and newer, I only setup my scanner in my AppDelegate if the user has already faced the prompt (i.e. auth status != .notDetermined). Since watchOS 6 wasn't released when iOS 13.0 was, they managed to get the better API in for watchOS 6.0! Yay!
func didFinishLaunching ....
		...
		if #available(iOS 13.1, *), CBManager.authorization != .notDetermined {
let sensorManager = BluetoothSensorManager.shared
sensorManager.configure(healthStore: .shared)
sensorManager.loadSavedSensors()
}
		...
}
So for iOS 13.0, if you can manage lazily instantiating your CBManager subclass until the exact moment you need it--do so. That is more or less the only workaround you have for iOS 13.0.
For best practices, many WWDC videos that talk about API that use a permission model, defer, defer, defer. Don't ask for permission until you need it. Check out the new precise location API for iOS 14 or the WWDC 2019 Bluetooth video.
Regards
Too bad you're not working with CoreBluetooth API itself. I'm not familiar with Flutter's implementation around CoreBluetooth, but there is behavioral change with iOS 13. Namely, there is a permission / authorization change requiring users grant permission to BLE usage.
When you install from the App Store, do you see the prompt asking for permission to use BLE? If the flutter plugin does not handle the unauthorized scenarios introduced in iOS 13, and then revised in iOS 13.1 API in code, it might explain why you wouldn't be seeing discoveries. That wouldn't however explain why you can't see it when deploying with Xcode.