So for the SensorKit framework, first as I understand it, you must be approved as a research study to obtain the entitlements. Then the way SensorKit works, once the user provides authorization, is that you start a recorder and this allows the device to store the events. But you only have access to the events after 24 hours have passed. Given these constraints, I don't think it will be possible to detect and react in real time a SRWristDetection event. We also have wished to be able to use SensorKit data in real time to perform for JITAI (Just In Time Adaptive Interventions) - maybe someday :)
Post
Replies
Boosts
Views
Activity
I doubt Apple would provide SensorKit entitlements outside of a research study. And one thing that is not very apparent is that the data provided via SensorKit is only available at least 24 hours after it was recorded. You may be able to gather some real time data such as raw accelerometer/gyroscope data and semi-processed data about likely activity (driving, cycling, walking) via the CoreMotion framework.
SensorKit does not provide real-time data. It all has a 24-hour "embargo", I suspect this is to allow the research participant the chance to revoke data sharing for 24 hours after any event. If you are looking for real-time heart rates, your best bet is HealthKit and HKAnchoredObjectQuery which can wake your app in the background as new data is written to HealthKit. There are, of course, some challenges/caveats - you must request and the user must grant read permissions to your app, and the phone must be unlocked for the updates to be readable for background updates.
For my experience with sensor data, you can really only get 24/7 sensors from SensorKit which requires a special entitlement and must be for medical research purposes. Even so, it strongly considers privacy in the output so some sensor data is generalized or anonymized to some degree. I think you may be able to fire the sensors in the background, but we haven't attempted to do so due to the multitude of limitations of what an app can do while backgrounded? So for our use case, we fire up sensors for active measurement of specific scenarios. And for other research use cases, we rely on SensorKit to provide a 24/7 picture of what is happening around the user.
I'm sorry @tongxingx, but I don't think you are able to get real-time data for ambient light, at least from SensorKit. I believe Apple purposefully makes data available after 2 hours for privacy.
I couldn't easily find a client that could decode lz4 so I wrote some code to do it using Apple's sample code for Compression and Decompression on this page.
A lot of this is possible with publicly available APIs, but each element may have its own unique limitations. For example:
ECG - currently, this is only performed on a recent (series 4 or higher) Apple Watch in the ECG app and is then published to HealthKit which you can extract with permission.
SpO2 - similar to ECG, but also randomly sampled when the user appears to be resting - also extracted through HealthKit
accelerometer - you have a few options - if you want to fire up the sensors for a short period of time, you can use CoreMotion in a Watch app which would require some user intervention. If you qualify for a research study, you may be able to use SensorKit which can extract 24/7 accelerometer data from the watch and/or phone, but only have access to the data after 24 hours.
HRV - heart rate variability is another intriguing measurement that may have to do with sleep/stress that works a lot like SpO2 in that you can force a measurement and heartbeat series via a Mindfulness session on the watch app and the os will occasionally passively measure this throughout the day when the user is at rest.
For the ambient light in this example, the entitlement looks like this:
...
<key>com.apple.developer.sensorkit.reader.allow</key>
<array>
<string>ambient-light-sensor</string>
</array>
...
Not sure this will work exactly - but this is the basic gist after gutting our model to an MVP. This assumes you've already requested permissions. You have to create a SRSensorReader with the sensor you want. Set a delegate for the reader's callbacks. Request the devices for the sensor. On the callback with devices, then you can request samples by creating an SRFetchRequest. Then when the data starts coming back, then you have to cast the sample into a type you're looking for and then extract the data from the sample. FWIW, the readers do not appear to be threadsafe - i.e., you can only be working with one reader at a time - don't attempt to read in parallel from different sensors.
class SensorKitDataExtractor : NSObject {
let reader: SRSensorReader
let fetchRequest = SRFetchRequest()
init() {
reader = SRSensorReader(sensor: .ambientLightSensor)
reader.delegate = self
reader.fetchDevices()
}
}
extension SensorKitDataExtractor : SRSensorReaderDelegate {
func sensorReader(_ reader: SRSensorReader, didFetch devices: [SRDevice]) {
fetchSamples(device: devices.first(where: { $0.model == self.queryRule.sensorType.device.rawValue }))
}
func sensorReader(_ reader: SRSensorReader, fetchDevicesDidFailWithError error: Error) {
print("Error fetching devices: \(error)")
}
func sensorReader(_ reader: SRSensorReader, fetching fetchRequest: SRFetchRequest, didFetchResult result: SRFetchResult<AnyObject>) -> Bool {
switch result.sampleObject {
case let lightSample as SRAmbientLightSample:
// do something with the data
default:
print("Unhandled sample type: \(result.sampleObject)")
return false
}
return true
}
func sensorReader(_ reader: SRSensorReader, didCompleteFetch fetchRequest: SRFetchRequest) {
print("Reader did complete fetch")
}
func sensorReader(_ reader: SRSensorReader, fetching fetchRequest: SRFetchRequest, failedWithError error: Error) {
print("Reader fetch failed: \(error)")
}
private func fetchSamples(device: SRDevice?) {
guard let device = device else {
print("No device found for this sensor")
return
}
fetchRequest.device = device
fetchRequest.from = SRAbsoluteTime.fromCFAbsoluteTime(_cf: fromDate)
fetchRequest.to = SRAbsoluteTime.fromCFAbsoluteTime(_cf: toDate)
reader.fetch(fetchRequest)
}
}
I think this is because the NSManagedObject's hash value doesn't change when a property changes. And the hash changing is what forces the UITableViewDiffableDataSource to draw a change. Unfortunately it appears you cannot override the hash function for NSManagedObject. My workaround was to make a wrapper that includes the values I want to trigger an update so the hash changes when any of these values change. It works well for me, but it would be nicer if the hash value could change if a property changes or for the diffable data source to have another mechanism to work more closely with NSFetchedResultsController.
struct WrappedItem: Hashable { /* to trigger model change update */
let item: Item	/* this is an NSManagedObject */
let name: String
}
/* this is the declaration for the snapshot which u */
var diffableDataSourceSnapshot = NSDiffableDataSourceSnapshot<Int, WrappedItem>()
func updateSnapshot() {
diffableDataSourceSnapshot = NSDiffableDataSourceSnapshot<Int, WrappedItem>()
diffableDataSourceSnapshot.appendSections([0])
do {
let mappedObjects = try (fetchedResultsController.fetchedObjects ?? []).map { (item) -> WrappedItem in
return WrappedItem(item: item, name: item.name)
}
diffableDataSourceSnapshot.appendItems(mappedCollections)
} catch {
os_log(.error, "Error updating snapshot - %@", error.localizedDescription)
}
diffableDataSource?.apply(diffableDataSourceSnapshot)
}
}