Hey
We are testing a project on xcode 14 beta 5 and we have an issue with a model that is simply Apple's Vision Feature Print (embeddings). The model has the input 299x299, then a visionFeaturePrint layer and the output is float64[2048].
The model is in Core ML Package v3 and was created using CoreML Tools, cutting the layer added by Create ML into a classification model.
The result depends solely on the interaction that invokes the prediction despite the input image (simulator/Apple M1 chip). On the device works as expected.
let config = MLModelConfiguration()
#if targetEnvironment(simulator)
config.computeUnits = .cpuOnly
#else
config.computeUnits = .all
#endif
model = try! ImageSemanticInfo_iOS(configuration: config)
let buffer = thumb!.toCVPixelBuffer()!
for _ in 0..<3{
let results = try! model!.prediction(image: buffer).sceneprint
}
For example, if we take just the first entry of the embedding, we will always get the following results, regardless of the input image used:
0.474750816822052 - First call
0.3231460750102997 - Second call
0.37376347184181213 - Third call
Post
Replies
Boosts
Views
Activity
Multi selection of photos is usually used by either keeping the order followed by the user or by following the chronological order of the photos.
PHPicker - did not find yet a enum or so to define the order of photos while iterating through the result;
PhotoKit Picker - the nature of this picker does not seem to allow it. We have to assume a "random" order in photoLibraryDidChange and always have a second step of selection to define the order.
Let's take an example of an app that has in its nature photo organization has an important component of its functionality. Let's also imagine that photo meta-information is crucial for the app. In this situation it makes sense to request permission to access the camera roll.
PHPicker & PhotoKit Picker (if that is the correct name) have quite appealing filters that can be quite useful for the user. If the user granted access to the camera roll, life is beautiful and we can use PHPicker + PhotoKit to have PHAssets.
If the user opts by the limited access mode, we can end up with crazy flows. In this situation we have to call PhotoKit Picker to request access to photos + (meta-data) that the user wants to select.
Imagine now that in one session the user wants to do something around people. We can can call presentLimitedLibraryPicker in a place that makes sense for the user.
In a second session the user wants to do something with photos of a specific travel. We call again presentLimitedLibraryPicker to request access to the desired photos.
In fact some of the photos can be already selected. 2 from the 30 photos selected belong to the travel journey.
To know the new selection, the one that is currently in the user's mind, we have to handle photoLibraryDidChange. The problem is that 2 photos did not change.
This means that we'll have to have a second custom "picker" for the "true" selection. For the user this will be a selection of a selection...
Are we missing something or we'll be forced to do something like this?
Is it giving access to albums created automatically with people? I'm not able to see it in the simulator. The search area suggests it is available, though.
And what about memories available in Photos app (Month tab)? If the user wants to use a trip to Paris as the entry point for the selection is this possible?
If you set it, does it mean that the user will have an "old style" alert when a permission request is done?