I have been on the Bata 18.2 since it launched and I can not seem to get the playground keeps saying I’ve requested. Why! Please help
Explore the power of machine learning and Apple Intelligence within apps. Discuss integrating features, share best practices, and explore the possibilities for your app here.
Post
Replies
Boosts
Views
Activity
Apple intelligence not working at all. Was working ok under 18.1 but nothing under 18.2??
It been hours still no acess to image playground
Hello,
I’m attempting to convert a TensorFlow model to CoreML using the coremltools package, but I’m encountering an error during the conversion process. The error traceback points to an issue within the Cast operation in the MIL (Model Intermediate Layer) when it tries to perform type inference:
AttributeError: 'float' object has no attribute 'astype'
Here is the relevant part of the error traceback:
File ~/.pyenv/versions/3.10.12/lib/python3.10/site-packages/coremltools/converters/mil/mil/ops/defs/iOS15/elementwise_unary.py", line 896, in get_cast_value
return input_var.val.astype(dtype=type_map[dtype_val])
I’ve tried converting a model from the yamnet-tensorflow2 repository, and this error occurs when CoreML tries to cast a float type during the conversion of certain operations. I’m currently using Python 3.10 and coremltools version 6.0.1, with TensorFlow 2.x.
Has anyone encountered a similar issue or can offer suggestions on how to resolve this?
I’ve also considered that this might be related to mismatches in the model’s data types, but I’m not sure how to proceed.
Platform and package versions:
coremltools 6.1
tensorflow 2.10.0
tensorflow-estimator 2.10.0
tensorflow-hub 0.16.1
tensorflow-io-gcs-filesystem 0.37.1
Python 3.10.12
pip 24.3.1 from ~/.pyenv/versions/3.10.12/lib/python3.10/site-packages/pip (python 3.10)
Darwin MacBook-Pro.local 24.1.0 Darwin Kernel Version 24.1.0: Thu Oct 10 21:02:27 PDT 2024; root:xnu-11215.41.3~2/RELEASE_X86_64 x86_64
Any help or pointers would be greatly appreciated!
I have been attempting to achieve the early access image creation (iPad iOS 18.2) and it has taken days, I still don’t have it and feel I never will. Someone heeelelllppppp!
https://developer.apple.com/machine-learning/models/
Adding the DepthAnythingV2SmallF16.mlpackage to a new project in Xcode 16.1 and invoking the class crashes the app.
Anyone else having the same issue?
I tried Xcode 16.2 beta and it has the same response.
Code
import UIKit
import CoreML
class ViewController : UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
do {
// Use a default model configuration.
let defaultConfig = MLModelConfiguration()
// app crashes here
let model = try? DepthAnythingV2SmallF16(
configuration: defaultConfig
)
} catch {
//
}
}
}
Response
/AppleInternal/Library/BuildRoots/4b66fb3c-7dd0-11ef-b4fb-4a83e32a47e1/Library/Caches/com.apple.xbs/Sources/MetalPerformanceShadersGraph/mpsgraph/MetalPerformanceShadersGraph/Core/Files/MPSGraphExecutable.mm:129: failed assertion Error: unhandled platform for MPSGraph serialization'
`
I am working to add Spotlight indexing for my app entities as discussed in WWDC24's video "What's New in App Intents".
That video goes over the IndexedEntity protocol and the integration with Spotlight via CSSearchableItemAttributeSet.
What I'm seeing though does not match the video. In the video, the presenter goes through the sort of progressive approach you can take to getting this data into Spotlight starting with the basics and then expanding to include more support depending on how much the developer wants to do.
What I'm seeing is that if you conform to IndexedEntity, your entities will appear in Spotlight using the name derived from
public var displayRepresentation: DisplayRepresentation
So, that works. Name appears... BUT the next part of the video goes into how to expand your implementation with more metadata for Spotlight via CSSearchableItemAttributeSet. The issue I'm seeing is that once that's implemented, the items disappear from Spotlight, almost like that implementation is overriding the base implementation in a way that no longer functions.
My expectation is that an item with custom attributes would use them in Spotlight as appropriate, not disappear from search, i.e. what's shown in the video should work.
I've got a sample project here:
https://hanchor.s3.amazonaws.com/misc/IndexingTest.zip
To reproduce with the sample:
Build and run. Indexing is setup in the init() method so it will just run.
Go to Spotlight and search for 'Huntersblau', a string included in the content set. At this point you should see a result - good!
Stop the app and go back and uncomment the var attributeSet: CSSearchableItemAttributeSet implementation in IndexingTestApp.swift. This will provide custom attributes to Spotlight.
Repeat steps 1 and 2 - you'll see now, it no longer appears in the search results - when CSSearchableItemAttributeSet is implemented, the item drops out of Spotlight.
Image Playground Error: Cannot find protocol declaration for 'ImageGenerationViewControllerDelegate'
@available(macCatalyst 18.1, *)
@available(iOS 18.1, *)
extension CKImageSelectionManager: ImagePlaygroundViewController.Delegate {
public func imagePlaygroundViewController(_ imagePlaygroundViewController: ImagePlaygroundViewController, didCreateImageAt imageURL: URL) {
}
func presentImagePlayground() {
let imagePlaygroundVC = ImagePlaygroundViewController()
// Set delegate to self to receive the callback
imagePlaygroundVC.delegate = self
imagePlaygroundVC.isModalInPresentation = true // Prevents dismissal with swipe if needed
self.delegate?.presentImageSelectionViewController(imagePlaygroundVC)
}
}
This generates an error in the xcode generated swift header.
Before I updated to 18.2, I was missing Apple Intelligence features such as chatgpt integration. Now that I updated to 18.2. I’m still missing Apple Intelligence and ChatGPT integration altogether and now my Siri turned into the old Siri. What can I do?
I updated to iOS 18.2 beta 2, and with it, I am able to see behind the early access requested by just swiping down, but when I do swipe down, the app closes. Does this mean I'm close to getting access?
After installing IOS 18.2 public beta, the Apple intelligence been stuck on downloading and now I’m Back to the old Siri. Which kinda sucks. lol
Any help or tips to get my APPLE INTELLIGENCE back?
Once available, I immediately installed the MacOS 15.2 beta and configured MacOS and Siri enable Apple Intelligence.
I joined the waiting list, and soon after the downloading process started.
I have since then been stuck in Pending.
I have recently (yesterday) installed 15.2 Beta 2 (Public) - and there is no difference.
I have restarted multiple times; I have left my mac on - connected to WiFi - over night multiple times; I have changed language - no only of Siri but also on my Mac (requiring a restart) - multiple time.
I am frustrated that I cannot see what is causing the Pending status- pending on what? I am frustrated that I cannot just start Apple Intelligence enrolment from scratch - no restart button.
Any help and advice would be greatly welcome.
The simulator should support the new Siri Ul and Apple Intelligence features, at least they should work if Apple Intelligence is enabled on the Mac itself.
Feedback ID: FB15699827
When i’m update to iOS 18.2 beta 1, Apple intelligence is downloading still now!
“Apple Intelligence is downloading. Some Apple Intelligence and Siri features may be unavailable until the download is complete”
How can i solve this problem?
Hello, I am trying to show an imagePlaygroundSheet
but when I open it in the 16 Pro and 16 Pro Max iOS 18.2 simulator, I see "Image Playground is not available. Image Playground is not available on this iPhone". The environment variable supportsImagePlayground always returns false.
I changed the simulator language and region to US but still see the error. Is this expected?
Screenshot
I have been to image creation for 2 hours. I can do genmoji, playground well for 10 minutes.
After that, when I run the playground and start to generate Animation style of photo, it will stuck in here and no result output. Is there anyone know what's happened...
Hiya,
I have downloaded the update and requested early access for playground etc yet almost a week in and I’m still waiting for early access, anyone else having this problem?
I have been waitlisted on playground for a week and a half almost two weeks and I still don’t have access to playground. Whenever I open the app and click done telling me I am on the waitlist it shows me the app and force closes. Anyone know why I haven’t gotten access when people have who have been on the waitlist shorter than me.
I request the access to playground when released ios 18.2 beta 1 and now in beta 2 of ios 18.2 is stiill waiting, how long take this??
I fell for the ”writing power“ of AI advertising and purchased an AirBook, Only to learn that power is related to Notes and Texts. PAGES needs writing power yesterday. When will that actual writing app get AI?