The simulator should support the new Siri Ul and Apple Intelligence features, at least they should work if Apple Intelligence is enabled on the Mac itself.
Feedback ID: FB15699827
Explore the power of machine learning and Apple Intelligence within apps. Discuss integrating features, share best practices, and explore the possibilities for your app here.
Post
Replies
Boosts
Views
Activity
When i’m update to iOS 18.2 beta 1, Apple intelligence is downloading still now!
“Apple Intelligence is downloading. Some Apple Intelligence and Siri features may be unavailable until the download is complete”
How can i solve this problem?
Hello, I am trying to show an imagePlaygroundSheet
but when I open it in the 16 Pro and 16 Pro Max iOS 18.2 simulator, I see "Image Playground is not available. Image Playground is not available on this iPhone". The environment variable supportsImagePlayground always returns false.
I changed the simulator language and region to US but still see the error. Is this expected?
Screenshot
So, I've declared an AppIntent that indicates my app can "Open files" that conform to UTType.Image.
I've got a @AssistantEntity(schema: .files.file) and a
@AssistantIntent(schema: .files.openFile) declared.
So I navigate to the files app, quicklook an image, and open type-to-siri.
I tell siri "open this in " and all it does is act like "open ". No breakpoint is hit in my intent's perform method.
Am I doing something wrong? How can I test these cross-app behaviors?
Are they... not actually possible? Does an "OpenIntent" only work on my app's own URLs and not on file URLs from other apps?
I have been to image creation for 2 hours. I can do genmoji, playground well for 10 minutes.
After that, when I run the playground and start to generate Animation style of photo, it will stuck in here and no result output. Is there anyone know what's happened...
I request the access to playground when released ios 18.2 beta 1 and now in beta 2 of ios 18.2 is stiill waiting, how long take this??
Does your app store/platform support AI apps
is there any difference in the submission to review process for AI apps vs non AI apps
Can you share any documentation supporting the above
I have seen a lot of tutorials on pytorchvision models being able to be converted to coreml models but I have not been able to google or find any tutorials for torchaudio models. Is converting to a torchaudio to coreml model even possible? Does anybody have links that show how to do it?
Trying to get on the waitlist for the above and the computer is saying: “Apple Intelligence is not available when Mac is set to English (Singapore)”.
When just a few more bullet points below my Language selection shows “English (United States)”.
That’s the only thing I can see, of course you guys are the experts.
I would like to be part of this AI experiment/experience.
Thanks for any help you can give to this 35+ year Mac user.
Lee W
I'm trying to disable Writing Tools for a specific TextField using .writingToolsBehavior(.disabled), but when running the app on my iPhone 16 Pro with Apple Intelligence enabled, I can still use Writing Tools on the text box. I also see no difference with .writingToolsBehavior(.limited).
Is there something I'm doing wrong or is this a bug?
Sample code below:
import SwiftUI
struct ContentView: View {
@State var text = ""
var body: some View {
VStack {
TextField("Enter Text", text: $text)
.writingToolsBehavior(.disabled)
}
.padding()
}
}
#Preview {
ContentView()
}
On 25 Oct I I updated and Requested Early access for Image Playground feature, Still its got stuck on the same for 10 days, Now new 18.2 beta 2 is available still am not get access for 1 version Old feature
I am working on the neural network classifier provided on the coremltools.readme.io in the updatable->neural network section(https://coremltools.readme.io/docs/updatable-neural-network-classifier-on-mnist-dataset).
I am using the same code but I get an error saying that the coremltools.converters.keras.convert does not exist. But this I know can be coreml version issue. Right know I am using coremltools version 6.2. I converted this model to mlmodel with .convert only. It got converted successfully.
But I face an error in the make_updatable function saying the loss layer must be softmax output. Even the coremlt package API reference there I found its because the layer name is softmaxND but it should be softmax.
Now the problem is when I convert the model from Keras sequential model to coreml model. the layer name and type change. And the softmax changes to softmaxND.
Does anyone faced this issue?
if I execute this builder.inspect_layers(last=4)
I get this output
[Id: 32], Name: sequential/dense_1/Softmax (Type: softmaxND)
Updatable: False
Input blobs: ['sequential/dense_1/MatMul']
Output blobs: ['Identity']
[Id: 31], Name: sequential/dense_1/MatMul (Type: batchedMatmul)
Updatable: False
Input blobs: ['sequential/dense/Relu']
Output blobs: ['sequential/dense_1/MatMul']
[Id: 30], Name: sequential/dense/Relu (Type: activation)
Updatable: False
Input blobs: ['sequential/dense/MatMul']
Output blobs: ['sequential/dense/Relu']
In the make_updatable function when I execute
builder.set_categorical_cross_entropy_loss(name='lossLayer', input='Identity')
I get this error
ValueError: Categorical Cross Entropy loss layer input (Identity) must be a softmax layer output.
I recently upgraded my Ipad(M4) pro to ios18.1 and the apple intelligence waitlist option popped up. I‘ve been trying to click join waitlist but nothing happens and the apple intellignece pop up menu just dissapears each time I clicked join waitlist. I made sure my region is in the States.
I’ve had the beta and been signed up since October 23rd. Still sitting on early access requested. It’s a 16PM. My son has my old 15PM he downloaded the 18.2 beta 4 days ago and requested access and he already has it. So is it a lottery system or only certain phone models at a given time? Anyone else have the same issue?
Thanks in advance,
Jeremy
I recently updated my iPhone to iOS 14.2 and the playground app came and then I couldn't open it because I hadn't released it yet. Today it released it and I rediled the notification but I went to look for it to test it and I couldn't find it anymore. How do I make the PLAYGROUND app come back from Apple smart. Como faço para baixar novamente o app?
I installed beta 18.2 on the day of release and requested access to playground that day 2 min after I got in it is now a week later
and I still do not have access am I missing something ?
16 pro max
What are the major differences in review process for AI based apps vis a vis normal apps for Apple store?
I've been eagerly awaiting access to the Playground app since its launch day. Is anyone else still stuck on the waitlist?
I have requested early access for more than 10 days why still haven’t get access to it? Is there any problem on the system? Why granting access to Image playground take so long?
I have an existing photo generation app that i was looking to switch to this new api, to create specific images. How are people finding it to develop with in terms of ease of implementation?