Our App Clips have been approved for about a week. However, our App Clips are not loading or previewing in iMessage.
My friend and I both have an iPhone with iOS 18. Our App Clip minimum target is iOS 17. For both of us, the App Clip previews aren't loading in iMessage (see image below). But this isn't right—App Clips generally SHOULD preview in iMessage. For reference, when I send my friend an Instagram post URL, the app clip DOES load & preview in iMessage—suggesting it's just an issue with 222's App Clips.
In the iMessage screenshot... the top url is https://rsvp.222.place/?id=0034d4c8-7dcd-44bf-89df-6dc1acd806d2
The bottom url is https://appclip.apple.com/id?p=place.twotwotwo.twotwotwo-ios.Clip (default App Clip Link)
When I run either of these URLs in App Clip Diagnostics in Settings -> Developer, both of them are marked as approved and ready to go.
Post
Replies
Boosts
Views
Activity
We're adjusting to the new iOS 18 .limited contacts access mode.
In our app, we don't request contacts right away. We have a search bar where users can search through their own contacts and select one using the ContactAccessButton. If they do select one, then they're prompted to "Grant Limited Access", not "Grant Full Access", as the screenshot shows below.
Later on, we want to offer the ability for users to sync their entire contact book with our app. This will improve their experience on the app by automatically finding all their friends already on the app, without them having to do the manual work of clicking on every single contact in the ContactsAccessPicker.
Is this possible right now?
It doesn't seem like it—when I call ContactsStore.requestAccess(for: .contacts) while in .limited access mode, nothing happens. But I would like to show a prompt that gives the user the ability to grant all their contacts to improve their experience.
In XCode 16, there's the Canvas section within Editor.
This is XCode 16.1 beta. "Canvas" subitem appears in Search but does not appear under Editor section
Previews do not work for me without using Legacy Previews Extension, so I can no longer use Previews at all with XCode 16.1 Beta.
Why was Canvas subsection under Editor removed?c
When using both the ContactsAccessButton demo project, as well as when implementing it in my own, the whole app freezes after entering a few characters and searching through contacts.
I don't know if this is necessarily reproducable because it's probably related to the contacts in my contact book. Typing in "Lex" does not freeze the app, but typing in "Adam No" freezes it.
I get the following console error before my app freezes and I'm forced to force quit it:
#ContactsButton Failed to get service proxy: Error Domain=NSCocoaErrorDomain Code=4097 "connection to service with pid 3387 named com.apple.ContactsUI.ContactsButtonXPCService" UserInfo={NSDebugDescription=connection to service with pid 3387 named com.apple.ContactsUI.ContactsButtonXPCService}
#ContactsButton Failed to get remote content: nil (got this a number of times
ContactsAccessButton really doesn't seem production ready...
Apple revealed the ContactsAccessButton in the WWDC24 session 10121: Meet the Contact Access Button.
After watching the video, reading through the
documentation as well as the
sample code
, I can only find a SwiftUI ContactsAccessButton.
However, our code base is written largely in UIKit, and our team prefers to do complex work and customization with lists via UITableView as opposed to SwiftUI List. So we would greatly prefer to use a UIKit ContactAccessButton.
Is there not a UIKit equivalent to ContactsAccessButton? If there is, where can we find it?
I have tried cleaning my project and restarting my app.
Unfortunately, the error message is not readable, so I can't make anything useful out of it.
I am using Managed Settings to block all apps during a set period of time using the below method.
override func intervalDidStart(for activity: DeviceActivityName) {
super.intervalDidStart(for: activity)
ManagedSettingsStore().shield.applicationCategories = .all()
}
When a user opens an app during that time, they should be allowed to press a button on the shield which lets them "continue using that app", just like the default iOS screen time allows for.
However, this doesn't seem possible right now when the below delegate is called for ActivityCategoryToken ShieldAction:
override func handle(action: ShieldAction, for category: ActivityCategoryToken, completionHandler: @escaping (ShieldActionResponse) -> Void) {
switch action {
case .primaryButtonPressed:
completionHandler(.close)
case .secondaryButtonPressed:
ManagedSettingsStore().shield.applicationCategories = .all(except: category) //this is not possible right now.
completionHandler(.none)
@unknown default:
fatalError()
}
}
This is because the .all(except:) method takes a set of application tokens, but the handle delegate method on activityCategoryTokens only provides application category tokens.
Is there no way to get around this? It would be very helpful if either
a) the delegate method for ActivityCategoryToken ShieldAction also provided the ApplicationToken that was blocked (this would be preferred), or
b) the .all(except: ) method also accepts ActivityCategoryTokens as an input.
I am trying to mimic the flash behavior of a disposable camera on iPhone.
With a disposable camera, when the capture button is clicked, the flash instantly and quickly goes off once, right as the photo is captured. However, with the iPhone’s camera, when the capture button is pressed, the torch is turned on for roughly a second, and then the sound is produced & torch flashes on & off as the photo is capture.
When using the straightforward Apple API of setting the AVCapturePhotoSettings.flashMode to on:
var photoSettings = AVCapturePhotoSettings()
photoSettings.flashMode = .on
the system default flash behavior is applied. Instead, I would like to create my own custom flash behavior which does not include the torch being turned on for roughly a second before it flashes again in order to mimic a disposable camera.
My idea was to manually toggle the torch either right before or during the AVCapturePhotoOutput.capturePhoto() process.
private let flashSessionQueue = DispatchQueue(label: "flash session queue")
func toggleTorch(_ on: Bool) {
let device = AVCaptureDevice.default(.builtInDualWideCamera, for: .video, position: .back)!
do {
try device.lockForConfiguration()
device.torchMode = on ? .on : .off
device.unlockForConfiguration()
} catch {
print("Torch could not be used")
}
}
func photoOutput(_ output: AVCapturePhotoOutput, willBeginCaptureFor resolvedSettings: AVCaptureResolvedPhotoSettings) {
flashSessionQueue.async {
self.toggleTorch(true)
}
flashSessionQueue.asyncAfter(deadline: .now() + 0.2) {
self.toggleTorch(false)
}
}
However, when I do this, the photo is not actually captured, and instead I get either error “**AVFoundationErrorDomain Code=-11800". The flash does occur—the torch turns on and off—but no photo is captured.
Would it even be possible to create a custom flash by manually toggling the torch before / during a camera capture? I have been doing research on AVCaptureMultiCamSession and am also wondering if that could provide a solution.