Since a couple of days (or maybe even weeks), I cannot use App Store Connect in Chrome anymore. I get to the page where the Apps should appear, but there are none. If I open the same page in Safari, it works. But I dislike Safari, since it always asks me for my password each time I visit that site.
Post
Replies
Boosts
Views
Activity
I just recently saw a message in the Unity forums, by a Unity staff member, that Apple requires an Apple Silicon based Mac (M1, M2) in order to build apps for the Vision Pro glasses. This confused me since the simulator works just fine on my Intel Mac. Is there any official statement from Apple on this? It would be weird to buy a new Mac just because of this.
On my macBook Pro 2019 with Intel processor, I could run apps in the visionOS simulator without any problems when I was running macOS Ventura. But since I upgraded the Mac to Sonoma, the visionOS simulator seems to be broken.
The display in Xcode sticks to "Loading visionOS 1.0", and the simulator page under "Devices and Simulators" says "No runtime".
This is independent of which Xcode version I am using. I used Xcode 15 beta2, but also tried out more recent versions.
Could it be that developing on Intel Macs was dropped on macOS Sonoma without any notice? I can see that the Xcode 15.1 specs state you need a Silicon Mac, but the Xcode 15 specs don't. And it worked for me, at least on Ventura. The "only" change I made since was upgrading the OS to Sonoma.
Our iOS app relies heavily on the ability to place objects in arbitrary locations, and we would like to know if this is possible on visionOS as well.
It should work like this: The user faces into a certain direction. We place an object approx. 5m in front of the user. The object then gets pinned to this position (in air) and won't move any more. It should not be anchored to a real-world item like a wall, the floor or a desk.
Placing the object should even work, if the user looks down while placing the object. The object should then appear 5m in front of him once he looks up.
On iOS, we implemented this using Unity and AR Foundation on iOS. For visionOS, we haven't decided yet if we go for native instead. So, if that's only possible using native code, that's also fine.
In the WWDC23 sessions it was mentioned that the device won't support taking photos or recording videos through the cameras. Which I think is a huge limitation. However, in another forum I read that it actually works, using AVFoundation. So I went back into the docs, and they said it was not possible.
Hence, I am pretty confused. Has anyone tried this out yet and confirm whether camera access is blocked completely or not? For our app, it would be a bummer if it was.
I recently had a chat with a company in the manufacturing business. They were asking if Vision Pro could be used to guide maintenance workers through maintenance processes, a use-case that is already established on other platforms. I thought Vision Pro would be perfect for this as well, until I read in this article from Apple that object detection is not supported:
https://developer.apple.com/documentation/visionos/bringing-your-arkit-app-to-visionos#Update-your-interface-to-support-visionOS
To me, this sounds like sacrificing a lot of potential for business scenarios, just for the sake of data privacy. Is this really the case, i.e. is there no way to detect real-world objects and place content on top of them? Image recognition would not be enough in this use-case.
Is it possible to render a Safari-based webview in full immersive space, so an app can show web pages there?
Hi, if I run an app on the visionOS simulator, I get tons of "garbage" messages in the Xcode logs. Please find some samples below. Because of these messages, I can hardly see really relevant logs. Is there any way to get rid of these?
[0x109015000] Decoding completed without errors
[0x1028c0000] Decoding: C0 0x01000100 0x00003048 0x22111100 0x00000000 11496
[0x1028c0000] Options: 1x-1 [FFFFFFFF,FFFFFFFF] 00054060
[0x1021f3200] Releasing session
[0x1031dfe00] Options: 1x-1 [FFFFFFFF,FFFFFFFF] 00054060
[0x1058eae00] Releasing session
[0x10609c200] Decoding: C0 0x01000100 0x00003048 0x22111100 0x00000000 10901
[0x1058bde00] Decoding: C0 0x01000100 0x0000304A 0x22111100 0x00000000 20910
[0x1028d5200] Releasing session
[0x1060b3600] Releasing session
[0x10881f400] Decoding completed without errors
[0x1058e2e00] Decoding: C0 0x01000100 0x0000304A 0x22111100 0x00000000 9124
[0x1028d1e00] Decoding: C0 0x01000100 0x0000304A 0x22111100 0x00000000 20778
[0x1031dfe00] Decoding completed without errors
[0x1031fe000] Decoding completed without errors
[0x1058e2e00] Options: 256x256 [FFFFFFFF,FFFFFFFF] 00025060```
We're an AR startup in the US, but our founders live in Europe. We definitely want to order the VP once it gets available in the States, but I just saw in my mail that Apple requires a US prescription if you wear glasses. This is a bummer for us. We can forward VP to Europe, but we won't be able to travel to the States just to get such a prescription. Why can't Apple just accept any prescription from an optician?!
I am trying to create a Map with markers that can be tapped on. It also should be possible to create markers by tapping on a location in the map.
Adding a tap gesture to the map works. However, if I place an image as an annotation (marker) and add a tap gesture to it, this tap will not be recognized. Also, the tap gesture of the underlying map fires.
How can I
a) react on annotation / marker taps
b) prevent that the underlying map receives a tap as well (i.e. how can I prevent event bubbling)
Just walked through the order process to realize that I can't get them because my glasses have a prism. Come on, Apple, are you kidding???
I'm experimenting with MapKit on visionOS and I would like to try out different locations. However, I cannot find a way to simulate them. Neither setting a location in Xcode nor setting it in the Simulator would work.
I tap on the "MapUserLocationButton", I get an error message:
CLLocationManager(<CLLocationManager: 0x600000008a10>) for <MKCoreLocationProvider: 0x60000302a400> did fail with error: Error Domain=kCLErrorDomain Code=1 "(null)"
Also, if I try to add the MapCompass and the MapScaleView to .mapControls, this does not have an effect. Which is a pity, since map scaling does not work very well using a mouse in the simulator. How can I get these controls to work?
Last but not least, the MapUserLocationButton shows up in the very upper right and is cut off a bit, so I would love to pad it. But .padding does not have an effect either.
I love the new SwiftUI APIs for Apple Maps. However, I am missing (or haven't found) quite a number of features, particularly on visionOS.
Besides an easy way to zoom maps, the most important feature for me is marker clustering. If you have a lot of markers on a map, this is an absolute must.
Is there any way to accomplish this?
I have implemented a custom view that shows a page in WKWebKit:
import SwiftUI
import WebKit
struct WebView: UIViewRepresentable {
let urlString: String
func makeUIView(context: Context) -> WKWebView {
let webView = WKWebView()
webView.navigationDelegate = context.coordinator
return webView
}
func updateUIView(_ uiView: WKWebView, context: Context) {
if let url = URL(string: urlString) {
let request = URLRequest(url: url)
uiView.load(request)
}
}
func makeCoordinator() -> Coordinator {
Coordinator(self)
}
class Coordinator: NSObject, WKNavigationDelegate {
var parent: WebView
init(_ parent: WebView) {
self.parent = parent
}
}
}
It works, but it shows a grey button in the upper left with no icon. If I click on that button, nothing happens. But I can see this error message in the Xcode logs:
Trying to convert coordinates between views that are in different UIWindows, which isn't supported. Use convertPoint:fromCoordinateSpace: instead.
What is this button, and how can I get rid of it?
As a second question: I also tried to spawn Safari in a separate window, using this view:
import SafariServices
import SwiftUI
struct SafariView: UIViewControllerRepresentable {
let url: URL
func makeUIViewController(context: Context) -> SFSafariViewController {
return SFSafariViewController(url: url)
}
func updateUIViewController(_ uiViewController: SFSafariViewController, context: Context) {
// No update logic needed for a simple web view
}
}
This works, but Safari shows up behind the view that is including the Safari view. Instead, I would want Safari to show up in front - or even better: next to my main view (either left or right). Is there a way to do this?
Is it possible to show a map (or a WkWebView) in a fully-immersive AR or VR view, so it surrounds the user like a panorama?