I have two Apple Vision Pros so that I can make and test a multi-player immersive reality game. But I am one developer, so I need to be able to take one Apple Vision Pro off, and put the other one on to see what the other device is seeing, and to ensure my game information is correctlly being sent over the network with multipeer connectivity.
But when I take one off, the Apple Vision Pro immediately goes to sleep. With Apple Vision Pro OS 1, I could put a piece of paper into the pro and they would stay on for hours, and I could take them on and off and debug my game. But now with VisionOS 2, even with the paper they soon go to sleep.
Is there a setting I can change or override as a developer to stop this auto sleep or auto lock?
I need to check things like:
when two devices are on the network, can I see them both so that players can select each other from a menu?
can i send object positions back and forth
Thank you.
iPad and iOS apps on visionOS
RSS for tagDiscussion about running existing iPad and iOS apps directly on Apple Vision Pro.
Posts under iPad and iOS apps on visionOS tag
73 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
I'm working on a school project that allows users to open a .USDZ file (using Quick Look) on the webpage while using Apple Vision Pro to put the object in their physical envirnment, the project is deployed on Vercel. I'm testing the page with my apple vision pro, when I click open the .USDZ file, I'm seeing a triangle with an exclamation mark while it's trying to load, but it won't load. Does anybody know how to troubleshoot this issue?
Hi everyone I am working on a small project that requires World Anchors so that I can persist my content through whenever the user chooses to leave/close the app. However I can't manage to make my Arkit session to run even though I think all the privacy permissions have been set and allowed correctly. Here is a sample code in an empty scene:
//
// WorldTrackingView.swift
// SH_AVP_Demo
//
// Created by 李希 on 9/19/24.
//
import SwiftUI
import RealityKit
import RealityKitContent
//import VisionKit
import ARKit
import Foundation
import UIKit
import simd
struct WorldTrackingView_test: View {
@State var myCube = Entity()
@Environment(.scenePhase) var myScenePhase
var body: some View {
RealityView { content in
//Load Scene
if let Scene = try? await Entity.load(named: "WorldTrackingScene", in: realityKitContentBundle){
//Add scene to the view
content.add(Scene)
//Look for the cube entity
if let cubeEntity = Scene.findEntity(named: "Cube"){
myCube = cubeEntity
// Create collission for the cube
myCube.generateCollisionShapes(recursive: true)
// Allow inputs to interact
myCube.components.set(InputTargetComponent(allowedInputTypes: .indirect))
// set some ground shadows
myCube.components.set(GroundingShadowComponent(castsShadow: true))
}
}
}
// Add drag gesture that targets any entity in the scene
.gesture(DragGesture().targetedToAnyEntity()
//Do something when the cube position changes
.onChanged{ value in value.entity.position = value.convert(value.location3D, from: .local, to: value.entity.parent!)
myCube = value.entity
// Test and see if the Arkit runs with different data providers
var session = ARKitSession()
var worldData = WorldTrackingProvider()
let planeData = PlaneDetectionProvider()
let sceneData = SceneReconstructionProvider()
do {
Task{
try await session.run([worldData])
for await update in worldData.anchorUpdates {
switch update.event {
case .added, .updated:
// Update the app's understanding of this world anchor.
print("Anchor position updated.")
case .removed:
// Remove content related to this anchor.
print("Anchor position now unknown.")
}
}
}
}catch{
print("session not running \(error.localizedDescription)")
return
}
}
//At the end of the gesture save anchor
.onEnded{ value in
}
)
}
}
#Preview(immersionStyle: .mixed) {
WorldTrackingView()
}
All is does is to generate a cube in an immersive view. The cube has collision and input components added to so that I can interact with it using a drag gesture. I decided to start an arkit session with a WorldTrackingProvider() but I keep getting the following error:
ARPredictorRemoteService <0x117e0c620>: Service configured with error: Error Domain=com.apple.arkit.error Code=501 "(null)"
Remote Service was invalidated: <ARPredictorRemoteService: 0x117e0c620>, will stop all data_providers.
ARRemoteService: remote object proxy failed with error: Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service with pid 81 named com.apple.arkit.service.session was invalidated from this process." UserInfo={NSDebugDescription=The connection to service with pid 81 named com.apple.arkit.service.session was invalidated from this process.}
ARRemoteService: weak self released before invalidation
ARRemoteService: remote object proxy failed with error: Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service with pid 81 named com.apple.arkit.service.prediction was invalidated from this process." UserInfo={NSDebugDescription=The connection to service with pid 81 named com.apple.arkit.service.prediction was invalidated from this process.}
ARRemoteService: weak self released before invalidation
If I switch it with a PlaneDetectionProvider() or a SceneReconstructionProvider() I get print statements in my terminal, but none if i replace it with a WorldTrackingProvider(). Any idea what could be causing this? Same code was working before a recent for xcode I believe.
My application supports iOS and visionOS, in some cases, the image asset is a little bit different on visonOS, and to make this difference, we are using the visionOS image set to assign a different image.
Here is the screenshot of the asset, the warning and the detailed warning.
I am a student developer
We are trying to implement an application that allows you to take photos in visionOS mr mode and access the photos you took.
Can the contents of the link below be used on visionOS?
https://developer.apple.com/tutorials/sample-apps/capturingphotos-captureandsave/
I would really appreciate your reply.
For reference, we plan to package the methods in swift and import the framework into Unity to use them.
navigationController.popToRootViewController(animated: true) does not work on Xcode 16 / iOS 18 Simulator. However, setting animated: to false works fine.
This is only happening on iOS 18 / Xcode 16.
Hi team,
Our app is going to upgrade the backend infrastructure which needs to inform an announcement to the users.
It seems there is a limitation that doesn't allow a message related to the decommissioning put up on the App's home page.
I have check some docs over internet but still not found any official one specify this topic.
Can we just put the message as mentioned above or there is actually a limit which not allow to add it in?
Thanks
I have an application made from Flutter, which is possible to run on VisionOS by running as design to Ipad, and I would like that inside this application would be possible to go to mixed reality somehow. I am trying to do so far was to embedded the vision project that I have inside the swift application that flutter generates, but in this attempt I got an error from Xcode telling me that this way is not possible. I wonder if is there an another way that I could achieve my goal?
Hi All,
I am trying to build a new iOS app by following https://developer.apple.com/videos/play/wwdc2024/10163/?time=67
When I trying to remove all legacy VN I am getting error, I would appreciate if someone can help me get up to speed with the new Vision API
Hi,
we've been developing an iOS game using OpenGL ES, trying running compatible iOS app on Vision Pro Simulator on M1 Mac, but we met a strange EXC_BAD_ACCESS crash in glDrawElements(), while the app run normally on ios or ipad simulators with the same developing environment. The stack like following:
Our Mac is a Mac Studio(2022) with Apple M1 Max and MacOS 14.1.2.
Xcode: Version 15.3 (15E204a)
Simulator OS: Vision OS 1.1
iOS app running in compatibility mode on Apple Vision Pro does not apply hoverEffect to SwiftUI View
We tested our iOS app on visionOS and found that the hover effect works on most of UIKit views, but it does not work on most of SwiftUI views.
SwiftUI views are used within UIHostingController.
I created a new project (Storyboard based iOS app) and displayed the following SwiftUI view in UIHostingController and found that the buttons in the List were highlighted, but not the standalone buttons.
struct SwiftUIView: View {
var body: some View {
List {
// This button has the hover effect.
Button {
print("Hello")
} label: {
Text("Hello")
}
}
// This button does't have the hover effect.
Button {
print("Hello")
} label: {
Text("Hello")
}
.hoverEffect()
}
}
Is there a way to highlight any SwiftUI view?
I’m trying to add screenshot for my new app, I did it the same way I have been doing this and it seems like there’s a issue with the website causing it to say that the images are still processing after hitting add to review. Is anyone having this issue, I’ve been trying on 3 different apple devices since 6pm EST.
In visionOS simulator, a ContactPicker for Multiple contacts selection is shown without the Done button. Can I assume this behavior will be OK on an actual Vision Pro? I could not get a list of contacts to be selected.
On iOS, the Done button is shown ok as follows:
import ContactsUI
import Combine
struct ContactPickerView: View {
@State private var pickedNumber: String?
@StateObject private var coordinator = Coordinator()
var body: some View {
VStack {
Button("Open Contact Picker") {
openContactPicker()
}
.padding()
Text(pickedNumber ?? "")
.padding()
}
.onReceive(coordinator.$pickedNumber, perform: { phoneNumber in
self.pickedNumber = phoneNumber
})
.environmentObject(coordinator)
}
func openContactPicker() {
let contactPicker = CNContactPickerViewController()
contactPicker.delegate = coordinator
let scenes = UIApplication.shared.connectedScenes
let windowScenes = scenes.first as? UIWindowScene
let window = windowScenes?.windows.first
window?.rootViewController?.present(contactPicker, animated: true, completion: nil)
}
class Coordinator: NSObject, ObservableObject, CNContactPickerDelegate {
@Published var pickedNumber: String?
func contactPicker(_ picker: CNContactPickerViewController, didSelect contacts: [CNContact]){
print(contacts)
contacts.forEach { contact in
for number in contact.phoneNumbers {
let phoneNumber = number.value
print("number is = \(phoneNumber)")
}
}
}
}
}
I want to run my custom application on my VisionPro.
I have paired my VisionPro with Xcode successfully but when I am running it it shows me to enable developer mode in VisionPro.
When I followed the options and Went to Settings -> Privacy & Security.
There is no developer mode option visible at any place.
Please let me know how I can enable the developer option in VisionPro.
Thanks
I am developing an iPhone app, but I've been targeting the AVP, as well. In fact, since I got the AVP, I've mainly be building and running my app on it. This morning, I had an upgrade to Xcode 15.4 (15F31d). Ever since I have not been able to see my AVP as a run destination.
It does show up in the device list, although there are no provisioning files on it for some reason. But I can't target it for building. I've tried unpairing and turning developer mode off and on.
Has anyone else seen this problem after upgrading Xcode? Any help is appreciated.
I'm taking my iOS/iPadOS app and converting it so it runs on visionOS. I’m trying to compile my app, build it, for both visionOS and iOS. When I try to build for an iPhone and iPad simulator, I get the following error:
 Building for 'iphonesimulator', but realitytool only supports [xros, xrsimulator]
I’m thinking I might need to do a # if conditional compilation statement for visionOS so iOS doesn’t try to build lines of code but I can’t for this particular error find out for which file or code I need to do the conditional compilation. Anyone know how to get rid of this error? 
I have a unity scene which i have created for vision pro and i have also created a biomatric authentication application for vision os using Xcode and swift. What i want to do is call unity scene after the authentication has taken place form the xcode. now i have seen medium post but it only shows how we can do that for apps, I am not bale to do that for vision Pro
I have followed this post : https://medium.com/mop-developers/launch-a-unity-game-from-a-swiftui-ios-app-11a5652ce476
All this i am doing because as far as i know Apple vision pro is not currently supporting optic id authentication with unity's polyspatial plugin.
Any help on this will be appreciated.
Thank you in advace.
Today I have tried to add a second archive action for visionOS. I had added a visionOS destination to my app target a while back and can build and archive my app for visionOS in Xcode 15.3 locally, and also run it on the device.
Xcode Cloud is giving me the following errors in the Archive - visionOS action (Archive - iOS works):
Invalid Info.plist value. The value for the key 'DTPlatformName' in bundle MyApp.app is invalid.
Invalid sdk value. The value provided for the sdk portion of LC_BUILD_VERSION in MyApp.app/MyApp is 17.4 which is greater than the maximum allowed value of 1.2.
This bundle is invalid. The value provided for the key MinimumOSVersion '17.0' is not acceptable.
Type Mismatch. The value for the Info.plist key CFBundleIcons.CFBundlePrimaryIcon is not of the required type for that key. See the Information Property List Key Reference at https://developer.apple.com/library/ios/documentation/general/Reference/InfoPlistKeyReference/Introduction/Introduction.html#//apple_ref/doc/uid/TP40009248-SW1
All 4 errors are annotated with "Prepare Build for App Store Connect" and I get them for both "TestFlight (Internal Testing Only)" and "TestFlight and App Store" deployment preparation options.
I have tried to remove the visionOS destination and add it back, but this is not changing the project at all.
Any ideas what I am missing?
Firstly, everything is ok. I have been connected Apple Vision Pro device to the Xcode via wireless network, also build my app in past several weeks. But since yesterday, I can not connect Apple Vision Pro device to my Xcode anymore. The device did not listed in Devices and Simulators window. I have tried:
Update my Xcode to 15.3
Reboot my Mac and Apple Vision Pro
Reset Apple Vision Pro also erase all data
Other Macs in the same network also did not list any Vision Pro device
I'm sure Vision Pro and Mac are in the same network, and it worked before. I go to Settings - General - Remote Devices, and open Xcode's Devices and Simulators window, still can't see any Apple Vision Pro device.
I am attempting to integrate VisionOS support into my existing iOS app, which utilizes Swift UI and CocoaPods. However, upon adding VisionOS as a supported platform and attempting to run the app, I encounter two errors:
"'jot/jot.h' file not found" at "/Users/xxxxxx/Desktop/IOS_DEVELOPMENT/iOS/xxxxxxxxx/xxxxxxxxx-Bridging-Header.h:17:9".
"Failed to emit precompiled header" at "/Users/xxxxxx/Library/Developer/Xcode/DerivedData/xxxxxxxxx-bnhvaxypgfhmvqgklzjdnxxbrdhu/Build/Intermediates.noindex/PrecompiledHeaders/xxxxxxxxx-Bridging-Header-swift_6TTOG1OAZB5F-clang_21TRHDW14EDOZ.pch" for bridging header "/Users/xxxxxxxx/Desktop/IOS_DEVELOPMENT/iOS/xxxxxxx/xxxxxxxx-Bridging-Header.h".
I'm seeking assistance with resolving these errors. Below is my Podfile configuration:
source 'https://github.com/CocoaPods/Specs.git'
platform :ios, '15.0'
target 'xxxxxxxxxx' do
use_frameworks!
pod 'RealmSwift'
pod 'JGProgressHUD'
pod 'BadgeLabel'
pod 'jot'
pod 'MaterialComponents/Chips'
pod 'GoogleMaps'
pod 'Firebase/Crashlytics'
pod 'Firebase/Analytics' # Firebase pod for Google Analytics
# Add pods for any other desired Firebase products
# https://firebase.google.com/docs/ios/setup#available-pods
end
post_install do |installer|
installer.pods_project.targets.each do |target|
target.build_configurations.each do |config|
config.build_settings['IPHONEOS_DEPLOYMENT_TARGET'] = '15.0'
end
end
end
Any assistance in resolving these errors would be greatly appreciated.