iPad and iOS apps on visionOS

RSS for tag

Discussion about running existing iPad and iOS apps directly on Apple Vision Pro.

Posts under iPad and iOS apps on visionOS tag

93 Posts
Sort by:
Post not yet marked as solved
0 Replies
125 Views
Firstly, everything is ok. I have been connected Apple Vision Pro device to the Xcode via wireless network, also build my app in past several weeks. But since yesterday, I can not connect Apple Vision Pro device to my Xcode anymore. The device did not listed in Devices and Simulators window. I have tried: Update my Xcode to 15.3 Reboot my Mac and Apple Vision Pro Reset Apple Vision Pro also erase all data Other Macs in the same network also did not list any Vision Pro device I'm sure Vision Pro and Mac are in the same network, and it worked before. I go to Settings - General - Remote Devices, and open Xcode's Devices and Simulators window, still can't see any Apple Vision Pro device.
Posted
by A00a.
Last updated
.
Post not yet marked as solved
0 Replies
137 Views
I am attempting to integrate VisionOS support into my existing iOS app, which utilizes Swift UI and CocoaPods. However, upon adding VisionOS as a supported platform and attempting to run the app, I encounter two errors: "'jot/jot.h' file not found" at "/Users/xxxxxx/Desktop/IOS_DEVELOPMENT/iOS/xxxxxxxxx/xxxxxxxxx-Bridging-Header.h:17:9". "Failed to emit precompiled header" at "/Users/xxxxxx/Library/Developer/Xcode/DerivedData/xxxxxxxxx-bnhvaxypgfhmvqgklzjdnxxbrdhu/Build/Intermediates.noindex/PrecompiledHeaders/xxxxxxxxx-Bridging-Header-swift_6TTOG1OAZB5F-clang_21TRHDW14EDOZ.pch" for bridging header "/Users/xxxxxxxx/Desktop/IOS_DEVELOPMENT/iOS/xxxxxxx/xxxxxxxx-Bridging-Header.h". I'm seeking assistance with resolving these errors. Below is my Podfile configuration: source 'https://github.com/CocoaPods/Specs.git' platform :ios, '15.0' target 'xxxxxxxxxx' do use_frameworks! pod 'RealmSwift' pod 'JGProgressHUD' pod 'BadgeLabel' pod 'jot' pod 'MaterialComponents/Chips' pod 'GoogleMaps' pod 'Firebase/Crashlytics' pod 'Firebase/Analytics' # Firebase pod for Google Analytics # Add pods for any other desired Firebase products # https://firebase.google.com/docs/ios/setup#available-pods end post_install do |installer| installer.pods_project.targets.each do |target| target.build_configurations.each do |config| config.build_settings['IPHONEOS_DEPLOYMENT_TARGET'] = '15.0' end end end Any assistance in resolving these errors would be greatly appreciated.
Posted Last updated
.
Post not yet marked as solved
0 Replies
132 Views
I am attempting to integrate VisionOS support into my existing iOS app, which utilizes Swift UI and CocoaPods. However, upon adding VisionOS as a supported platform and attempting to run the app, I encounter two errors: "'jot/jot.h' file not found" at "/Users/xxxxxx/Desktop/IOS_DEVELOPMENT/iOS/xxxxxxxxx/xxxxxxxxx-Bridging-Header.h:17:9". "Failed to emit precompiled header" at "/Users/xxxxxx/Library/Developer/Xcode/DerivedData/xxxxxxxxx-bnhvaxypgfhmvqgklzjdnxxbrdhu/Build/Intermediates.noindex/PrecompiledHeaders/xxxxxxxxx-Bridging-Header-swift_6TTOG1OAZB5F-clang_21TRHDW14EDOZ.pch" for bridging header "/Users/xxxxxxxx/Desktop/IOS_DEVELOPMENT/iOS/xxxxxxx/xxxxxxxx-Bridging-Header.h". I'm seeking assistance with resolving these errors. Below is my Podfile configuration: source 'https://github.com/CocoaPods/Specs.git' platform :ios, '15.0' target 'xxxxxxxxxx' do use_frameworks! pod 'RealmSwift' pod 'JGProgressHUD' pod 'BadgeLabel' pod 'jot' pod 'MaterialComponents/Chips' pod 'GoogleMaps' pod 'Firebase/Crashlytics' pod 'Firebase/Analytics' # Firebase pod for Google Analytics # Add pods for any other desired Firebase products # https://firebase.google.com/docs/ios/setup#available-pods end post_install do |installer| installer.pods_project.targets.each do |target| target.build_configurations.each do |config| config.build_settings['IPHONEOS_DEPLOYMENT_TARGET'] = '15.0' end end end Any assistance in resolving these errors would be greatly appreciated.
Posted Last updated
.
Post not yet marked as solved
1 Replies
111 Views
Is there a method for finding APIs that are compatible with both iOS and Vision OS (ex. hoverStyle)? I'm encountering difficulties in developing for Vision OS, although I can successfully build for 'Apple Vision (designedForIPad)'. Are there any methods for discovering APIs that support both platforms? I'm looking to enhance my application and would appreciate any guidance on where to find such APIs. Additionally, I'm interested in changing the background to a glass style. However, it seems that this feature may not be supported by the available APIs, particularly those designed for Vision OS. Any suggestions or insights would be greatly appreciated."
Posted
by soosoosoo.
Last updated
.
Post not yet marked as solved
3 Replies
210 Views
Looking for an iOS Vision Developer for Private Project We are seeking a talented iOS Vision Developer to join our team for an exciting private project. The project involves the development of an interactive space modeling application. Requirements: Proficiency in iOS development with a strong focus on Vision framework. Experience in developing interactive applications. Ability to collaborate effectively within a team environment. Strong problem-solving skills and attention to detail. Responsibilities: Designing and implementing features using the Vision framework. Collaborating with the team to ensure the smooth integration of features. Testing and debugging applications to ensure optimal performance. If you are passionate about iOS development and have experience with the Vision framework, we would love to hear from you.
Posted Last updated
.
Post not yet marked as solved
1 Replies
233 Views
I want to get only spatial video while open the Photo library in my app. How can I achieve? One more thing, If I am selecting any video using photo library then how to identify selected video is Spatial Video or not? self.presentPicker(filter: .videos) /// - Tag: PresentPicker private func presentPicker(filter: PHPickerFilter?) { var configuration = PHPickerConfiguration(photoLibrary: .shared()) // Set the filter type according to the user’s selection. configuration.filter = filter // Set the mode to avoid transcoding, if possible, if your app supports arbitrary image/video encodings. configuration.preferredAssetRepresentationMode = .current // Set the selection behavior to respect the user’s selection order. configuration.selection = .ordered // Set the selection limit to enable multiselection. configuration.selectionLimit = 1 let picker = PHPickerViewController(configuration: configuration) picker.delegate = self present(picker, animated: true) } `func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) { picker.dismiss(animated: true) { // do something on dismiss } guard let provider = results.first?.itemProvider else {return} provider.loadFileRepresentation(forTypeIdentifier: "public.movie") { url, error in guard error == nil else{ print(error) return } // receiving the video-local-URL / filepath guard let url = url else {return} // create a new filename let fileName = "\(Int(Date().timeIntervalSince1970)).\(url.pathExtension)" // create new URL let newUrl = URL(fileURLWithPath: NSTemporaryDirectory() + fileName) print(newUrl) print("===========") // copy item to APP Storage //try? FileManager.default.copyItem(at: url, to: newUrl) // self.parent.videoURL = newUrl.absoluteString } }`
Posted Last updated
.
Post not yet marked as solved
0 Replies
178 Views
Hi. I am trying to implement a drag and drop system using RealityView with a model pinning to a plane in the scene. For that I want to do a raycast from the center of the object I am currently dragging to a plane in the scene. My idea was to convert the object position to screen coordinates first and then do a raycast into the scene from the screen to understand if the object I am dragging is above a certain plane. The issue I am having is I can't seem to find a way to convert the position of the object in the scene to screen space. Is there an API for that?
Posted
by vlsim.
Last updated
.
Post not yet marked as solved
0 Replies
176 Views
Hi! Currently showing the Apple Vision Pro to my clients. Sharing the screen is challenging with Guest Mode on the AVP. You can share the screen but you need to teather the AVP and MBP to an iPhone and then you can AirPlay. A lot of big corporate WiFi networks may not allow AirPlay so teathering is a better option. Does anyone know if Apple is making their AVP demo app available to developers? This would be super helpful for showing off the AVP's capabilities. Thanks! JB
Posted Last updated
.
Post not yet marked as solved
4 Replies
498 Views
Hi guys, if you started using Vision Pro, I'm sure you already found some limitations. Let's join forces and make feature requests. When creating Feedback, request from one guy may not get any attenption from Apple, but if we join and more of us make the same request, we might just push those ideas through. Feel free to add your ideas and don't forget to create feedback: app windows can only be moved forward to a distance of about 20ft/6m. I'm pretty sure some users would like to push window as far as a few miles away and make the window large to be still legible. This would be very interesting especialy when using Environments and 360-degree view. I really want to put some apps up on the sky above the mountains and around me, even those iOS apps just made compatible with Vision Pro. when capturing screen, I always get message "Video capture not possible due to insufficient lighting". Why? I have Environment loaded and extended 360 degrees with some apps opened, so there is no need for external lighting (at least I thing it's not needed). I just want to capture what I see. Imagine creating tutorials, recording lessons for learning various subjects, etc. Actual Vision Pro user might prefer loading their on environments an setup app in spatial domain, but for those that don't have it yet or when creating videos to be available on antique 2D computer screens , it may be useful to create 2D videos this way. 3D video recording is not very good, kind of shaky, not when Vision Pro is static, but when walking and especially when turning head left/right/up/down (even relatively slowly). I think hardware should be able to capture and create nice and smooth video. It's possible that Apple just designed simple camera app and wants to give developers a chance to create a better Camera app, but it still would be nice to have something better out of the box. I would like to be able to walk through Environments. I understand safety of see-through effect, so users didn't hit any obstacles, but perhaps obstacles could be detected and when user gets to 6ft/2m from obstacle then it could present at first warning (there is already "You are close to and object" and then make surroundigns visible, but if there are no obstacles (user can be located in large space and can place a tape or a thread around the safe area), I should be able to walk around and take a look inside that crater on the Moon. We need Environments, Environments, Environments and yet more of them, I was hoping for hundreds, so we could even pick some of them and use in our apps, like games where you want to setup a specific environment. Well, that's just a beginning and I could go on and on and on, but tell me what you guys think. Regards and enjoy new virtual adventure! Robert
Posted Last updated
.
Post not yet marked as solved
1 Replies
440 Views
As the title already suggests, is it possible with the current Apple Vision Simulator to recognize objects/humans, like it is currently possible on the iPhone. I am not even sure, if we have an api for accessing the cameras of the Vision Pro? My goal is, to recognize for example a human and add to this object an 3D object, for example a hat. Can this be done?
Posted
by wladislaw.
Last updated
.
Post not yet marked as solved
0 Replies
251 Views
Hello! I have an app that has both iPhone and iPad targets. I want to use the iPhone version on Vision Pro. But since the app has an iPad version, the system always uses the Designed for iPad version. Is any way to set the Designed for iPhone version as the version to use on Vision Pro (while weeping the iPad target). Thank you!
Posted Last updated
.
Post not yet marked as solved
0 Replies
284 Views
My project is use OC, is a iOS App, and now I need make it to visionOS (not unmodified designed for iPhone). So a question one, how can I differentiate visionOS by code, need use macro definitions, otherwise, it cannot be compiled. The question two, have some other tips?or other question need I know? Thanks.
Posted
by lowinding.
Last updated
.
Post not yet marked as solved
0 Replies
201 Views
After using Vision Pro and experiencing the impressive features of the VisionOS software, I couldn't ignore the discomfort of wearing the device for an extended period. The strain on my face and eyes made me frequently take off the glasses. While VR applications can provide an immersive experience that helps overlook some of the discomfort, if the positioning of AVP is not as a VR device but rather as an MR (Mixed Reality) device, the interaction with the real world would be significantly affected by the wearing experience. I'm wondering if Apple has any plans to optimize hardware weight and screen radiation~
Posted
by Gocobnus.
Last updated
.