I am attempting to integrate VisionOS support into my existing iOS app, which utilizes Swift UI and CocoaPods. However, upon adding VisionOS as a supported platform and attempting to run the app, I encounter two errors:
"'jot/jot.h' file not found" at "/Users/xxxxxx/Desktop/IOS_DEVELOPMENT/iOS/xxxxxxxxx/xxxxxxxxx-Bridging-Header.h:17:9".
"Failed to emit precompiled header" at "/Users/xxxxxx/Library/Developer/Xcode/DerivedData/xxxxxxxxx-bnhvaxypgfhmvqgklzjdnxxbrdhu/Build/Intermediates.noindex/PrecompiledHeaders/xxxxxxxxx-Bridging-Header-swift_6TTOG1OAZB5F-clang_21TRHDW14EDOZ.pch" for bridging header "/Users/xxxxxxxx/Desktop/IOS_DEVELOPMENT/iOS/xxxxxxx/xxxxxxxx-Bridging-Header.h".
I'm seeking assistance with resolving these errors. Below is my Podfile configuration:
source 'https://github.com/CocoaPods/Specs.git'
platform :ios, '15.0'
target 'xxxxxxxxxx' do
use_frameworks!
pod 'RealmSwift'
pod 'JGProgressHUD'
pod 'BadgeLabel'
pod 'jot'
pod 'MaterialComponents/Chips'
pod 'GoogleMaps'
pod 'Firebase/Crashlytics'
pod 'Firebase/Analytics' # Firebase pod for Google Analytics
# Add pods for any other desired Firebase products
# https://firebase.google.com/docs/ios/setup#available-pods
end
post_install do |installer|
installer.pods_project.targets.each do |target|
target.build_configurations.each do |config|
config.build_settings['IPHONEOS_DEPLOYMENT_TARGET'] = '15.0'
end
end
end
Any assistance in resolving these errors would be greatly appreciated.
iPad and iOS apps on visionOS
RSS for tagDiscussion about running existing iPad and iOS apps directly on Apple Vision Pro.
Posts under iPad and iOS apps on visionOS tag
73 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Is there a method for finding APIs that are compatible with both iOS and Vision OS (ex. hoverStyle)?
I'm encountering difficulties in developing for Vision OS, although I can successfully build for 'Apple Vision (designedForIPad)'. Are there any methods for discovering APIs that support both platforms? I'm looking to enhance my application and would appreciate any guidance on where to find such APIs.
Additionally, I'm interested in changing the background to a glass style. However, it seems that this feature may not be supported by the available APIs, particularly those designed for Vision OS. Any suggestions or insights would be greatly appreciated."
With IOS 17.4.1 we notice that the restriction for Disallow Airprint is not working, it works on 17.3.1, and the restriction gets pushed down to both devices but does not work with IOS 17.4.1
Looking for an iOS Vision Developer for Private Project
We are seeking a talented iOS Vision Developer to join our team for an exciting private project. The project involves the development of an interactive space modeling application.
Requirements:
Proficiency in iOS development with a strong focus on Vision framework.
Experience in developing interactive applications.
Ability to collaborate effectively within a team environment.
Strong problem-solving skills and attention to detail.
Responsibilities:
Designing and implementing features using the Vision framework.
Collaborating with the team to ensure the smooth integration of features.
Testing and debugging applications to ensure optimal performance.
If you are passionate about iOS development and have experience with the Vision framework, we would love to hear from you.
App A want to know if iPhone user is looking at the photo one of the family member and display some information related to that user?
Hi, I am a musician and I tried the vision pro a few days ago. I have ideas to launch the related app on Vision Pro. I would like to find a developer who can help me build up the app :) Thank you all!
I want to get only spatial video while open the Photo library in my app. How can I achieve?
One more thing, If I am selecting any video using photo library then how to identify selected video is Spatial Video or not?
self.presentPicker(filter: .videos)
/// - Tag: PresentPicker
private func presentPicker(filter: PHPickerFilter?) {
var configuration = PHPickerConfiguration(photoLibrary: .shared())
// Set the filter type according to the user’s selection.
configuration.filter = filter
// Set the mode to avoid transcoding, if possible, if your app supports arbitrary image/video encodings.
configuration.preferredAssetRepresentationMode = .current
// Set the selection behavior to respect the user’s selection order.
configuration.selection = .ordered
// Set the selection limit to enable multiselection.
configuration.selectionLimit = 1
let picker = PHPickerViewController(configuration: configuration)
picker.delegate = self
present(picker, animated: true)
}
`func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {
picker.dismiss(animated: true) {
// do something on dismiss
}
guard let provider = results.first?.itemProvider else {return}
provider.loadFileRepresentation(forTypeIdentifier: "public.movie") { url, error in
guard error == nil else{
print(error)
return
}
// receiving the video-local-URL / filepath
guard let url = url else {return}
// create a new filename
let fileName = "\(Int(Date().timeIntervalSince1970)).\(url.pathExtension)"
// create new URL
let newUrl = URL(fileURLWithPath: NSTemporaryDirectory() + fileName)
print(newUrl)
print("===========")
// copy item to APP Storage
//try? FileManager.default.copyItem(at: url, to: newUrl)
// self.parent.videoURL = newUrl.absoluteString
}
}`
Hi. I am trying to implement a drag and drop system using RealityView with a model pinning to a plane in the scene. For that I want to do a raycast from the center of the object I am currently dragging to a plane in the scene. My idea was to convert the object position to screen coordinates first and then do a raycast into the scene from the screen to understand if the object I am dragging is above a certain plane.
The issue I am having is I can't seem to find a way to convert the position of the object in the scene to screen space. Is there an API for that?
Hi!
Currently showing the Apple Vision Pro to my clients. Sharing the screen is challenging with Guest Mode on the AVP. You can share the screen but you need to teather the AVP and MBP to an iPhone and then you can AirPlay. A lot of big corporate WiFi networks may not allow AirPlay so teathering is a better option.
Does anyone know if Apple is making their AVP demo app available to developers?
This would be super helpful for showing off the AVP's capabilities.
Thanks!
JB
I made a model with AssetBundle,and it was based on the iOS platform。In vision pro ,i wanted to load this model, but it faild to load. I need some help .
WindowGroups have ids, like this: WindowGroup(id: "window_id"), but DocumentGroups do not have. So opening like this does not work: openWindow(id: "window_id").
The problem is that if the user dismisses the DocumentGroup, there is no way to reopen it again.
Hi guys,
if you started using Vision Pro, I'm sure you already found some limitations. Let's join forces and make feature requests. When creating Feedback, request from one guy may not get any attenption from Apple, but if we join and more of us make the same request, we might just push those ideas through. Feel free to add your ideas and don't forget to create feedback:
app windows can only be moved forward to a distance of about 20ft/6m. I'm pretty sure some users would like to push window as far as a few miles away and make the window large to be still legible. This would be very interesting especialy when using Environments and 360-degree view. I really want to put some apps up on the sky above the mountains and around me, even those iOS apps just made compatible with Vision Pro.
when capturing screen, I always get message "Video capture not possible due to insufficient lighting". Why? I have Environment loaded and extended 360 degrees with some apps opened, so there is no need for external lighting (at least I thing it's not needed). I just want to capture what I see. Imagine creating tutorials, recording lessons for learning various subjects, etc. Actual Vision Pro user might prefer loading their on environments an setup app in spatial domain, but for those that don't have it yet or when creating videos to be available on antique 2D computer screens , it may be useful to create 2D videos this way.
3D video recording is not very good, kind of shaky, not when Vision Pro is static, but when walking and especially when turning head left/right/up/down (even relatively slowly). I think hardware should be able to capture and create nice and smooth video. It's possible that Apple just designed simple camera app and wants to give developers a chance to create a better Camera app, but it still would be nice to have something better out of the box.
I would like to be able to walk through Environments. I understand safety of see-through effect, so users didn't hit any obstacles, but perhaps obstacles could be detected and when user gets to 6ft/2m from obstacle then it could present at first warning (there is already "You are close to and object" and then make surroundigns visible, but if there are no obstacles (user can be located in large space and can place a tape or a thread around the safe area), I should be able to walk around and take a look inside that crater on the Moon.
We need Environments, Environments, Environments and yet more of them, I was hoping for hundreds, so we could even pick some of them and use in our apps, like games where you want to setup a specific environment.
Well, that's just a beginning and I could go on and on and on, but tell me what you guys think.
Regards and enjoy new virtual adventure!
Robert
i'm struggling to get my app through validation for visionOS. I'm not sure why I'm getting this as all of the appIcons are filled.
Hello!
I have an app that has both iPhone and iPad targets.
I want to use the iPhone version on Vision Pro. But since the app has an iPad version, the system always uses the Designed for iPad version.
Is any way to set the Designed for iPhone version as the version to use on Vision Pro (while weeping the iPad target).
Thank you!
My project is use OC, is a iOS App, and now I need make it to visionOS (not unmodified designed for iPhone).
So a question one, how can I differentiate visionOS by code, need use macro definitions, otherwise, it cannot be compiled.
The question two, have some other tips?or other question need I know?
Thanks.
After using Vision Pro and experiencing the impressive features of the VisionOS software, I couldn't ignore the discomfort of wearing the device for an extended period. The strain on my face and eyes made me frequently take off the glasses. While VR applications can provide an immersive experience that helps overlook some of the discomfort, if the positioning of AVP is not as a VR device but rather as an MR (Mixed Reality) device, the interaction with the real world would be significantly affected by the wearing experience. I'm wondering if Apple has any plans to optimize hardware weight and screen radiation~
Will Roomplan API be available for Vision OS? Currently its not available.
The AVP is paired with Xcode and all software is up to date. Trying to copy shared cache symbols, haven't been able to debug on-device yet.
It repeatedly gets stuck at 2%, 5%, has made it as high as 8% but just comes back to this:
Previous preparation error: Unable to copy shared cache files.
I'm sharing more visuals and strings in case others are having a similar issue. I've checked the Console and the "Show Recent Logs" that can be reached from Xcode's Device window but there's nothing there.
Following this thread I'm able to render a simple picture in a Plane material, however, I'm unable to scale it to show bigger than the window itself, or move it behind the window.
Here's my relevant code so far.-
var body: some View {
ZStack {
RealityView { content in
var material = UnlitMaterial()
material.color = try! .init(tint: .white,
texture: .init(.load(named: "image",
in: nil)))
let entity = Entity()
let component = ModelComponent(
mesh: .generatePlane(width: 1, height: 1),
materials: [material]
)
entity.components.set(component)
let currentTransform = entity.transform
var newTransform = Transform(scale: currentTransform.scale,
rotation: currentTransform.rotation,
translation: SIMD3(0, 0, -0.2))
entity.move(to: newTransform, relativeTo: nil)
/*
let scalingPivot = Entity()
scalingPivot.position.y = entity.visualBounds(relativeTo: nil).center.y
scalingPivot.addChild(entity)
content.add(scalingPivot)
scalingPivot.scale *= .init(x: 1, y: 1, z: 1)
*/
}
}
}
It belongs to an ImmersiveSpace I'm opening directly from my main window, but I have several issues:
The texture shows always in front of the window
I'm unable to scale it (scaling seems to affect to the texture coordinates inside the material instead of scaling the mesh itself)
I can only see the texture in the canvas preview (not in simulator)
Aloha,
I'm wondering where the documentation for the Vision Pro "Developer Strap" is located. I have the Vision Pro devices and the developer straps, but I'm not sure how to go about using the developer straps for VisionOS development in Xcode & Unity.