As per Apple Developer guidelines for Vision OS (https://developer.apple.com/design/human-interface-guidelines/immersive-experiences), If a person moves more than about a meter, the system automatically makes all displayed content translucent to help them navigate their surroundings.
Here, what is intended by "translucent behavior"? Will the app content be fully invisible? Or displayed with some transparency?
VisionKit
RSS for tagScan documents with the camera on iPhone and iPad devices using VisionKit.
Posts under VisionKit tag
48 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Vision Pro has a primary user mode and guest modes. Is it possible to change the primary user through factory reset or similar mechanism once it is configured?
Heard Vision Pro device is expected to be available in US market very soon.
But it will be delayed for other markets.
Any idea whether Apple still accepts applications to get Vision Pro Developer Kit in loan mode?
In our app, we needed to use visionkit framework to lift up the subject from an image and crop it. Here is the piece of code:
if #available(iOS 17.0, *) {
let analyzer = ImageAnalyzer()
let analysis = try? await analyzer.analyze(image, configuration: self.visionKitConfiguration)
let interaction = ImageAnalysisInteraction()
interaction.analysis = analysis
interaction.preferredInteractionTypes = [.automatic]
guard let subject = await interaction.subjects.first else{
return image
}
let s = await interaction.subjects
print(s.first?.bounds)
guard let cropped = try? await subject.image else { return image }
return cropped
}
But the s.first?.bounds always returns a cgrect with all 0 values. Is there any other way to get the position of the cropped subject? I need the position in the image from where the subject was cropped. Can anyone help?
Env
Intel Core i7
macOS :14.0
Xcode 15 Beta 8
simulator:visionOS 1.0 beta 3(21N5233e)
simulator: ios 17.0.1 ,ios 17.0 beta 8
Step
Xcode create a new Vision Demo, it can't build.
[macosx] error: Failed to find newest available Simulator runtime Command RealityAssetsCompile failed with a nonzero exit code
Hi guys,
has any individual develper received Vision Pro dev kit or is it just aimed at big companies?
Basically I would like to start with one or 2 of my apps that I removed from the store already, just to get familiar with VisionOS platform and gain knowledge and skills on a small, but real project.
After that I would like to use the Dev kit on another project. I work on a contract for mutlinational communication company on a pilot project in a small country and extending that project to VisionOS might be very interesting introduction of this new platform and could excite users utilizing their services. However I cannot quite reveal to Apple details for reasons of confidentiality. After completing that contract (or during that if I manage) I would like to start working on a great idea I do have for Vision Pro (as many of you do).
Is it worth applying for Dev kit as an individual dev? I have read some posts, that guys were rejected.
Is is better to start in simulator and just wait for actual hardware to show up in App Store? I would prefer to just get the device, rather than start working with the device that I may need to return in the middle of unfinished project.
Any info on when pre-orders might be possible?
Any idea what Mac specs are for developing for VisionOS - escpecially for 3D scenes. Just got Macbook Pro M3 Max with 96GB RAM, I'm thinknig if I should have maxed out the config. Anybody using that config with Vision Pro Dev kit?
Thanks.
I was looking for IP camera which is not very expensive.
The key point is I should be able to convert its frames to CMSampleBuffer
I would like to use images to make some basic analysis using Vision.
So far I could not find any IP camera manufacturer supports SDK for Swift and iOS for this kind of study.