AR Quick View's AR Quick View button cannot be pressed on some iOS
devices.
As for the iPhone 12 Pro, the AR Quick View button cannot be pressed on
iOS 15 when reading from the standard iOS QR code reader, but the AR
Quick View button can be pressed on the iPhone 12 Pro with iOS 14.
On other devices, you can press the AR Quick View button when launching
from Safari, but you cannot press the AR Quick View button when
accessing from the standard QR code reader. Also, on some iOS devices,
the AR Quick View button cannot be pressed.
The problem seems to be occurring outside of the https://konabeer.jp/beach-diorama-ar/ as well.
Is there any way to avoid this AR Quick View button not being able to be pressed?
It would be very helpful if I could get more information than just the solution to this problem.
iPhone7(iOS13):OK
iPhoneX(iOS15):QR Code Reader:NG
iPhone11(iOS14):OK
iPhone12Pro(iOS14):OK
iPhone12Pro(iOS15):QR Code Reader:NG
iPhone12(iOS15.1):QR Code Reader:NG
iPad Air(iOS15):QR Code Reader:NG
iPhoneSE(iOS15):QR Code Reader:NG
As a result of our own verification, the AR Quick Look button is
deactivated on iOS 15 and when the QR code is imported from the Control
Center.
If you know of any known bugs or workarounds, or if there are any informative sites that have information on this issue, I would appreciate it.
Post
Replies
Boosts
Views
Activity
Has the problem of the black background when taking a picture with the AR Quick look function in iOS 15 been resolved? I think it's a rather serious bug.
On an iOS 15 device, go to the Quick Look page below and display any of the multiple 3D models in AR Quick Look; the background will be black when you take a picture with the AR Quick Look shooting function.
https://developer.apple.com/augmented-reality/quick-look/
There are similar events below, but they do not seem to be addressed at all.
https://developer.apple.com/forums/thread/691784
Hi
One point I would like to ask.
How many specs do you need and which Mac would you recommend for Apple Vision Pro development? We use XCode, RealityKit, ARKit, Reality Composer Pro, Unity Editor which supports visionOS development, and MaterialX.
If possible, what notebook and desktop models do you recommend?
Best regards
Sadao Tokuyama
https://1planet.co.jp/
Hi,
I have one question: you explained Shared Space at the beginning of the session video, but I didn't really understand it.
Is this Shared Space like the Dock on a Mac?
Are applications placed in the Shared Space and the operation is to launch the application placed in the Shared Space ? Why is the word "Shared" included, or is there a function to do Shared?
"By default, apps launch into Shared Space."
By default, apps launch into the Shared Space. What is the default? What is the non-default state?
"People remain connected to their surroundings through passthrough."
What does the above mean on visionOS?
By the way, is the application that starts on the Shared Space the so-called clock, or does the Safari browser also work on the Shared Space?
What kind of applications can only run on Full Space?
I don't have an image of the role of each function on visionOS.
If possible, it would be easier to understand if there is an actual image of the application running, not just a diagram.
Best regards.
Sadao Tokuyama
https://1planet.co.jp/
Volumes allow an app to display 3D content in defined bounds, sharing the space with other apps
What does it mean to be able to share space in Volumes? What are the benefits of being able to do this?
Do you mean Shared Space?
I don't understand Shared Space very well to begin with.
they can be viewed from different angles.
Does this mean that because it is 3D content with depth, if I change the angle, I can see it with depth?
It seems obvious to me because it is 3D content.
Is this related to Volumes?
Hi,
I have a question about Apple Vision Pro specifications.
HWhat are the vertical, horizontal, and diagonal FOV degrees of Vision Pro?
How many cm does Vision Pro's Near clipping plane support?
How many meters does Vision Pro's Far clipping plane support?
Does eye tracking, authentication, etc. work properly for people wearing contact lenses?
At what age can Vision Pro be worn?
Best regards.
Sadao Tokuyama
https://1planet.co.jp/
https://twitter.com/tokufxug
Hi,
I was wondering after watching the WWDC23 session, Meet Core Location for spatial computing, does the Apple Vision Pro have GPS? Or does it provide Core Location functionality via Wi-Fi?
Also, in Unity, we use Input.location to get latitude and longitude. When developing in Unity with Apple Vision Pro, do we use Input.location to get latitude and longitude?
Best regards.
Sadao Tokuyama
https://1planet.co.jp/
https://twitter.com/tokufxug
Hi,
Will the Apple Vision Pro developer kit be sold only in the US? I live in Japan. Is it possible to get the Apple Vision Pro developer kit in Japan?
Best regards.
Sadao Tokuyama
https://1planet.co.jp/
https://twitter.com/tokufxug
Hi,
I have one question.
When creating a web page, is there a way to determine that it is being accessed from Safari on visionOS? I would also like to know the user agent for Safari on visionOS. If there is more than one way to determine this, such as JavaScript and web server, please tell us all. Cases where it is used include changing the page layout in the case of Safari on visionOS, changing the processing method when dynamically generating HTML pages on a web server, and judging Quick Look.
Best regards.
Sadao Tokuyama
https://1planet.co.jp/
https://twitter.com/tokufxug
Hi,
I am currently watching the Create immersive Unity apps video from WWDC23. I am posting this question because a question arose while watching the video.
First, look at the following, which is explained in the session
Because you're using Unity to create volumetric content that participates in the shared space, a new concept called a volume camera lets you control how your scene is brought into the real world.
A volume camera can create two types of volumes, bounded and unbounded, each with different characteristics.
Your application can switch between the two at any time.
https://developer.apple.com/videos/play/wwdc2023/10088/?time=465
Your unbounded volume displays in a full space on this platform and allows your content to fully blend with passthrough for a more immersive experience.
https://developer.apple.com/videos/play/wwdc2023/10088/?time=568
At first, we explain that there are two types of volumetric content in Shared Space: bounded volume and unbounded volume.
However, when we get to the description of unbounded volume, it is changed to Full Space.
Is Full Space correct for unbounded volume, not Shared Space?
Best regards.
P.S.
I felt uncomfortable with the title Create immersive Unity apps. The first half of the presentation was about Unity development and Shared Space's Bounded Volume, and I felt that Bounded Volume apps are far from immersive.
Apple's definition of immersive in spatial computing was vague.
Sadao Tokuyama
https://1planet.co.jp/
https://twitter.com/tokufxug
Hi,
I have a question about Apple Vision Pro's support for Unity programmable shaders.
Shaders applied to Material are not supported.
RenderTextures are supported. (Can be used as texture input to Shader Graph for display through RealityKit.)
Regarding the above, are Shared Space, Full Space, and Full immersive space all covered?
Is Full immersive space irrelevant because it is Metal and not RealityKit?
Best regards.
Sadao Tokuyama
https://1planet.co.jp/
https://twitter.com/tokufxug
Hi,
If you write the sample code for the gesture in the following document and try to execute it, an error will occur.
https://developer.apple.com/documentation/visionos/adding-3d-content-to-your-app
Value of type '(@escaping (TapGesture.Value) -> Void) -> _EndedGesture' has no member 'targetedToAnyEntity'
Is something missing?
XCode 15.0 beta 2 (15A516b)
visionOS 1.0 beta
import SwiftUI
import RealityKit
import RealityKitContent
struct SphereSecondView: View
{
@State var scale = false
/*var tap: some Gesture {
TapGesture()
.onEnded { _ in
print("Tap")
scale.toggle()
}
}*/
var body: some View {
RealityView {content in
let model = ModelEntity(
mesh: .generateSphere(radius: 0.1),
materials: [SimpleMaterial(color: .yellow, isMetallic: true)])
content.add(model)
} update: { content in
if let model = content.entities.first {
model.transform.scale = scale ? [2.0, 2.0, 2.0] : [1.0, 1.0, 1.0]
}
}
.gesture(TapGesture().onEnded
.targetedToAnyEntity() { _ in
scale.toggle()
})
}
}
#Preview{
SphereSecondView()
}
Best regards.
Sadao Tokuyama
https://1planet.co.jp/
https://twitter.com/tokufxug
Hi,
I run full immersive in the visionOS simulator and enable full immersive, but pass-through is still enabled. same behavior as mixed.
Is there a bug in the visionOS simulator that prevents the pass-through with full immersive from being disabled?
Thanks.
Sadao Tokuyama
https://1planet.co.jp/
https://twitter.com/tokufxug
Hi,
I would like to learn how to create custom materials using Shader Graph in Reality Composer Pro. I would like to know more about Shader Graph in general, including node descriptions and how the material's display changes when nodes are connected. However, I cannot find a manual for Shader Graph in Reality Composer Pro. This leaves me totally clueless on how to create custom materials.
Thanks.
Sadao Tokuyama
https://1planet.co.jp/
https://twitter.com/tokufxug
Hi,
I have a question about Immersion Style. It is about progressive.
I understand that by specifying progressive in Immersion, it is possible to mix mixed and full, but when is this used, for example, as in the WWDC23 movie where the person watching the movie on the screen gradually switches the room to space, or in the Digital Crown where the person is watching a movie on the screen and the room gradually changes to space? For example, when a person is watching a movie on the screen and the room gradually changes to space, as in the WWDC23 movie, or when the room gets darker and darker as the Digital Crown is adjusted, or when the room goes completely dark?
Please let me know if you have a video, sample code, or explanation that shows an example of progression.
By the way, is it possible to get the event of operating the Digital Crown from the application?
Thanks.
Sadao Tokuyama
https://1planet.co.jp/
https://twitter.com/tokufxug