Topic : AR - UI - Accessibility - App Design - Development
Background: Having virtually attended WWDC2020 (my1st WWDC at the ripe old age of 68) I have come away with a wealth of treasures in terms of new directions and what to prepare for. Starting with the Keynote, State of the Platform, Sessions, and lab support WebX sessions were all of invaluable help to me. I would like to extend my thanks to Apple for making ‘remote’ support like this possible for us in a ‘very weird year’.
What brings me here is; I’m designing an app for Apple AR.
Reality Composer for iOS was introduced in October 2019. I haven’t viewed the real or the AR world in the same way since. Apple’s implementation of AR (without requiring being ‘tethered’ to the ‘interGlue’—good job on that!) is the best I’ve seen. Period.
As a disabled ‘user’ of Apple products for 30+ years I find myself in ‘yet another unique position’ in life; now I’m a Developer. This shift in perspective has caused me to widen my field of view when it comes to creating an ‘Accessible AR experience simulation/game’ environment targeted for iOS 14 on the iPad Pro.
1) AR; is the AR itself the ‘interface’ for that particular ‘operations space’ in an app?
One of the big questions I still wrestle with is; when AR is the destination user space targeted. I’m not sure there is any consensus on this even among Apple Engineers I’ve spoken with. (This was my sense after having several Lab session talks with a few of them) What is perhaps most important for all of us to remember is that AR is still ‘quite new to us’ as a ‘media’ format for communication. We will have to experiment, explore, devise, refine, and implement our way through the ‘unanswered questions’ and other stones we plow up.
‘’It depends on what your app does” is a revered answer, and it’s a good place to start asking the correct set of questions.
2) Creating the Magical Experience: As Developers in this new communications space of AR, I unsatisfied we each have different apps, with differing capabilities and focus. I think I can assume that each and every one of us has as a similar goal; to make their app offer the best user experience possible. In that sense, I feel like those of us developing for AR are in the ‘same boat together’.
That being true, I ask myself; ‘Why isn’t there more discussion and activity in the Accessibility forum and AR, RealityKit, Reality Composer forums relating to UI in AR, Accessibility in AR?’
Am I the only one who isn’t quite sure about how ‘best’ to add as much ‘Accessibility’ as possible to the AR experience? I don’t think so.
I can only speak for myself in this regard; after my 1st WWDC, I am convinced that I need to take advantage of forum posting and Feedback Assistant ticketing as I wade further into the waters of ‘design’ and development’ for my app.
3) Things I’m considering for AR Accessibility: I know my own disability better than anybody else. I do not know as much about supporting other types, but I want to do everything that makes sense and can be done to provide a ‘rich user experience’ in AR, no matter what type of ‘end user processing challenges’ may be involved.
The basic choices for ‘visual’ UI design for AR seem to fall into;
2-D pancake interfaces with ‘hover’ controls overlaying the AR experience using SwiftUI.
3-D - build UI elements in RealityKit-Composer and add Xcode functionality.
Once again, there was no consensus I could come away with from having spoken with several Apple folks on this topic. I feel this once again reflects that ‘we are all still figuring out how ‘best to use’ this great technology’.
What I have come away with from WWDC20 in this regard is;
I will most likely need to prototype a solution using both a 2 D and 3 D UI for use in my AR based app.
The 2-D one may well prove to be faster to implement.; Swift keeps getting more capable, thoughtful and mature as a modern programming language in the past 3 years.
The 3-D approach; Let’s face it, 3D was ****, even before Metal and Metal2. Nowadays- WOW seems an appropriate descriptor.
4) Spatial Audio Considerations: when AR is the user targeted environment, using spatial audio requires consideration in 3 interconnected ways: The Design content, the UI, and the Accessibility.
I’m assuming by the lack of any forum posting on this topic (I haven’t checked the audio forums) that others are happy and content with their implementations regarding this topic.
5) Speech and TTS Audio: this seems almost a different set of audio aspect of UI design, Accessibility uses of ‘audio’ within an app.
6) An AR HIG - Accessibility; set of ‘rules of the road’, or at the very least; ‘stuff you ought to think about’.
There are helpful aspects of the Accessibility HIG which are ‘transferable’ to AR;
Gestures
Navigation
Text Size & Weight
Color & Contrast
What I would like to avoid (and I assume others would, as well) is having ‘my interpretation’ of how the Accessibility and HIG elements apply in the Apple AR user space only to have to tear it down or rebuild it if Apple comes out with ‘do things this way’ pronouncements at a future date.
What I find most lacking is; any specific language or references in the Accessibility guidelines or HIG which states it directly applies to Apple’s AR user space.What is there now seems to be centered on 2-D pancake design and implementation issues.}
7) SFSymbols for AR: Should I make them? Should Apple already have thought of making them for this ‘user space’?
Many of the same potential ‘hazards of development’ exist in this topic, as well.
Apple wants us to adopt SF Symbols for good solid reasons. Yet, I do not see any discussion, questions or awareness of this ‘gap’ in making these valuable ‘communication assets’ available to us in the AR user space. Maybe it’s jut me?
If Apple ‘intends’ on making SF Symbols part of the AR user space in the next year or so, it would be extremely helpful to Developers to know it is indeed on the ‘roadmap’
Reinventing various circular objects is not the highlight of my day. I don’t wish to spend the time creating things for my app which Apple may well tell me I can’t use because they didn’t ‘make it’. I really could give a rat’s furry behind about ‘who’ makes Symbols 2 for AR, so long as we have them for that ‘user space’ as well.
This may be larger than the average post, but I’m hoping I have offered sufficient reasons for my related set of questions.
Thank you for your consideration.
Larry ‘Catfish‘ Kuhn
Background: Having virtually attended WWDC2020 (my1st WWDC at the ripe old age of 68) I have come away with a wealth of treasures in terms of new directions and what to prepare for. Starting with the Keynote, State of the Platform, Sessions, and lab support WebX sessions were all of invaluable help to me. I would like to extend my thanks to Apple for making ‘remote’ support like this possible for us in a ‘very weird year’.
What brings me here is; I’m designing an app for Apple AR.
Reality Composer for iOS was introduced in October 2019. I haven’t viewed the real or the AR world in the same way since. Apple’s implementation of AR (without requiring being ‘tethered’ to the ‘interGlue’—good job on that!) is the best I’ve seen. Period.
As a disabled ‘user’ of Apple products for 30+ years I find myself in ‘yet another unique position’ in life; now I’m a Developer. This shift in perspective has caused me to widen my field of view when it comes to creating an ‘Accessible AR experience simulation/game’ environment targeted for iOS 14 on the iPad Pro.
1) AR; is the AR itself the ‘interface’ for that particular ‘operations space’ in an app?
One of the big questions I still wrestle with is; when AR is the destination user space targeted. I’m not sure there is any consensus on this even among Apple Engineers I’ve spoken with. (This was my sense after having several Lab session talks with a few of them) What is perhaps most important for all of us to remember is that AR is still ‘quite new to us’ as a ‘media’ format for communication. We will have to experiment, explore, devise, refine, and implement our way through the ‘unanswered questions’ and other stones we plow up.
‘’It depends on what your app does” is a revered answer, and it’s a good place to start asking the correct set of questions.
2) Creating the Magical Experience: As Developers in this new communications space of AR, I unsatisfied we each have different apps, with differing capabilities and focus. I think I can assume that each and every one of us has as a similar goal; to make their app offer the best user experience possible. In that sense, I feel like those of us developing for AR are in the ‘same boat together’.
That being true, I ask myself; ‘Why isn’t there more discussion and activity in the Accessibility forum and AR, RealityKit, Reality Composer forums relating to UI in AR, Accessibility in AR?’
Am I the only one who isn’t quite sure about how ‘best’ to add as much ‘Accessibility’ as possible to the AR experience? I don’t think so.
I can only speak for myself in this regard; after my 1st WWDC, I am convinced that I need to take advantage of forum posting and Feedback Assistant ticketing as I wade further into the waters of ‘design’ and development’ for my app.
3) Things I’m considering for AR Accessibility: I know my own disability better than anybody else. I do not know as much about supporting other types, but I want to do everything that makes sense and can be done to provide a ‘rich user experience’ in AR, no matter what type of ‘end user processing challenges’ may be involved.
The basic choices for ‘visual’ UI design for AR seem to fall into;
2-D pancake interfaces with ‘hover’ controls overlaying the AR experience using SwiftUI.
3-D - build UI elements in RealityKit-Composer and add Xcode functionality.
Once again, there was no consensus I could come away with from having spoken with several Apple folks on this topic. I feel this once again reflects that ‘we are all still figuring out how ‘best to use’ this great technology’.
What I have come away with from WWDC20 in this regard is;
I will most likely need to prototype a solution using both a 2 D and 3 D UI for use in my AR based app.
The 2-D one may well prove to be faster to implement.; Swift keeps getting more capable, thoughtful and mature as a modern programming language in the past 3 years.
The 3-D approach; Let’s face it, 3D was ****, even before Metal and Metal2. Nowadays- WOW seems an appropriate descriptor.
4) Spatial Audio Considerations: when AR is the user targeted environment, using spatial audio requires consideration in 3 interconnected ways: The Design content, the UI, and the Accessibility.
I’m assuming by the lack of any forum posting on this topic (I haven’t checked the audio forums) that others are happy and content with their implementations regarding this topic.
5) Speech and TTS Audio: this seems almost a different set of audio aspect of UI design, Accessibility uses of ‘audio’ within an app.
6) An AR HIG - Accessibility; set of ‘rules of the road’, or at the very least; ‘stuff you ought to think about’.
There are helpful aspects of the Accessibility HIG which are ‘transferable’ to AR;
Gestures
Navigation
Text Size & Weight
Color & Contrast
What I would like to avoid (and I assume others would, as well) is having ‘my interpretation’ of how the Accessibility and HIG elements apply in the Apple AR user space only to have to tear it down or rebuild it if Apple comes out with ‘do things this way’ pronouncements at a future date.
What I find most lacking is; any specific language or references in the Accessibility guidelines or HIG which states it directly applies to Apple’s AR user space.What is there now seems to be centered on 2-D pancake design and implementation issues.}
7) SFSymbols for AR: Should I make them? Should Apple already have thought of making them for this ‘user space’?
Many of the same potential ‘hazards of development’ exist in this topic, as well.
Apple wants us to adopt SF Symbols for good solid reasons. Yet, I do not see any discussion, questions or awareness of this ‘gap’ in making these valuable ‘communication assets’ available to us in the AR user space. Maybe it’s jut me?
If Apple ‘intends’ on making SF Symbols part of the AR user space in the next year or so, it would be extremely helpful to Developers to know it is indeed on the ‘roadmap’
Reinventing various circular objects is not the highlight of my day. I don’t wish to spend the time creating things for my app which Apple may well tell me I can’t use because they didn’t ‘make it’. I really could give a rat’s furry behind about ‘who’ makes Symbols 2 for AR, so long as we have them for that ‘user space’ as well.
This may be larger than the average post, but I’m hoping I have offered sufficient reasons for my related set of questions.
Thank you for your consideration.
Larry ‘Catfish‘ Kuhn