Custom Input Triggers for Apple Vision Pro

Hello,

Does the Apple Vision Pro have an API for creating custom triggers for selecting things on the screen instead of the hand pinch gesture? For instance, using an external button/signal/controller instead of pinching fingers?

Answered by Vision Pro Engineer in 796201022

Yes, visionOS supports Game Controller. Here's an example that uses GCController to enable a person to move a sphere along the x/y axis using a game controller. The example uses the the Entity Component System (ECS) paradigm. Understanding RealityKit’s modular architecture explains ECS in detail.

Create a RealityView containing a sphere entity and listen for game controller updates. As they arrive update the entity's ControllerComponent to reflect the game controller's state. More on ControllerComponent in the next step.

struct ImmersiveView: View {
    @State private var entity:ModelEntity?
    
    var body: some View {
        RealityView { content in

            let entity = ModelEntity(mesh: .generateSphere(radius: 0.05), materials: [SimpleMaterial(color: .green, isMetallic: true)])
            entity.position = [0, 1.2, -1.0]
            content.add(entity)
            self.entity = entity
        }
        .task {
            NotificationCenter.default.addObserver(
                forName: NSNotification.Name.GCControllerDidConnect,
                object: nil, queue: nil) {
                    notification in
                    
                    // Wait until you get a controller.
                    for controller in GCController.controllers() {
                        
                        // Update the entity's component when a person interacts with the controller.
                        // This fires multiple times a second so store the controller state
                        // in a component then use a system to apply them every frame.
                        controller.extendedGamepad?.valueChangedHandler = { pad, _ in
                            Task { @MainActor in
                                guard let entity else {return}
                                entity.components.set(ControllerComponent(pad: pad))
                            }
                        }
                    }
                }
        }
    }
}

Create a component to store the state of the controller.

struct ControllerComponent : Component {
    let speed:Float = 1
    let pad:GCExtendedGamepad
}

Create a system to adjust the entity's position based on the input received from the game controller.

struct ControllerSystem: System {
    static let query = EntityQuery(where: .has(ControllerComponent.self))
    
    public init(scene: RealityKit.Scene) {
    }

    // Extract the change in the controller's position every frame
    // and use it to update the entity's position.
    public func update(context: SceneUpdateContext) {
        let entities = context.entities(matching: Self.query,
                                       updatingSystemWhen: .rendering)

        for entity in entities {
            guard let component = entity.components[ControllerComponent.self] else {continue}
            
            let leftThumbstick = component.pad.leftThumbstick
            let xAxis:Float = leftThumbstick.xAxis.value
            let yAxis:Float = leftThumbstick.yAxis.value

            entity.position = entity.position + [xAxis, yAxis, 0] * component.speed * Float(context.deltaTime)
        }
    }
}

Register the component and system. You can do this in your app's initializer.

struct ControllerDemoApp: App {

    init() {
        ControllerComponent.registerComponent()
        ControllerSystem.registerSystem()
    }
}

Pair the game controller to your Vision Pro

See Connect headphones, game controllers, and other Bluetooth accessories to Apple Vision Pro to learn how to pair a game controller to your Vision Pro.

Note: The source for Happy Beam contains additional example code for enabling game controllers.

Yes, visionOS supports Game Controller. Here's an example that uses GCController to enable a person to move a sphere along the x/y axis using a game controller. The example uses the the Entity Component System (ECS) paradigm. Understanding RealityKit’s modular architecture explains ECS in detail.

Create a RealityView containing a sphere entity and listen for game controller updates. As they arrive update the entity's ControllerComponent to reflect the game controller's state. More on ControllerComponent in the next step.

struct ImmersiveView: View {
    @State private var entity:ModelEntity?
    
    var body: some View {
        RealityView { content in

            let entity = ModelEntity(mesh: .generateSphere(radius: 0.05), materials: [SimpleMaterial(color: .green, isMetallic: true)])
            entity.position = [0, 1.2, -1.0]
            content.add(entity)
            self.entity = entity
        }
        .task {
            NotificationCenter.default.addObserver(
                forName: NSNotification.Name.GCControllerDidConnect,
                object: nil, queue: nil) {
                    notification in
                    
                    // Wait until you get a controller.
                    for controller in GCController.controllers() {
                        
                        // Update the entity's component when a person interacts with the controller.
                        // This fires multiple times a second so store the controller state
                        // in a component then use a system to apply them every frame.
                        controller.extendedGamepad?.valueChangedHandler = { pad, _ in
                            Task { @MainActor in
                                guard let entity else {return}
                                entity.components.set(ControllerComponent(pad: pad))
                            }
                        }
                    }
                }
        }
    }
}

Create a component to store the state of the controller.

struct ControllerComponent : Component {
    let speed:Float = 1
    let pad:GCExtendedGamepad
}

Create a system to adjust the entity's position based on the input received from the game controller.

struct ControllerSystem: System {
    static let query = EntityQuery(where: .has(ControllerComponent.self))
    
    public init(scene: RealityKit.Scene) {
    }

    // Extract the change in the controller's position every frame
    // and use it to update the entity's position.
    public func update(context: SceneUpdateContext) {
        let entities = context.entities(matching: Self.query,
                                       updatingSystemWhen: .rendering)

        for entity in entities {
            guard let component = entity.components[ControllerComponent.self] else {continue}
            
            let leftThumbstick = component.pad.leftThumbstick
            let xAxis:Float = leftThumbstick.xAxis.value
            let yAxis:Float = leftThumbstick.yAxis.value

            entity.position = entity.position + [xAxis, yAxis, 0] * component.speed * Float(context.deltaTime)
        }
    }
}

Register the component and system. You can do this in your app's initializer.

struct ControllerDemoApp: App {

    init() {
        ControllerComponent.registerComponent()
        ControllerSystem.registerSystem()
    }
}

Pair the game controller to your Vision Pro

See Connect headphones, game controllers, and other Bluetooth accessories to Apple Vision Pro to learn how to pair a game controller to your Vision Pro.

Note: The source for Happy Beam contains additional example code for enabling game controllers.

Custom Input Triggers for Apple Vision Pro
 
 
Q