SceneKit on visionOS works flawlessly.
My only complaint is that Apple doesn’t have an official hardware game controller, so any game utilizing a keyboard, MFI, or GCVirtualController cannot be easily ported to visionOS.
For example, say you have a FPS:
macOS could use the keyboard,
Apple TV can use the remote as a controller,
and iOS and iPadOS can use the GCVirtualController.
visionOS, however, cannot use a dPad without hardware, so shipping the game using gesture controls is incredibly frustrating.
I tried using spatial swipes and pinches and had to put all visionOS game development on hold.
What would make visionOS ports super simple would be to add some kind of remote or controller packaged with Vision Pro 2.
Otherwise, you have to use tap games...
Post
Replies
Boosts
Views
Activity
The only thing you can really do on the visionOS simulator is develop a SwiftUI 2D app. For a 2D app you don’t even need the visionOS simulator; you can make a SwiftUI 2D app for any platform and it should mostly just work on visionOS.
Everything else that’s based in spatial computing requires the physical device.
Although you can kind of guesstimate 3D physics and USDZ lighting, or how an immersive view might appear, the simulator shows an emulated 2D view of the application that looks nothing like what the end-user sees.
It is kind of like the iPhone camera: the simulator cannot emulate the camera. You need to test on actual devices.
You could try applying for a dev kit:
https://developer.apple.com/visionos/developer-kit/
For SwiftUI
Note: The touch location doesn't map 1:1 to the SKScene, so you have to convert the CGPoint based on the anchor of your SKScene and the size of your SKScene. Print the longPressLocation to find the coordinates.
import SwiftUI
import SpriteKit
struct ContentView : View {
// MARK: - Properties -
@State var gameScene : GameScene = GameScene(
size: CGSize(
width: GAME_WIDTH,
height: GAME_HEIGHT
)
)
// a long press gesture that enables isDragging
@State var longPressLocation = CGPoint.zero
// MARK: - Lifecycle -
var body : some View {
SpriteView(
scene: gameScene
)
.ignoresSafeArea()
.onAppear(
perform: {
// adjust the game scene
gameScene.scaleMode = .aspectFit
}
)
.gesture(
LongPressGesture(
minimumDuration: 0.0
).sequenced(
before: DragGesture(
minimumDistance: 0.0,
coordinateSpace: .global
)
)
.onEnded {
value in
switch value {
case .second(
true,
let drag
):
// touches ended
// capture location of touch
longPressLocation = drag?.location ?? .zero
// find the SKSpriteNode
if let node : SKSpriteNode = gameScene.atPoint(
longPressLocation
) as? SKSpriteNode {
// do something with the SpriteKit node
}
default:
break
}
}
)
}
}
#Preview {
ContentView()
}
Here are my thoughts from making an iTunes Media app:
If you are only going to display iTunes content, I would just use MPMediaQuery as MusicKit is still tied to MPMusicPlayerController as of June 2024:
https://developer.apple.com/documentation/mediaplayer/mpmusicplayercontroller
You need to pass in the MPMediaItem anyway, so converting from a MusicKit Song to MPMediaItem doesn’t make a lot of sense.
If you’re going to play Apple Music subscription content, then clearly you need to use MusicKit.
If you want to get a head start, then by all means use MusicLibraryRequest.
I haven’t found a way to use MusicLibraryRequest without MPMusicPlayerController, so it seems redundant to use both.
I keep MusicLibraryRequest in a separate Xcode project for when Apple finally fixes MusicKit systemMusicPlayer:
https://developer.apple.com/documentation/musickit/systemmusicplayer
That said, I cannot get it to work with iTunes Content.
Here's how you would generate a Convex physics shape. That wouldn't help with the torus but may work for other models.
if let modelEntity = try? await ModelEntity(
named: "model.usdz",
in: Bundle.main
) {
// collision
modelEntity.generateCollisionShapes(
recursive: false
)
// create a convex physics shape
let modelCollisionComponent = try? await CollisionComponent(
shapes: [
ShapeResource.generateConvex(
from: modelEntity.model!.mesh
)
]
)
modelEntity.components.set(
modelCollisionComponent!
)
modelEntity.components[PhysicsBodyComponent.self] = PhysicsBodyComponent(
shapes: modelCollisionComponent!.shapes,
mass: 0,
material: nil,
mode: .static
)
}
You have to add it to the Main Bundle as a resource in a folder within the visionOS project. It doesn't work with realityKitContentBundle.
if let modelEntity = try? await ModelEntity(
named: "model.usdz",
in: Bundle.main
) {
// collision
modelEntity.generateCollisionShapes(
recursive: false
)
if let modelCollisionComponent = modelEntity.components[CollisionComponent.self] {
modelEntity.components[PhysicsBodyComponent.self] = PhysicsBodyComponent(
shapes: modelCollisionComponent.shapes,
mass: 0,
material: nil,
mode: .static
)
}
}
You cannot diffuse a CIImage and you must CGImage.
A UIImage initiated from a CIImage won't work either.
Just adding to this, you cannot diffuse a CIImage and you must CGImage.
A UIImage initiated from a CIImage won't work either.
Could someone please explain what an ARKit app for iPad (that uses the iPad's camera) would look like on Vision Pro Designed for iPad?
Update: Apparently you can't put your iOS ARKit apps on Vision Pro.
The version of ARKit in iOS is incompatible with the one in visionOS and visionOS can’t display windows that contain ARKit views. For information about how to bring an ARKit app to visionOS, see Bringing your ARKit app to visionOS.
To add my two cents, if you're using the AppDelegate to register the physical MFi controller, aka a physical game controller, you need to write exception logic to make sure that the Virtual Controller isn't included. Otherwise it will create bugs and send the button event twice.
A virtual controller has a vendorName of "Apple Touch Controller".
// MARK: - Controller -
@objc private func controllerWasConnected(_ notification: Notification) {
let controller: GCController = notification.object as! GCController
let status = "MFi Controller: \(String(describing: controller.vendorName)) is connected"
print(status)
// make sure it's not virtual
if controller.vendorName != "Apple Touch Controller" {
gcSharedInstance.reactToInput(controller: controller)
}
}
@objc private func controllerWasDisconnected(_ notification: Notification) {
let controller : GCController = notification.object as! GCController
let status = "MFi Controller: \(String(describing: controller.vendorName)) is disconnected"
print(status)
// make sure it's not virtual
if controller.vendorName != "Apple Touch Controller" {
gcSharedInstance.deallocController(controller: controller)
}
}
You can query iTunes movies using the ITLibrary with Cocoa.
ITLibrary
AVPlayer seems to have issues with the DRM, even if it says it's DRM free. Maybe Apple can fix this in the future.
You will also need to download the movie from the macOS Apple TV app first, otherwise the item.location will always return nil.
import Cocoa
import iTunesLibrary
import AVFoundation
import AVKit
class ViewController : NSViewController {
@IBOutlet var videoView: AVPlayerView!
var videoUrl : URL?
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
do {
let library = try ITLibrary.init(apiVersion: "1.0")
if library != nil {
let mediaItems = library.allMediaItems
for item in mediaItems {
if item.isVideo {
if item.isDRMProtected == false {
if item.title == <Video Title> {
print(item.title)
print("item.location: \(item.location)")
print("item.isDRMProtected: \(item.isDRMProtected)")
print("item.isUserDisabled: \(item.isUserDisabled)")
print("item.isCloud: \(item.isCloud)")
print("item.locationType: \(item.locationType)")
videoUrl = item.location
}
}
}
}
}
} catch {
print("error")
}
DispatchQueue.main.async {
if self.videoUrl!.startAccessingSecurityScopedResource() {
print("access")
do {
let data = try Data.init(contentsOf: self.videoUrl!)
let fileManager = FileManager.default
var path = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0] as String
path = path + "/video.mov"
let url = URL.init(filePath: path)
try data.write(to: url)
let player = AVPlayer(url: url)
let playerLayer = AVPlayerLayer(player: player)
playerLayer.backgroundColor = NSColor.black.cgColor
playerLayer.frame = self.view.bounds
playerLayer.videoGravity = .resizeAspect
self.view.layer?.sublayers?
.filter { $0 is AVPlayerLayer }
.forEach { $0.removeFromSuperlayer() }
self.view.layer?.insertSublayer(playerLayer, at: 0)
playerLayer.magnificationFilter = .nearest
playerLayer.player?.play()
} catch {
//
}
self.videoUrl?.stopAccessingSecurityScopedResource()
}
}
}
override var representedObject: Any? {
didSet {
// Update the view, if already loaded.
}
}
}
Once you have the various .DAE files, each containing a single animation, you load the animation from a .DAE file with animation and play it in another without the animation:
// MARK: - Animations -
func loadAnimation(sceneName: String, extensionName: String, targetNode: SCNNode) {
// source of .dae with animation
let sceneURL = Bundle.main.url(forResource: sceneName, withExtension: extensionName)
let sceneSource = SCNSceneSource(url: sceneURL!, options: nil)
for key in sceneSource?.identifiersOfEntries(withClass: CAAnimation.self) ?? [] {
if let animationObj = sceneSource?.entryWithIdentifier(key,
withClass: CAAnimation.self) {
if animationObj.isKind(of: CAAnimationGroup.self) {
animationObj.repeatCount = .infinity
animationObj.fadeInDuration = CGFloat(0)
animationObj.fadeOutDuration = CGFloat(0.0)
// play animation in target .dae node
playAnimation(animation: animationObj, node: targetNode)
return
}
}
}
}
func playAnimation(animation: CAAnimation, node: SCNNode) {
let player = SCNAnimationPlayer.init(animation: SCNAnimation.init(caAnimation: animation))
node.addAnimationPlayer(player, forKey: "myAnimation")
player.play()
}
Example
// idle
self.loadAnimation(sceneName: "assets.scnassets/models/bird/bird_idle", extensionName: "dae", targetNode: birdNode)
// walk
self.loadAnimation(sceneName: "assets.scnassets/models/bird/bird_walk", extensionName: "dae", targetNode: birdNode)
// fly
self.loadAnimation(sceneName: "assets.scnassets/models/bird/bird_fly", extensionName: "dae", targetNode: birdNode)
Download Blender for macOS
https://blender.org
Install Collada Exporter plugin
https://github.com/godotengine/collada-exporter
Download a .blend file containing animations
https://opengameart.org/content/bird
Open the .blend file and click the Editor Type icon
Select Nonlinear Animation
Make sure the NLA Action strip (the animation) is unmuted by clicking the check icon next to the Animation Name.
DAE models only support one animation, so only one can be checked.
Click File > Export > Better Collada (.dae)
Toggle the Export Animation option and click Export DAE
Collada (DAE) only supports one animation per .dae file.
From Blender, save each animation (NLA Action strip) as a unique .dae file.
For example:
Bird_Idle.dae
Bird_Walk.dae
Bird_Fly.dae
You could write your own logic and store the value (Date) in iCloud or locally on the app. I’m not sure that you have access to the keychain from iOS, in such a way that would persist a start date.
However, if you write the start date value locally or store it in iCloud, a user could delete their iCloud container, or the app itself, and the trial would reset indefinitely each time they erase the contents.
Without Apple’s help, you cannot implement a Free Trial on IAP or Consumables, unless you use your own web server. But using your own web server, to create user accounts, sort of defeats the purpose of trial based IAP since there is incurred overhead on maintaining a server.
This is disappointing since Apple offers this functionality on Final Cut Pro (90 day free trial, one time purchase to unlock).