Posts

Post not yet marked as solved
0 Replies
310 Views
Hello! I'm trying to display AVPlayerViewController in a separate WindowGroup - my main window opens a new window where the only element is a struct that implements UIViewControllerRepresentable for AVPlayerViewController: @MainActor public struct AVPlayerView: UIViewControllerRepresentable { public let assetName: String public init(assetName: String) { self.assetName = assetName } public func makeUIViewController(context: Context) -> AVPlayerViewController { let controller = AVPlayerViewController() controller.player = AVPlayer() return controller } public func updateUIViewController(_ controller: AVPlayerViewController, context: Context) { Task { if context.coordinator.assetName != assetName { let url = Bundle.main.url(forResource: assetName, withExtension: ".mp4") guard let url else { return } controller.player?.replaceCurrentItem(with: AVPlayerItem(url: url)) controller.player?.play() context.coordinator.assetName = assetName } } } public static func dismantleUIViewController(_ controller: AVPlayerViewController, coordinator: Coordinator) { controller.player?.pause() controller.player = nil } public func makeCoordinator() -> Coordinator { return Coordinator() } public class Coordinator: NSObject { public var assetName: String? } } WindowGroup(id: Window.videoPlayer.rawValue) { AVPlayerView(assetName: "wwdc") .onDisappear { print("DISAPPEAR") } } This displays the video player in non-inline mode and plays the video. The problem appears when I try to close the video player's window using the close button. Sound from the video continues playing in the background. I've tried to clean the state myself by using dismantleUIViewController and onDisapear methods, but they are not called by the system (it works correctly if a window doesn't contain AVPlayerView). This appear on Xcode 15.1 Beta 3 (I haven't tested it on other versions). Is there something I do incorrectly that is causing this issue, or is it a bug and I need to wait until its fixed?
Posted
by kmoczala.
Last updated
.
Post not yet marked as solved
1 Replies
541 Views
Hello! I would like to develop a visionOS application that tracks a single object in a user's environment. Skimming through the documentation I found out that this feature is currently unsupported in ARKit (we can only recognize images). But it seems it should be doable by combining CoreML and Vision frameworks. So I have a few questions: Is it the best approach or is there a simpler solution? What is the best way to train a CoreML model without access to the device? Will videos recorded by iPhone 15 be enough? Thank you in advance for all the answers.
Posted
by kmoczala.
Last updated
.