Post

Replies

Boosts

Views

Activity

Reply to How to play Music Videos with MusicKit for Swift?
We'd live to understand why it's not possible via the API. Video Previews are available. It seems to me that if an Apple Music Subscriber authenticates within your MusicKit app developers should be able to write music kit apps that also play video content. The suggested workaround is NOT a great experience, the preloading sequence brings the user to a random looking view state before loading the video. The video's audio is clipped and pops during this loading sequence. Developer wanting to re-skin the interface need full access to the applications player. MusicKit is awesome with this major oversight and omission as an exception. Calling the Apple Music API also fails to return the actual video URL Pleas explain a better work around.
Aug ’24
Reply to VisionFramework does not work with VisionOS2.0
Vision Framework for GenerateForegroundMask does NOT work on VisionOS 2.0 (Beta 5) See example code: func analyse() async { guard let selectedImage = selectedImage else { return } let ciImage = CIImage(image: selectedImage) let handler = VNImageRequestHandler(ciImage: ciImage!, options: [:]) let request = VNGenerateForegroundInstanceMaskRequest { request, error in DispatchQueue.main.async { self.isProcessing = false if let error = error { print("Error: \(error.localizedDescription)") return } guard let results = request.results as? [VNPixelBufferObservation], let pixelBuffer = results.first?.pixelBuffer else { print("No results found") return } let ciImage = CIImage(cvPixelBuffer: pixelBuffer) let context = CIContext() if let cgImage = context.createCGImage(ciImage, from: ciImage.extent) { self.processedImage = UIImage(cgImage: cgImage) } } } // Configure the Vision request with preferred compute device do { // Query supported devices for each compute stage (handle potential errors) let supportedDevices = try request.supportedComputeStageDevices print("Supported Devices: \(supportedDevices)") // Check the available devices for the main compute stage if let mainStageDevices = supportedDevices[.main] { print("Main Stage Devices: \(mainStageDevices)") // Try to set the Neural Engine first, then GPU, and finally CPU (if applicable) var selectedDevice: MLComputeDevice? = nil if let neuralEngineDevice = mainStageDevices.first(where: { "\($0)".contains("NeuralEngine") }) { selectedDevice = neuralEngineDevice print("Selected Neural Engine: \(neuralEngineDevice)") } else if let gpuDevice = mainStageDevices.first(where: { "\($0)".contains("GPU") }) { selectedDevice = gpuDevice print("Selected GPU: \(gpuDevice)") } else if let cpuDevice = mainStageDevices.first(where: { "\($0)".contains("CPU") }) { selectedDevice = cpuDevice print("Selected CPU: \(cpuDevice)") } else { print("No preferred device found, using default.") } // Set the selected compute device, if any if let selectedDevice = selectedDevice { try request.setComputeDevice(selectedDevice, for: .main) } } } catch { print("Failed to configure Vision request compute device: \(error)") } #if targetEnvironment(simulator) request.usesCPUOnly = true #endif if #available(iOS 14.0, *) { request.usesCPUOnly = true } do { try handler.perform([request]) } catch { DispatchQueue.main.async { self.isProcessing = false print("Failed to perform request: \(error.localizedDescription)") } } }```
Aug ’24