I recently created a project, originally in xCode 13.3 if i am not miistaken. Then i update it to 13.4 then update to mac os 15.6 as well.
Previously the project worked fine, but then i notice if i activate breakpoints the projects stopped when i run it in xCode but it worked fine like before, without breakpoints activated.
Also, when i build a .app of the project, the .app crashed exactly when the breakpoints previously stopped.
I am very confused on how to continue, would like to get help from anyone regarding the issue. From what i can gather from the breakpoints and crash report, it's got something about UUIID registration? AV foundation?
I am so confused. Please Help!
Thanks.
AVKit
RSS for tagCreate view-level services for media playback, complete with user controls, chapter navigation, and support for subtitles and closed captioning using AVKit.
Posts under AVKit tag
74 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
We have developed a custom player tvOS application using AVPlayer Foundation. When we hit the Siri command "What did they say?" Playback will go backwards but subtitles will not work temporarily. Anyone please suggest a solution for this issue :)
Hi, I love VideoMaterial API that gives so much power to play video on any mesh. But I am trying to play a side-by-side 3D video usingVideoMaterial:
RealityView { content in
let mesh = MeshResource.generatePlane(width: 300.0, height: 300.0, cornerRadius: 0) //generate mesh
let vidMaterial = VideoMaterial(avPlayer: AVPlayer(url: URL(string: "https://someurl/test/master.m3u8")!)) //VideoMaterial
vidMaterial.controller.preferredViewingMode = .stereo //<-- no idea why it doesn't work for SBS video in simulator
vidMaterial.avPlayer?.play()
let planeEntity = Entity() //new entity
planeEntity.components.set(ModelComponent(mesh: mesh, materials: [vidMaterial])) //set a new ModelComponent to the entity
content.add(planeEntity)
}
this code works well for plain 2D video playback but how do I display a Side-by-Side or Top-Bottom 3D video?
I found GeometrySwitchCameraIndex in custom ShaderGraphMaterial but if I use input node as a image texture then how do I pass the video frame as texture into my custom shader to achieve the 3D effect or maybe there is an even better way to deal with this?
There seems to be additional API .preferredViewingMode on the VideoMaterial's controller that can be set to .stereo but it doesn't give any stereo effect. Perhaps it's only for MV-HEVC media playback?
After experiencing the official demo (https://developer.apple.com/documentation/visionos/destination-video) on a real device and a demo, I tried to change the poster of the to-be-played list displayed by customOverlayViewController in the demo to a VStack layout. The poster in customOverlayViewController could not fully respond to click events.
Issue found in Native App or Hybrid App:Native
OS Version:Any
Device:Any
4.Description:
We are using AVPlayer for streaming videos in our iOS application. The streaming works fine in lower sandbox environment, but we are encountering a "server not properly configured" error in the production environment.
5.Steps to Reproduce:
Configure AVPlayer with a video URL from the production server.
Attempt to play the video.
6.Expected Behavior:
The video should stream successfully as it does in the sandbox environment.
7.Actual Behavior:
AVPlayer fails to stream the video and reports a "server not properly configured" error.
I have AVPlayer loading an MP3 from a URL. While it is playing, how can I tell how much of the MP3 has been actually downloaded so far? I have tried using item.loadedTimeRanges, but it does not seem to be accurate. I get a few notifications but it usualy stops sending notifications around 80 seconds and doesn't keep up to even the current position of the player.
Any idea?
Also, is there any way to get the total duration of the audio? All the methods I've tried return NAN.
I have an mp4 file (which is around 25min) which i need to play but if i open it in quick time player or AVplayer, it shows as around 55min in those two platfroms. I don't understand why this happens. It correctly shows the time when opening with a browser or VSCode built in video playback.
Here is the link to the file:
https://www.dropbox.com/scl/fi/hbg59uqx8xdpiqbnx5wz8/videoplayback-1.mp4?rlkey=7o8l8m7j8dhq0o6f3zssgv9bd&st=5lt6apug&dl=0
My app need a specific scene that play a video when my iPhone close to NFC Tags. and my app can read the data from NFC Tags, the data will tell us which kind of video can be play.
I tried to write URLScheme or Universal Link in NFC Tags, but all this ways will pop up notifications. not launch my app and play a video, how can I design my app.
please give me some advice, thanks!
I am trying to make the immersive version of AVplayerViewController bigger, but I can't find any information on how I can go about it. It seems that if I wanted to change immersive video viewing experience, only thing I can do is using VideoMaterial and put it on ModelEntity with .generatePlane. is there a way to change video size on immersive mode for AVplayerViewController?
hello
i wanna play mp4 file in VideoMaterial avPlayer.
so first i make to use reality composer pro.
I created matterial using the sphere provided by default in Reality Composer Pro and exported it to usdz.
and when i play mp4 file in sphere matterial, it's good play
But i wanna custom created matterial (ex. shaper3d create 3d modeling) not good play.
i make custom created matterial - it's curved matterial
curved matterial in shaper3d and exported it to usdz.
curved matterial in Reality Composer Pro Scene and exported it to usdz.
when i play mp4 file in curved matterial, it's not good play
-> not adjust screen play
How can I adjust and display the video in a custom usda file?
When using .mov video files for creating video materials in RealityKit, they display correctly on my modelEntity. However, when I tried using a video file in the .mp4 format, I only get a solid black material. Does AVKit support playing .mp4 video files on visionOS 1.2?
How to display the user's own persona in a view
I'm developing an app which use "System Audio Recording Only" API to capture system audio.
Is there any API to check if app is authorized? So I can instruct user to give my app with this permission.
Thanks.
A user of my app, which shows subtitles loaded from a text file above a video loaded from another file, reported that it crashes within minutes of launching it. The user confirmed that the crash only happens when they load a video within the app; if they use it without a video, it doesn't crash. The subtitles are shown in a NSTextView added to AVPlayerView.contentOverlayView.
What could cause such a crash? I'm not able to reproduce it.
(I tried attaching the full crash report but I always got a validation error "This post contains sensitive language. Please revise it in order to continue.")
Crashed Thread: 0 Dispatch queue: com.apple.main-thread
Exception Type: EXC_BREAKPOINT (SIGTRAP)
Exception Codes: 0x0000000000000001, 0x000000019c50cae8
Termination Reason: Namespace SIGNAL, Code 5 Trace/BPT trap: 5
Terminating Process: exc handler [20614]
Application Specific Backtrace 0:
0 CoreFoundation 0x0000000198a472ec __exceptionPreprocess + 176
1 libobjc.A.dylib 0x000000019852e788 objc_exception_throw + 60
2 Foundation 0x0000000199b2caa0 -[NSKeyValueNestedProperty object:withObservance:didChangeValueForKeyOrKeys:recurse:forwardingValues:] + 664
3 Foundation 0x0000000199af4e08 -[NSKeyValueUnnestedProperty object:withObservance:didChangeValueForKeyOrKeys:recurse:forwardingValues:] + 196
4 Foundation 0x0000000199b8bc54 NSKeyValueDidChange + 200
5 Foundation 0x0000000199acde1c -[NSObject(NSKeyValueObservingPrivate) _changeValueForKeys:count:maybeOldValuesDict:maybeNewValuesDict:usingBlock:] + 684
6 Foundation 0x0000000199af7484 -[NSObject(NSKeyValueObservingPrivate) _changeValueForKey:key:key:usingBlock:] + 64
7 Foundation 0x0000000199b110e8 _NSSetObjectValueAndNotify + 284
8 AVKit 0x00000001bd3ff694 -[AVPlayerControlsViewController setPlayerController:] + 376
9 Foundation 0x0000000199acddd0 -[NSObject(NSKeyValueObservingPrivate) _changeValueForKeys:count:maybeOldValuesDict:maybeNewValuesDict:usingBlock:] + 608
10 Foundation 0x0000000199af7484 -[NSObject(NSKeyValueObservingPrivate) _changeValueForKey:key:key:usingBlock:] + 64
11 Foundation 0x0000000199b110e8 _NSSetObjectValueAndNotify + 284
12 AVKit 0x00000001bd3c6e68 -[AVPlayerView setPlaybackControlsViewController:] + 88
13 Foundation 0x0000000199b11074 _NSSetObjectValueAndNotify + 168
14 AVKit 0x00000001bd40f4b8 -[AVPlayerView _updatePlaybackControlsViewControllerIfNeeded] + 536
15 AVKit 0x00000001bd3cc0b4 -[AVPlayerView viewDidMoveToWindow] + 136
16 AppKit 0x000000019c24456c -[NSView _setWindow:] + 1788
17 AppKit 0x000000019ccce4a0 __21-[NSView _setWindow:]_block_invoke.146 + 268
18 AppKit 0x000000019c244564 -[NSView _setWindow:] + 1780
19 AppKit 0x000000019ccce4a0 __21-[NSView _setWindow:]_block_invoke.146 + 268
20 AppKit 0x000000019c244564 -[NSView _setWindow:] + 1780
...
35 AppKit 0x000000019ccce4a0 __21-[NSView _setWindow:]_block_invoke.146 + 268
36 AppKit 0x000000019c244564 -[NSView _setWindow:] + 1780
37 AppKit 0x000000019c428ec0 -[NSWindow dealloc] + 684
38 Foundation 0x000000019a1fa118 _NSKVOPerformWithDeallocatingObservable + 172
39 Foundation 0x0000000199b17404 NSKVODeallocate + 180
40 Foundation 0x0000000199b122f0 empty + 88
41 Foundation 0x0000000199af77fc dealloc + 60
42 Foundation 0x0000000199af7740 -[NSConcreteMapTable dealloc] + 76
43 AppKit 0x000000019c936e80 ___NSTouchBarFinderSetNeedsUpdateOnMain_block_invoke_2 + 1388
44 AppKit 0x000000019c2d4c4c NSDisplayCycleObserverInvoke + 168
45 AppKit 0x000000019c2d48a8 NSDisplayCycleFlush + 644
46 QuartzCore 0x00000001a0bc3f64 _ZN2CA11Transaction19run_commit_handlersE18CATransactionPhase + 120
47 QuartzCore 0x00000001a0bc2d04 _ZN2CA11Transaction6commitEv + 320
48 AppKit 0x000000019c3589d0 __62+[CATransaction(NSCATransaction) NS_setFlushesWithDisplayLink]_block_invoke + 272
49 AppKit 0x000000019cd18208 ___NSRunLoopObserverCreateWithHandler_block_invoke + 64
50 CoreFoundation 0x00000001989d187c __CFRUNLOOP_IS_CALLING_OUT_TO_AN_OBSERVER_CALLBACK_FUNCTION__ + 36
51 CoreFoundation 0x00000001989d1768 __CFRunLoopDoObservers + 536
52 CoreFoundation 0x00000001989d0d94 __CFRunLoopRun + 776
53 CoreFoundation 0x00000001989d0434 CFRunLoopRunSpecific + 608
54 HIToolbox 0x00000001a317419c RunCurrentEventLoopInMode + 292
55 HIToolbox 0x00000001a3173e2c ReceiveNextEventCommon + 220
...
Thread 0 Crashed:: Dispatch queue: com.apple.main-thread
0 AppKit 0x19c50cae8 -[NSApplication _crashOnException:] + 240
1 AppKit 0x19c358b44 __62+[CATransaction(NSCATransaction) NS_setFlushesWithDisplayLink]_block_invoke + 644
2 AppKit 0x19cd18208 ___NSRunLoopObserverCreateWithHandler_block_invoke + 64
3 CoreFoundation 0x1989d187c __CFRUNLOOP_IS_CALLING_OUT_TO_AN_OBSERVER_CALLBACK_FUNCTION__ + 36
4 CoreFoundation 0x1989d1768 __CFRunLoopDoObservers + 536
5 CoreFoundation 0x1989d0d94 __CFRunLoopRun + 776
6 CoreFoundation 0x1989d0434 CFRunLoopRunSpecific + 608
7 HIToolbox 0x1a317419c RunCurrentEventLoopInMode + 292
8 HIToolbox 0x1a3173e2c ReceiveNextEventCommon + 220
9 HIToolbox 0x1a3173d30 _BlockUntilNextEventMatchingListInModeWithFilter + 76
10 AppKit 0x19c22fd68 _DPSNextEvent + 660
11 AppKit 0x19ca25808 -[NSApplication(NSEventRouting) _nextEventMatchingEventMask:untilDate:inMode:dequeue:] + 700
12 AppKit 0x19c22309c -[NSApplication run] + 476
13 AppKit 0x19c1fa2e0 NSApplicationMain + 880
14 Underword 0x102c099ac 0x102c08000 + 6572
15 dyld 0x19856a0e0 start + 2360
Using the hardware volume buttons on the iPhone, you have 16 steps you can adjust your volume to. I want to implement a volume control slider in my app. I am updating the value of the slider using AVAudioSession.sharedInstance().outputVolume. The problem is that this returns values rounded to the nearest 0 or 5. This makes the slider jump around. .formatted() is not causing this problem.
You can recreate the problem using code below.
@main
struct VolumeTestApp: App {
init() {
try? AVAudioSession.sharedInstance().setActive(true)
}
var body: some Scene {
WindowGroup {
ContentView()
}
}
}
struct ContentView: View {
@State private var volume = Double()
@State private var difference = Double()
var body: some View {
VStack {
Text("The volume changed by \(difference.formatted())")
Slider(value: $volume, in: 0...1)
}
.onReceive(AVAudioSession.sharedInstance().publisher(for: \.outputVolume), perform: { value in
volume = Double(value)
})
.onChange(of: volume) { oldValue, newValue in // Only used to make the problem more obvious
if oldValue > newValue {
difference = oldValue - newValue
} else {
difference = newValue - oldValue
}
}
}
}
Here is a video of the problem in action:
https://share.icloud.com/photos/00fmp7Vq1AkRetxcIP5EXeAZA
What am I doing wrong or what can I do to avoid this?
Thank you
I want to add additional UI to places outside of AVPlayerViewController, such as adding likes and comments, but I found that once AVPlayerViewController is wrapped in other views, it becomes inline mode and cannot play 3D effects, instead becoming 2D effects. 3D effects can only be played when it is full screen. I want to know if there is any way to meet my needs like this? On the premise of being able to play 3D videos, add additional layouts outside the AVPlayerViewController, or use AVPlayerViewController to achieve the method of both playing 3D videos and adding additional layouts outside the player. If anyone knows, could you give me some guidance? Thank you.
Hi guys,
I'm investigating failure to play low latency Live HLS stream and I'm getting following error:
(String) “<AVPlayerItemErrorLog: 0x30367da10>\n#Version: 1.0\n#Software: AppleCoreMedia/1.0.0.21L227 (Apple TV; U; CPU OS 17_4 like Mac OS X; en_us)\n#Date: 2024/05/17 13:11:46.046\n#Fields: date time uri cs-guid s-ip status domain comment cs-iftype\n2024/05/17 13:11:16.016 https://s2-h21-nlivell01.cdn.xxxxxx.***/..../xxxx.m3u8 -15410 \“CoreMediaErrorDomain\” \“Low Latency: Server must support http2 ECN and SACK\” -\n2024/05/17 13:11:17.017 -15410 \“CoreMediaErrorDomain\” \“Invalid server blocking reload behavior for low latency\” -\n2024/05/17 13:11:17.017
The stream works when loading from dev server with TLS 1.3, but fails on CDN servers with TLS 1.2.
Regular Live streams and VOD streams work normally on those CDN servers.
I tried to configure TLSv1.2 in Info.plist, but that didn't help.
When running
nscurl --ats-diagnostics --verbose
it is passing for the server with TLS 1.3, but failing for CDN servers with TLS 1.2 due to error Code=-1005 "The network connection was lost."
Is TLS 1.3 required or just recommended?
Refering to
https://developer.apple.com/documentation/http-live-streaming/enabling-low-latency-http-live-streaming-hls
and
https://datatracker.ietf.org/doc/html/draft-pantos-hls-rfc8216bis
Is it possible to configure AVPlayer to skip ECN and SACK validation?
Thanks.
I've created a Full Immersive VisionOS project and added a spacial video player in the ImmersiveView swift file. I have a few buttons on a different VideosView swift file on a floating window and i'd like switch the video playing in ImmersiveView when i click on a button in VideosView file.
Video player working great in ImmersiveView:
RealityView { content in
if let videoEntity = try? await Entity(named: "Video", in: realityKitContentBundle) {
guard let url = Bundle.main.url(forResource: "video1", withExtension: "mov") else {fatalError("Video was not found!")}
let asset = AVURLAsset(url: url)
let playerItem = AVPlayerItem(asset: asset)
let player = AVPlayer()
videoEntity.components[VideoPlayerComponent.self] = .init(avPlayer: player)
content.add(videoEntity)
player.replaceCurrentItem(with: playerItem)
player.play()
}else {
print("file not found!")
}
}
Buttons in floating window from VideosView:
struct VideosView: View {
var body: some View {
VStack{
Button(action: {}) {
Text("video 1").font(.title)
}
Button(action: {}) {
Text("video 2").font(.title)
}
Button(action: {}) {
Text("video 3").font(.title)
}
}
}
}
In general how do I control the video player across views and how do I replace the video when each button is selected. Any help/code/links would be greatly appreciated.
After numerous trials and errors, we finally succeeded in implementing VR180.
However, there is a problem.
Videos played via a URL (Internet) connection experience significant lag. Initially, I thought it was a bitrate issue.
But after various tests, I began to suspect that the problem might be with the internet connection processing..itself
I tested the same video through both file opening (set up as a network drive) and URL (AWS) connections. Since AWS provides stable speeds, I concluded there is no issue there.
The video files are 8K. The bitrate is between 80-90 Mbps. The conditions for decoding and implementing 8K are the same. Also, when I mirrored the video, there was significant lag. Both AFP and URL use the same wireless conditions. I assume the conditions for implementing 8K are the same. When mirroring, the AFP connection had no lag at all.
Could it be that VisionOS's URL (Internet connection) is causing a high system load? I noticed that an app called AmazeVR allows videos to be downloaded before playing. Could this be because of the URL issue?
If anyone knows, please respond.
I'm using something similar to this example.
import SwiftUI
struct ContentView: View {
@State private var toggle = false
var body: some View {
CustomParentView {
Button {
toggle.toggle()
} label: {
Text(toggle.description)
}
}
}
}
struct CustomParentView<Content: View>: UIViewRepresentable {
let content: Content
@inlinable init(@ViewBuilder content: () -> Content) {
self.content = content()
}
func makeUIView(context: Context) -> UIView {
let view = UIView()
let hostingController = context.coordinator.hostingController
hostingController.view.frame = view.bounds
hostingController.view.autoresizingMask = [.flexibleWidth, .flexibleHeight]
view.addSubview(hostingController.view)
return view
}
func updateUIView(_ uiView: UIView, context: Context) {
context.coordinator.hostingController.rootView = self.content
}
class Coordinator: NSObject {
var hostingController: UIHostingController<Content>
init(hostingController: UIHostingController<Content>) {
self.hostingController = hostingController
}
}
func makeCoordinator() -> Coordinator {
return Coordinator(hostingController: UIHostingController(rootView: content))
}
}
The only different thing is I'm using UIScrollView.
When I have a @State width and call .frame(width) on the content, the content would stay with initial width even when width is changed.
I tried:
hostingController.sizingOptions = .intrinsicContentSize
This time the size would change to correct size if I pinch zoom the content, but the initial size that trigger updateUIView would be .zero. This prevents me to center the content.
Is there a way to dynamically set size and get correct rendering just like any child view of a normal SwiftUI view?