Post

Replies

Boosts

Views

Activity

Max HLS segment file size?
The media services used for HLS streaming in an AVPlayer seem to crash if your segments are too large. Anything over 20Mbps seems to cause a crash. I have tried adjusting the segment length to 1 second also and it didn't help. I am remuxing Dolby Vision and HDR video and want to avoid transcoding and losing any metadata. However the segments are too large. Is there a workaround for this? Otherwise it seems AVFoundation is not suited to high bitrate HLS and I should be using MPV or similar.
2
0
294
Nov ’24
AVPlayer and HLS streams timeout
I find the default timeout of 1 second to download a segment is not reasonable when playing an HLS stream from a server that is transcoding. Does anyone know if it's possible to change this networking timeout? Error status: -12889, Error domain: CoreMediaErrorDomain, Error comment: No response for map in 1s. Event: <AVPlayerItemErrorLogEvent: 0x301866250> Also there is a delegate to control downloading HLS for offline viewing but no delegate for just streaming HLS.
0
0
526
Jul ’24
Error 15517 when playing HLS
Playing fMP4 HLS stream on VisionOS beta. This is the stream, HEVC main 10 and EAC3 6 channel: #EXT-X-STREAM-INF:BANDWIDTH=6760793,AVERAGE-BANDWIDTH=6760793,VIDEO-RANGE=PQ,CODECS="hvc1.2.4.L150.B0,mp4a.a6",RESOLUTION=3840x2160,FRAME-RATE=23.976,SUBTITLES="subs" This is what AVPlayer says: Error Domain=AVFoundationErrorDomain Code=-11848 "Cannot Open" UserInfo={NSLocalizedFailureReason=The media cannot be used on this device., NSLocalizedDescription=Cannot Open, NSUnderlyingError=0x3009e37b0 {Error Domain=CoreMediaErrorDomain Code=-15517 "(null)"}} I can't find any documentation for the underlying error 15517. Is it because "mp4a.a6" is declared in the codec list and not "ec-3"? hlsreport has these MUST FIX issues: 1. Measured peak bitrate compared to multivariant playlist declared value exceeds error tolerance Multivariant Playlist Stream Definition for All Variants 2. Stereo audio in AAC-LC, HE-AAC v1, or HE-AAC v2 format MUST be provided Multivariant Playlist 3. If Dolby Digital Plus is provided then Dolby Digital MUST be provided also Multivariant Playlist 4. I-frame playlists ( EXT-X-I-FRAME-STREAM-INF ) MUST be provided to support scrubbing and scanning UI Multivariant Playlist 5. The server MUST deliver playlists using gzip content-encoding All Variants All Renditions Multivariant Playlist 6. You MUST provide multiple bit rates of video Multivariant Playlist 7. Playlist codec type doesn't match content codec type All Variants 8. (Segment) The operation couldn’t be completed. (HTTPPumpErrorDomain error -16845 - HTTP 400: (unhandled)) (list of subtitle renditions) 9. (Segment) HTTP 400 - HTTP/2.0 400 Bad Request (list of subtitle renditions) 10. Multichannel audio MUST be separate audio stream All Variants 11. If EXT-X-INDEPENDENT-SEGMENTS is not in the multivariant playlist, then you MUST use the EXT-X-INDEPENDENT-SEGMENTS tag in all video media playlists All Variants 12. The CODECS attribute MUST include every media format present All Variants, does not declare EC-3
1
1
761
Jun ’24
TextureResource.DrawableQueue broken in VisionOS 2?
I have an input texture in a ShaderGraphMaterial. I use .replace(withDrawables:) to replace the texture with a drawable queue. When I present drawables to this queue, nothing happens in VisionOS 2. The drawables are not presented, I can't get any more via nextDrawable() because the unpresented ones are holding things up. This is with both bgra8Unorm_srgb and rgba16float formats. I have confirmed the material applied to my object has the modified texture resources on them. It was working in VisionOS 1.2. What changed in VisionOS 2 to cause this?
3
0
543
Jun ’24
Does anyone know if HDR video is supported in a RealityView?
I have attempted to use VideoMaterial with HDR HLS stream, and also a TextureResource.DrawableQueue with rgba16Float in a ShaderGraphMaterial. I'm capturing to 64RGBAHalf with AVPlayerItemVideoOutput and converting that to rgba16Float. I don't believe it's displaying HDR properly or behaving like a raw AVPlayer. Since we can't configure any EDR metadata or color space for a RealityView, how do we display HDR video? Is using rgba16Float supposed to be enough? Is expecting the 64RGBAHalf capture to handle HDR properly a mistake and should I capture YUV and do the conversion myself? Thank you
7
0
1k
Jun ’24
EXC_BAD_ACCESS crash with .replace(withDrawables: ) on MaterialParameters.Value.textureResource
I get a crash every time I try to swap this texture for a drawable queue. I have a DrawableQueue leftQueue created on the main thread, and I invoke this block on the main thread. Scene.usda contains a reference to a model cinema. It crashes on the line with the replace(). if let shaderMaterial = try? await ShaderGraphMaterial(named: "/Root/cinema/_materials/Screen", from: "Scene.usda", in: theaterBundle) { if let imageParam = shaderMaterial.getParameter(name: "image"), case let .textureResource(imageTexture) = imageParam { imageTexture.replace(withDrawables: leftQueue) } } One weird thing, the imageParam has an invalid value when it crashes. imageParam RealityFoundation.MaterialParameters.Value <invalid> (0x0) Here is the stack trace is: * thread #1, queue = 'com.apple.main-thread', stop reason = EXC_BAD_ACCESS (code=1, address=0x9) frame #0: 0x0000000191569734 libobjc.A.dylib`objc_release + 8 frame #1: 0x00000001cb9e5134 CoreRE`re::SharedPtr<re::internal::AssetReference>::reset(re::internal::AssetReference*) + 64 frame #2: 0x00000001cba77cf0 CoreRE`re::AssetHandle::operator=(re::AssetHandle const&) + 36 frame #3: 0x00000001ccc84d14 CoreRE`RETextureAssetReplaceDrawableQueue + 228 frame #4: 0x00000001acf9bbcc RealityFoundation`RealityKit.TextureResource.replace(withDrawables: RealityKit.TextureResource.DrawableQueue) -> () + 164 * frame #5: 0x00000001006d361c Screenlit`TheaterView.setupMaterial(self=Screenlit.TheaterView @ 0x000000011e2b7a30) at TheaterView.swift:245:74 frame #6: 0x00000001006e0848 Screenlit`closure #1 in closure #1 in closure #1 in closure #1 in TheaterView.body.getter(self=Screenlit.TheaterView @ 0x000000011e2b7a30) at TheaterView.swift:487 frame #7: 0x00000001006f1658 Screenlit`partial apply for closure #1 in closure #1 in closure #1 in closure #1 in TheaterView.body.getter at <compiler-generated>:0 frame #8: 0x00000001004fb7d8 Screenlit`thunk for @escaping @callee_guaranteed @Sendable @async () -> (@out τ_0_0) at <compiler-generated>:0 frame #9: 0x0000000100500bd4 Screenlit`partial apply for thunk for @escaping @callee_guaranteed @Sendable @async () -> (@out τ_0_0) at <compiler-generated>:0
2
0
704
May ’24
Capturing HDR pixel buffers with AVPlayerItemVideoOutput
On a Vision Pro I load an HDR video served over HLS using AVPlayer. Per FFMPEG the video has: pixel format: yuv420p10le color space / ycbcr matrix: bt2020nc color primaries: bt2020 transfer function: smte2084 I wanted to try out letting AVFoundation do all of the color conversion instead of making my own YUV -> RGB shader. To display a 10-bit texture in a drawable queue, the destination Metal texture format must be MTLPixelFormat.rgba16Float (no other formats above 8-bit are supported). So the pixel format I am capturing in is kCVPixelFormatType_64RGBAHalf since it's pretty close. It's worth noting that the AVAsset shows no track information...must be because it's HLS? I am using AVPlayerItemVideoOutput to get pixel buffers: AVPlayerItemVideoOutput(outputSettings: [ AVVideoColorPropertiesKey: [ AVVideoColorPrimariesKey: AVVideoColorPrimaries_ITU_R_2020, AVVideoTransferFunctionKey: AVVideoTransferFunction_SMPTE_ST_2084_PQ, AVVideoYCbCrMatrixKey: AVVideoYCbCrMatrix_ITU_R_2020 ], kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_64RGBAHalf), kCVPixelBufferMetalCompatibilityKey as String: true ]) I can change these settings in real time and see they are having an effect on my drawable queue. The BT.2020 primaries do not look correct to me, it's very bright and washed out. When I switch to BT.709 it looks closer to the output of the AVPlayer. The AVPlayer by itself doesn't look terrible, just a little dark maybe. When I leave out the outputSettings and let the AVPlayerItemVideoOutput choose its own color settings, it appears to choose BT.2020 also. Is it enough to put in these outputSettings and expect an RGB pixelBuffer that perfectly matches those settings? Or do I have to just capture in YUV and do all of the conversion manually? Am I misunderstanding something related to color settings here? I am definitely not an expert. Thanks
0
0
573
Apr ’24
Custom material getting converted to PhysicallyBasedMaterial
I have a custom material in Reality Composer. When I attach it to a cube and try loading the scene in XCode, the material cannot be cast to a ShaderGraphMaterial because it has been changed to a PhysicallyBasedMaterial. The material was always a Custom material, I did not change the type in Reality Composer. Does anyone know how to fix?
1
0
1k
Mar ’24