Hello,
Is there an example from Apple on how to extract the data to create an Iframe playlist using the AVAssetSegmentTrackReport?
I'm following the example of HLS authoring from WWDC 2020 - Author fragmented MPEG-4 content with AVAssetWriter
It states:
"You can create the playlist and the I-frame playlist based on the information AVAssetSegmentReport provides."
I've examined the AVAssetSegmentTrackReport and it only appears to provide the firstVideoSampleInformation, which is good for the first frame, but the content I'm creating contains an I-Frame every second within 6 second segments.
I've tried parsing the data object from the assetWriter delegate function's didOutputSegmentData parameter, but only getting so far parsing the NALUs - the length prefixes seem to go wrong when I hit the first NALU type 8 (PPS) in the first segment.
Alternatively, I could parse out the output from ffmpeg, but hoping there's a solution within Swift.
Many thanks
HTTP Live Streaming
RSS for tagSend audio and video over HTTP from an ordinary web server for playback on Mac, iOS, and tvOS devices using HTTP Live Streaming (HLS).
Posts under HTTP Live Streaming tag
75 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Description:
HLS-VOD-Stream contains several audio tracks, being marked with same language tag but different name tag.
https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_16x9/bipbop_16x9_variant.m3u8
e.g.
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="bipbop_audio",LANGUAGE="eng",NAME="BipBop Audio 1",AUTOSELECT=YES,DEFAULT=YES
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="bipbop_audio",LANGUAGE="eng",NAME="BipBop Audio 2",AUTOSELECT=NO,DEFAULT=NO,URI="alternate_audio_aac/prog_index.m3u8"
You set up Airplay from e.g. iPhone or Mac or iPad to Apple TV or Mac.
Expected behavior:
You see in AVPlayer and QuickTime Language Audiotrack Dropdown containing info about LANGUAGE and NAME on Airplay Sender as on Airplay Receiver - the User Interface between playing back a local Stream or Airplay-Stream is consistent.
Current status:
You see in UI of Player of Airplay Receiver only Information of Language Tag.
Question:
=> Do you have an idea, if this is a missing feature of Airplay itself or a bug?
Background:
We'd like to offer additional Audiotrack with enhanced Audio-Characteristics for better understanding of spoken words - "Klare Sprache".
Technically, "Klare Sprache" works by using an AI-based algorithm that separates speech from other audio elements in the broadcast. This algorithm enhances the clarity of the dialogue by amplifying the speech and diminishing the volume of background sounds like music or environmental noise. The technology was introduced by ARD and ZDF in Germany and is available on select programs, primarily via HD broadcasts and digital platforms like HbbTV.
Users can enable this feature directly from their television's audio settings, where it may be labeled as "deu (qks)" or "Klare Sprache" depending on the device. The feature is available on a growing number of channels and is part of a broader effort to make television more accessible to viewers with hearing difficulties.
It can be correctly signaled in HLS via:
e.g.
https://ccavmedia-amd.akamaized.net/test/bento4multicodec/airplay1.m3u8
# Audio
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="stereo-aac",LANGUAGE="de",NAME="Deutsch",DEFAULT=YES,AUTOSELECT=YES,CHANNELS="2",URI="ST.m3u8"
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="stereo-aac",LANGUAGE="de",NAME="Deutsch (Klare Sprache)",DEFAULT=NO,AUTOSELECT=YES,CHARACTERISTICS="public.accessibility.enhances-speech-intelligibility",CHANNELS="2",URI="KS.m3u8"
Still there's the problem, that with Airplay-Stream you don't get this extra information but only LANGUAGE tag.
Hello,
I'm trying to stream stereoscopic SBS video on the APV. I see that AVPlayerViewController supports MV-HEVC video playback but it's not clear how to play SBS video on the Apple Vision Pro.
Are there any docs or examples you can share?
For my use case, SBS is the only format I can support.
Hi, I'm trying to download a encripted video using mediafilesegmenter with SAMPLE-AES, not fairplay...
I can play the video online without any problems..
When i try download the video using AVAssetDownloadTask
I get an error:
Error Domain=CoreMediaErrorDomain Code=-12160 "(null)"
And, if I use ClearKey system to deliver the key when I have a custom scheme on the m3u8, Airplay doesn't work either
Sample-aes only works with fairplay?
I can't find any information about it, does anyone know if it is a bug?
I hope someone can help me :)
I find the default timeout of 1 second to download a segment is not reasonable when playing an HLS stream from a server that is transcoding.
Does anyone know if it's possible to change this networking timeout?
Error status: -12889, Error domain: CoreMediaErrorDomain, Error comment: No response for map in 1s. Event: <AVPlayerItemErrorLogEvent: 0x301866250>
Also there is a delegate to control downloading HLS for offline viewing but no delegate for just streaming HLS.
Hi!
I am working with a team in developing a multichannel based audio web application. The whole structure is based on multiple tracks playing in sync, so after some research and failed attempts, we ended up going with the solution of having one audio buffer (HTMLAudioElement), containing a multichannel file (specifically 8 channels) that we play, split the channels, process them separately, and play the result back to the user.
We started doing this with multichannel wav files, and it worked great but every playback was way too large to be scalable, so we started looking into other multichannel-capable files. So far we have tried aac, opus and webm, none of which has worked properly in safari. I've looked in the apple developer documentation, HLS and all that, and seems like the only option is eac3, but I haven't been able to convert any of my files to that format, and I have been really trying.
The other option we have been exploring is decoding opus files with WASM manually, but with little success to date.
Has anyone been able to achieve anything similar to this?
Thanks!
Hi
I'm trying to run a 4K video on my Apple TV 4K, but I get error in AVPlayer.
Error Domain=CoreMediaErrorDomain Code=-16170
I can't get any more information.
Example HSL Manifest with video track in 4K:
#EXT-X-STREAM-INF:AUDIO="aud_mp4a.40.2",AVERAGE-BANDWIDTH=11955537,BANDWIDTH=12256000,VIDEO-RANGE=SDR,CODECS="hvc1.1.6.L153.90,mp4a.40.2",RESOLUTION=3840x2160,FRAME-RATE=50,HDCP-LEVEL=TYPE-1
video_4/stream.m3u8
Maybe, problem with hvc1 ? But as far as I know, Apple TV supports HEVC.
Hi Guys,
I'm working on adding LL-HLS support to the Ant Media Server. I'm following up the documentation in hlstools for streaming and testing mediastreamsegmenter and tsrecompressor. What I wonder is that the sample uses 1002 ms for --part-target-duration-ms (-w in short form) as below
mediastreamsegmenter -w 1002 -t 4 224.0.0.50:9123 -s 16 -D -T -f /Library/WebServer/Documents/2M/`
It works in this way.
mediastreamsegmenter -w 1000 -t 4 224.0.0.50:9123 -s 16 -D -T -f /Library/WebServer/Documents/2M/`
It works in this way
mediastreamsegmenter -w 1000 -t 4 224.0.0.50:9123 -s 16 -D -T --iso-fragmented -f /Library/WebServer/Documents/2M/`
It crashes in this way when I add --iso-fragmented and mediastreamsegmenter gives the following error
encountered failure write segment failed (-17543) - exiting
It works if I use 1001 or 1003.
I wondering if there is a reason for that or is it a bug?
I am trying to use hls to play audio and video on iOS devices, the audio format is opus, sliced files use fmp4 format, on iPhone13 ios 17.5.1, it works fine, but on iPhoneX 16.5.1, it does not play, Is it because hls does not support the opus audio format on some versions?
I'm trying to play an H265 video in opus audio format using iOS HLS, and a video slice file in fmp4 format, and I found that on iPhone13 ios 17.5.1, it plays fine, but on iPhone X ios 16.5.1, it doesn't play. Is it because some iOS versions do not support HLS in the opus audio format?
How to ensure current SKScene has fully loaded before engaging it with the GamePad Controller?
MAJOR REWRITE FOR THE SAKE OF HOPEFULLY (?) INCREASED CLARITY
The problem is this = when stopping sound is involved when I do switch SKScenes, if I press the buttons of the GamePad Controller (which cycle thru these other SKScenes) too fast, the movement of the Game Pieces fails to resume when I return to the Game Scene after the above cycling.
This problem occurs only with the os(tvOS) version, but not with the iPad version. And the reason for this distinction is that each SKScene for the iPad has to fully load due to the fact that the button I press to switch SKScenes is at the top-left corner of the iPad -- so, by definition, by the time I have access to this button, the current SKScene has fully loaded.
By definition, there is no such button for os(iOS).
Given this button’s absence, I need the Swift version of jQuery’s
$(document).ready (function() {.
Any help will be appreciated to the rafters ...
The same H265 encrypted Fairplay content can be played in all Apple devices except A1625.
The clear H265 content is played in A1625.
The question is: will this model (A1625) support H265 Fairplay encrypted content?
A ticket was created here:
https://discussions.apple.com/thread/255658006?sortBy=best
I saw equalizer in apps like Musi and Spotify. I think (but not sure) they use HLS streaming. If so, how to implement such an equalizer for HLS?
I searched and tried several approaches but so far none works, like:
AVAudioEngine seems only support local file;
Download .ts and merge into .mp3 to make it local can not guarantee real time effect;
MTAudioProcessingTap needs the audio track. For remote .mp3 I can extract the audio track but not for HLS.
Any suggestion?
Hello everyone!
I'm planning to buy an Apple Vision Pro (for replacing a Varjo XR-3)
I want to use it for a professional project, and I want to know if it can fit our need.
I want to develop a program on the Vision Pro for playing some live streaming videos from our local network cameras. (using RTSP)
Is this possible to get and play more than one live stream video.
One of those videos come from a stereo camera, streaming a SideBySide 3d stereo video.
Is this possible to have a classic 2d video in one ultra-wide virtual screen and another one virtual screen displaying a 3D video with depth simultaneously?
Thank you everyone in advance.
Regard's.
Playing fMP4 HLS stream on VisionOS beta. This is the stream, HEVC main 10 and EAC3 6 channel:
#EXT-X-STREAM-INF:BANDWIDTH=6760793,AVERAGE-BANDWIDTH=6760793,VIDEO-RANGE=PQ,CODECS="hvc1.2.4.L150.B0,mp4a.a6",RESOLUTION=3840x2160,FRAME-RATE=23.976,SUBTITLES="subs"
This is what AVPlayer says:
Error Domain=AVFoundationErrorDomain Code=-11848 "Cannot Open" UserInfo={NSLocalizedFailureReason=The media cannot be used on this device., NSLocalizedDescription=Cannot Open, NSUnderlyingError=0x3009e37b0 {Error Domain=CoreMediaErrorDomain Code=-15517 "(null)"}}
I can't find any documentation for the underlying error 15517.
Is it because "mp4a.a6" is declared in the codec list and not "ec-3"?
hlsreport has these MUST FIX issues:
1. Measured peak bitrate compared to multivariant playlist declared value exceeds error tolerance
Multivariant Playlist Stream Definition for All Variants
2. Stereo audio in AAC-LC, HE-AAC v1, or HE-AAC v2 format MUST be provided
Multivariant Playlist
3. If Dolby Digital Plus is provided then Dolby Digital MUST be provided also
Multivariant Playlist
4. I-frame playlists ( EXT-X-I-FRAME-STREAM-INF ) MUST be provided to support scrubbing and scanning UI
Multivariant Playlist
5. The server MUST deliver playlists using gzip content-encoding
All Variants
All Renditions
Multivariant Playlist
6. You MUST provide multiple bit rates of video
Multivariant Playlist
7. Playlist codec type doesn't match content codec type
All Variants
8. (Segment) The operation couldn’t be completed. (HTTPPumpErrorDomain error -16845 - HTTP 400: (unhandled))
(list of subtitle renditions)
9. (Segment) HTTP 400 - HTTP/2.0 400 Bad Request
(list of subtitle renditions)
10. Multichannel audio MUST be separate audio stream
All Variants
11. If EXT-X-INDEPENDENT-SEGMENTS is not in the multivariant playlist, then you MUST use the EXT-X-INDEPENDENT-SEGMENTS tag in all video media playlists
All Variants
12. The CODECS attribute MUST include every media format present
All Variants, does not declare EC-3
Hi, I am working on an app that is very similar to TikTok in terms of video experience. There is an infinite scroll feed of videos, and I am using HLS URLs as the video source.
My requirement is to cache the initial few seconds of each video on the disk while the video is playing. The next time a user views the video, it should play the initial few seconds from the cache, with the subsequent chunks coming from the network. Additionally, when there is no network connection, the video should still play the initial few seconds from the cache.
I was able to achieve this with MP4 using AVAssetResourceLoaderDelegate, but the same approach is not possible with HLS.
What are some other ways through which I can implement this feature?
Thanks.
I am using HTTP Live Streaming Tools to segment a spatial video (MV-HEVC) recorded by Vision Pro. I first used the macOS build on my MacBook, it works beautifully with the command:
mediafilesegmenter -r -f path/to/destination path/to/movie.MOV
But when I tried to use the CentOS build in a Docker container and segment the exact same file using the exact same command, it gives the following error:
can't create format reader /path/to/movie.MOV 561211770
Unable to find any valid tracks to segment.
I looked up the error code, it seems to correspond to kAudioSessionBadPropertySizeError. Any idea why?
Hi guys,
I'm investigating failure to play low latency Live HLS stream and I'm getting following error:
(String) “<AVPlayerItemErrorLog: 0x30367da10>\n#Version: 1.0\n#Software: AppleCoreMedia/1.0.0.21L227 (Apple TV; U; CPU OS 17_4 like Mac OS X; en_us)\n#Date: 2024/05/17 13:11:46.046\n#Fields: date time uri cs-guid s-ip status domain comment cs-iftype\n2024/05/17 13:11:16.016 https://s2-h21-nlivell01.cdn.xxxxxx.***/..../xxxx.m3u8 -15410 \“CoreMediaErrorDomain\” \“Low Latency: Server must support http2 ECN and SACK\” -\n2024/05/17 13:11:17.017 -15410 \“CoreMediaErrorDomain\” \“Invalid server blocking reload behavior for low latency\” -\n2024/05/17 13:11:17.017
The stream works when loading from dev server with TLS 1.3, but fails on CDN servers with TLS 1.2.
Regular Live streams and VOD streams work normally on those CDN servers.
I tried to configure TLSv1.2 in Info.plist, but that didn't help.
When running
nscurl --ats-diagnostics --verbose
it is passing for the server with TLS 1.3, but failing for CDN servers with TLS 1.2 due to error Code=-1005 "The network connection was lost."
Is TLS 1.3 required or just recommended?
Refering to
https://developer.apple.com/documentation/http-live-streaming/enabling-low-latency-http-live-streaming-hls
and
https://datatracker.ietf.org/doc/html/draft-pantos-hls-rfc8216bis
Is it possible to configure AVPlayer to skip ECN and SACK validation?
Thanks.
Hi everyone, I am having a problem on AVPlayer when I try to play some videos.
The video starts for a few seconds, but immediately after I see a black screen and in the console there is the following error:
<__NSArrayM 0x14dbf9f30>(
{
StreamPlaylistError = "-12314";
comment = "have audio audio-aacl-54 in STREAMINF without EXT-X-MEDIA audio group";
date = "2024-05-13 20:46:19 +0000";
domain = CoreMediaErrorDomain;
status = "-12642";
uri = "http://127.0.0.1:8080/master.m3u8";
},
{
"c-conn-type" = 1;
"c-severity" = 2;
comment = "Playlist parse error";
"cs-guid" = "871C1871-D566-4A3A-8465-2C58FDC18A19";
date = "2024-05-13 20:46:19 +0000";
domain = CoreMediaErrorDomain;
status = "-12642";
uri = "http://127.0.0.1:8080/master.m3u8";
}
)
I'm trying to cast the screen from an iOS device to an Android device.
I'm leveraging ReplayKit on iOS to capture the screen and VideoToolbox for compressing the captured video data into H.264 format using CMSampleBuffers. Both iOS and Android are configured for H.264 compression and decompression.
While screen casting works flawlessly within the same platform (iOS to iOS or Android to Android), I'm encountering an error ("not in avi mode") on the Android receiver when casting from iOS. My research suggests that the underlying container formats for H.264 might differ between iOS and Android.
Data transmission over the TCP socket seems to be functioning correctly.
My question is:
Is there a way to ensure a common container format for H.264 compression and decompression across iOS and Android platforms?
Here's a breakdown of the iOS sender details:
Device: iPhone 13 mini running iOS 17
Development Environment: Xcode 15 with a minimum deployment target of iOS 16
Screen Capture: ReplayKit for capturing the screen and obtaining CMSampleBuffers
Video Compression: VideoToolbox for H.264 compression
Compression Properties:
kVTCompressionPropertyKey_ConstantBitRate: 6144000 (bitrate)
kVTCompressionPropertyKey_ProfileLevel: kVTProfileLevel_H264_Main_AutoLevel (profile and level)
kVTCompressionPropertyKey_MaxKeyFrameInterval: 60 (maximum keyframe interval)
kVTCompressionPropertyKey_RealTime: true (real-time encoding)
kVTCompressionPropertyKey_Quality: 1 (lowest quality)
NAL Unit Handling: Custom header is added to NAL units
Android Receiver Details:
Device: RedMi 7A running Android 10
Video Decoding: MediaCodec API for receiving and decoding the H.264 stream