Integrate video and other forms of moving visual media into your apps.

Posts under Video tag

90 Posts
Sort by:
Post not yet marked as solved
3 Replies
69 Views
I am a bit confused on whether certain Video Toolbox (VT) encoders support hardware acceleration or not. When I query the list of VT encoders (VTCopyVideoEncoderList(nil,&encoderList)) on an iPhone 14 Pro device, for avc1 (AVC / H.264) and hevc1 (HEVC / H.265) encoders, the kVTVideoEncoderList_IsHardwareAccelerated flag is not there, which -based on the documentation found on the VTVideoEncoderList.h- means that the encoders do not support hardware acceleration: optional. CFBoolean. If present and set to kCFBooleanTrue, indicates that the encoder is hardware accelerated. In fact, no encoders from this list return this flag as true and most of them do not include the flag at all on their dictionaries. On the other hand, when I create a compression session using the VTCompressionSessionCreate() and pass the kVTVideoEncoderSpecification_EnableHardwareAcceleratedVideoEncoder as true in the encoder specifications, after querying the kVTCompressionPropertyKey_UsingHardwareAcceleratedVideoEncoder using the following code, I get a CFBoolean value of true for both H.264 and H.265 encoder. In fact, I get a true value (for both of the aforementioned encoders) even if I don't specify the kVTVideoEncoderSpecification_EnableHardwareAcceleratedVideoEncoder during the creation of the compression session (note here that this flag was introduced in iOS 17.4 ^1). So the question is: Are those encoders actually hardware accelerated on my device, and if so, why isn't that reflected on the VTCopyVideoEncoderList() call?
Posted
by
Post not yet marked as solved
1 Replies
166 Views
xtension Entity { func addPanoramicImage(for media: WRMedia) { let subscription = TextureResource.loadAsync(named:"image_20240425_201630").sink( receiveCompletion: { switch $0 { case .finished: break case .failure(let error): assertionFailure("(error)") } }, receiveValue: { [weak self] texture in guard let self = self else { return } var material = UnlitMaterial() material.color = .init(texture: .init(texture)) self.components.set(ModelComponent( mesh: .generateSphere(radius: 1E3), materials: [material] )) self.scale *= .init(x: -1, y: 1, z: 1) self.transform.translation += SIMD3(0.0, -1, 0.0) } ) components.set(Entity.WRSubscribeComponent(subscription: subscription)) } func updateRotation(for media: WRMedia) { let angle = Angle.degrees( 0.0) let rotation = simd_quatf(angle: Float(angle.radians), axis: SIMD3<Float>(0, 0.0, 0)) self.transform.rotation = rotation } struct WRSubscribeComponent: Component { var subscription: AnyCancellable } } case .failure(let error): assertionFailure("(error)") Thread 1: Fatal error: Error Domain=MTKTextureLoaderErrorDomain Code=0 "Image decoding failed" UserInfo={NSLocalizedDescription=Image decoding failed, MTKTextureLoaderErrorKey=Image decoding failed}
Posted
by
Post not yet marked as solved
0 Replies
74 Views
Link sanbox: https://codesandbox.io/p/sandbox/webrtc-ios-lasted-issue-jzx9h5 Issue: Black video screen when url changed. Reproduce step: Get the source code on sanbox repo above Install packages by command "npm install" Start local web-app under https by command "HTTPS=true npm start" Update url by click button "Update URL search param" OS: iOS v17.4.1 Browser: Safari Device: iPhone 11 pro Anyone can help? Note: it's works on iPhone X iOS version 16 Link video issue: https://streamable.com/rj07u8
Posted
by
Post not yet marked as solved
0 Replies
95 Views
Issue: WebRTC - User medias - Black video screen when url changed. Reproduce step: Get the source code on sanbox repo above Install packages by command "npm install" Start local web-app under https by command "HTTPS=true npm start" Update url by click button "Update URL search param" OS: iOS v17.4.1 Browser: Safari Device: iPhone 11 pro Anyone can help? Note: it's works on iPhone X iOS version 16 Link video issue: https://streamable.com/rj07u8
Posted
by
Post not yet marked as solved
0 Replies
234 Views
I have a AVPlayer() which loads the video and places it on the screen ModelEntity in the immersive view using the VideoMaterial. This also makes the video untappable as it is a VideoMaterial. Here's the code for the same: let screenModelEntity = model.garageScreenEntity as! ModelEntity let modelEntityMesh = screenModelEntity.model!.mesh let url = Bundle.main.url(forResource: "<URL>", withExtension: "mp4")! let asset = AVURLAsset(url: url) let playerItem = AVPlayerItem(asset: asset) let player = AVPlayer() let material = VideoMaterial(avPlayer: player) screenModelEntity.components[ModelComponent.self] = .init(mesh: modelEntityMesh, materials: [material]) player.replaceCurrentItem(with: playerItem) return player I was able to load and play the video. However, I cannot figure out how to show the player controls (AVPlayerViewController) to the user, similar to the DestinationVideo sample app. How can I add the video player controls in this case?
Posted
by
Post not yet marked as solved
0 Replies
276 Views
<div class="container" style="background-size: contain; user-select: none; pointer-events: none; height: 787.5px; width: 1400px;"> <div class="container__header">header</div> <span> <div class="video-container" style="inset: 17.853% 68% 11.747% 1%; z-index: 2; opacity: 1;"> <div class="video-container__placeholder-image">image</div> <div class="video-container__content"> <div class="some-info"></div> <div class="video-canvas"></div> <div class="other-info"></div> </div> </div> <div class="video-container" style="inset: 17.853% 1% 11.747% 33%; z-index: 1; opacity: 1;"> <div class="video-container__placeholder-image">image</div> <div class="video-container__content"> <div class="video-canvas"> <div class="player" style="width: 100%; height: 100%; position: relative; overflow: hidden; background-color: black;"> <video playsinline="" muted="" style="object-fit: cover; width: 100%; height: 100%; position: absolute; left: 0px; top: 0px;"></video> </div> </div> </div> </div> </span> </div> The page looks like Then, the html changed as follows, <div class="container" style="background-size: contain; user-select: none; pointer-events: none; height: 787.5px; width: 1400px;"> <div class="container__header">header</div> <span> <div class="video-container" style="inset: 100% 100% 0% 0%; z-index: 2; opacity: 0;"> <div class="video-container__placeholder-image">image</div> <div class="video-container__content"> <div class="some-info"></div> <div class="video-canvas"></div> <div class="other-info"></div> </div> </div> <div class="video-container" style="style="inset: 6.106% 5.98719% 0%; z-index: 3; opacity: 1;""> <div class="video-container__placeholder-image">image</div> <div class="video-container__content"> <div class="video-canvas"> <div class="player" style="width: 100%; height: 100%; position: relative; overflow: hidden; background-color: black;"> <video playsinline="" muted="" style="object-fit: cover; width: 100%; height: 100%; position: absolute; left: 0px; top: 0px;"></video> </div> </div> </div> </div> </span> </div> From the mac developer tools, the width of the video is 1400px, but it render like the size is same as before in iOS17+(iOS17.1 and iOS17.3.1). The expected results looks like the actual results are looks like I tried the same operators in iOS 14.6 and 16.4 and it worked as expected, this problem likes only exists in iOS17+. Please help me to resolve this problom. Thanks.
Posted
by
zcl
Post not yet marked as solved
0 Replies
227 Views
We utilized AVFragmentedAssetMinder to refresh the player data. While notifications for AVAssetDurationDidChange were consistently received whenever the player duration changed. However, following the release of iOS 17, notifications for AVAssetDurationDidChange ceased to be received. Could you please advise anyone why this notification is not being triggered? what we have to change NotificationCenter.default.addObserver(self, selector: #selector(self.onVideoUpdate), name: .AVAssetDurationDidChange, object: nil) #AVPLAyer, #AVMUtableMovie
Posted
by
Post not yet marked as solved
0 Replies
340 Views
Hi guys, I'm implementing FairPlay support for a video streaming application. I've managed to get as far as generating the SPC and acquiring a license from the license server. However when it comes to parsing the license (CKC) returned from the server, the FPS module returns error code -42671. Has anyone else faced this before and / or knows what the fix is? I thought passing it the license should be enough unless additional data is required?
Posted
by
Post not yet marked as solved
2 Replies
362 Views
Hi everyone, I need to add spatial video maker in my app which was wrote in objective-c. I found some reference code by swift, can you help me with converting the code to objective -c? let left = CMTaggedBuffer( tags: [.stereoView(.leftEye), .videoLayerID(leftEyeLayerIndex)], pixelBuffer: leftEyeBuffer) let right = CMTaggedBuffer( tags: [.stereoView(.rightEye), .videoLayerID(rightEyeLayerIndex)], pixelBuffer: rightEyeBuffer) let result = adaptor.appendTaggedBuffers( [left, right], withPresentationTime: leftPresentationTs)
Posted
by
Post not yet marked as solved
1 Replies
347 Views
Does the new MV-HEVC vision pro spatial video format supports having an alpha channel? I've tried converting a side by side video with alpha channel enabled by using this Apple example project, but the alpha channel is being removed. https://developer.apple.com/documentation/avfoundation/media_reading_and_writing/converting_side-by-side_3d_video_to_multiview_hevc
Posted
by
Post not yet marked as solved
0 Replies
300 Views
Is there any way to play panoramic or 360 videos in an immersive space, without using VideoMaterial on a sphere? I've tried using local videos with 4k and 8k quality and all of them look pixelated using this approach. I tried both simulator as well as the real device, and I can't ever get a high-quality playback. If the video is played on a regular 2D player, on the other hand, it shows the expected quality.
Posted
by
Post not yet marked as solved
1 Replies
384 Views
I've encountered an issue with the seek bar time display in the video player on iOS 17, specifically affecting live stream videos using HLS manifests with the time displayed in am/pm format. As the video progresses, the displayed start time appears to shift backwards in time with a peculiar pattern: Displayed Start Time = Normal Start Time - Viewed Duration For instance, if a program begins at 9:00 AM, at 9:30 AM, the start time shown will erroneously be 8:30 AM. Similarly, at 9:40 AM, the displayed start time will be 8:20 AM. This issue is observed with both VideoPlayer and AVPlayerViewController on iOS 17. The same implementation of the video player on iOS 16 displays the duration of the viewed program and doesn’t have any issues. Please advise on any known workarounds or solutions to address this issue on iOS 17.
Posted
by
Post marked as solved
1 Replies
475 Views
Hey devs! I recently started a project, a macOS app, which is like a Remote Desktop app but only on local network. For this I wanted to use the MultipeerConnectivity framework, and it's my first time using it. So I have already done the device discovery side that is working well as it is the easier part. Now I just need someone who knows how it works and who has time to explain me (as I couldn't find any documentation about this) how does work the OutputStream And InputStream in MC and if its a good choice for my needs. It has to be low latency and high resolution... I also have seen other frameworks such as WebRTC that I could combine with a local WebSocket Server, but as I'm new to live video streaming and that I don't know anyone that is experimented with this I wanted to ask there for your advices. Thank you in advance, TR-MZ (just an unknown Indie dev).
Posted
by
Post not yet marked as solved
0 Replies
316 Views
Hey, all! I've been trying to upload a video preview to the AVP storefront for our app, but some of the export requirements seem to contradict one another. For the AVP, a resolution of 4K is needed... which would require H264 level 5.2. Yet, the H264 level can't be any higher than 4... which is 1080p. It seems like a catch-22 where either the H264 level will be too high, or the resolution will be too low. Does anyone have a fix or a way around this issue?
Posted
by
Post not yet marked as solved
1 Replies
323 Views
Does Video Toolbox’s compression session yield data I can decompress on a different device that doesn’t have Apple’s decompression? i.e. so I can network data to other devices that aren’t necessarily Apple? or is the format proprietary rather than just regular h.264 (for example)? If I can decompress without video toolbox, may I have reference to some examples for how to do this using cross-platform APIs? Maybe FFMPEG has something?
Posted
by