We are trying to build a video recording app using AVFoundation and AVCaptureDevice.
No custom settings are used like iso, exposure duration. All the settings are kept to auto.
But when video is captured using front camera and 1080x1920 dimensions, the video captured from the app and front native camera does not match.
In settings i have kept video setting as 30 fps 1080x1920.
The video captured from the app includes more area than the native app. Also some values like iso, exposure duration does not match.
So is there any way to capture video exactly same as native camera using AVFoundation and AVCaptureDevice.
I have attached screenshots from video for reference.
Native
AVCapture
Video
RSS for tagIntegrate video and other forms of moving visual media into your apps.
Posts under Video tag
91 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
AVKit provides the SwiftUI view VideoPlayer, and allows you to add an interactive overlay. But that overlay is normally placed behind the system-provided playback controls.
Is there any way to suppress those controls, without resorting to wrapping AVPlayerView?
Hi, Recording Videos with AVAssetWriter, capture fps(camera output fps) is ok, but final result video fps was lower, the reason is AVAssetWriterInput.isReadyForMoreMediaData is false sometimes.
Yes, I have read document many times, it said need to set expectsMediaDataInRealTime to true and balabala...
I really be tortured by this problem for a long time, can I debug this problem? or any advice?
Hi all,
we want to play downloaded encrypted HLS Fragment MP4 files on iPhone, and we are using UsingAVFoundationToPlayAndPersistHTTPLiveStreams to test.
on this HLSCatalog app, we can playback encrypted HLS Fragment MP4 streaming, but when we download the encrypted HLS Fragment MP4 to device/iPhone, then try to playback on device, but now, it has an issue:The error is: Error: Optional("The operation couldn’t be completed. (CoreMediaErrorDomain error 1718449215.)")
So, we want to know how can I playback downloaded encrypted HLS fmp4 on iPhone.
and you can try below url:
http://69.234.244.220/prod/vod/HDR10_2D_LEFT_48FPS_FMP4_Encrypted/prog_index.m3u8
Steps to reproduce:
1: create a HLS fragment mp4 with mediafilesegmenter:
mediafilesegmenter --iso-fragmented -t 4 --encrypt-key-file=BT709-2D-48FPS.key --encrypt-key-url=http://69.234.244.220:5000/download/BT709-2D-48FPS.key -f prod/vod/HDR10_2D_LEFT_48FPS_FMP4_Encrypted HDR10_2D_LEFT_48FPS.mp4
2: upload to content server
3: download UsingAVFoundationToPlayAndPersistHTTPLiveStreams from https://developer.apple.com/documentation/avfoundation/offline_playback_and_storage/using_avfoundation_to_play_and_persist_http_live_streams
4: in HLSCatalog app, replace playlist_url of Item-1 of Streams to http://69.234.244.220/prod/vod/HDR10_2D_LEFT_48FPS_FMP4_Encrypted/prog_index.m3u8
5: in HLSCatalog app->click the icon of Advanced Stream-->click download, when download success, the try play...now, it can NOT playback on iPhone.
Hello Apple,
I am yet again concerned about the new iOS Screen Mirroring that going to be available on iOS 18 stable.
I have an app that is only meant to be viewed on iPhones (not Macs or Computers, due to various security reasons.
I have raised a Feedback Assistant on this and Apple have ignored this.
Other apps that might benefit from a Disabling / Detecting API for iOS Screen Mirroring for Apps may be Snapchat, DAZN, Sony, Netflix, Amazon Prime, etc.
Is there still pans for an API that can disable this functionality now or in the future as I am sure that a company like Snapchat doesn't want people screenshotting photos using iOS Screen Mirroring and the app doesn't know.
Thanks.
Hello,
I'm trying to load a video on immersive space.
Actually using VideoMaterial and applying it on a plane surface I'm able to load video that I have locally.
If I want to load something from an external link like youtube or other service how can I do that?
Remembering obv that I'm loading it in an immersive space.
Thanks ;)
Hello,
I'm trying to stream stereoscopic SBS video on the APV. I see that AVPlayerViewController supports MV-HEVC video playback but it's not clear how to play SBS video on the Apple Vision Pro.
Are there any docs or examples you can share?
For my use case, SBS is the only format I can support.
Hi, I would like to put a video in an ImmersiveSpace but, when people tap on maximize button from top left corner, the app crashes. Is it a way to remove that button?
I need it to be in an ImmersiveSpace because I would like to be able to instantiate it wherever I want.
Hi,
We are currently building an app for immersive experiences of our custom content. This is displayed from a video on a custom geometry in the immersive on the Vision Pro
I have enabled the AVPlayerViewController system controls that detach when entering immersive like in the sample:
https://developer.apple.com/documentation/visionos/building-an-immersive-media-viewing-experience
For our case, we do not need the 2D screen showing after entering the immersive, only the environment
So my question is how to remove the screen with the video and keep the controls, like in the Apple TV app for immersive experiences?
Thanks in advance
I encountered an issue when playing WebRTC video using the H5 video tag on an iPhone with iOS 17.
After performing the following operations, the video automatically plays in fullscreen mode:
Create the video tag, specify the playback source, and set the autoplay attribute to true.
Call the API to enter fullscreen playback mode.
Exit fullscreen by swiping with a finger.
Stop playback and destroy the video tag.
Repeat step 1;
at this point, the video does not correctly default to inline playback but instead automatically plays in fullscreen.
It seems as if the system has cached the previous fullscreen state, even though I have exited fullscreen, and the same result occurs with different browsers.
I have set the 'playsinline' and 'webkit-playsinline' attributes for the video tag.
Anyone encountered the same issue? I would appreciate any solutions that can be shared.
I want my app to allow the user to search for certain words in a video file and the transcript of that video. I found a Transcript class. I don't remember which Framework it is in. Would someone point me in the right direction? What Framework and Classes should I use.
We have a service providing identification services via video, i.e. users get on a WebRTC video call and follow some instructions, all the while seeing themselves full screen. So this web site displays the user's camera full screen with a few overlays. In iOS 18 beta, this stopped working; the video isn't rendered anymore. The remote end of the conversation can see the video fine, so the camera itself and WebRTC are working. No errors in the console.
When tapping the site settings button in the address bar and the font size/reader/translate page dialog pops up, now suddenly the video is being rendered just fine, immediately. It suggests it was there the entire time, Safari just didn't bother actually showing it.
It's hard to create a minimal reproducible example for this. The behaviour just suddenly changed starting with iOS 18 beta. Has anyone observed something similar or has any tricks which may enable a workaround?
Hi, I love VideoMaterial API that gives so much power to play video on any mesh. But I am trying to play a side-by-side 3D video usingVideoMaterial:
RealityView { content in
let mesh = MeshResource.generatePlane(width: 300.0, height: 300.0, cornerRadius: 0) //generate mesh
let vidMaterial = VideoMaterial(avPlayer: AVPlayer(url: URL(string: "https://someurl/test/master.m3u8")!)) //VideoMaterial
vidMaterial.controller.preferredViewingMode = .stereo //<-- no idea why it doesn't work for SBS video in simulator
vidMaterial.avPlayer?.play()
let planeEntity = Entity() //new entity
planeEntity.components.set(ModelComponent(mesh: mesh, materials: [vidMaterial])) //set a new ModelComponent to the entity
content.add(planeEntity)
}
this code works well for plain 2D video playback but how do I display a Side-by-Side or Top-Bottom 3D video?
I found GeometrySwitchCameraIndex in custom ShaderGraphMaterial but if I use input node as a image texture then how do I pass the video frame as texture into my custom shader to achieve the 3D effect or maybe there is an even better way to deal with this?
There seems to be additional API .preferredViewingMode on the VideoMaterial's controller that can be set to .stereo but it doesn't give any stereo effect. Perhaps it's only for MV-HEVC media playback?
Is it possible to put an video on loop and autoplaying for visionOS? We used AVPlayerViewController
Hi, I'm developing a simple app to visualize embedded VR180 3D video.
I used a semisphere and projected the video as its material. The semisphere is in the ambient at a fixed y value of 1.35, which is good for a seated person, but not ideal for a standing person because the stereoscopic vision is not correct. In the AppleTV+ and Kandao applications, I noticed that the translation of the video is anchored to the Apple Vision Pro. I tried using AnchorEntity to the head with trackingMode .once, but there is the problem of rotation; the semisphere starts with the rotation of the head.
Is there a solution, for example, to anchor the semisphere only to the translation and not to the rotation of the head?
I have an HDR10+ encoded video that if loaded as a mov plays back on the Apple Vision Pro but when that video is encoded using the latest (1.23b) Apple HLS tools to generate an fMP4 - the resulting m3u8 cannot be played back in the Apple Vision Pro and I only get back a "Cannot Open" error.
To generate the m3u8, I'm just calling mediafilesegmenter (with -iso-fragmented) and then variantplaylistcreator. This completes with no errors but the m3u8 will playback on the Mac using VLC but not on the Apple Vision Pro.
The relevant part of the m3u8 is:
#EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=40022507,BANDWIDTH=48883974,VIDEO-RANGE=PQ,CODECS="ec-3,hvc1.1.60000000.L180.B0",RESOLUTION=4096x4096,FRAME-RATE=24.000,CLOSED-CAPTIONS=NONE,AUDIO="audio1",REQ-VIDEO-LAYOUT="CH-STEREO"
{{url}}
Has anyone been able to use the HLS tools to generate fMP4s of MV-HEVC videos with HDR10?
We noticed that AVPlayerViewController does not always show the "Multi-channel" label in the audio setting in the player when playing a video asset with surround sound as an audio track. (see image)
We only serve in the HLS master manifest a multichannel audio track, like this
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="audio_0",CHANNELS="6",NAME="Surround",LANGUAGE....
Different tvOS versions will give us different outcomes on whether or not the "multi-channel" label is shown
DOES NOT SHOW (the label Multi-channel will not show)
Model A1842 (tvOS v 17.5.1)
Model A1625 (tvOS v 16.6)
DOES SHOW (see image)
Model A1625 (tvOS v 15.6)
This gives us the impression that the label being shown depends on tvOS version.. Any reason why? This is an ideal way for the user to see that the audio track has surround..
I recently bought an insta360 flow gimbal. when recording video with the instaflow app, I cannot see the location in apple photos app and all other apple apps. However I can see the location in windows photos app once I downloaded the videos into my windows PC. The location is also visible in android app once I share it through google account.
With an exif app, I can see the location meta data in exif table as well, but again not shown as location.
exiftool in my pc can also see the meta data including location as in attached screenshot.
Compared to video shot with built-in camera app, I cannot find any difference in terms of location meta data.
What could be wrong? I contacted insta360 app support, they do not seem to understand what's going on, just asking for very simple questions again and again like do you enable GPS location access, are you shooting video?
I also contacted apple support, they are just saying it's thirdparty issue and refusing to help further. If it's really thirdparty issue how come the location data is actually embeded as meta data, and windows pc and android device can see the location? BTW, I air drop this video to all my apple devices like iPhone 15 ultra and ipad air, and very old iPhone, all of them cannot see the location.
i can sure the app already hava the camera access, but in the embedded html, i still cannot open the camera. And this HTML page is work at Safari, but cant work on app when the page is embedded in app.
there is the error message:
DOMException: undefine is not an object (evaluating 'navigator.mediaDevices.getUserMedia')
and i also try to use 'navigator.getUserMedia' and 'navigator.mediaDevices.enumerateDevices()', this all dont work.
The same H265 encrypted Fairplay content can be played in all Apple devices except A1625.
The clear H265 content is played in A1625.
The question is: will this model (A1625) support H265 Fairplay encrypted content?
A ticket was created here:
https://discussions.apple.com/thread/255658006?sortBy=best