These snippets might be of use to you:
if let captureConnection = videoDataOutput.connection(with: .video) {
captureConnection.isEnabled = true
captureConnection.isCameraIntrinsicMatrixDeliveryEnabled = true
}
[God almighty. Why is it so impossible to format code in this editor?]
This function pulls out the intrinsics and computes the field-of-view, but that was for something I was doing; just the intrinsics matrix here might be what you want:
nonisolated func computeFOV(_ sampleBuffer: CMSampleBuffer) -> Double? {
guard let camData = CMGetAttachment(sampleBuffer, key:kCMSampleBufferAttachmentKey_CameraIntrinsicMatrix, attachmentModeOut: nil) as? \
Data else { return nil }
let intrinsics: matrix_float3x3? = camData.withUnsafeBytes { pointer in
if let baseAddress = pointer.baseAddress {
return baseAddress.assumingMemoryBound(to: matrix_float3x3.self).pointee
}
return nil
}
guard let intrinsics = intrinsics else { return nil }
let fx = intrinsics[0][0]
let w = 2 * intrinsics[2][0]
return Double(atan2(w, 2*fx))
}
Again, sorry for the totally ****** formatting. If someone can tell me how this is supposed to work, I'm all ears. I pasted code and hit "code block" but it didn't help much.
Post
Replies
Boosts
Views
Activity
Be sure to pass in the camera intrinsics. Rather than compute them yourself, pull them from the AVCaptureDevice.
I've seen something similar, when the zoom is at default, it's fine, as it increases, the fact that your view is zoomed in isn't known to the tracking system because incorrect intrinsics. So a small offset at low zoom because a big offset at bigger zoom and the system tells the accessory to rotate too much. Feedback loop.
Thank you for your reply (and, indeed all the advice and monitoring you’ve been doing for the past few years — I‘ve read most of your posts.)
One thing that I just can’t discern, through neither documentation, nor discussions, is what happens when I try to send a large message over UDP through Network framework. Given that there are no constraints on the size of a message in Network, let’s suppose I (stupidly) just send 1 megabyte of Data as a single message.
Does Network:
Say, “forget it!” (i.e. error). That’s just too big.
Break it into large packets (anything way over the MTU) and send it (which would require reassembling it?)
Break it into packets of approximately MTU size (say between 500 bytes and 1500 bytes) which again requires Network to reassemble on the other side.
Suppose I ask to send 30K as a single message:
Does Network just send this as a single large packet? (I assume “yes”)
Or does Network still break it down into more MTU sized chunks, again requiring reassembly by Network on the other side? (I assume “no”)
Last question: I understand fragmentation is “bad”. Does this mean if I opt to use Network and UDP, I should always try to break my messages up into chunks of between 500 and 1500 bytes (depending on what I believe the MTU is)?
Thanks for the QUIC note, which I am unfamiliar with. I will read about this, and maybe that’s what I want.
—————————-
My actual scenario is trying to use my iPad to see the camera stream from my iPhone, which is located perhaps 10 to 20 feet away from the iPad, but up on a tripod (perhaps in a dockkit accessory). I want to both see what the iPhone sees and possibly control the positioning of the iPhone in the dockkit accessory.
So yes it is streaming video, but from extremely close range. Assuming Network forms a peer-to-peer connection in places where I don’t have a wifi network (say, outside!) this would fantastic.
This is a bit off-topic, but hoping one of you might reply. I just learned about the Network framework. In an introductory WWDC talk, they show a live-streaming video example, but the code isn’t available (sadly).
There’s a reference to “breaking the frame up into blocks” because of delivery over UDP: I assume this is because of message lengths?
At any rate, if someone can give me a quick idea of the strategy of sending video frames from device to device, over UDP (which must assume some things can get lost), I’d greatly appreciate it. I assume UDP has message length constraints in Network, but I don‘t see them mentioned.
Surely I can’t just send an entire 2K jpeg image (i.e. 1920x1080 pixels) in one UDP message. Or can I?
OK, it's almost two years later: any signs/indications that an async API might be showing up in the Network framework? (I've just learned about this framework now, so obviously a little late to the party, and wow it looks exciting.)
But async/await makes things so easy, I was really hoping it would be part of the framework before I start coding with it.