I am talking about AVCaptureVideoDataOutput.recommendedVideoSettings.
I found sometimes it return nil, there is my test result.
hevc .mov with activeColorSpace sRGB
60FPS -> ok
120FPS -> ok
hevc .mov with activeColorSpace displayP3_HLG
60FPS -> nil
120FPS -> nil
h264 .mov
30FPS -> ok
60FPS -> nil
120FPS -> nil
so, if you don't give a recommend setting, and you don't give a document, how does developer to use it?
Post
Replies
Boosts
Views
Activity
I made a LockScreen ControlWidget with CameraCaptureIntent, but I found launch my main app from Control Widget, SceneDelegate will be called like below:
sceneWillEnterForeground
sceneDidBecomeActive
sceneWillResignActive
sceneDidBecomeActive
sceneWillResignActive be called, is it normal?
it make my app camera launch with a delay.
My custom control widget is show up and I can set it to Lock Screen, but it doesn't launch my app when I clicked it. any problem?
in A.swift file, code like below:
@available(iOS 18.0, *)
struct LockScreenAppLaunchWidget: ControlWidget {
var body: some ControlWidgetConfiguration {
StaticControlConfiguration(kind: "abc") {
ControlWidgetButton(action: LaunchAppIntent()) { // <-- HERE
Label("Something", systemImage: "arrow.up")
}
}
.displayName("Open app")
}
}
@available(iOS 18, *)
struct LaunchAppIntent: AppIntent {
static var title: LocalizedStringResource = "ABC"
static var description: IntentDescription? = "abcd"
static var openAppWhenRun: Bool = true
@MainActor
func perform() async throws -> some IntentResult & OpensIntent {
return .result()
}
}
in ios 18, we can custom two bottom button on Lock Screen, in past, they are Flash and Camera.
So I want my app's function could be display in here, and I custom it, and will show a Controls List, But I can not found my app's icon.
How to make a myself Control item and add it to Control List?
any guideline or tutorials are welcome :D
Hi, Recording Videos with AVAssetWriter, capture fps(camera output fps) is ok, but final result video fps was lower, the reason is AVAssetWriterInput.isReadyForMoreMediaData is false sometimes.
Yes, I have read document many times, it said need to set expectsMediaDataInRealTime to true and balabala...
I really be tortured by this problem for a long time, can I debug this problem? or any advice?
if there is a huge image(10000x10000), loading it into memory will cause crash every time.
so, can I load a portion of image into memory and process it, and write back this part to image file?
I am using AudioService API to play a sound effect before start video recoding, I found a weird thing that the final result video's volume is relative to sound effect volume. if sound effect volume is big, the final video's volume became so low, if sound effect volume is quite, final video's volume is big.
I think the iphone's microphone or audio framework make this, maybe it detected the peak volume and balance audio record volume.
I use 'playAndRecord' to record video, dosen't change it when play sound effect.
need help to slove this issue, i want the video's volume is constants
recently, we recevied many portrait mode issue about camera couldn't be launched.
there is the error report from Notification
notification: name = AVCaptureSessionRuntimeErrorNotification, object = Optional(<AVCaptureSession: 0x282f5c830 [AVCaptureSessionPresetPhoto]>
<AVCaptureDeviceInput: 0x282d1fc40 [Back Dual Cam]>[vide] -> <AVCapturePhotoOutput: 0x282d2adc0>
<AVCaptureDeviceInput: 0x282d1fc40 [Back Dual Cam]>[vide] -> <AVCaptureVideoDataOutput: 0x282d26f40>
<AVCaptureDeviceInput: 0x282d1fc40 [Back Dual Cam]>[dpth] -> <AVCaptureDepthDataOutput: 0x282d361a0>), userInfo = Optional([AnyHashable("AVCaptureSessionErrorKey"): Error Domain=AVFoundationErrorDomain Code=-11819 "The operation could not be completed" UserInfo={NSLocalizedDescription=The operation could not be completed, NSLocalizedRecoverySuggestion=}])
2022-04-06 12:11:15.969 < i > Thread: main [CaptureManager] onAVCaptureSessionRuntimeError: related decl 'e' for AVError(_nsError: Error Domain=AVFoundationErrorDomain Code=-11819 "The operation could not be completed" UserInfo={NSLocalizedDescription=The operation could not be completed, NSLocalizedRecoverySuggestion=}) code: -11819 session.isRunning: false
I don't know how to fix this, other device is ok in ios 15.4
we can't re produce this problem
I am using a color lookup table to create a 3d texture, and use input color to find output color. but I found output color has some interrupt layer. Is this sampler sample 3d texture by using trilinear?
This is my code for creating 3d texture
the lutData is a array of r, g, b, a, r, g, b, a, ....
lutSize is the lookup table size, it's 32 or 64.
func create3DTextureFromLutData(lutData: [Float], lutSize: Int) - MTLTexture? {
let desc = MTLTextureDescriptor()
desc.textureType = .type3D
desc.pixelFormat = .rgba32Float
desc.width = lutSize
desc.height = lutSize
desc.depth = lutSize
desc.usage = .shaderRead
guard let texture = _device.makeTexture(descriptor: desc) else { return nil}
texture.replace(region: MTLRegion(origin: .init(x: 0, y: 0, z: 0), size: .init(width: lutSize, height: lutSize, depth: lutSize)), mipmapLevel: 0, slice: 0, withBytes: lutData, bytesPerRow: lutSize * MemoryLayoutFloat.size * 4, bytesPerImage: lutSize * lutSize * MemoryLayoutFloat.size * 4)
return texture
}
and in my shader, I just want to sample it like this.
metal
constexpr sampler textureSampler = sampler(mag_filter::linear, min_filter::linear, mip_filter::linear, address::clamp_to_edge);
half4 lutWith3dTexture(half4 inputColor, short lutSize, half lutStrength, texture3dfloat, access::sample lut3dTexture, sampler textureSampler) {
half4 originColor = inputColor;
float3 coor = float3(((inputColor.rgb * (lutSize - 1)) + 0.5) / lutSize);
half4 resultColor = half4(lut3dTexture.sample(textureSampler, coor));
return mix(originColor, resultColor, lutStrength);
}
I am trying different filter::xxxx didn't help anything.
Error Domain=AVFoundationErrorDomain Code=-11800 "这项操作无法完成" UserInfo={NSUnderlyingError=0x1102b4940 {Error Domain=NSOSStatusErrorDomain Code=-16415 "(null)"}, NSLocalizedFailureReason=发生未知错误(-16415), AVErrorRecordingFailureDomainKey=4, NSLocalizedDescription=这项操作无法完成}
I am using Iphone12 pro to test my camera app, when I record 4K 60FPS video, captureOutput didDrop will be invoked many times, the final fps only was about 45fps.
I found some weird things.
If I move camera or let something moving front of the camera, the viewport is changing, the dropping frame will stop, If keep camera stable and static, dropping frame will start again.
I made some debug info displayed in the top of view, If I display it, the dropping frame stopped, If I hide this debug view, dropping frame start agian.
If I don't record ( with AVAssetWriter) , it will drop frame per 0.5 sec, If I start record, it drop frame very fast.
drop frame reason is "OutOfBuffers"
this problem is must shown on iphone12 pro, everything is fine on iphoneX.
I faced problem same as this metal-vertexfunction-defined-in-metal-file-becomes-nil-once-setting-compiler - https://stackoverflow.com/questions/57391441/metal-vertexfunction-defined-in-metal-file-becomes-nil-once-setting-compiler-a
is there a offical solution?
iphone 7(iPhone9,2) take RAW photo capture crashed.
raw pixel type is: 1919379252
processedFormat is hvc1
This error was throwed in photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?)
Error Domain=AVFoundationErrorDomain Code=-11800 "这项操作无法完成" UserInfo={NSUnderlyingError=0x12360ec60 {Error Domain=NSOSStatusErrorDomain Code=-16415 "(null)"}, NSLocalizedFailureReason=发生未知错误(-16415), AVErrorRecordingFailureDomainKey=4, NSLocalizedDescription=这项操作无法完成}
I have a image with pixel size: 3024 * 3024
I have a custom UIView with logical size: 207 * 207, and layer.contentsScale is 3, so pixel size is 621 * 621.
I want to draw this UIImage in my custom UIView, the code in draw(rect:) like this;
var image: UIImage? {
didSet {
setNeedDisplay()
}
}
override func draw(_ rect: CGRect) {
guard let ctx = UIGraphicsGetCurrentContext() else { return }
ctx.addRect(bounds)
ctx.setFillColor(UIColor.black.cgColor)
ctx.fillPath()
if var im = image {
let size = Math.getMaxSizeWithAspect(size: CGSize(width: bounds.width, height: bounds.height), radioWidthToHeight: im.size.width / im.size.height)
im.draw(in: CGRect(x: (bounds.width - size.width) / 2, y: (bounds.height - size.height) / 2, width: size.width, height: size.height))
}
and the result is very bad, picture is aliasing, I tried much solution, but not work well.
// not work list...
ctx.setShouldAntialias(true)
ctx.setAllowsAntialiasing(true)
ctx.interpolationQuality = .high
layer.allowsEdgeAntialiasing = true
layer.minificationFilter = .trilinear
I am using AVCaptureVideoDataOutput to capture each frame from camera capture.
I display these frame to MTKView.
that's fine.
If I call AVCapturePhotoOutput.capturePhoto, the delegate function AVCaptureVideoDataOutput.captureOutput will be paused, so that my preview had a little freezed, that's very bad.
I don't know if I do something wrong cause this?
this is my capturePhoto code:
func createTakePhotoSettingForNormalPhoto() - AVCapturePhotoSettings {
var settings: AVCapturePhotoSettings
settings.isHighResolutionPhotoEnabled = true
return settings
}
func takePhoto() {
settings = createTakePhotoSettingForNormalPhoto()
settings.flashMode = ...
_outputStillImage.capturePhoto(with: settings, delegate: self)
}
I noticed that if I tap shot button repeatly and quickly in IOS native Camera APP, it will have a black flash at each shot, but the preview also could display frame smoothly.