Please help me.
I use "FFmpeg library" with Swift.
I am trying to output real time video of H.264 to the iPhone.
However, I have many problems.
One of them is type conversion from "AVPicture" to "AVFrame".
Secondly, a negative value is assigned to "size".
Below is a part of the source code.
--------------------------------
av_register_all()
avcodec_register_all()
avformat_network_init()
let path = "rtsp://example_"
var format Context = UnsafeMutablePointer<AVFormatContext>(avformat_alloc_context())
av_format_open_input(&formatContext, path, nil, nil)
av_format_find_stream_info(formatContext, nil)
av_dump_format(formatContext, 0, path, 0)
let target = AVMEDIA_TYPE_VIDEO
var codecContext = formatContext.pointee.stream[0].pointee.codec
let codec = avcodec_finde_decoder(codecContext.pointee.codec_id)
avcodec_open2(codecContext, codec, nil)
var frame = av_frame_alloc()
var convertFrame = av_frame_alloc()
let width = codecContext.pointee.width
let height = codecContext.pointee.height
let format = AV_PIX_FMT_VDPAU_H264
let size = av_image_get_buffer_size(fornat, width, height, 1)
var buffer = av_malloc(Int(size))
avpicture_fill(UnsafeMutablepointer<AVPicture>(convertFrame), UnsafePointer<UInt8>(buffer), format, width, height)
--------------------------------
The first argument of this avpicture_fill () is AVPicture.
Therefore, conversion of AVPicture from AVFrame is necessary but it is not accepted.
Am I making a mistake?