Image quality taken by iPhone camera using AVCaptureVideoDataOutputSampleBufferDelegate

I use iPhone camera to capture images in my iOS APP.

AVCaptureVideoDataOutputSampleBufferDelegate
is used to get images from iPhone camera. A part of the program is shown below.


AVCaptureDevice *videoDevice = [AVCamViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionBack]; 
AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error]; 
if (error) { NSLog(@"%@", error); } 
if ([session canAddInput:videoDeviceInput]) 
{ 
  [session addInput:videoDeviceInput]; [self setVideoDeviceInput:videoDeviceInput];
   dispatch_async(dispatch_get_main_queue(),    ^{ 
   [[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] setVideoOrientation:(AVCaptureVideoOrientation)[self interfaceOrientation]]; }); 
}


The quality of the image taken by

AVCaptureVideoDataOutputSampleBufferDelegate and that of the image taken iPhone's camera app are much different. How can I adjust the image quality to have the same or closer quality as camera app.

Replies

Are you hoping to get buffers for still image taking? Video? In general you get different levels of quality either by using AVCaptureSession's setSessionPreset: or by using AVCaptureDevice's setActiveFormat: API. These two techniques are mutually exclusive.

Thanks. Which has better quality? What are the other differences between the two? I'm asking so that I can decide which one to use.