I use iPhone camera to capture images in my iOS APP.
AVCaptureVideoDataOutputSampleBufferDelegate
is used to get images from iPhone camera. A part of the program is shown below.AVCaptureDevice *videoDevice = [AVCamViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionBack];
AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (error) { NSLog(@"%@", error); }
if ([session canAddInput:videoDeviceInput])
{
[session addInput:videoDeviceInput]; [self setVideoDeviceInput:videoDeviceInput];
dispatch_async(dispatch_get_main_queue(), ^{
[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] setVideoOrientation:(AVCaptureVideoOrientation)[self interfaceOrientation]]; });
}
The quality of the image taken by
AVCaptureVideoDataOutputSampleBufferDelegate and that of the image taken iPhone's camera app are much different. How can I adjust the image quality to have the same or closer quality as camera app.