Posts

Post not yet marked as solved
1 Replies
317 Views
Hello. I'm having problems with a narrow video aspect ratio displayed on an iPad Air. In an outlet labeled ImagePreviewOutlet I want to display a video feed from the iPad camera facing the user. So basically the user will see their own face. I did all the necessary moves in the Story Board and in the code and I got what I wanted. Except it is in the wrong Aspect Ratio. The image doesn't fill the screen of the iPad even though the dimensions of the ImagePreviewOutlet does. It's narrow, showing white space on either side of the video. I want a wider aspect ratio. I understand that the iPad has 2 ratios : 4:3 and 16:9. Can anybody help me? I can supply the code that I'm using but I don't want my first post to be overwhelming. Thank you for your time. JR
Posted Last updated
.
Post not yet marked as solved
0 Replies
189 Views
In an app I am working on that goes on an iPad air, I have the camera facing the user turn on and take a live video feed of the user is placed in an outlet called ImagePreviewOutlet. It works but the aspect ratio is too narrow leaving 2 white spaces on the left and right of the video. I want the aspect ration to be wider so the video fills the whole of the iPad. Here is the code for the 3 methods that handles the video: (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { // Convert CMSampleBufferRef to CIImage CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pixelBuffer]; // Calculate the fitting rect based on 4:3 aspect ratio CGFloat targetAspectRatio = 4.0 / 3.0; // 4:3 aspect ratio CGFloat targetWidth = CGRectGetHeight(self.ImagePreviewOutlet.bounds) * targetAspectRatio; CGFloat targetX = (CGRectGetWidth(self.ImagePreviewOutlet.bounds) - targetWidth) / 2.0; CGRect fittingRect = CGRectMake(targetX, 0, targetWidth, CGRectGetHeight(self.ImagePreviewOutlet.bounds)); // Apply filters (if needed) // Resize the CIImage to fit the ImagePreviewOutlet CIImage *resizedImage = [ciImage imageByApplyingTransform:CGAffineTransformMakeScale(fittingRect.size.width / CGRectGetWidth(ciImage.extent), fittingRect.size.height / CGRectGetHeight(ciImage.extent))]; // Render the CIImage into a CGImage CIContext *context = [CIContext contextWithOptions:nil]; CGImageRef cgImage = [context createCGImage:resizedImage fromRect:resizedImage.extent]; // Create a UIImage from the CGImage UIImage *image = [UIImage imageWithCGImage:cgImage]; // Update the UI on the main thread dispatch_async(dispatch_get_main_queue(), ^{ self.ImagePreviewOutlet.image = image; // Replace with your UI element }); // Release resources CGImageRelease(cgImage); } (void)showImagePreviewOutlet { self.ImagePreviewOutlet.hidden = NO; // Create a capture session self.captureSession = [[AVCaptureSession alloc] init]; // Configure the capture device (use front camera) AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithDeviceType:AVCaptureDeviceTypeBuiltInWideAngleCamera mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionFront]; AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:nil]; [self.captureSession addInput:input]; // Create a preview layer to display the live video AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession]; previewLayer.frame = self.ImagePreviewOutlet.bounds; [self.ImagePreviewOutlet.layer addSublayer:previewLayer]; // Start the capture session on a background thread dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{ [self.captureSession startRunning]; }); // Schedule a timer to stop the capture session after 10 seconds [NSTimer scheduledTimerWithTimeInterval:10 target:self selector:@selector(hideImagePreviewOutlet) userInfo:nil repeats:NO]; } (void)setupCaptureSession { AVCaptureSession *captureSession = [[AVCaptureSession alloc] init]; AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithDeviceType:AVCaptureDeviceTypeBuiltInWideAngleCamera mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionBack]; NSError *error = nil; AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error]; if (input) { if ([captureSession canAddInput:input]) { [captureSession addInput:input]; } AVCaptureVideoDataOutput *videoOutput = [[AVCaptureVideoDataOutput alloc] init]; if ([captureSession canAddOutput:videoOutput]) { [captureSession addOutput:videoOutput]; } [captureSession startRunning]; } else { NSLog(@"Error setting up capture session: %@", error.localizedDescription); } } I want the video to be wider. Any ideas? JR
Posted Last updated
.
Post not yet marked as solved
0 Replies
182 Views
I'm using Ventura 13.3.1 with Xcode 14.3. Working on my iPad app, everything is going smoothly, but when I test it on my iPad, everything works just fine except my graphic representing my app has been replaced on the home screen by apple's default blank icon. My graphics is in the Appleton.appiconset folder. And that Folder is in the file with my app's name. I'm sure this problem is easy to solve. Thank you for your time. JR
Posted Last updated
.
Post not yet marked as solved
5 Replies
2.7k Views
I hope I can talk about this subject on this Apple's forum.As I learned more and more about programming and app development, I became aware of a thing called Git Hub. I looked at it and read about it but I am still not sure what good it is to people like us.I think it is a place where we can post the entire number of files of the code so other app developers can scrutinize the code written.For example, I know it would be impossible to display all the code of all the classes that go into my current comic book app project. But if I could do just that somewhere else on the web and then post a link here in the Apple forums it would be very handy!Can someone here enlighten me on this?JR
Posted Last updated
.