Post

Replies

Boosts

Views

Activity

How to share 'back facing' iOS camera app at same time Eye Tracking app needs 'front facing' camera?
While using my xmas present of a new iPhone and iOS 18.2, I figured I'd try the Eye Tracker app. I've been working with clients successfully using Tobii and other existing eye trackers. In my limited tests, Apple has room for improvement. My main issue is with the camera app which cannot be used at the same time while using the Eye Tracker app. I get an error popup from Apple: Camera is use by another app The image below is from my app showing the popup message "Camera in use by another app", but the same error occurs on the installed camera app. This error is from Apple, not my app. For terminology: 'front' camera is the one pointing at the user (the selfi camera) while 'back' camera is the main one with multiple lenses. Eye tracking needs the 'front' camera. It seems when an app uses the camera, it takes over both the front and back facing cameras (since you might swap them). Thus another app, especially Eye Tracking, cannot use just the front facing camera at the same time. That limits use of Eye Tracking, in particular one cannot take pictures or click any buttons on an app that uses the camera. Anyone know of a way for an app to not take over both front and back cameras at the same time? If I can separate them, the Eye Tracker could use the front camera while the camera uses the back camera.
1
0
169
3w
"Unexpected Failure" compiling with Xcode and code generated with CocoaPods
I've been using CocoaPods.org very successfully for a number of years. My applications are almost all Objective-C but sometimes I include Swift. Just recently, with the upgrade to macOS Sonoma 14.2.1, almost all my Xcode 15.1 builds are getting an error: Unexpected Failure I tracked this down to a file, when Swift is included, that is generated from CocaPods with a name like: ./Pods/Target Support Files/Pods-APPLICATION/Pods-APPLICATION-resources.sh This is the shell script segment that is generating the error: function on_error { echo "$(realpath -mq "${0}"):$1: error: Unexpected failure" } trap 'on_error $LINENO' ERR The Unexpected failure is occurring in this portion of the script - but in this case because realpath -mq is a wrong way of using realpath - as the -mq is an invalid option. This might be a macOS change. But it really just masks the reason that on_error was called. All of my applications that were working are getting this same error, and this is without making any changes. Has anyone else encountered this recent problem? I'm stuck.
0
1
1.1k
Dec ’23
No UI for deleting table row in tvOS
My UITableView code works great on iOS, allowing a user to select edit and then swipe to let the user delete a row. Unfortunately, for tvOS, I cannot find a user interface equivelent for deleting a UITableView row. There is no edit button (on the navigator menu bar), and no swipe feature. Is there something I'm missing? I saw a post saying to hold press the volume down button but that doesn't work. I'm using Xcode 14.0, and the latest tvOS.
0
0
732
Sep ’22
VNRequest internalPerformInContext has not been implemented
I've been using the Vision object tracking code from the Apple example and others. This has been using swift and the programs work nicely. I just ported the image recognizer to ObjectiveC with a few lines of code. But I get an error from the VNRequest handler: error=Error Domain=com.apple.vis Code=8 "-[VNRequest internalPerformInContext:error:] has not been implemented" My code is very simple taking a UIImage and looking for objects. As I'm not processing video (yet) there wasn't a processor to initialize, but maybe there is something that needs to be initialized? Any ideas? (void) processImageWithVision:(UIImage*) image { CIImage* ciImage = image.CIImage; if (!ciImage) ciImage = [[CIImage alloc] initWithCGImage:image.CGImage]; NSDictionaryVNImageOption,id* dict = [[NSDictionary alloc] initWithObjectsAndKeys: @(0.8), @"confidenceThreshold", nil]; //Handler to process the image VNImageRequestHandler *imageRequestHandler = [[VNImageRequestHandler new] initWithCIImage:ciImage orientation:kCGImagePropertyOrientationUp options:dict]; // the handler that gets called VNRequest *vnRequest = [[VNRequest new]initWithCompletionHandler:^(VNRequest * _Nonnull request, NSError * _Nullable error) {   NSLog(@"vnRequest called %@, %@, error=%@",request, request.results, error); }];  vnRequest.preferBackgroundProcessing = YES; //doesn't matter //vnRequest to pass to the perform requests. NSArrayVNRequest** vnRequests = [NSArray arrayWithObject:vnRequest]; //Schedules Vision requests to be performed. [imageRequestHandler performRequests:vnRequests error:nil]; }
1
0
1.2k
Apr ’21