Automatically take a picture when a certain pattern found in camera preview

Hello... we are just getting started with a project and would like a little help getting pointed in the right direction.

We would like to create an App that, when the camera is pointed at a surface, it can scan for a certain pattern and when it sees it takes an picture (sort of like one of those check scanning bank apps, but a unique pattern involving fiducials).

So we would need to analyze the live preview data for specific features, and then save the snapshot when we see the feature we are looking for.

Is this (a) possible, and (b) is there a sample project that does this, in part or whole?

Thanks very much!


Replies

Hello,

What you are describing is possible as long as you can identify that whatever object you are trying to detect is in the frame, you would need an ml model that performs this object detection and provides a bounding box of the object in the frame. You could create such a model using the Object Detection template in CreateML.

Once you have your ml model, you would then use Vision to run it on the pixel buffers that you are receiving from the camera, see this example for running a VNCoreMLRequest on a camera preview: https://developer.apple.com/documentation/vision/classifying_images_with_vision_and_core_ml