Post

Replies

Boosts

Views

Activity

Object Detection Model Preview and Model on App having varying results on the image despite it being the same model
Hello, I have an object detection model that I integrated into an app. When I put an image on the preview for the Object Detection File, it classifies the image correctly. However, if I put the same image onto the app, it classifies it differently with different values. I am confused as to how this is happening. Here is my code: import UIKit import CoreML import Vision import ImageIO class SecondViewController: UIViewController, UINavigationControllerDelegate { @IBOutlet weak var photoImageView: UIImageView! lazy var detectionRequest: VNCoreMLRequest = { do { let model = try VNCoreMLModel(for: EarDetection2().model) let request = VNCoreMLRequest(model: model, completionHandler: { [weak self] request, error in self?.processDetections(for: request, error: error) }) request.imageCropAndScaleOption = .scaleFit return request } catch { fatalError("Failed to load Vision ML model: \(error)") } }() @IBAction func testPhoto(_ sender: UIButton) { let vc = UIImagePickerController() vc.sourceType = .photoLibrary vc.delegate = self present(vc, animated: true) } @IBOutlet weak var results: UILabel! func updateDetections(for image: UIImage) { let orientation = CGImagePropertyOrientation(rawValue: UInt32(image.imageOrientation.rawValue)) guard let ciImage = CIImage(image: image) else { fatalError("Unable to create \(CIImage.self) from \(image).") } DispatchQueue.global(qos: .userInitiated).async { let handler = VNImageRequestHandler(ciImage: ciImage, orientation: orientation!) do { try handler.perform([self.detectionRequest]) } catch { print("Failed to perform detection.\n\(error.localizedDescription)") } } } func processDetections(for request: VNRequest, error: Error?) { DispatchQueue.main.async { guard let results = request.results else { print("Unable to detect anything.\n\(error!.localizedDescription)") return } let detections = results as! [VNRecognizedObjectObservation] self.drawDetectionsOnPreview(detections: detections) } } func drawDetectionsOnPreview(detections: [VNRecognizedObjectObservation]) { guard let image = self.photoImageView?.image else { return } let imageSize = image.size let scale: CGFloat = 0 UIGraphicsBeginImageContextWithOptions(imageSize, false, scale) for detection in detections { image.draw(at: CGPoint.zero) print(detection.labels.map({"\($0.identifier) confidence: \($0.confidence)"}).joined(separator: "\n")) print("------------") results.text = (detection.labels.map({"\($0.identifier) confidence: \($0.confidence)"}).joined(separator: "\n")) // The coordinates are normalized to the dimensions of the processed image, with the origin at the image's lower-left corner. let boundingBox = detection.boundingBox let rectangle = CGRect(x: boundingBox.minX*image.size.width, y: (1-boundingBox.minY-boundingBox.height)*image.size.height, width: boundingBox.width*image.size.width, height: boundingBox.height*image.size.height) UIColor(red: 0, green: 1, blue: 0, alpha: 0.4).setFill() UIRectFillUsingBlendMode(rectangle, CGBlendMode.normal) } let newImage = UIGraphicsGetImageFromCurrentImageContext() UIGraphicsEndImageContext() self.photoImageView?.image = newImage } } extension SecondViewController: UIImagePickerControllerDelegate { func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) { picker.dismiss(animated: true) guard let image = info[.originalImage] as? UIImage else { return } self.photoImageView?.image = image updateDetections(for: image) } } I attached pictures of the model preview and the app preview (it may be hard to tell but they are the same image). I have also attached pictures of my files and storyboard. Any help would be great! Thanks in advance!
0
0
815
Aug ’21
Connect an Iphone and an External Camera over Wifi
I have been trying to figure out how to connect an iPhone to an external camera over Wifi. The phone connects to the camera over wifi, but I am confused about how to display the camera output onto a custom-made application. Is this even possible? Any help would be appreciated. I essentially want to stream the video and take pictures on the custom-made app on xcode.
5
0
1.3k
Nov ’21