Increase speed of reading barcodes?

Hi everyone. I was am new to coding using Xcode. I wrote an app in Android studio and I was porting it over to iOS. I have the app working but I am not very happy with the reading speed of barcodes. I was hoping someone could suggest what I can do to speed up reading.


Here is my code, thank you guys so much.




#import "ScanViewController.h"
#import "Utils.h"
#import "LoginToken.h"
#import 
#import 


#define COLOR_SUCCESS [UIColor colorWithDisplayP3Red:44.0f/255.0f green:1.0f blue:0 alpha:1.0f]
#define COLOR_ERROR [UIColor colorWithDisplayP3Red:1.0f green:0 blue:0 alpha:1.0f]


@interface ScanViewController ()


@property (nonatomic, readwrite) AVCaptureSession *captureSession;
@property (nonatomic, readwrite) AVCaptureVideoPreviewLayer *videoPreviewLayer;
@property (nonatomic, readwrite) UIView *qrCodeFrameView;
@property (nonatomic, readwrite) UILabel *qrCodeTextView;
@property (nonatomic, readwrite) NSArray *supportedCodeTypes;
@property (nonatomic, readwrite) long long lastScanTime;
@property (nonatomic, readwrite) NSString *lastScanCode;
@property (nonatomic, readwrite) AVAudioPlayer *audioPlayer;

@end

@implementation ScanViewController

- (void)viewDidLoad {
    [super viewDidLoad];
    // Do any additional setup after loading the view.



    self.captureSession = [AVCaptureSession new];

    self.captureSession.sessionPreset = AVCaptureSessionPresetHigh;
    self.supportedCodeTypes = @[AVMetadataObjectTypeUPCECode,
                                AVMetadataObjectTypeEAN13Code,
                                AVMetadataObjectTypeEAN8Code];
//                               AVMetadataObjectTypeCode39Code,
//                               AVMetadataObjectTypeCode39Mod43Code,
//                               AVMetadataObjectTypeCode93Code,
//                               AVMetadataObjectTypeCode128Code,
//                               AVMetadataObjectTypeAztecCode,
//                               AVMetadataObjectTypePDF417Code,
//                               AVMetadataObjectTypeITF14Code,
//                               AVMetadataObjectTypeDataMatrixCode,
//                               AVMetadataObjectTypeInterleaved2of5Code,
//                               AVMetadataObjectTypeQRCode];
    self.lastScanTime = 0;

     AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
        if(captureDevice == nil) {
            NSLog(@"Failed to get the camera device");
            return;
        }
    
        @try {
            // Get an instance of the AVCaptureDeviceInput class using the previous device object.
            AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:nil];
        
            // Set the input device on the capture session.
            [self.captureSession addInput:input];
        
            // Initialize a AVCaptureMetadataOutput object and set it as the output device to the capture session.
            AVCaptureMetadataOutput *captureMetadataOutput = [[AVCaptureMetadataOutput alloc] init];
            [self.captureSession addOutput:captureMetadataOutput];
        
            // Set delegate and use the default dispatch queue to execute the call back
            [captureMetadataOutput setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
            captureMetadataOutput.metadataObjectTypes = self.supportedCodeTypes;
            //            captureMetadataOutput.metadataObjectTypes = [AVMetadataObject.ObjectType.qr]
        } @catch (NSException *error) {
            // If any error occurs, simply print it out and don't continue any more.
            NSLog(@"%@", error);
            return;
        }
    
        // Initialize the video preview layer and add it as a sublayer to the viewPreview view's layer.
        self.videoPreviewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
        self.videoPreviewLayer.videoGravity = kCAGravityResizeAspectFill;
        self.videoPreviewLayer.frame = self.view.layer.bounds;
        [self.view.layer addSublayer:self.videoPreviewLayer];
    
        // Start video capture.
        [self.captureSession startRunning];
    
        // Move the result view and loading view to the front
        [self.view bringSubviewToFront:self.resultView];
        [self.view bringSubviewToFront:self.loadingView];
    
        // Initialize QR Code Frame to highlight the QR code
        self.qrCodeFrameView = [[UIView alloc] init];
        if (self.qrCodeFrameView) {
            self.qrCodeFrameView.layer.borderColor = UIColor.greenColor.CGColor;
            self.qrCodeFrameView.layer.borderWidth = 2;
            [self.view addSubview:self.qrCodeFrameView];
            [self.view bringSubviewToFront:self.qrCodeFrameView];
        }
    
        self.qrCodeTextView = [[UILabel alloc] init];
        if (self.qrCodeTextView) {
            [self.qrCodeTextView setTextColor:UIColor.greenColor];
            [self.qrCodeTextView setFont:[UIFont systemFontOfSize:20]];
            [self.qrCodeFrameView addSubview:self.qrCodeTextView];
        }
    
        [self rotateLoadingImage];
        [self setResultType:RESULT_TYPE_WORKING codeContent:@"Ready" price:0.00];
        [self.loadingView setHidden:YES];
    
    }


-(void)viewDidAppear:(BOOL)animated {
    [super viewDidAppear:animated];

    NSString *soundFilePath = [[NSBundle mainBundle] pathForResource:@"notification" ofType:@"wav"];
    NSURL *soundFileUrl = [NSURL fileURLWithPath:soundFilePath];
    self.audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:soundFileUrl error:nil];

    // Check if valid verified login exists
    LoginToken *token = [LoginToken getInstance];
    [token load];



-(void)viewWillDisappear:(BOOL)animated {
    if (self.audioPlayer != nil) {
        [self.audioPlayer stop];
        self.audioPlayer = nil;
    }

    [super viewWillDisappear:animated];
}


/*
#pragma mark - Navigation


// In a storyboard-based application, you will often want to do a little preparation before navigation
- (void)prepareForSegue:(UIStoryboardSegue *)segue sender:(id)sender {
    // Get the new view controller using [segue destinationViewController].
    // Pass the selected object to the new view controller.
}
*/
-(void) updatePreviewLayer:(AVCaptureConnection*)layer orientation:(AVCaptureVideoOrientation)orientation {
    layer.videoOrientation = orientation;
    self.videoPreviewLayer.frame = self.view.bounds;
}


-(void)viewDidLayoutSubviews {
    [super viewDidLayoutSubviews];

    if(self.videoPreviewLayer.connection != nil) {
        UIDevice *currentDevice = [UIDevice currentDevice];
        UIDeviceOrientation orientation = [currentDevice orientation];
        AVCaptureConnection *previewLayerConnection = self.videoPreviewLayer.connection;
    
        if(previewLayerConnection.isVideoOrientationSupported) {
            switch (orientation) {
                case UIDeviceOrientationPortrait:
                    [self updatePreviewLayer:previewLayerConnection orientation:AVCaptureVideoOrientationPortrait];
                    break;
                case UIDeviceOrientationLandscapeRight:
                    [self updatePreviewLayer:previewLayerConnection orientation:AVCaptureVideoOrientationLandscapeLeft];
                    break;
                case UIDeviceOrientationLandscapeLeft:
                    [self updatePreviewLayer:previewLayerConnection orientation:AVCaptureVideoOrientationLandscapeRight];
                    break;
                case UIDeviceOrientationPortraitUpsideDown:
                    [self updatePreviewLayer:previewLayerConnection orientation:AVCaptureVideoOrientationPortraitUpsideDown];
                    break;
                default:
                    [self updatePreviewLayer:previewLayerConnection orientation:AVCaptureVideoOrientationPortrait];
                    break;
            }
        }
    }
}


-(void)captureOutput:(AVCaptureOutput *)output didOutputMetadataObjects:(NSArray<__kindof AVMetadataObject *> *)metadataObjects fromConnection:(AVCaptureConnection *)connection {
    // Check if the metadataObjects array is not nil and it contains at least one object.
    if (metadataObjects.count == 0) {
        self.qrCodeFrameView.frame = CGRectZero;
        return;
    }

    // Get the metadata object.
    AVMetadataMachineReadableCodeObject *metadataObj = (AVMetadataMachineReadableCodeObject*)(metadataObjects[0]);


    if ([self.supportedCodeTypes containsObject:metadataObj.type]) {
        // If the found metadata is equal to the QR code metadata (or barcode) then update the status label's text and set the bounds
        AVMetadataObject *barCodeObject = [self.videoPreviewLayer transformedMetadataObjectForMetadataObject:metadataObj];
        NSString *code = metadataObj.stringValue;
    
        if (code != nil) {
            // check upc a code
            if ([self checkUpcACode:metadataObj.type code:code] == NO) {
                self.qrCodeTextView.text = @"";
                return;
            }
        
            int i=0;
            for (i=0; i<code.length; i++)="" {<br="">                char ch = [code characterAtIndex:i];
                if (ch != '0') break;
            }
            if (i>0) i--;
            code = [code substringFromIndex:i];
        
            self.qrCodeFrameView.frame = barCodeObject.bounds;
            [self.qrCodeTextView setText:code];
            self.qrCodeTextView.frame = CGRectMake(0, self.qrCodeFrameView.frame.size.height-20, self.qrCodeFrameView.frame.size.width, 20);
            NSLog(@"%@", code);
        
            [self handleBarcode:code];
        } else {
            self.qrCodeTextView.text = @"";
        }
    }
}


-(BOOL)checkUpcACode:(AVMetadataObjectType)type code:(NSString*)code {
    if (type == AVMetadataObjectTypeEAN13Code) {
        if ([code hasPrefix:@"0"] && [code length] > 0) {
            return YES;
        }
    }
    return NO;
}


-(void)handleBarcode:(NSString*)code {
    // check duration for scan
    long long timeSinceLastScan = [Utils currentMilliTime] - self.lastScanTime;
    if (timeSinceLastScan < 1000 * 10 && [code isEqualToString:self.lastScanCode]) {
        return;
    }
    NSLog(@"time since last scan : %lld", timeSinceLastScan);
    self.lastScanTime = [Utils currentMilliTime];
    self.lastScanCode = [NSString stringWithFormat:@"%@", code];

    long long nCode = [code longLongValue];
    if (nCode > 0) {
        [self setResultType:RESULT_TYPE_WORKING codeContent:@"Working" price:0.00];
        [self.mApi getPrice:nCode handler:^(bool ok, long long code, double value) {
            dispatch_async(dispatch_get_main_queue(), ^{
                [self handlePriceResult:ok code:code price:value];
            });
        }];
    } else {
        [self handleScanFeedback:RESULT_TYPE_ERROR];
        [self setResultType:RESULT_TYPE_ERROR codeContent:@"Failed to parse code" price:0.00];
    }
}


-(void)rotateLoadingImage {
    [UIView animateWithDuration:2 delay:0 options:UIViewAnimationOptionCurveLinear animations:^{
        [self.loadingImageView setTransform:CGAffineTransformMakeRotation(M_PI)];
    } completion:^(BOOL finished){
        [self rotateLoadingImageAgain];
    }];
}


-(void)rotateLoadingImageAgain {
    [UIView animateWithDuration:2 delay:0 options:UIViewAnimationOptionCurveLinear animations:^{
        [self.loadingImageView setTransform:CGAffineTransformMakeRotation(M_PI * 2)];
    } completion:^(BOOL finished){
        [self rotateLoadingImage];
    }];
}


-(void)setResultType:(ResultType)type codeContent:(NSString*)codeContent price:(double)price {
    switch (type) {
        case RESULT_TYPE_DEFAULT:
        {
            [self.loadingView setHidden:YES];
            [self.resultView setBackgroundColor:[UIColor whiteColor]];
            [self.resultCodeDisplay setText:@"Ready"];
            [self.resultPriceDisplay setText:@"$ 0.00"];
            [self.resultIconDisplay setImage:[UIImage imageNamed:@"ic_default_24dp"]];
            break;
        }
        case RESULT_TYPE_NOT_FOUND:
        {
            [self.loadingView setHidden:YES];
            [self.resultView setBackgroundColor:[UIColor whiteColor]];
            [self.resultCodeDisplay setText:[NSString stringWithFormat:@"%@ - not found try rescanning", codeContent]];
            [self.resultPriceDisplay setText:@"$ 0.00"];
            [self.resultIconDisplay setImage:[UIImage imageNamed:@"ic_not_found_24dp"]];
            break;
        }
        case RESULT_TYPE_SUCCESS_NORMAL:
        {
            [self.loadingView setHidden:YES];
            [self.resultView setBackgroundColor:[UIColor whiteColor]];
            [self.resultCodeDisplay setText:codeContent];
            [self.resultPriceDisplay setText:[NSString stringWithFormat:@"$ %.02f", price]];
            [self.resultIconDisplay setImage:[UIImage imageNamed:@"ic_default_24dp"]];
            break;
        }
        case RESULT_TYPE_SUCCESS_TARGET:
        {
            [self.loadingView setHidden:YES];
            [self.resultView setBackgroundColor:COLOR_SUCCESS];
            [self.resultCodeDisplay setText:codeContent];
            [self.resultPriceDisplay setText:[NSString stringWithFormat:@"$ %.02f", price]];
            [self.resultIconDisplay setImage:[UIImage imageNamed:@"ic_success_24dp"]];
            break;
        }
        case RESULT_TYPE_ERROR:
        {
            [self.loadingView setHidden:YES];
            [self.resultView setBackgroundColor:COLOR_ERROR];
            [self.resultCodeDisplay setText:@"Error"];
            [self.resultPriceDisplay setText:@"$ 0.00"];
            [self.resultIconDisplay setImage:[UIImage imageNamed:@"ic_error_black_24dp"]];
            break;
        }
        case RESULT_TYPE_WORKING:
        {
            [self.loadingView setHidden:NO];
            break;
        }
        default:
            break;
    }
}


-(void)handleScanFeedback:(ResultType)type {
    switch (type) {
        case RESULT_TYPE_SUCCES   break;
    }
}
S_TARGET:
        {
            if (self.audioPlayer != nil)
                [self.audioPlayer play];
            AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
            break;
        }
        case RESULT_TYPE_SUCCESS_NORMAL:
        case RESULT_TYPE_ERROR:
        case RESULT_TYPE_NOT_FOUND:
        {
            AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
            break;
        }
        default:
     
@end

Accepted Reply

try:


//change              [self handleBarcode:code]; 



// to 

       dispatch_async(dispatch_get_main_queue(), ^{  

                [self handleBarcode:code];    
    }); 

Replies

TMI.

One cause of 'too slow response' is the possibility that the app is responding on a thread that is blocked. A blocked thread sends its code to change the view but it doesn't actually refresh the view until it ends. You get around that by:



  
// code placed here will not show up on the device until the thread it is on gets unblocked
//therefore -----
  dispatch_async(dispatch_get_main_queue(), ^{
        
       //  place in here code that displays something to the user.  
       //     This code will be displayed even though the other thread is blocked  
    });



EDIT:

Note...

https://developer.apple.com/documentation/avfoundation/avcapturevideodataoutputsamplebufferdelegate/1385775-captureoutput?language=objc This method is called on the dispatch queue specified by the output’s

sampleBufferCallbackQueue
property.


I suspect the solution is above.

Thank you so much for your reply, I am such a noob at coding in Xcode, where would I place this in my above code to try this out?


I did forget to add one thing. It seems to POST to the API pretty fast but it seems to be very slow at actually picking up the barcode itself. I think it has something to do with a focus setting maybe? I am also using a very antique 5c for testing (It's my first iPhone ever)



I really appreciate your help.

try:


//change              [self handleBarcode:code]; 



// to 

       dispatch_async(dispatch_get_main_queue(), ^{  

                [self handleBarcode:code];    
    }); 

Holy crap!! you are a genius! It's faster than the Android version on an old 5c! I cannot thank you enough!

Pay it forward.