0 Replies
      Latest reply on Nov 11, 2019 4:44 AM by farver
      farver Level 1 Level 1 (0 points)

        Implementing a Broadcast Upload Extension with a SampleHandler.swift and invoking it with a BroadcastPicker works fine. However, in trying to find out how to process the samples coming in to processSampleBuffer, the trail goes cold. Does anyone know how to upload the samples to a streaming server?

         

        In a nutshell: How does one convert sample data (CMSampleBuffer) to a standard format, for audio and video, ready to send to a server?

        Apples WWDC code suggests:

         

        // session: add frame
        let imageBuffer:CVImageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
        let pts:CMTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer) as CMTime
        VTCompressionSessionEncodeFrame(session!, imageBuffer: imageBuffer, presentationTimeStamp: pts, duration: CMTime.invalid, frameProperties: nil, sourceFrameRefcon: nil, infoFlagsOut: nil)

         

        in processSampleBuffer, but the actual uploading of video and audio sample frames is not mentioned.

         

        override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) {
        
            switch sampleBufferType {
            case RPSampleBufferType.video:
                // Handle video sample buffer
        
                // What now?
        
                break
            case RPSampleBufferType.audioApp:
                // Handle audio sample buffer for app audio
        
                // What now?
        
                break
            case RPSampleBufferType.audioMic:
                // Handle audio sample buffer for mic audio
        
                // What now?
        
                break
            @unknown default:
                // Handle other sample buffer types
                fatalError("Unknown type of sample buffer")
            }
        
        }

         

        Any help would be much appreciated.