Take a little video on IOS

Time:2021-4-30
  • demand

    The company’s mixed development, uni terminal shooting small video is not very ideal, in order to achieve the effect of imitation wechat, native plug-ins start

  • thinking

    Step 1: one avcapturesession and one avcapturevideopreviewlayer

    Step 2: Video & audio recording requires the corresponding avcapturedeviceinput. Similarly, avcapturevideodataoutput and avcaptureaudiodataoutput

    Step 3: set output in the proxy to distinguish video from audio, and write the corresponding cmsamplebufferref to the video file

    Step 4: write to the video file, use avassetwriter, corresponding to video & Audio need two avassetwriterinput, add avassetwriter

    Step 5: cmsamplebufferref keeps coming and assetwriter keeps writing until it stops

  • Serve

    The first step of initialization is not written, nothing can look at my front blog

    Step 2: two avcapturedeviceinputs, two outputs, and set the proxy for the output

    self.videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:device error:&error];
    if (error) {
        Nslog (@ "error getting videoinput object from device, error reason:% @", error); ";
        return;
    }
    
    //Add device to session
    if ([self.session canAddInput:self.videoInput]) {
        [self.session addInput:self.videoInput];
    }
    
    [self.videoOutput setSampleBufferDelegate:self queue:self.videoQueue];
    if ([self.session canAddOutput:self.videoOutput]) {
        [self.session addOutput:self.videoOutput];
    }
    
    //Audio correlation
    AVCaptureDevice *adevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
    self.audioInput = [[AVCaptureDeviceInput alloc] initWithDevice:adevice error:&error];
    
    if ([self.session canAddInput:self.audioInput]) {
        [self.session addInput:self.audioInput];
    }
    
    [self.audioOutput setSampleBufferDelegate:self queue:self.videoQueue];
    if ([self.session canAddOutput:self.audioOutput]) {
        [self.session addOutput:self.audioOutput];
    }
    
    //Video output
    - (AVCaptureVideoDataOutput *)videoOutput {
        if (!_videoOutput) {
            _videoOutput = [[AVCaptureVideoDataOutput alloc] init];
            _videoOutput.alwaysDiscardsLateVideoFrames = YES;
        }
        return _videoOutput;
    }
    
    //Audio output
    - (AVCaptureAudioDataOutput *)audioOutput {
        if (!_audioOutput) {
            _audioOutput = [[AVCaptureAudioDataOutput alloc] init];
        }
        return _audioOutput;
    }

    Step 3: start session and operate cmsamplebufferref in the agent

    #pragma mark - AVCaptureVideoDataOutputSampleBufferDelegate & AVCaptureAudioDataOutputSampleBufferDelegate
    - (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
        @autoreleasepool {
            //Video
            if (connection == [self.videoOutput connectionWithMediaType:AVMediaTypeVideo]) {
                if (!self.manager.outputVideoFormatDescription) {
                    @synchronized(self) {
                        CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
                        self.manager.outputVideoFormatDescription = formatDescription;
                    }
                } else {
                    @synchronized(self) {
                        if (self.manager.state == StateRecording) {
                            [self.manager appendBuffer:sampleBuffer type:AVMediaTypeVideo];
                        }
                    }
                }
            }
            
            //Audio
            if (connection == [self.audioOutput connectionWithMediaType:AVMediaTypeAudio]) {
                if (!self.manager.outputAudioFormatDescription) {
                    @synchronized(self) {
                        CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
                        self.manager.outputAudioFormatDescription = formatDescription;
                    }
                }
                @synchronized(self) {
                    if (self.manager.state == StateRecording) {
                        [self.manager appendBuffer:sampleBuffer type:AVMediaTypeAudio];
                    }
                }
            }
        }
    }

    Step 4: avassetwriter and corresponding input

    //Writer initialization
    self.writer = [AVAssetWriter assetWriterWithURL:_videoUrl fileType:AVFileTypeMPEG4 error:nil];
    
    _videoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:_videoSettings];
    //Expectsmedia datainrealtime must be set to yes. You need to get data from capture session in real time
    _videoInput.expectsMediaDataInRealTime = YES;
    
    _audioInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:_audioSettings];
    _audioInput.expectsMediaDataInRealTime = YES;
    
    if ([_writer canAddInput:_videoInput]) {
        [_writer addInput:_videoInput];
    }
    if ([_writer canAddInput:_audioInput]) {
        [_writer addInput:_audioInput];
    }

    Step 5: cmsamplebufferref in step 3 is written to the video file through avassetwriter

    - (void)appendBuffer:(CMSampleBufferRef)buffer type:(NSString *)mediaType {
        if (buffer == NULL) {
            NSLog(@"empty sampleBuffer");
            return;
        }
        
        @synchronized (self) {
            if (self.state < StateRecording) {
                NSLog(@"not ready yet");
                return;
            }
        }
        
        CFRetain(buffer);
        dispatch_async(self.queue, ^{
            @autoreleasepool {
                @synchronized (self) {
                    if (self.state > StateFinish) {
                        CFRelease(buffer);
                        return;
                    }
                }
                
                if (!self.canWrite && mediaType == AVMediaTypeVideo) {
                    [self.writer startWriting];
                    [self.writer startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(buffer)];
                    self.canWrite = YES;
                }
                
                if(!self.timer) {
                    dispatch_async(dispatch_get_main_queue(), ^{
                        self.timer = [NSTimer scheduledTimerWithTimeInterval:TIMER_INTERVAL target:self selector:@selector(updateProgress) userInfo:nil repeats:YES];
                        [[NSRunLoop currentRunLoop] addTimer:self.timer forMode:NSDefaultRunLoopMode];
                    });
                }
                
                //Write video data
                if (mediaType == AVMediaTypeVideo) {
                    if (self.videoInput.readyForMoreMediaData) {
                        BOOL success = [self.videoInput appendSampleBuffer:buffer];
                        if (!success) {
                            @synchronized (self) {
                                [self stop:^{}];
                                [self destroy];
                            }
                        }
                    }
                }
                
                //Write audio data
                if (mediaType == AVMediaTypeAudio) {
                    if (self.audioInput.readyForMoreMediaData) {
                        BOOL success = [self.audioInput appendSampleBuffer:buffer];
                        if (!success) {
                            @synchronized (self) {
                                [self stop:^{}];
                                [self destroy];
                            }
                        }
                    }
                }
                CFRelease(buffer);
            }
        });
    }
  • Write at the end:

    1. When setting video properties, avassetwriterinput is designed according to its own needs. The setting of bit rate and frame rate will affect the quality and size of the captured video, depending on the requirements of each project
    2. If there is a problem with the video angle, we can adjust it from three directions

      1. Video orientation under layer connect settings

      2. Videoorientation under connect setting of avcaptureoutput

      3. Avassetwriterinput sets transform for video, such as rotation M_ Pi / 2 angle