iOS takes a small video

Time:2022-11-22
  • need

    The company’s mixed development, the uni end shooting small videos is not very ideal, in order to achieve the effect of imitating WeChat, the native plug-in started

  • train of thought

    Step 1: 1 AVCaptureSession, 1 AVCaptureVideoPreviewLayer [consider compatible replacement with AVPreView]

    Step 2: Video recording requires video & audio, and the corresponding AVCaptureDeviceInput is required. Similarly, the corresponding AVCaptureVideoDataOutput and AVCaptureAudioDataOutput

    Step 3: Set the output in the proxy to distinguish between video and audio, and write the corresponding CMSampleBufferRef to the video file

    Step 4: To write video files, AVAssetWriter is used, corresponding to video & audio requires two AVAssetWriterInput, add AVAssetWriter

    Step 5: CMSampleBufferRef keeps coming, AssetWriter keeps writing until it stops

  • Serve

    I won’t write the initialization of the first step. If you have nothing to do, you can read my previous blog

    Step 2: Two AVCaptureDeviceInput and two Output, and set the proxy of Output

    self.videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:device error:&error];
    if (error) {
        NSLog(@"An error occurred while obtaining the videoInput object captured by the device, the reason for the error: %@", error);
        return;
    }
    
    // The device is added to the session
    if ([self.session canAddInput:self.videoInput]) {
        [self.session addInput:self.videoInput];
    }
    
    [self.videoOutput setSampleBufferDelegate:self queue:self.videoQueue];
    if ([self.session canAddOutput:self.videoOutput]) {
        [self.session addOutput:self.videoOutput];
    }
    
    // audio related
    AVCaptureDevice *adevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
    self.audioInput = [[AVCaptureDeviceInput alloc] initWithDevice:adevice error:&error];
    
    if ([self.session canAddInput:self.audioInput]) {
        [self.session addInput:self.audioInput];
    }
    
    [self.audioOutput setSampleBufferDelegate:self queue:self.videoQueue];
    if ([self.session canAddOutput:self.audioOutput]) {
        [self.session addOutput:self.audioOutput];
    }
    
    // video output
    - (AVCaptureVideoDataOutput *)videoOutput {
        if (!_videoOutput) {
            _videoOutput = [[AVCaptureVideoDataOutput alloc] init];
            _videoOutput.alwaysDiscardsLateVideoFrames = YES;
        }
        return _videoOutput;
    }
    
    // Audio output
    - (AVCaptureAudioDataOutput *)audioOutput {
        if (!_audioOutput) {
            _audioOutput = [[AVCaptureAudioDataOutput alloc] init];
        }
        return _audioOutput;
    }

    Step 3: Start the Session and operate CMSampleBufferRef in the proxy

    #pragma mark - AVCaptureVideoDataOutputSampleBufferDelegate & AVCaptureAudioDataOutputSampleBufferDelegate
    - (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
        @autoreleasepool {
            // video
            if (connection == [self.videoOutput connectionWithMediaType:AVMediaTypeVideo]) {
                if (!self.manager.outputVideoFormatDescription) {
                    @synchronized(self) {
                        CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
                        self.manager.outputVideoFormatDescription = formatDescription;
                    }
                } else {
                    @synchronized(self) {
                        if (self.manager.state == StateRecording) {
                            [self.manager appendBuffer:sampleBuffer type:AVMediaTypeVideo];
                        }
                    }
                }
            }
            
            //audio
            if (connection == [self.audioOutput connectionWithMediaType:AVMediaTypeAudio]) {
                if (!self.manager.outputAudioFormatDescription) {
                    @synchronized(self) {
                        CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
                        self.manager.outputAudioFormatDescription = formatDescription;
                    }
                }
                @synchronized(self) {
                    if (self.manager.state == StateRecording) {
                        [self.manager appendBuffer:sampleBuffer type:AVMediaTypeAudio];
                    }
                }
            }
        }
    }

    Step 4: AVAssetWriter and corresponding Input

    // writer initialization
    self.writer = [AVAssetWriter assetWriterWithURL:_videoUrl fileType:AVFileTypeMPEG4 error:nil];
    
    _videoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:_videoSettings];
    //expectsMediaDataInRealTime must be set to yes, and data needs to be obtained from the capture session in real time
    _videoInput.expectsMediaDataInRealTime = YES;
    
    _audioInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:_audioSettings];
    _audioInput.expectsMediaDataInRealTime = YES;
    
    if ([_writer canAddInput:_videoInput]) {
        [_writer addInput:_videoInput];
    }
    if ([_writer canAddInput:_audioInput]) {
        [_writer addInput:_audioInput];
    }

    Step 5: The CMSampleBufferRef in step 3 is written to the video file through AVAssetWriter

    - (void)appendBuffer:(CMSampleBufferRef)buffer type:(NSString *)mediaType {
        if (buffer == NULL) {
            NSLog(@"empty sampleBuffer");
            return;
        }
        
        @synchronized (self) {
            if (self.state < StateRecording) {
                NSLog(@"not ready yet");
                return;
            }
        }
        
        CFRetain(buffer);
        dispatch_async(self.queue, ^{
            @autoreleasepool {
                @synchronized (self) {
                    if (self.state > StateFinish) {
                        CFRelease(buffer);
                        return;
                    }
                }
                
                if (!self.canWrite && mediaType == AVMediaTypeVideo) {
                    [self.writer startWriting];
                    [self.writer startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(buffer)];
                    self.canWrite = YES;
                }
                
                if(!self.timer) {
                    dispatch_async(dispatch_get_main_queue(), ^{
                        self.timer = [NSTimer scheduledTimerWithTimeInterval:TIMER_INTERVAL target:self selector:@selector(updateProgress) userInfo:nil repeats:YES];
                        [[NSRunLoop currentRunLoop] addTimer:self.timer forMode:NSDefaultRunLoopMode];
                    });
                }
                
                // write video data
                if (mediaType == AVMediaTypeVideo) {
                    if (self.videoInput.readyForMoreMediaData) {
                        BOOL success = [self.videoInput appendSampleBuffer:buffer];
                        if (!success) {
                            @synchronized (self) {
                                [self stop:^{}];
                                [self destroy];
                            }
                        }
                    }
                }
                
                // write audio data
                if (mediaType == AVMediaTypeAudio) {
                    if (self.audioInput.readyForMoreMediaData) {
                        BOOL success = [self.audioInput appendSampleBuffer:buffer];
                        if (!success) {
                            @synchronized (self) {
                                [self stop:^{}];
                                [self destroy];
                            }
                        }
                    }
                }
                CFRelease(buffer);
            }
        });
    }
  • Write at the end:

    1. When setting video properties in AVAssetWriterInput, design according to your own needs. The bit rate and frame rate settings will affect the quality and size of the video after shooting, depending on the requirements of each project.
    2. If there is a problem with the video viewing angle, you can adjust it from three directions

      1.layer connect setting videoOrientation

      2. The connect setting of AVCaptureOutput under videoOrientation

      3. AVAssetWriterInput is to set the transform for video, such as Rotation M_PI/2 angle