AVCaptureVideoDataOutput for iOS camera customization

Time:2022-11-23
  • question

    The leader looked at the photo taken earlier and asked, “Where is the sound coming from?”

    “The system is built-in, you can see that the system also has sound when taking pictures”

    “Is there a way to remove it? It sucks”

    “Let me try”

  • train of thought

    The road is long and the road is long, I am looking for it in Du Niang+SDK

    Paizhuan AVCaptureVideoDataOutput, convert CMSampleBufferRef to UIImage in proxy method

  • upper size

    • session settings do not mention
    • For layer settings, please refer to the previous article [AVCapturePhotoOutput for iOS camera customization] and the previous article [iOS is written before customizing the camera]
    • Get the camera, get the device input and add it to the session, initialize videoOutput and add it to the session
    AVCaptureDevice *device = [self cameraDevice];
    if (!device) {
        NSLog(@"There is a problem getting the rear camera");
        return;;
    }
    
    NSError *error = nil;
    self.videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:device error:nil];
    
    // The device is added to the session
    if ([self.captureSession canAddInput:self.videoInput]) {
        [self.captureSession addInput:self.videoInput];
    }
    
    [self.videoOutput setSampleBufferDelegate:self queue:self.videoQueue];
    if ([self.captureSession canAddOutput:self.videoOutput]) {
        [self.captureSession addOutput:self.videoOutput];
    }
    
    // lazy loading
    - (AVCaptureVideoDataOutput *)videoOutput {
        if (!_videoOutput) {
            _videoOutput = [[AVCaptureVideoDataOutput alloc] init];
            _videoOutput.alwaysDiscardsLateVideoFrames = YES;
            _videoOutput.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
                                                                     forKey:(id)kCVPixelBufferPixelFormatTypeKey];
        }
        return _videoOutput;
    }
    
    - (dispatch_queue_t)videoQueue {
        if (!_videoQueue) {
            _videoQueue = dispatch_queue_create("queue", DISPATCH_QUEUE_SERIAL);
        }
        return _videoQueue;
    }
    • Agent AVCaptureVideoDataOutputSampleBufferDelegate
    - (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
        @autoreleasepool {
            If (connection = = [self videoOutput connectionWithMediaType: AVMediaTypeVideo]) {// video
                @synchronized (self) {
                    UIImage *image = [self bufferToImage:sampleBuffer rect:self.scanView.scanRect];
                    self.uploadImg = image;
                }
            }
        }
    }
    • CMSampleBufferRef is converted into UIImage, this method has been adjusted, and a certain part of the whole picture is taken as a screenshot, and set as needed. You need to adjust it yourself to obtain the image of the specified area
    - (UIImage *)bufferToImage:(CMSampleBufferRef)sampleBuffer rect:(CGRect)rect {
        // Get a CMSampleBuffer's Core Video image buffer for the media data
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        // Lock the base address of the pixel buffer
        CVPixelBufferLockBaseAddress(imageBuffer, 0);
    
        // Get the number of bytes per row for the pixel buffer
        void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
    
        // Get the number of bytes per row for the pixel buffer
        size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
        // Get the pixel buffer width and height
        size_t width = CVPixelBufferGetWidth(imageBuffer);
        size_t height = CVPixelBufferGetHeight(imageBuffer);
    
        // Create a device-dependent RGB color space
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    
        // Create a bitmap graphics context with the sample buffer data
        CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
                                                     bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
        // Create a Quartz image from the pixel data in the bitmap graphics context
        CGImageRef quartzImage = CGBitmapContextCreateImage(context);
        // Unlock the pixel buffer
        CVPixelBufferUnlockBaseAddress(imageBuffer,0);
    
        // Free up the context and color space
        CGContextRelease(context);
        CGColorSpaceRelease(colorSpace);
        
        // Get the image of the specified area
        CGRect dRect;
        CGSize msize = UIScreen.mainScreen.bounds.size;
        msize.height = msize.height - 150;
        CGFloat x = width * rect.origin.x / msize.width;
        CGFloat y = height * rect.origin.y / msize.height;
        CGFloat w = width * rect.size.width / msize.width;
        CGFloat h = height * rect.size.height / msize.height;
        dRect = CGRectMake(x, y, w, h);
        
        CGImageRef partRef = CGImageCreateWithImageInRect(quartzImage, dRect);
        
        // Create an image object from the Quartz image
        UIImage *image = [UIImage imageWithCGImage:partRef];
    
        // Release the Quartz image
        CGImageRelease(partRef);
        CGImageRelease(quartzImage);
        
        return image;
    }
  • With the picture, call it a day. How to use pictures, business should work