iOS使用AVCaptureSession实现音视频采集

news/2024/7/20 21:03:06 标签: ios

AVCaptureSession配置采集行为并协调从输入设备到采集输出的数据流。要执行实时音视频采集,需要实例化采集会话并添加适当的输入和输出。

  • AVCaptureSession:管理输入输出音视频流
  • AVCaptureDevice:相机硬件的接口,用于控制硬件特性,诸如镜头的位置(前后摄像头)、曝光、闪光灯等。
  • AVCaptureInput:配置输入设备,提供来自设备的数据
  • AVCaptureOutput:管理输出的音视频数据流
  • AVCaptureConnection:输入与输出的连接
  • AVCaptureVideoPreviewLayer:显示当前相机正在采集的状况
  • AVAssetWriter:将媒体数据写入到容器文件

初始化AVCaptureSession

- (AVCaptureSession *)captureSession {
    if (_captureSession == nil){
        _captureSession = [[AVCaptureSession alloc] init];
        if ([_captureSession canSetSessionPreset:AVCaptureSessionPresetHigh]) {
            _captureSession.sessionPreset = AVCaptureSessionPreset1280x720;
        }
    }
    return _captureSession;
}

- (dispatch_queue_t)videoQueue {
    if (!_videoQueue) {
        _videoQueue = dispatch_queue_create("VideoCapture", DISPATCH_QUEUE_SERIAL);
    }
    return _videoQueue;
}

添加视频输入

- (AVCaptureDevice *)getCameraDeviceWithPosition:(AVCaptureDevicePosition )position {
    AVCaptureDeviceDiscoverySession *deviceDiscoverySession =  [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:@[AVCaptureDeviceTypeBuiltInWideAngleCamera] mediaType:AVMediaTypeVideo position:position];
    for (AVCaptureDevice *device in deviceDiscoverySession.devices) {
        if ([device position] == position) {
            return device;
        }
    }
    return nil;
}

- (void)setupVideoInput {
    AVCaptureDevice *captureDevice = [self getCameraDeviceWithPosition:AVCaptureDevicePositionBack];
    if (!captureDevice){
        NSLog(@"captureDevice failed");
        return;
    }
    NSError *error = nil;
    self.videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:captureDevice error:&error];
    if (error) {
        NSLog(@"videoInput error:%@", error);
        return;
    }
    if ([self.captureSession canAddInput:self.videoInput]) {
        [self.captureSession addInput:self.videoInput];
    }
}

添加音频输入

- (void)setupAudioInput {
    AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
    NSError *error = nil;
    self.audioInput = [[AVCaptureDeviceInput alloc] initWithDevice:captureDevice error:&error];
    if (error) {
        NSLog(@"audioInput error:%@", error);
        return;
    }
    if ([self.captureSession canAddInput:self.audioInput]) {
        [self.captureSession addInput:self.audioInput];
    }
}

添加视频输出

- (void)setupVideoOutput {
    self.videoOutput = [[AVCaptureVideoDataOutput alloc] init];
    self.videoOutput.alwaysDiscardsLateVideoFrames = YES;
    [self.videoOutput setSampleBufferDelegate:self queue:self.videoQueue];
    if ([self.captureSession canAddOutput:self.videoOutput]) {
        [self.captureSession addOutput:self.videoOutput];
    }
}

添加音频输出

- (void)setupAudioOutput {
    self.audioOutput = [[AVCaptureAudioDataOutput alloc] init];
    [self.audioOutput setSampleBufferDelegate:self queue:self.videoQueue];
    if ([self.captureSession canAddOutput:self.audioOutput]) {
        [self.captureSession addOutput:self.audioOutput];
    }
}

设置视频预览

- (void)setupCaptureVideoPreviewLayer:(UIView *)previewView {
    _captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession];
    CALayer *layer = previewView.layer;
    _captureVideoPreviewLayer.frame = previewView.bounds;
    _captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspect;
    
    _captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    _captureVideoPreviewLayer.connection.videoOrientation = AVCaptureVideoOrientationLandscapeRight;
    [layer insertSublayer:_captureVideoPreviewLayer atIndex:0];
}

开始和结束采集会话

- (void)startSession {
    if (![self.captureSession isRunning]) {
        [self.captureSession startRunning];
    }
}

- (void)stopSession{
    if ([self.captureSession isRunning]) {
        [self.captureSession stopRunning];
    }
}

初始化AVAssetWriter,将音视频保存到视频文件

- (void)setUpWriter {
    if (self.videoURL == nil) {
        return;
    }
    self.assetWriter = [AVAssetWriter assetWriterWithURL:self.videoURL fileType:AVFileTypeMPEG4 error:nil];
    NSInteger numPixels = kScreenWidth * kScreenHeight;
    
    CGFloat bitsPerPixel = 12.0;
    NSInteger bitsPerSecond = numPixels * bitsPerPixel;
    
    NSDictionary *compressionProperties = @{ AVVideoAverageBitRateKey : @(bitsPerSecond),
                                             AVVideoExpectedSourceFrameRateKey : @(15),
                                             AVVideoMaxKeyFrameIntervalKey : @(15),
                                             AVVideoProfileLevelKey : AVVideoProfileLevelH264BaselineAutoLevel };
    self.videoCompressionSettings = @{ AVVideoCodecKey : AVVideoCodecTypeH264,
                                       AVVideoWidthKey : @(width * 2),
                                       AVVideoHeightKey : @(height * 2),
                                       AVVideoScalingModeKey : AVVideoScalingModeResizeAspect,
                                       AVVideoCompressionPropertiesKey : compressionProperties };
    _assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:self.videoCompressionSettings];
    _assetWriterVideoInput.expectsMediaDataInRealTime = YES;
    self.audioCompressionSettings = @{ AVEncoderBitRatePerChannelKey : @(28000),
                                       AVFormatIDKey : @(kAudioFormatMPEG4AAC),
                                       AVNumberOfChannelsKey : @(1),
                                       AVSampleRateKey : @(22050) };

    _assetWriterAudioInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:self.audioCompressionSettings];
    _assetWriterAudioInput.expectsMediaDataInRealTime = YES;
    
    if ([_assetWriter canAddInput:_assetWriterVideoInput]){
        [_assetWriter addInput:_assetWriterVideoInput];
    }
    else{
        NSLog(@"AssetWriter videoInput append Failed");
    }
    
    if ([_assetWriter canAddInput:_assetWriterAudioInput]){
        [_assetWriter addInput:_assetWriterAudioInput];
    }
    else{
        NSLog(@"AssetWriter audioInput Append Failed");
    }
    _canWrite = NO;
}

AVCaptureVideoDataOutputSampleBufferDelegate和AVCaptureAudioDataOutputSampleBufferDelegate音视频处理

#pragma mark - AVCaptureVideoDataOutputSampleBufferDelegate|AVCaptureAudioDataOutputSampleBufferDelegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    @autoreleasepool{
        if (connection == [self.videoOutput connectionWithMediaType:AVMediaTypeVideo]) {
            @synchronized(self){
                [self appendSampleBuffer:sampleBuffer ofMediaType:AVMediaTypeVideo];
            }
        }
        if (connection == [self.audioOutput connectionWithMediaType:AVMediaTypeAudio]) {
            @synchronized(self) {
                [self appendSampleBuffer:sampleBuffer ofMediaType:AVMediaTypeAudio];
            }
        }
    }
}


- (void)appendSampleBuffer:(CMSampleBufferRef)sampleBuffer ofMediaType:(NSString *)mediaType {
        if (sampleBuffer == NULL){
        NSLog(@"empty sampleBuffer");
        return;
    }
    @autoreleasepool{
        if (!self.canWrite && mediaType == AVMediaTypeVideo){
            [self.assetWriter startWriting];
            [self.assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)];
            self.canWrite = YES;
        }
        if (mediaType == AVMediaTypeVideo){
            if (self.assetWriterVideoInput.readyForMoreMediaData){
                BOOL success = [self.assetWriterVideoInput appendSampleBuffer:sampleBuffer];
                if (!success){
                    NSLog(@"assetWriterVideoInput appendSampleBuffer fail");
                    @synchronized (self){
                        [self stopVideoRecorder];
                    }
                }
            }
        }
        if (mediaType == AVMediaTypeAudio){
            if (self.assetWriterAudioInput.readyForMoreMediaData){
                BOOL success = [self.assetWriterAudioInput appendSampleBuffer:sampleBuffer];
                if (!success){
                    NSLog(@"assetWriterAudioInput appendSampleBuffer fail");
                    @synchronized (self){
                        [self stopVideoRecorder];
                    }
                }
            }
        }
    }
}

停止视频录制

- (void)stopVideoRecorder {
    __weak __typeof(self)weakSelf = self;
    if(_assetWriter && _assetWriter.status == AVAssetWriterStatusWriting) {
        [_assetWriter finishWritingWithCompletionHandler:^{
            weakSelf.canWrite = NO;
            weakSelf.assetWriter = nil;
            weakSelf.assetWriterAudioInput = nil;
            weakSelf.assetWriterVideoInput = nil;
        }];
    }
}


http://www.niftyadmin.cn/n/5136053.html

相关文章

JAVA密码加盐加密处理

public static void main(String[] args) {String password "123";// 生成随机盐值byte[] salt generateSalt();// 通过盐值和密码进行哈希加密byte[] hashedPassword hashPassword(password, salt);// 将盐值和哈希后的密码存储在数据库中saveToDatabase(hashedP…

接入百度地图api

注册百度账号,进入百度地图百度地图开放平台 | 百度地图API SDK | 地图开发 点击开发文档,比如js开发选择javascript API,然后跟着向下走,官方文档挺详细 先获取账号与密钥 自用Referer设置 * 即可 复制AK密钥导入自己的html文件即…

机器学习笔记 - 神经辐射场(NeRF)的简要概述

一、简述 神经辐射场十分重要。在表示和渲染 3D 场景领域,神经辐射场 (NeRF) 在准确性方面取得了巨大突破。 给定底层场景的多个图像,NeRF 可以从任意视点重建该场景的高分辨率、2D 渲染图。与局部光场融合 (LLFF) 和场景表示网络 (SRN) 等现有技术相比,NeRF 更能够捕获场景…

视频直播与制作软件 Wirecast Pro mac中文版软件功能

Wirecast Pro mac是一款专业的视频直播和流媒体软件,由Telestream公司开发和发布,适用于各种场景,包括企业会议、体育赛事、音乐演出、教育培训等。 Wirecast Pro mac软件功能 支持多摄像头连接,实现多角度拍摄和切换。 可导入图片…

JavaSE20——IO流

IO流 1 Java IO流 I/O是Input/Output的缩写, I/O技术是非常实用的技术,用于处理设备之间的数据传输。如读/写文件,网络通讯等。Java程序中,对于数据的输入/输出操作以“流(stream)” 的方式进行 I(Input): 输入流指的是将数据以…

从零开始的C++(十三)

优先级队列(priority_queue),是一种基于堆实现的有序数组,效果是插入元素后会自动排序,使得遍历时会一直呈现一种有序。默认情况下会按照降序排列,可以自定义排列方式 模拟实现: 优先级队列实…

丈夫出轨向“第三者”转账百万余元,能否要回?

婚姻中,夫妻理应相互尊重、彼此忠诚,对于夫妻共同财产共同处置。然而,如果一方出轨,还向“第三者”转账百万余元,那另一方能否追回呢?    李某于2003年与丈夫周某登记结婚。婚姻关系存续期间,…

S5PV210裸机(九):ADC

本文主要探讨210的ADC相关知识。 ADC ADC:模数转换(模拟信号转数字信号) 量程:模拟电压信号范围(210为0~3.3V) 精度:若10二进制位来表示精度(210为10位或12位),量…