Avfoundation programming guide 1 – using assets

Time:2022-5-13

Reprinted from:Technology blog of chenjiang3

AVFoundation Programming Guide

Create an assert object

In order to create an assert object that represents any resource identified by the URL, you can use avurlassert. The simplest is to create an assert object from the file:

NSURL *url = <#A URL that identifies an audiovisual asset such as a movie file#>;
AVURLAsset *anAsset = [[AVURLAsset alloc] initWithURL:url options:nil];

Optional actions for asset initialization

The second parameter of the avurlasset initialization method uses a dictionary. The only key in this dictionary is avurlassetpreciseduration andtimingkey. Its value is a boolean type (an object wrapped with nsvalue). This value indicates whether the asset provides an accurate duration.
It takes a lot of processing time to obtain the accurate duration of the asset. Using an estimated duration is more efficient and sufficient for playback. Therefore:
·If you want to play the asset, just pass the initialization method nil instead of a dictionary, or a dictionary with avurlassetpreferreciseduration andtimingkeyword as the key and no as the value.
·If you want to add an asset to a composition, you need a precise access right. At this time, you can pass a dictionary whose key value pairs are

Avurlassetpreferprecisedurationandtimingkey and yes.
NSURL *url = <#A URL that identifies an audiovisual asset such as a movie file#>;
NSDictionary *options = @{ AVURLAssetPreferPreciseDurationAndTimingKey : @YES };
AVURLAsset *anAssetToUseInAComposition = [[AVURLAsset alloc] initWithURL:url options:options];

Access the user’s asset

In order to access the asset in iPod library and photo album, you need to get the URL of the asset.
·To access the iPod library, you need to create an mpmediaquery object to find the object you want, and then get its URL through the mpmediaitempropertyasseturl. To learn more about the media library, check out the multimedia programming guide
·To access albums, you can use alassetslibrary
The following example is used to get the first video in the album:
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];


// Enumerate just the photos and videos group by using ALAssetsGroupSavedPhotos.
[library enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos
                       usingBlock:^(ALAssetsGroup *group, BOOL *stop) {
                           
                           // Within the group enumeration block, filter to enumerate just videos.
                           [group setAssetsFilter:[ALAssetsFilter allVideos]];
                           
                           // For this example, we're only interested in the first item.
                           [group enumerateAssetsAtIndexes:[NSIndexSet indexSetWithIndex:0]
                                                   options:0
                                                usingBlock:^(ALAsset *alAsset, NSUInteger index, BOOL *innerStop) {
                                                    
                                                    // The end of the enumeration is signaled by asset == nil.
                                                    if (alAsset) {
                                                        ALAssetRepresentation *representation = [alAsset defaultRepresentation];
                                                        NSURL *url = [representation url];
                                                        AVAsset *avAsset = [AVURLAsset URLAssetWithURL:url options:nil];
                                                        // Do something interesting with the AV asset.
                                                    }
                                                }];
                       }
                     failureBlock: ^(NSError *error) {
                         // Typically you should handle an error more gracefully than this.
                         NSLog(@"No groups");
                     }];

Prepare to use asset

Initializing an asset (or track) does not mean that all the information in the asset is immediately available. It takes some time to calculate. Even for duration (such as MP3 files without summary information), you should use the avasynchronously keyvalueloading protocol to obtain these values, and obtain the values you want in the handler through – loadvaluesassynchronously forkeys: completionhandler.
You can use statusofvalueforkey: error: to test whether the value of an attribute is successfully obtained. When an assert is loaded for the first time, the value of most attributes is in avkeyvaluestatusunknown status. In order to obtain the value of one or more attributes, you need to call loadvaluesasynchronously forkeys: completionhandler:. In the combleton handler, you can do any appropriate processing according to the status of the attribute. You have to deal with the failure of loading, which may be due to some reasons, such as the network is not connected, or the loading is cancelled.

NSURL *url = <#A URL that identifies an audiovisual asset such as a movie file#>;
AVURLAsset *anAsset = [[AVURLAsset alloc] initWithURL:url options:nil];
NSArray *keys = @[@"duration"];
 
[asset loadValuesAsynchronouslyForKeys:keys completionHandler:^() {
 
    NSError *error = nil;
    AVKeyValueStatus tracksStatus = [asset statusOfValueForKey:@"duration" error:&error];
    switch (tracksStatus) {
        case AVKeyValueStatusLoaded:
            [self updateUserInterfaceForDuration];
            break;
        case AVKeyValueStatusFailed:
            [self reportError:error forAsset:asset];
            break;
        case AVKeyValueStatusCancelled:
            // Do whatever is appropriate for cancelation.
            break;
   }
}];

If you want to play an asset, you should load its tracks property.

Get still pictures from video

To get static pictures (such as thumbnails) from the asset, you can use the avassetimagegenerator object. You can initialize an avassetimagegenerator object with asset Even if the asset does not have a visible track during initialization, it can succeed, so you should check whether the asset has a track and use trackswithmediacharacteristic:

AVAsset anAsset = <#Get an asset#>;
if ([[anAsset tracksWithMediaType:AVMediaTypeVideo] count] > 0) {
    AVAssetImageGenerator *imageGenerator =
        [AVAssetImageGenerator assetImageGeneratorWithAsset:anAsset];
    // Implementation continues...
}

You can also set other properties of imagegenerator. For example, you can specify the maximum resolution of the generated image, and you can generate an image or a series of images at a specified time. You must keep the reference of generator until all the images are generated.

Generate a single picture

You can use copycgimageattime: actualtime: error: to generate a picture at a specified time point. Avfoundation may not be able to accurately generate a picture at the time you specify, so you can pass a cmtime pointer in the second parameter to obtain the exact time of the generated picture.

AVAsset *myAsset = <#An asset#>];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:myAsset];
 
Float64 durationSeconds = CMTimeGetSeconds([myAsset duration]);
CMTime midpoint = CMTimeMakeWithSeconds(durationSeconds/2.0, 600);
NSError *error;
CMTime actualTime;
 
CGImageRef halfWayImage = [imageGenerator copyCGImageAtTime:midpoint actualTime:&actualTime error:&error];
 
if (halfWayImage != NULL) {
 
    NSString *actualTimeString = (NSString *)CMTimeCopyDescription(NULL, actualTime);
    NSString *requestedTimeString = (NSString *)CMTimeCopyDescription(NULL, midpoint);
    NSLog(@"Got halfWayImage: Asked for %@, got %@", requestedTimeString, actualTimeString);
 
    // Do something interesting with the image.
    CGImageRelease(halfWayImage);
}

Generate a series of pictures

In order to generate a series of pictures, you can call generatecgimagesasynchronously fortimes: completionhandler:. The first parameter is an array containing nsvalue type. Each object in the array is a cmtime structure, indicating the time point of the picture you want to generate in the video. The second parameter is a block, which will be called back every time a picture is generated, This block provides a result parameter to tell you whether the image is successfully generated or whether the image generation operation is cancelled.

In your block implementation, you need to check the result to determine whether the image is successfully generated. In addition, make sure you hold the image generator until the operation of generating images is completed.

AVAsset *myAsset = <#An asset#>];
// Assume: @property (strong) AVAssetImageGenerator *imageGenerator;
self.imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:myAsset];
 
Float64 durationSeconds = CMTimeGetSeconds([myAsset duration]);
CMTime firstThird = CMTimeMakeWithSeconds(durationSeconds/3.0, 600);
CMTime secondThird = CMTimeMakeWithSeconds(durationSeconds*2.0/3.0, 600);
CMTime end = CMTimeMakeWithSeconds(durationSeconds, 600);
NSArray *times = @[NSValue valueWithCMTime:kCMTimeZero],
                  [NSValue valueWithCMTime:firstThird], [NSValue valueWithCMTime:secondThird],
                  [NSValue valueWithCMTime:end]];
 
[imageGenerator generateCGImagesAsynchronouslyForTimes:times
                completionHandler:^(CMTime requestedTime, CGImageRef image, CMTime actualTime,
                                    AVAssetImageGeneratorResult result, NSError *error) {
 
                NSString *requestedTimeString = (NSString *)
                    CFBridgingRelease(CMTimeCopyDescription(NULL, requestedTime));
                NSString *actualTimeString = (NSString *)
                    CFBridgingRelease(CMTimeCopyDescription(NULL, actualTime));
                NSLog(@"Requested: %@; actual %@", requestedTimeString, actualTimeString);
 
                if (result == AVAssetImageGeneratorSucceeded) {
                    // Do something interesting with the image.
                }
 
                if (result == AVAssetImageGeneratorFailed) {
                    NSLog(@"Failed with error: %@", [error localizedDescription]);
                }
                if (result == AVAssetImageGeneratorCancelled) {
                    NSLog(@"Canceled");
                }
  }];

You can also cancel the image generation operation and send the message cancelallcgimagegeneration to the generator.

Clip video and transcode video

You can transcode and clip the video by using the avassetexportsession object. This process is shown in the figure below,
Avfoundation programming guide 1 - using assets
An export session is a control object that can generate an asset asynchronously. You can initialize a session with the asset and preset name you need to generate. Preset name indicates the properties of the asset you want to generate. Next, you can configure the export session. For example, you can specify the output URL and file type, as well as other settings, such as metadata.
You can first check whether the preset you set is available by using the exportpresetscompatiblewithasset: method.

AVAsset *anAsset = <#Get an asset#>;
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:anAsset];
if ([compatiblePresets containsObject:AVAssetExportPresetLowQuality]) {
    AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]
        initWithAsset:anAsset presetName:AVAssetExportPresetLowQuality];
    // Implementation continues.
}

You can configure the URL of the output of the session (this URL must be the file URL). Avassetexportsession can infer the type of the output file through the URL extension. Of course, you can directly set the file type and use outputfiletype. You can also specify other attributes, such as time range, length of output file, etc. the following are the following:

exportSession.outputURL = <#A file URL#>;
    exportSession.outputFileType = AVFileTypeQuickTimeMovie;
 
    CMTime start = CMTimeMakeWithSeconds(1.0, 600);
    CMTime duration = CMTimeMakeWithSeconds(3.0, 600);
    CMTimeRange range = CMTimeRangeMake(start, duration);
    exportSession.timeRange = range;

To generate a new asset, you can call exportasynchronously withcompletionhandler:. When the generation operation is completed, you will call back to the block. In this block, you need to check the status of the session to determine whether it is successful, as follows:

  [exportSession exportAsynchronouslyWithCompletionHandler:^{
 
        switch ([exportSession status]) {
            case AVAssetExportSessionStatusFailed:
                NSLog(@"Export failed: %@", [[exportSession error] localizedDescription]);
                break;
            case AVAssetExportSessionStatusCancelled:
                NSLog(@"Export canceled");
                break;
            default:
                break;
        }
    }];

You can cancel the generation operation by sending a cancelexport message to the session.
If the exported file exists or the exported URL is outside the sandbox, the export operation will fail. There are two other situations that can also lead to failure:
·A call came
·Your application runs in the background and other applications start playing.
In this case, you should notify the user that export failed and re export.

reference resources:

https://developer.apple.com/library/prerelease/ios/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/01_UsingAssets.html

Recommended Today

The file / boot / grub / stage1 not read cor solution appears

It’s reinstalledoperationsystem, grub was also overwritten. In order to restore startup, I took a live CD of Ubuntu and started it. After entering Linux, I entered the command line to restore grub. First mount the original / partition sudo mkdir /mnt/root sudo mount /dev/sda7 /mnt/root sudo mount -t proc none /mnt/root/proc sudo mount -o bind […]