AVFoundation 编程指南(上) AVFoundation programming guide
- 如果想简单的播放movies,使用AVKit
- iOS上,录制视频,当只需要对格式进行最小限度的控制时,可以使用UIKit
- 使用AVAudioPlayer播放音频
- 使用AVAudioRecorder录制音频
播放 Playback
Asset的读取,写入和重新编码
缩略图 Thumbnails
编辑
定格和视频媒体捕获 Still and Video Media Capture
AVFoundation的并发编程
- UI相关的通知在主线程触发
- 需要创建或者指定一个队列queue的类或者方法将会在这个队列上返回通知。
NSURL *url = <#A URL that identifies an audiovisual asset such as a movie file#>;
|
AVURLAsset *anAsset = [[AVURLAsset alloc] initWithURL:url options:nil];
|
初始化Asset的选项
- 如果你仅仅想要播放asest,给字典传递 nil或者传递 key的值为NO(NSValue)都可以。
- 如果你想把asset加入一个composition(AVMutableComposition),你需要精确的随机数据。所以需要给字典的key传入YES,如下:
NSURL *url = <#A URL that identifies an audiovisual asset such as a movie file#>;
|
NSDictionary *options = @{ AVURLAssetPreferPreciseDurationAndTimingKey : @YES };
|
AVURLAsset *anAssetToUseInAComposition = [[AVURLAsset alloc] initWithURL:url options:options];
|
访问用户的Assets
- 创建 MPMediaQuery 实例在iPod中方位library,然后使用 MPMediaItemPropertyAssetURL 拿到他的url
- 访问 Photos App的asset,使用 ALAssetLibrary(现在应该是PHAssetLibrary)了。
准备使用 一个 Asset
AVAsset anAsset = <#Get an asset#>;
|
if ([[anAsset tracksWithMediaType:AVMediaTypeVideo] count] > 0) {
|
AVAssetImageGenerator *imageGenerator =
|
[AVAssetImageGenerator assetImageGeneratorWithAsset:anAsset];
|
// Implementation continues…
|
}
|
生成一个单独的图片
生成一个序列的图片
- 生成的image
- 请求image的时间和image实际生成的时间
- 生成失败的error 描述
AVAsset *myAsset = <#An asset#>];
|
// Assume: @property (strong) AVAssetImageGenerator *imageGenerator;
|
self.imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:myAsset];
|
Float64 durationSeconds = CMTimeGetSeconds([myAsset duration]);
|
CMTime firstThird = CMTimeMakeWithSeconds(durationSeconds/3.0, 600);
|
CMTime secondThird = CMTimeMakeWithSeconds(durationSeconds*2.0/3.0, 600);
|
CMTime end = CMTimeMakeWithSeconds(durationSeconds, 600);
|
NSArray *times = @[NSValue valueWithCMTime:kCMTimeZero],
|
[NSValue valueWithCMTime:firstThird], [NSValue valueWithCMTime:secondThird],
|
[NSValue valueWithCMTime:end]];
|
[imageGenerator generateCGImagesAsynchronouslyForTimes:times
|
completionHandler:^(CMTime requestedTime, CGImageRef image, CMTime actualTime,
|
AVAssetImageGeneratorResult result, NSError *error) {
|
NSString *requestedTimeString = (NSString *)
|
CFBridgingRelease(CMTimeCopyDescription(NULL, requestedTime));
|
NSString *actualTimeString = (NSString *)
|
CFBridgingRelease(CMTimeCopyDescription(NULL, actualTime));
|
NSLog(@”Requested: %@; actual %@”, requestedTimeString, actualTimeString);
|
if (result == AVAssetImageGeneratorSucceeded) {
|
// Do something interesting with the image.
|
}
|
if (result == AVAssetImageGeneratorFailed) {
|
NSLog(@”Failed with error: %@”, [error localizedDescription]);
|
}
|
if (result == AVAssetImageGeneratorCancelled) {
|
NSLog(@”Canceled”);
|
}
|
}];
|
裁切和转码一个Movie
AVAsset *anAsset = <#Get an asset#>;
|
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:anAsset];
|
if ([compatiblePresets containsObject:AVAssetExportPresetLowQuality]) {
|
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]
|
initWithAsset:anAsset presetName:AVAssetExportPresetLowQuality];
|
// Implementation continues.
|
}
|
exportSession.outputURL = <#A file URL#>;
|
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
|
CMTime start = CMTimeMakeWithSeconds(1.0, 600);
|
CMTime duration = CMTimeMakeWithSeconds(3.0, 600);
|
CMTimeRange range = CMTimeRangeMake(start, duration);
|
exportSession.timeRange = range;
|
- 来电话了
- 你app在后太当时另一个app开始了播放
- 创建AVURLAsset
- 创建AVPlayerItem实例
- 关联item到AVPlayer实例
- 等待item的status 属性表明现在可以播放了(使用KVO观察,接收一个status变动通知)
NSURL *url = [NSURL URLWithString:@”<#Live stream URL#>];
|
// You may find a test stream at <http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8>.
|
self.playerItem = [AVPlayerItem playerItemWithURL:url];
|
[playerItem addObserver:self forKeyPath:@”status” options:0 context:&ItemStatusContext];
|
self.player = [AVPlayer playerWithPlayerItem:playerItem];
|
self.player = [AVPlayer playerWithURL:<#Live stream URL#>];
|
[player addObserver:self forKeyPath:@”status” options:0 context:&PlayerStatusContext];
|
– (IBAction)play:sender {
|
[player play];
|
}
|
改变 Playback Rate
aPlayer.rate = 0.5;
|
aPlayer.rate = 2.0;
|
查找,重定位播放头 Seeking—Repositioning the Playhead
CMTime fiveSecondsIn = CMTimeMake(5, 1);
|
[player seekToTime:fiveSecondsIn];
|
CMTime fiveSecondsIn = CMTimeMake(5, 1);
|
[player seekToTime:fiveSecondsIn toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero];
|
// Register with the notification center after creating the player item.
|
[[NSNotificationCenter defaultCenter]
|
addObserver:self
|
selector:@selector(playerItemDidReachEnd:)
|
name:AVPlayerItemDidPlayToEndTimeNotification
|
object:<#The player item#>];
|
– (void)playerItemDidReachEnd:(NSNotification *)notification {
|
[player seekToTime:kCMTimeZero];
|
}
|
NSArray *items = <#An array of player items#>;
|
AVQueuePlayer *queuePlayer = [[AVQueuePlayer alloc] initWithItems:items];
|
AVPlayerItem *anItem = <#Get a player item#>;
|
if ([queuePlayer canInsertItem:anItem afterItem:nil]) {
|
[queuePlayer insertItem:anItem afterItem:nil];
|
}
|
- 如果用户使用多任务切换到不同的app,一个 player的rate会降到0.0
- 如果你正在播放远程媒体,player item的loadedTimeRanges and seekableTimeRanges属性会在更多数据可用时改变。这俩属性告诉你那部分player item的timeline是可用的。
- 一个 player 的currentItem当 一个play而 item为http live stream 创建时改变。
- 一个 player item的 tracks 属性可能在播放一个 http live stream时改变。如果stream为内容提供了不同的编码时,这种情况可能发生;如果 player 切换到不同编码,tracks会改变。
- 一个player或者player item的status可能在播放 失败时改变。
响应一个状态变化
追踪视觉显示的已准备好的时机
追踪时间
// Assume a property: @property (strong) id playerObserver;
|
Float64 durationSeconds = CMTimeGetSeconds([<#An asset#> duration]);
|
CMTime firstThird = CMTimeMakeWithSeconds(durationSeconds/3.0, 1);
|
CMTime secondThird = CMTimeMakeWithSeconds(durationSeconds*2.0/3.0, 1);
|
NSArray *times = @[[NSValue valueWithCMTime:firstThird], [NSValue valueWithCMTime:secondThird]];
|
self.playerObserver = [<#A player#> addBoundaryTimeObserverForTimes:times queue:NULL usingBlock:^{
|
NSString *timeDescription = (NSString *)
|
CFBridgingRelease(CMTimeCopyDescription(NULL, [self.player currentTime]));
|
NSLog(@”Passed a boundary at %@”, timeDescription);
|
}];
|
到达Item的结束位置
[[NSNotificationCenter defaultCenter] addObserver:<#The observer, typically self#>
|
selector:@selector(<#The selector name#>)
|
name:AVPlayerItemDidPlayToEndTimeNotification
|
object:<#A player item#>]
|
- 配置视图以在AVPlayerLayer图层上使用
- 创建AVPlayer对象
- 为基于文件的asset创建一个AVPlayerItem对象,并使用KVO来观察其状态
- 通过启用按钮来响应该item准备播放
- 播放该项目,然后将播放器的头部恢复到开头
The Player View
To play the visual component of an asset, you need a view containing an AVPlayerLayer layer to which the output of an AVPlayer object can be directed. You can create a simple subclass of UIView to accommodate this:
#import <UIKit/UIKit.h>
|
#import <AVFoundation/AVFoundation.h>
|
@interface PlayerView : UIView
|
@property (nonatomic) AVPlayer *player;
|
@end
|
@implementation PlayerView
|
+ (Class)layerClass {
|
return [AVPlayerLayer class];
|
}
|
– (AVPlayer*)player {
|
return [(AVPlayerLayer *)[self layer] player];
|
}
|
– (void)setPlayer:(AVPlayer *)player {
|
[(AVPlayerLayer *)[self layer] setPlayer:player];
|
}
|
@end
|
A Simple View Controller
Assume you have a simple view controller, declared as follows:
@class PlayerView;
|
@interface PlayerViewController : UIViewController
|
@property (nonatomic) AVPlayer *player;
|
@property (nonatomic) AVPlayerItem *playerItem;
|
@property (nonatomic, weak) IBOutlet PlayerView *playerView;
|
@property (nonatomic, weak) IBOutlet UIButton *playButton;
|
– (IBAction)loadAssetFromFile:sender;
|
– (IBAction)play:sender;
|
– (void)syncUI;
|
@end
|
The syncUI method synchronizes the button’s state with the player’s state:
– (void)syncUI {
|
if ((self.player.currentItem != nil) &&
|
([self.player.currentItem status] == AVPlayerItemStatusReadyToPlay)) {
|
self.playButton.enabled = YES;
|
}
|
else {
|
self.playButton.enabled = NO;
|
}
|
}
|
You can invoke syncUI in the view controller’s viewDidLoad method to ensure a consistent user interface when the view is first displayed.
– (void)viewDidLoad {
|
[super viewDidLoad];
|
[self syncUI];
|
}
|
The other properties and methods are described in the remaining sections.
Creating the Asset
You create an asset from a URL using AVURLAsset. (The following example assumes your project contains a suitable video resource.)
– (IBAction)loadAssetFromFile:sender {
|
NSURL *fileURL = [[NSBundle mainBundle]
|
URLForResource:<#@”VideoFileName”#> withExtension:<#@”extension”#>];
|
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:fileURL options:nil];
|
NSString *tracksKey = @”tracks”;
|
[asset loadValuesAsynchronouslyForKeys:@[tracksKey] completionHandler:
|
^{
|
// The completion block goes here.
|
}];
|
}
|
In the completion block, you create an instance of AVPlayerItem for the asset and set it as the player for the player view. As with creating the asset, simply creating the player item does not mean it’s ready to use. To determine when it’s ready to play, you can observe the item’s status property. You should configure this observing before associating the player item instance with the player itself.
You trigger the player item’s preparation to play when you associate it with the player.
// Define this constant for the key-value observation context.
|
static const NSString *ItemStatusContext;
|
// Completion handler block.
|
dispatch_async(dispatch_get_main_queue(),
|
^{
|
NSError *error;
|
AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];
|
if (status == AVKeyValueStatusLoaded) {
|
self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
|
// ensure that this is done before the playerItem is associated with the player
|
[self.playerItem addObserver:self forKeyPath:@”status”
|
options:NSKeyValueObservingOptionInitial context:&ItemStatusContext];
|
[[NSNotificationCenter defaultCenter] addObserver:self
|
selector:@selector(playerItemDidReachEnd:)
|
name:AVPlayerItemDidPlayToEndTimeNotification
|
object:self.playerItem];
|
self.player = [AVPlayer playerWithPlayerItem:self.playerItem];
|
[self.playerView setPlayer:self.player];
|
}
|
else {
|
// You should deal with the error appropriately.
|
NSLog(@”The asset’s tracks were not loaded:\n%@”, [error localizedDescription]);
|
}
|
});
|
Responding to the Player Item’s Status Change
When the player item’s status changes, the view controller receives a key-value observing change notification. AV Foundation does not specify what thread that the notification is sent on. If you want to update the user interface, you must make sure that any relevant code is invoked on the main thread. This example uses dispatch_async to queue a message on the main thread to synchronize the user interface.
– (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object
|
change:(NSDictionary *)change context:(void *)context {
|
if (context == &ItemStatusContext) {
|
dispatch_async(dispatch_get_main_queue(),
|
^{
|
[self syncUI];
|
});
|
return;
|
}
|
[super observeValueForKeyPath:keyPath ofObject:object
|
change:change context:context];
|
return;
|
}
|
Playing the Item
Playing the item involves sending a play message to the player.
– (IBAction)play:sender {
|
[player play];
|
}
|
The item is played only once. After playback, the player’s head is set to the end of the item, and further invocations of the play method will have no effect. To position the playhead back at the beginning of the item, you can register to receive an AVPlayerItemDidPlayToEndTimeNotification from the item. In the notification’s callback method, invoke seekToTime: with the argument kCMTimeZero.
// Register with the notification center after creating the player item.
|
[[NSNotificationCenter defaultCenter]
|
addObserver:self
|
selector:@selector(playerItemDidReachEnd:)
|
name:AVPlayerItemDidPlayToEndTimeNotification
|
object:[self.player currentItem]];
|
– (void)playerItemDidReachEnd:(NSNotification *)notification {
|
[self.player seekToTime:kCMTimeZero];
|
}
|