The MNAVChapters iOS library reads chapter metadata of audiovisual assets. It reads chapters from MPEG-4 and specifically MP3 files.
Although the id3v2 standard specifies the chapter frame since 2005, I couldn't find a C or Objective-C library that parses this frame correctly. So, inspired by a post over on the auphonic blog, I started this modest Objective-C implementation.
A chapter within a media file.
title
The title of the chapter.
(nonatomic, copy) NSString *title;
url
An URL string of the chapter.
(nonatomic, copy) NSString *url;
time
The start time of the chapter.
(nonatomic) CMTime time;
duration
The duration of the chapter.
(nonatomic) CMTime duration;
artwork
An embedded chapter image.
(nonatomic) UIImage *artwork;
The parser which reads chapter marks from timed audiovisual media. It attempts to read chapter information from assets with "org.mp4ra"
or "org.id3"
meta data formats.
+ chaptersFromAsset:
Make sense of an AVAsset
object and, if possible, return an array of chapters.
+ (NSArray *)chaptersFromAsset:(AVAsset *)asset;
Here is an example of reading chapter marks from one the auphonic demo files:
AVAsset *asset = [self assetWithResource:@"auphonic_chapters_demo" ofType:@"mp3"];
NSArray *chapters = [[MNAVChapterReader chaptersFromAsset:asset];
Add the MNAVChapters Xcode project to your workspace or, to create a release build and use the object files in the build directory, do:
$ make
Run tests from the command-line:
$ make test
This repo contains an Xcode workspace to provide an easy to use example, written in Swift. When you are running the app for the first time, and tap on one of the episodes, be patient. It will have to download the media files, which, depending on your network might take some time. Once received, the files are copied to "/Library/Caches/"
, and onwardly read from there.