In this guide you will learn how to Embed Metadata within a Live Stream.

What is Timed Metadata?

Timed Metadata is metadata with timestamps. It can be inserted into a Live stream, using the AuroraLive API.
When AuroraLive processes a Live stream, the metadata is synchronized with the video frames.
During playback, all viewers of the stream get the metadata at the same time relative to the stream.The timecode serves as a cue point, which can be used to trigger an action based on the data, such as the following:

  • Updating player statistics for a sports stream.
  • Sending product details for a live shopping stream.
  • Sending questions for a live quiz stream.

Timed Metadata uses ID3 tags embedded in the video segments. As a result, they are available in the recorded video.

Inserting Timed Metadata

You can insert Timed Metadata only into an active stream.

To programmatically insert Timed Metadata, use the PutMetadata endpoint, described in the AuroraLive API Reference.

Consuming Timed Metadata

Use Player to consume Timed Metadata embedded in a live stream.
An event is triggered whenever playback reaches a segment with embedded metadata. You can use this event to initiate functionality within your client application.
This event is triggered for both live and recorded content.

HLS.js(v1.2.0 and up) for Web:

hlsjs.on(Hls.Events.FRAG_PARSING_METADATA, (_event, data) => {
// this.player is $("video")
for (let textTrack of this.player.textTracks) {
if (textTrack.kind != 'metadata' && textTrack.label != 'id3') {
continue
}
if (textTrack.oncuechange != null) {
continue
}
let lastCueStartTime = 0
textTrack.oncuechange = (cueEvent) => {
for (let cue of cueEvent.target.activeCues) {
// chrome will trigger oncuechange twice, the first activeCues is the previous one, the second is the current one. Need to add a time judgment to ignore the last message
if (cue.startTime <= lastCueStartTime) {
continue
}
lastCueStartTime = cue.startTime
let jsonMsg = new TextDecoder().decode(cue.value.data)
console.log("timed metadata: " + jsonMsg)
}
}
}
})

AVPlayer(iOS 8.0+) for iOS:

AVPlayerItem *playerItem = [AVPlayerItem playerItemWithURL:"http://127.0.0.1/timed_id3.m3u8"]; // example url
[playerItem addObserver:self forKeyPath:@"timedMetadata" options:NSKeyValueObservingOptionNew context:nil];
// class method
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if ([keyPath isEqualToString:@"timedMetadata"]) {
AVPlayerItem *item = object;
for (size_t i = 0; i < item.timedMetadata.count; i++){
AVMetadataItem *metadata = item.timedMetadata[i];
if (metadata.value == NULL) {
NSLog(@"warning: timed metadata is null.");
return;
}
NSData *data = metadata.value;
NSString *jsonString = [[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding];
NSLog(@"timed metadata:%@", jsonString);
}
}
}

Exoplayer(v2.10.6 and up) for Android:

SimpleExoPlayer player = ExoPlayerFactory.newSimpleInstance(context, renderersFactory, trackSelector, drmSessionManager);
player.addMetadataOutput(new MetadataOutput() {
@Override
public void onMetadata(Metadata metadata) {
for (int i = 0; i < metadata.length(); i++) {
Metadata.Entry entry = metadata.get(i);
if (entry instanceof PrivFrame) {
PrivFrame timedMetadata = (PrivFrame) entry;
Log.d("TimedMetadata", new String(timedMetadata.privateData));
}
}
}
});