This documentation covers tracking in version 2.x of the SDK.
If you are implementing a 1.x version of the SDK, you can download 1.x Developers Guides here: Download SDKs.
Tracking core playback includes tracking media load, media start, media pause, and media complete. Although not mandatory, tracking buffering and seeking are also core components used for tracking content playback. In your media player API, identify the player events that correspond with the Media SDK tracking calls, and code your event handlers to call tracking APIs, and to populate required and optional variables.
trackSessionStart
; For example: trackSessionStart(mediaObject, contextData)
trackPlay
trackPause
trackPlay
when playback resumestrackComplete
trackSessionEnd
trackEvent(SeekStart)
trackEvent(SeekComplete)
trackEvent(BufferStart);
trackEvent(BufferComplete);
Initial tracking setup - Identify when the user triggers the intention of playback (the user clicks play and/or autoplay is on) and create a MediaObject
instance using the media information for content name, content ID, content length, and stream type.
MediaObject
reference:
Variable Name | Description | Required |
---|---|---|
name |
Content name | Yes |
mediaid |
Content unique identifier | Yes |
length |
Content length | Yes |
streamType |
Stream type | Yes |
mediaType |
Media type (audio or video content) | Yes |
StreamType
constants:
Constant Name | Description |
---|---|
VOD |
Stream type for Video on Demand. |
LIVE |
Stream type for Live content. |
LINEAR |
Stream type for Linear content. |
AOD |
Stream type for audio on demand |
AUDIOBOOK |
Stream type for audio book |
PODCAST |
Stream type for Podcast |
MediaType
constants:
Constant Name | Description |
---|---|
Audio |
Media type for Audio streams. |
Video |
Media type for Video streams. |
The general format for creating the MediaObject
is MediaHeartbeat.createMediaObject(<MEDIA_NAME>, <MEDIA_ID>, <MEDIA_LENGTH>, <STREAM_TYPE>, <MEDIA_TYPE>);
Attach metadata - Optionally attach standard and/or custom metadata objects to the tracking session through context data variables.
Standard metadata -
Attaching the standard metadata object to the media object is optional.
Instantiate a standard metdata object, populate the desired variables, and set the metadata object on the Media Heartbeat object.
See the comprehensive list of metadata here: Audio and video parameters.
Custom metadata - Create a variable object for the custom variables and populate with the data for this content.
Track the intention to start playback - To begin tracking a session, call trackSessionStart
on the Media Heartbeat instance.
trackSessionStart
tracks the user intention of playback, not the beginning of the playback. This API is used to load the data/metadata and to estimate the time-to-start QoS metric (the time duration between trackSessionStart
and trackPlay
).
If you are not using custom metadata, simply send an empty object for the data
argument in trackSessionStart
.
Track the actual start of playback - Identify the event from the media player for the beginning of the playback, where the first frame of the content is rendered on the screen, and call trackPlay
.
Track the completion of playback - Identify the event from the media player for the completion of the playback, where the user has watched the content until the end, and call trackComplete
.
Track the end of the session - Identify the event from the media player for the unloading/closing of the playback, where the user closes the content and/or the content is completed and has been unloaded, and call trackSessionEnd
.
trackSessionEnd
marks the end of a tracking session. If the session was successfully watched to completion, where the user watched the content until the end, ensure that trackComplete
is called before trackSessionEnd
. Any other track*
API call is ignored after trackSessionEnd
, except for trackSessionStart
for a new tracking session.
Track all possible pause scenarios - Identify the event from the media player for pause and call trackPause
.
Pause Scenarios - Identify any scenario in which the Player will pause and make sure that trackPause
is properly called. The following scenarios all require that your app call trackPause()
:
Identify the event from the player for play and/or resume from pause and call trackPlay
.
This may be the same event source that was used in Step 4. Ensure that each trackPause()
API call is paired with a following trackPlay()
API call when the playback resumes.
Listen for playback seeking events from the media player. On seek start event notification, track seeking using the SeekStart
event.
On seek complete notification from the media player, track the end of seeking using the SeekComplete
event.
Listen for the playback buffering events from media player, and on buffer start event notification, track buffering using the BufferStart
event.
On buffer complete notification from the media player, track the end of buffering using the BufferComplete
event.
See examples of each step in the following platform-specific topics, and look at the sample players included with your SDKs.
For a simple example of playback tracking, see this use of the JavaScript 2.x SDK in an HTML5 player:
/* Call on media start */
if (e.type == "play") {
// Check for start of media
if (!sessionStarted) {
/* Set media info */
/* MediaHeartbeat.createMediaObject(<MEDIA_NAME>,
<MEDIA_ID>,
<MEDIA_LENGTH>,
<MEDIA_STREAMTYPE>,
<MEDIA_MEDIATYPE>);*/
var mediaInfo = MediaHeartbeat.createMediaObject(
document.getElementsByTagName('video')[0].getAttribute("name"),
document.getElementsByTagName('video')[0].getAttribute("id"),
video.duration,
MediaHeartbeat.StreamType.VOD);
/* Set custom context data */
var customVideoMetadata = {
isUserLoggedIn: "false",
tvStation: "Sample TV station",
programmer: "Sample programmer"
};
/* Set standard video metadata */
var standardVideoMetadata = {};
standardVideoMetadata[MediaHeartbeat.VideoMetadataKeys.EPISODE] = "Sample Episode";
standardVideoMetadata[MediaHeartbeat.VideoMetadataKeys.SHOW] = "Sample Show";
mediaInfo.setValue(MediaHeartbeat.MediaObjectKey.StandardVideoMetadata,
standardVideoMetadata);
// Start Session
this.mediaHeartbeat.trackSessionStart(mediaInfo, customVideoMetadata);
// Track play
this.mediaHeartbeat.trackPlay();
sessionStarted = true;
} else {
// Track play for resuming playack
this.mediaHeartbeat.trackPlay();
}
};
/* Call on video complete */
if (e.type == "ended") {
console.log("video ended");
this.mediaHeartbeat.trackComplete();
this.mediaHeartbeat.trackSessionEnd();
sessionStarted = false;
};
/* Call on pause */
if (e.type == "pause") {
this.mediaHeartbeat.trackPause();
};
/* Call on scrub start */
if (e.type == "seeking") {
this.mediaHeartbeat.trackEvent(MediaHeartbeat.Event.SeekStart);
};
/* Call on scrub stop */
if (e.type == "seeked") {
this.mediaHeartbeat.trackEvent(MediaHeartbeat.Event.SeekComplete);
};
/* Call on buffer start */
if (e.type == "buffering") {
this.mediaHeartbeat.trackEvent(MediaHeartbeat.Event.BufferStart);
};
/* Call on buffer complete */
if (e.type == "buffered") {
this.mediaHeartbeat.trackEvent(MediaHeartbeat.Event.BufferComplete);
};
For information on validating your legacy implementation, see Legacy Validation.