Track video playback to Adobe Media Analytics through Adobe Launch
28 October 2020 update: updated instructions and code for Adobe Media Analytics 3.x SDK.
For publishers that have lots of video (or audio) content in their websites, knowing how much that content is being consumed informs them about how to produce their media to meet their audience's demands continually.
For publishers that use Adobe Analytics as their measurement platform, video player interactions, like pressing the Play or Pause buttons, can be tracked with Custom Links.
But to track the continuous playback, publishers should measure with Adobe Media Analytics. This tracks "heartbeats", where an image request is sent every second or so as the video is played. This allows the publisher to report on which parts of a video are most (or least) watched.
In this three-part series, I describe how to use Adobe Experience Platform Launch (or Adobe Launch, as it's more commonly known) to track video playback to Adobe Analytics or Adobe Media Analytics:
- Part one: tracking video playback events using Adobe Analytics' Custom Links.
- Part two (this part): tracking video playback events to Adobe Media Analytics, Adobe's media tracking and reporting solution.
- Part three: tracking Brightcove video playback events to Adobe Media Analytics, instead of Brightcove's own analytics integration.
(Each part if written as a standalone guide, so you don't need to read them one-by-one.)
There are a few things to note:
- These instructions work with HTML5 audio and video only, because they depend on the native events that modern web browsers provide when working with these assets. These instructions will not work for any other kinds of video, particularly those that depend on specific browser plug-ins or extensions.
- These instructions will not work with YouTube video embeds. That is because the embeds use an <IFRAME> instead of a regular <VIDEO> HTML element.
- Although I use video in these instructions, the same method will work for audio as well. This is because the HTML5 media specification works with both kinds of formats, and I am not tracking anything that is video-specific, like picture frame size.
Adobe Experience Platform Launch, more commonly known as Adobe Launch, is Adobe's tag management solution. To find out more about Adobe Launch, you can read it from Adobe's documentation.
Recipe to track video playback events to Adobe Media Analytics through Adobe Launch
- Installation of the Adobe Media Analytics (3.x SDK) extension.
- A Rule to associate with video playback.
- Events in the Rule to detect the different kinds of playback events, like "Play", "Pause", etc. including "Loaded".
- Action in the Rule to send the tracking data to Adobe Media Analytics.
In these instructions, I show how to track the "loaded", "play", "pause", and "ended" video playback events.
1. Install the Adobe Media Analytics extension.
The full name of the extension is "Adobe Media Analytics (3.x SDK) for Audio and Video", but most of the time, it's just referred to as "Adobe Media Analytics" (or even its much older name, "MediaHeartbeat"). But mostly, you'll see references to "Media" later when configuring the Action for the Rule.
Adobe Media Analytics extension configuration |
Fill up as much or as little information as you want in the extension. This is to track information about your player.
The most important field is the "Variable Name". This needs to be a name that is not used with any other JavaScript variable in your website. Remember the name that is chosen here because it will be needed when configuring the Action for the Rule later.
You should also have installed the Experience Cloud ID Service extension. Adobe Media Analytics depends on this Service.
2. Create the Rule to associate with video playback.
Only one Rule is really needed to work with all of the video playback events. This will be explained in the next step when creating the Events.
3. Create the Events in the Rule, one for each kind of video playback event.
Adobe Launch's Core extension can detect all of the common media-related event types, and the names of the event types are quite self-explanatory. This saves us the trouble of having to write custom code to detect these events from the web browser.
Reminder: These media-related event types work with HTML5 audio and video only! They will not work with any other kinds of audio/video, particularly those that depend on specific browser plug-ins or extensions, nor with YouTube video embeds.
Adobe Launch's Core extension's "Media" Event types |
To start, choose the "Media Play" Event type. Then, setup the rest of the Event Configuration as needed. This is where the CSS selecor of the <VIDEO> element in the web page needs to be specified.
Adobe Launch's Core extension's "Media" Event configuration |
This is what my "Media Play" Event's configuration looks like:
"Media Play" Event configuration |
I have also updated the configuration's name to reflect the setup. In that way, when I look at my Rule's setup, I can instantly see how I have configured my "Media Play" Event without needing to open up the Event later on.
Rule with "Media Play" Event |
The beauty of an Adobe Launch's Rule is that you can add more than one Event to it. Then, when any of the Events are met while the user browses your website, the Rule will be triggered immediately.
For our purpose, that means that instead of creating separate Rules for each type of video playback event, all of the playback events can be included under one Rule, like so:
Rule with four "Media" Events |
But when it comes to the actual tracking, how will Adobe Launch know which event to track? For example, I wouldn't want to track a "pause" as a "play" incorrectly. That will be solved in the next section, when setting the Actions for this Rule.
4. Create the Action to send the tracking data to Adobe Media Analytics
At this point, the Adobe Analytics, Adobe Media Analytics and Experience Cloud ID Service extensions should all be installed in the Adobe Launch property already. Otherwise, the rest of these instructions are moot.
Only one Action is needed, and it contains all of the Custom Code needed to track the video playback to Adobe Media Analytics.
At the time of this writing, there is no other way to send tracking data to Adobe Media Analytics except with Custom Code.Add an Action from the Core extension and Action Type "Custom Code". Click "Open Editor" and add the following code:
// Initialise Adobe Media Analytics.
// "ADB" is the global variable declared inside the Adobe Media Analytics extension.
var ADB = window["ADB"];
var Media = ADB.Media;
var tracker = Media.getInstance();
// Store a reference to the <VIDEO> / <AUDIO> element in "player" to use with the rest of the script.
var player = this;
// Get a reference to the Media event type
var eventType = event.nativeEvent.type;
var mediaObject = mediaObject || null;
// Create the mediaObject if this is a new video or it cannot be found
if (eventType === 'loadeddata' || !mediaObject) {
// Create the mediaObject from the <VIDEO> / <AUDIO> element's media.
var mediaName = player.currentSrc,
mediaId = player.currentSrc,
mediaLength = player.duration,
mediaStreamType = Media.StreamType.VOD,
mediaType = Media.MediaType.Video;
mediaObject = Media.createMediaObject(mediaName, mediaId, mediaLength, mediaStreamType, mediaType);
}
// Track the mediaObject using the mediaDelegate.
switch (eventType) {
case 'loadeddata':
tracker.trackSessionStart(mediaObject);
break;
case 'play':
tracker.trackPlay();
break;
case 'pause':
// a "pause" event is sent with the "ended" event, so don't track the pause in that case
if (Math.ceil(player.currentTime) < Math.ceil(player.duration)) {
tracker.trackPause();
}
break;
case 'ended':
tracker.trackComplete();
break;
}
Here's what each part does:
// Initialise Adobe Media Analytics.
// "ADB" is the global variable declared inside the Adobe Media Analytics extension.
var ADB = window["ADB"];
var Media = ADB.Media;
var tracker = Media.getInstance();
Recall from when setting up the Adobe Media Analytics extension that the "Variable Name", ADB, had been set. This is where that variable is used. It provides access to the Media object, the guts of Adobe Media Analytics that provides important functions needed by the rest of the code.
Then, the tracker object is obtained. This is the object that handles the actual video playback tracking. It is used later in the code.
// Store a reference to the <VIDEO> / <AUDIO> element in "player" to use with the rest of the script.
var player = this;
This line of code is required because of the way JavaScript works, and it has to do with scope. (Learn more about scopes from W3Schools.)
// Get a reference to the Media event type
var eventType = event.nativeEvent.type;
This gets the HTML5 media event type from Adobe Launch's event object. The returned event type corresponds to Adobe Launch's "Media" Event types.
var mediaDelegate = mediaDelegate || null;
Adobe Media Analytics uses a mediaDelegate to get runtime information during the video playback. (Learn more about delegates from a StackOverflow post.) It requires two functions: getCurrentPlaybackTime and getQoSObject. If either of them are missing, then Adobe Media Analytics throws an error and refuses to run.
There are two situations where the mediaDelegate needs to be created:
A new video has been loaded in the player, so a correspondingly new mediaDelegate should be created for it to ensure that the right metadata is retrieved for that video.Since this Action can be run for any of the "Media" Events, the code checks if this mediaDelegate has been created already (which it should have, but this is a fallback, just in case something had messed up previously). If it hasn't, then it is created.
// Set the fixed variables needed for the QoS object.
var startupTime = new Date().getTime(),
fps = 30, // HTML5 media do not support frame rate, so this should be set to the media's actual frames per second
droppedFrames = 0; // HTML5 media do not support dropped frames
"QoS" stands for "Quality of Service". Adobe Media Analytics uses it to be able to report on the quality of the video playback that the user experiences. This could be used for server and bandwidth optimisation.
The QoS object needs four variables. But of those four, only bitrate can be derived from the video itself. This is because of HTML5's limitations, so there is no information about the playback's frames per second nor the number of dropped frames, and there is nothing that can be done about that. So those are set to fixed values.
// Create the delegate for MediaHeartbeat.
mediaDelegate = {
getCurrentPlaybackTime: function () {
// Get the current played time.
return player.currentTime;
},
getQoSObject: function () {
// Get the current playback rate.
var bitrate = player.playbackRate;
return MediaHeartbeat.createQoSObject(bitrate, startupTime, fps, droppedFrames);
},
};
With all of the required information available, the mediaDelegate can be created now.
(A mediaDelegate object is not needed any more with Adobe Media Analytics 3.x. It was required with earlier versions.)
var mediaObject = mediaObject || null;
There are two situations where the mediaObject needs to be created:
- A new video has been loaded in the player, so a correspondingly new mediaObject should be created for it to ensure that the right object is set to represent that video.
- Since this Action can be run for any of the "Media" Events, the code checks if this mediaObject has been created already (which, again, it should have, but this is still made available as a fallback, just in case). If it hasn't then it is created.
// Create the mediaObject from the <VIDEO> / <AUDIO> element's media.
var mediaName = player.currentSrc,
mediaId = player.currentSrc,
mediaLength = player.duration,
mediaStreamType = MediaHeartbeat.StreamType.VOD,
mediaType = MediaHeartbeat.MediaType.Video;
mediaObject = MediaHeartbeat.createMediaObject(mediaName, mediaId, mediaLength, mediaStreamType, mediaType);
All of this information can be obtained from the <VIDEO> element itself or from the MediaHeartbeat object itself. Here, both the video's name and ID have been set to the same thing: the URL of that video.
Refer to the MediaHeartbeat documentation for a full list of stream types.
Refer to the MediaHeartbeat documentation for a full list of media types.
MediaHeartbeat.getInstance(mediaDelegate).then(function (instance) {...}).catch(function (err) {...});
When tracking video playback, MediaHeartbeat needs to get the current time and Quality of Service from the video via the mediaDelegate. But such metadata can only be obtained while the video is playing.
So during tracking, MediaHeartbeat needs to:
Request for metadata about the current time and Quality of Service from the video.Wait for the metadata to be sent back to it.Continue with tracking.
In JavaScript parlance, getInstance() returns a "promise" that the video metadata will be made available. When that promise is fulfilled (i.e. the video metadata is available), the then() part runs to perform the actual tracking. (Learn more about "promises" from JavaScript.info.)
If an error occurs while trying to get that instance, then the error from MediaHeartbeat is logged to Adobe Launch's log (accessible through your web browser's console or the Adobe Experience Cloud Debugger add-on for your web browser).
Adobe Media Analytics 3.x doesn't need explicit instructions for waiting for metadata. So you can continue directly to the actual video playback tracking.
// Track the mediaObject.
switch (eventType) {
case 'loadeddata':
tracker.trackSessionStart(mediaObject);
break;
case 'play':
tracker.trackPlay();
break;
case 'pause':
// a "pause" event is sent with the "ended" event, so don't track the pause in that case
if (Math.ceil(player.currentTime) < Math.ceil(player.duration)) {
tracker.trackPause();
}
break;
case 'ended':
tracker.trackComplete();
break;
}
For each "Media" Event type, the appropriate Adobe Media Analytics tracking function is called.
The tricky part is with the "pause" event. Most web browsers report a "pause" when the video has finished playing. If this were left alone, then that would always cause an extra "pause" to be tracked. But we don't want that, because the user hadn't really pressed "Pause" at the end of the video.
So I check if the current time is equal to the video's duration. If so, I assume that the user has really finished watching the video, and so I ignore the extra "pause" event emitted by the web browser.
Notice that Adobe Media Analytics' trackSessionEnd() is not used here. This is a judgement call that I had made. A playback session lasts for as long as the video is available to the user. Therefore, the session ends when the video is unavailable, which is normally when the user navigates away from the current page. Unfortunately, Adobe Launch does not have a native Event type for this kind of condition, so I have not configured anything like it here.
trackSessionEnd() could have been called with the "ended" event. But if the user were to re-play the video, that re-play would not be tracked by Adobe Media Analytics.
Putting all of the above together, the Rule should look like this, with four Events and one Action:
Complete Rule configuration |
No Conditions need to be configured in the Rule for the purpose of tracking video playback events.
Testing the Rule
After you save the Rule, add it to a library, and build that library in Adobe Launch, you can see this Rule in action when you play, pause or finish watching an HTML5 video in your website.
The best way to validate the setup is by looking for image requests to Adobe Media Analytics in your web browser's "Network" console. Refer to Adobe Media Analytics' documentation for validating the image requests:
Test 1: Standard Playback
Test 2: Media Interruption
And that is how video playback events can be tracked to Adobe Media Analytics through Adobe Launch.
This setup works if you want to track HTML5 videos that you add to your web page using a regular <VIDEO> element. But if you use another provider for serving your videos, like Brightcove, then you will need to modify some of the code above, particularly to get the video's metadata. Tracking Brightcove video playback to Adobe Media Analytics will be covered in the next post.
Comments
Post a Comment