Friday, October 30, 2009

Segmenting Compound Metrics

I recently posted an article about segmenting metrics, including compound metrics. In this post we will see how looking at the components of compound metrics can lead to greater business insights.

A typical compound metric is Page Consumption (page views divided by visits). This is a proxy for user interest. Another compound metrics is Repeat Use (visits divided by unique users). This is a proxy of interest over time. There are many compound metrics, some common and some exotic. They are expressed as rates, percents, ratios, rankings, indicators, yields, averages and all sorts of things in hopes of inferring user behavior. Web Analysts delight in creating calculated measures.

For our example we will use Page Consumption. The approach can be applied to any compound metric, however. We will look at the action items that may follow from looking at the calculated PV/Visit and then the Pages and Visits separately.

Typically, if you request a Page Consumption report from your Analyst, you will get some numbers and a chart that may look like the one below. This example compares the PV/Visit for six subject areas on a media site that makes its money from the volume of page views.

A PV/Visit Report

“Home Repair” and “Decorating – Men” are your clear content winners. They drive a lot of pages for a visit relative to Music and Gardening. One action might be to buy some more keywords for the lagging areas or put more links to them from other areas of the site. You can spend fewer resources on “Home Repair” and “Decorating – Men”. Those topics are doing great.

However, there is more you need to know before you act. Also request the information above broken out by its components. The second chart presents the same data with the metric components side by side. It provides a more complete picture and suggests a very different set of to-dos.

A segmented PV/Visit Report

For example, it’s apparent the men’s decorating section has low volume but a really high level of consumption (the first chart shows it has almost the same consumption as “Home Repair”). Try to drive more visits there. A few more visits will go a long way. “Decorating – Men” is not doing ok after all.

The music section has a relatively low volume of consumption but the highest visit level. There could be many reasons for this. Is there a usability problem? Does the content suck? Is the site built in such a way that one can spend hours listening to content but all on one page? Whatever the reason, if you just drive more traffic to Music, your reward in increased page views will be limited.

Home Repair looks great in the first graph, but the second graph shows an issue with the drivers. It has the second lowest visit level. You might consider spending resources to drive more visits here. New visits are likely drive your volume at a higher rate than the other site sections.

As you can see, the second graph can lead to very different action items; actions not so easily understood from the top graph alone. This is a really simple example to illustrate the power of segmentation. In the real world, even this segmentation would not be the end of the investigative road. For example, one might look to see if a larger audience of men in need of decorating even exits.

Labels: , , , , , ,

Sunday, October 11, 2009

Comparative Metrics for Media Sites

If you have been in marketing for any length of time, you know a given number presented in isolation means nothing. For example the answer could be “42”. What does that mean? Is it good or bad? It has meaning only when compared to something. Comparing it to something is the first step in answering the question “What should I do?” For media web sites there are some standard comparisons that can be made. This article discusses various metric “baselines”.

In this article we will look at “internal” comparisons, comparing your site’s performance to itself. This is distinct from an “external” comparison, comparing your site’s performance to your competitor’s sites or industry “norms”.

Comparing Site Performance to Itself

An internal comparison looks at the performance of an element on your site where the numbers create their own baseline. These are “relative” comparisons used to identify an increase or decrease and by how much. It answers the basic question “Is it better now?” There are basically three ways to compare the metrics: trended, segmented, and contribution.

A trended metric is a time based measure; it looks at the number over time. In January it was 42, in February it is 43. This is a typical marketing measure for Media sites. It shows you how things are performing over some useful duration. In effect, the history of the measure becomes the relative benchmark for the current value.

Some forms of testing are an example of simple trend tracking. In this case one compares the value before and after the change is made. It is a comparison using a specific point in time around which to create a comparison.

Trending is often used in the context of segmented and contribution comparisons (described below). In fact, trending is one of the most useful measures of success you have.

This is an example of a trended report from Omniture showing performance for a subject area:

Web Analytics Trended Report

This type of comparison allows you to monitor overall performance then drill down to see what is affecting higher level numbers. It is a vertical segmentation. Changes in higher level tracking are investigated by segmenting the contributing metrics at a more granular level to see what changed. Typically, if a trend changes, you investigate the reasons for the change by segmenting to find the “contributors”.

The basic segmentation levels are:

  1. Site
  2. Page Group
  3. Page
  4. Link Group (Module)
  5. Link

For example, if you are looking at PV/Visit for the site overall you would investigate any changes by looking at the same metric for various page groups. These are sub-sets (segmentations) of the site. Examples of page groups are subject, page type (form, article, news, etc.), site area, tool, and application. You can then go further and look at the PV/Visit of individual pages within the page group and then the links on any given page of interest.

Note that these are all page segmentations rather than audience segmentations. In addition to tracking what pages changed, you can also segment by audience to determine if the audience or its behavior has changed.

These types of segmentation will tell you what components are driving changes on your site. In general, the more granular the level of tracking, the more the information tends to be tactical and actionable. Looking at the trends of your segments can provide insight into the near future of your site.

There is another type of segmentation that applies to compound metrics. These are rates and percentages etc such as Page Consumption (PV/Visit), Repeat Use (Visits/User) or Visit Duration (Time Spent / Visit). In this case, when the metric changes, look at the component of the metric itself to see what changed. For example, for Page Consumption did the Page Views or the Visits change, or both? Again this can be looked at for various levels of the site to understand what is driving the change.

Contribution compares a part of your site to another similar thing on your site. It is a horizontal comparison which allows you to compare pages, page sets, and applications at the same level. It provides insight into relative value. Is your News section providing more business value that your Sports area?

You need to make sure you are comparing similar levels and similar things. For example, you can compare different subject areas but not the contribution of a subject such as “Flu” to your site search application. In other words, make sure you are comparing apples to apples or fruit to fruit, not apples to plywood.

Contribution can be an extremely helpful way to understand performance. Did one segment improve at the expense of another or was there a net gain? For example, you launch a new application on your site. You see that it quickly contributes 10% to your site’s overall page views. Great! But wait. You see that your site did not grow overall; it stayed the same. This is actually quite common. You have shifted some of your existing audience to your new application from somewhere else on your site. By looking at the contribution of the other applications you can see where the audience for your new application came from and then determine whether you have shifted your users to higher or lower value pages.

As with the segmentations noted in the previous section, you can trend contribution over time. Comparing the trends for similar things you can very quickly see the contribution and interdependence of your site’s elements.

Contribution can also help you know where to invest resources and attention. For example, you may have a topic such as “Gardening” that generates very high page consumption. However, it contributes only a very small portion of your sites page views compared to other topics such as “Decorating” or “Recipes”. Once identified, you can then look at the growth potential of Gardening and decide whether or not to spend dollars promoting it or whether the current Gardening dollars are better spent improving your high value Recipe section.

This is an example of a trended comparison report using Omniture:

Web Analytics Contribution Report

Labels: , , , ,

Sunday, August 9, 2009

Monitoring Reports and their Attributes

Reports can be generalized into different types. For example there are testing reports, ad-hoc reports, predictive reporting, dashboards, etc. They have different purposes. The most common type of reporting is a monitoring report.

Monitoring reports are intended to do just that: monitor. They contain the metrics for your site or product that you look at all the time. This differentiates them from reports that are meant to answer specific short term questions such as testing reports or ad-hoc reporting.

These reports are not necessarily actionable based on the report alone. Changes in your reports are often investigated further to determine the cause of the change and the appropriate action needed.

The monitoring report contains the metric drivers for the business and detail about the components of those drivers. For example, if a key metric is repeat use, the report would also provide the visit and visitor components of the repeat use metric. It would also provide the tracking level below to see what contributed to the higher level figure. For repeat use for the site overall, this lower level of detail might be the repeat use by site section, application, or other functional area.

The report is often presented in a trended view and/or in comparison to other similar site elements. In the first case, you are comparing the thing being tracked to itself over time. In the latter case, you are looking at relative value. These become your benchmarks by which you evaluate your tracking.

The report, by nature of its purpose, is provided on a regular schedule. This can be monthly, weekly, daily or whatever the business needs. If you don’t need the information on a regular basis, then it does not fall into this category of reporting.

In order to do apples-to-apples comparisons the report needs to be consistent over time. It should track the same metrics in August as it does in January. Because the reports are consistent they are often be automated. In fact, the site itself should be built to specifically track these metrics consistently.

Note that this is a report and not a dashboard. A dashboard is meant to contain only the top level KPI and would not include the level of detail one would expect in a monitoring report.

When asking your analyst for reporting, keep in mind what kind of report you are requesting. This will help both you and your report provider to better understand the goals of the request.

Labels: , , , , ,