In an earlier post, I discussed the problem of analytics as an afterthought for customer-facing systems. At the end of that post, I quickly outlined an approach that integrates analytics right into the application development process. Here I'll expand on that approach.
The reason we launch customer-facing systems is because we want to alter people's behavior in some way. We want them to buy more, serve themselves, and find us online, among many other things.
I use these desired behaviors to anchor the analytics requirements gathering process. Here are the steps:
- List the behaviors you want to measure
- Describe the event(s) that indicate users have behaved in a certain way
- Describe the metric that will tell you about the event(s)
- Define the tags that will be implemented to capture the metrics
Behaviors: Effective Web and mobile analytics start with identifying the behaviors you want to encourage (and discourage) and putting in place mechanisms to measure their occurrence. This you can do very early in the application development process -- as early as requirements. As you're gathering requirements (however you do that), begin asking the question, "What user behavior makes this a successful system?" Make that list.
Events: During application design, precisely define the event(s) that indicate a particular behavior has occurred. For each behavior ask the question, "What event will occur in this system that tells me this behavior happened?" Look at your flows and screen mockups. You're looking for things like button clicks (e.g., on checkout, order, or "share with a friend" buttons).
Metrics/KPIs: Translate events into meaningful data about those events. For each event ask, "Is there something that I'd like to know about this event whenever it happens?" For instance, you might want to know if the person executing this event visited your site from a specific campaign. If the event is clicking a "submit order" button, you might want to know if a special offer was accepted during the visit. This will allow you to measure the effectiveness of the offer against order conversions that didn't include the offer.
Tags: Develop meaningful data tags that describe the metrics you wish to capture, and place them in a useful hierarchy. Putting thought into the tags and hierarchy is important because any poorly thought-out job in the development stage can significantly increase the effort required to add metrics at a later date. Properly designed tags are placed in a logical system that is scalable to meet future needs.
As an example, here is a table that I've used to help structure the analytics requirements process. In this case, I was working with an e-commerce system that allowed users to create "favorite orders" so that they could easily order them again. They also could "reorder" -- they could select an old order and place it again. Both were time-saving measures. We subsequently learned through metrics that reorder (going back to an order automatically stored by the system as part of the customer history) was much more popular than favorite order (returning to an order explicitly tagged by the customer as one of their favorites).
|Desired Behavior ||Event ||Metric/KPI
|Place a favorite order.
||User submits an order that was based on a favorite.
- Number and percentage of orders submitted as favorites
- Percentage of orders for each user submitted as favorites
- Percentage of favorite orders changed prior to submitting order
- Conversion rate of favorite-based shopping carts to submitted orders
|Place an order from history.
||User submits an order that was based on a previous order.
- Number and percentage of orders submitted based on an earlier order
- Percentage of orders for each user submitted based on an earlier order
- Percentage of history orders changed prior to submitting order
- Conversion rate of history-based shopping carts to submitted orders
There is a lot more complexity than can be fleshed out in a brief article, but for those just getting started with analytics, work on defining them as part of requirements and design. It doesn't have to wait until the very end -- or worse, after launch, in a subsequent release.