A key tenet of UX design is that every element should incite a specific and desired behavior. What actions do most business intelligence dashboards incite? None. Sure you can drill down, but what good does that do? It certainly doesn’t help the user recognize or solve a problem.
Ahh, the ubiquitous dashboard, heralded as the great soothsayer of modern business. But is it really? Pretty much every business intelligence dashboard I’ve seen has failed to serve its most basic and primary function: to identify problems and help correct them. How many times have you seen a dashboard that told you where a problem was or what to do about it? Probably none. Most dashboards show little more than static information that tells you only that it’s working. A good dashboard should do two things: identify trends and highlight exceptions. But a great dashboard should also incite appropriate actions that solve a problem.
Data To Action Continuum
The point of a dashboard is to help you identify an anomaly and take appropriate actions. Therefore, a dashboard needs to provide more than just data. It has to provide actionable insights. There’s a linear continuum that ranges from data to action.
Data is simply a number. This is the least actionable end of the continuum. Data does not incite any action. For instance, your dashboard might show you the number 12, but what does that tell you to do? This simple level is completely dependent upon individual and highly variable skill levels of the many users. Success is as variable as your users’ skill levels.
Information is data with context, such as relativeness or scale. This provides a little more help, but still cannot incite a specific behavior that will improve your site. This is the extent of most dashboard displays and it doesn’t reduce the dependency on user skill to determine what actions to take. For instance, a dashboard might show that your conversion rate is tracking at 12 percent in step 3 of your 7 step conversion funnel. That doesn’t tell you anything about what to do to change that, though. Moreover it doesn’t even tell you if it’s a problem. Maybe 12 percent is actually acceptable. But how would you know?
Insight is produced by adding in additional information to help identify when something is tracking normally or not. This might include forecasts based on current trends, industry averages, etc. This gets you closer to actionable state, but still relies on the aforementioned, highly variable, individual skills. For instance, if your dashboard displays user abandonment rates along your conversion flow, how do you know if those rates are a problem? Consider adding comparison lines that represent your goal and industry averages. Then you can easily compare your conversion rates to determine if they are exceptional or acceptable. A dashboard could even compare a previous design to a new one to determine whether the new design was more effective.
The real goal of any data analysis is to identify a problem and affect a solution. Charts and graphs provide static information, but don’t suggest corrective actions. This end of the continuum requires more work to design, but ultimately results in a dashboard that incites appropriate user behaviors. Part of the design effort includes analyzing potential problems and identifying actions that actually correct them. Example: We helped design a dashboard for pharma sales reps. The original design was pages and pages of numbers – data. It wasn’t helping the reps sell any better, mostly because sales reps are better at selling than analyzing reams of data. The desired goal of each rep was to meet their sales quotas in order to make their bonuses. So we tracked projected sales versus desired outcomes and highlighted if they were not going to make their bonus. We also identified a formula that could predict sales performance based on customer visits and if a rep was tracking too low, it would suggest that the rep visit specific customers. This design replaced pages of useless data that required hours of analysis with a single page dashboard that took only seconds to identify anomalies and corrective actions. This dashboard design remains the industry standard for the pharmaceutical sales domain.
Here’s a common type of dashboard graph depicting sales performance:
Looking at this dashboard, how does it answer the following basic questions:
- What’s the problem?
- What’s the cause of the problem?
- What can be done about it?
- What action does this suggest?
Typically, dashboards are merely a manifestation of the existing data. The thinking is, if we have the data, let’s display it, with little regard to what you will do once you see it. But all you end up with is a visualization of existing data. A better approach is to start by identifying the desired outcome, such as conversion funnel performance and objectives. Then work your way back to determine what data you’ll need to visualize that. Since trends or exceptions are key indicators that incite appropriate actions, the system will need to understand what are anomalous trends or data. A common way to provide this is to define thresholds. When a data point crosses a threshold, it is deemed exceptional. There’s usually a logical reason why it’s a definable threshold, and you can usually relate that threshold to a reason and a potential corrective action. Thus, the more successful approach is to:
- Determine what goals you are trying to achieve.
- Identify what indicators you want to see regarding those goals.
- Define the thresholds that dictate an exception.
- Determine what actions you would take to correct the exception.