• Votes for this article no votes for this yet
  • Dashboard Insight Newsletter Sign Up

Special Interview With Stephen Few, Dashboard and Data Visualization Expert

by Alexander 'Sandy' Chiang, Research Director, Dashboard InsightThursday, July 14, 2011

To kick off July’s focus on data visualization and as part of the new direction Dashboard Insight is taking, Alexander ‘Sandy’ Chiang, Research Director at Dashboard Insight, has a candid one-on-one interview with dashboard and data visualization expert, Stephen Few. Stephen is founder of Perceptual Edge and author of the three bestselling books on dashboard design and data visualization best practices: Information Dashboard Design, Show Me the Numbers, and Now You See It. Stephen will address some of the challenges and poor practices proliferating the space.

Alexander Chiang (AC): Thanks Stephen for taking the time to speak with me today.  There are many challenges and questions that arise in the process of dashboard development and your expertise will help to shed light on them.  

Stephen Few (SF): Thanks Alexander. I appreciate this opportunity to chat.

AC: Dashboards are typically designed poorly and data visualization principles are often ignored. What do you think are the major causes of this problem?

SF: People do what they know and they only know what they’ve experienced. Go to almost any website that features examples of dashboards and you’ll see all of the bad practices that people emulate. This is true not only of dashboard vendor sites, but of independent sites such as www.DashboardInsight.com as well.

If you look for sources of information about viable dashboard design best practices—not pseudo best practices that vendors often publish, which aren’t based on actual research but simply advocate what their products do—you’ll find few examples. Even few of the papers on this site are based on real evidence of what works. The proportion of poorly designed dashboards in the world vs. those that are well designed reflects the proportion of bad vs. good examples that people find when they look for help. This is shameful.

AC: I agree Steve. Early in June, I stated that my mandate was to promote dashboard design and data visualization best practices on Dashboard Insight going forward. Hopefully, with experts like yourself and through the new Dashboard Insight, we can end this vicious cycle through education.

SF: I’m encouraged by your new direction at Dashboard Insight, which is why I gladly agreed to do this interview, but turned down invitations in the past.

AC: Gauges are used more often than they should be. Why do you think people are so enamored with these inefficient visualizations?

SF: This interest in gauges exists partly for the reason described above: the appetite for silly gauges is stimulated by ubiquitous examples. There’s more to than this, however.

First, people love metaphors. Dashboard vendors didn’t begin developing gauges that look like automobile dashboards because they work well for computer-based performance monitoring displays. They did it because they took the dashboard metaphor too far. The metaphor is only useful because of a conceptual similarity between automobile dashboards, which are used to quickly monitor what you need to know when driving a car, and computer-based information dashboards, which are used to monitor some aspect of the organization. Beyond this similarity, the metaphor is both useless and dangerous. What works for driving a car is not what works best for doing your job. Most vendors never bothered to question this. Instead, they just followed the herd. Rather than developing products that work effectively, they spent their time coming up with more and flashier gauges.

Second, gauges are cute and fun. They have needles that bounce back and forth, they have light and shadow effects that make them look like the sun is shining on them, and they have bright colors that shout, “Look at me.” Are our organizations well managed by playing with things that are cute and fun? Don’t get me wrong, I like cute and fun as much as the next guy, but I chose my wife for reasons that go beyond this—she provides what I need in a partner. We should choose dashboards that give us what we really need. Dashboards should always display data as clearly, accurately, meaningfully, usefully, and efficiently as possible. I’ve seen few gauges that do this. Most of them use up a great deal of space to say very little, and what they do say they say poorly.

AC: So it sounds like there are gauges that effectively communicate data, albeit a rarity. Do you have an example?

SF: The purpose of a gauge is to display a single measure of something that’s currently going on. To do so effectively, it also includes context in the form of one or more comparisons (for example, compared to a target or the same measure at some point in the past, such as on this day last year). Also typical of gauges is a visual indication of qualitative state, which is usually handled using traffic light colors (red for poor, etc.).  These attributes describe what gauges attempt to do.

Because dashboards must display a great deal of information in a limited amount of space (that is, a single screen), all forms of display, including gauges, must use space efficiently. Circular gauges, by their very nature, waste space, so I’ve never seen a circular gauge that works well on a dashboard. This leaves us with gauges that are linear in design. Of the linear gauges that I’ve seen, few display a measure of what’s going on with sufficient context or provide a visual indication of the performance state that is rich in information, efficient in its use of space, and designed for quick and easy comprehension. This is because few were actually designed by people who understand the best practices of data visualization that have developed from research into visual perception and cognition. One exception is the gauge that I invented a few years ago called a “bullet graph.” Here’s a simple example:

Here it is again, with the parts labeled.

Here’s a simple example that combines several bullet graphs along with Edward Tufte’s sparklines to convey a great deal of information in a small amount of space. Imagine that this is only a small section of a much larger sales dashboard, which features the top sales performance metrics.

Since their invention, many dashboard products now support bullet graphs, but to varying degrees of effectiveness.  Bullet graphs can be designed in a number of ways to serve particular needs, but this isn’t the place to go on about them. The easiest way to learn about them is to read the Bullet Graph Design Specification that’s available on my website at www.PerceptualEdge.com.

AC: Animations appear in dashboards more often than they should. Why do you think people like animations (the stylistic ones) on their dashboards?

SF: By stylistic animations, I assume you’re referring to things moving on the screen gratuitously, where motion fails to communicate anything useful. Like gauges, animations of this type are fun. They make dashboards feel a bit like video games. The most common example of this that I’ve seen over and over, involves needles in gauges that bounce back and forth when values change until they finally settle into place to point to the new value. The developer of a popular dashboard product, who is also the author of a book about dashboards, once excitedly demonstrated for me the bouncing needle feature of his software. He was proud of this achievement and was confused when I responded with dismay. “Why did you program it to work like this?” I asked. He responded, “Because that’s how real gauges work.” I then explained that the bouncing needle wasn’t designed into old spring-loaded mechanical gauges as a feature, but was due to an unwanted and unavoidable limitation of the technology. It simply takes a while for a spring to settle down when the position of the needle changes dramatically. To him, as a software developer, the bouncing needle was a great achievement. To me, as an information designer, it was a behavior that undermined the ability to read the value while waiting for the needle to settle into place. Many talented software developers have wasted their abilities writing code to do things that actually undermine the effectiveness of dashboards. Unfortunately, much more effort has been focused on this than on building means of information display that actually work.

AC: In my experience, animations, and gauges for that matter, are often used for marketing purposes rather than practical purposes, to draw attention to a banner advertisement or to add ‘wow’ factor to a vendor’s software offering. Unfortunately, these ‘features’ often drive purchasing decisions.  Given that these visualizations are detrimental to communicating data, what do you think about categorizing dashboards which use flashy animations as ‘Advertorial Dashboards’ to distinguish them from functional implementations?

SF: Let me respond to your question by asking another: “Does a dashboard that does nothing but advertise a product’s features that don’t actually work for performance monitoring deserve a name? In my opinion, the only name it deserves is “dysfunctional.” Vendors should try to “wow” potential customers with dashboards that actually do the job. By doing so, the people who purchase their software won’t eventually learn to loathe them, but will become loyal customers instead.

AC: In one of your posts, you mention how filtering and other forms of interactivity do not belong in dashboards. What would you call interactive dashboards?

SF: Before I answer this question, I should clearly define what I mean by a dashboard. In an article that I wrote back in 2004 entitled “Dashboard Confusion,” I introduced a definition of the term dashboard, which I later repeated in my book Information Dashboard Design. My hope was to reduce the confusion that existed at the time and still exists today about what a dashboard actually is. We can’t discuss dashboards and how they ought to be designed without first agreeing on a definition. Here’s mine:

A dashboard is a visual display of the most important information needed to achieve one or more objectives; consolidated and arranged on a single screen so the information can be monitored at a glance.

What differentiates a dashboard from other forms of information delivery is the fact that it is used to rapidly  “monitor” what’s going on. If you do something that requires ongoing situation awareness to do it well, a dashboard can provide the information you need in a way that you can assimilate quickly. Dashboards don’t eliminate the need for other forms of information delivery, such as standard operational reports. Dashboards are different from other forms of display that display multiple charts simultaneously to support data exploration and analysis, which should be designed somewhat differently. Dashboards support rapid performance monitoring, and can only do so effectively if they are designed in specific ways to work with human eyes and brains.

To monitor performance, there is a specific set of data that you need to be kept aware of to do your job, just as airline pilots must scan a particular set of displays in a particular order frequently throughout a flight. To monitor what’s going on efficiently, you don’t want the display to change from day to day or moment to moment; it should look exactly the same every time you look at it, except for the fact that the information changes.

Interactivity, such as filtering, adding variables, changing the chart type, linked highlighting (a.k.a., brushing and linking), and so on, is incredibly useful, but for purposes other than monitoring. For instance, these and other interactions are fundamental to data exploration and analysis. The ability to select the data that you want to see is also useful for online reports, where the purpose is to look things up. The only interactions that ought to exist in a dashboard, however, are those that allow you to access additional information about what’s going on when something in the dashboard requires a response. For example, if you see in the dashboard that sales of a particular product have dropped significantly and have recently been trending downwards, you probably won’t find all the information that you need to understand why this is happening on the dashboard itself, but it should be incredibly easy to get to it from the dashboard. The ability to click on something in the dashboard that demands attention to access additional details that are needed to formulate a response ought to be available, but the dashboard itself shouldn’t change to get to it. The dashboard should always look the same, so familiar that you can assimilate the story that it’s telling you almost at a glance. If you’re forced to go from screen to screen or to select various sets of data to get the information that you need, you’ll waste a lot of time and still never manage to stitch together all of those fragments of information to construct the overview of what’s going on that’s needed to maintain situation awareness.

The way information is displayed and the interactions that are enabled should always suit the nature of the task that you’re doing. Few interactions beyond clicking to access more details when they’re needed fit the task of performance monitoring.

AC: In a way, you’re saying the dashboard is a portal to data exploration. I.e. when analysis is required, the dashboard serves as a gateway to a data analysis environment, possibly another type of dashboard. Maybe the term Analytical Dashboard should be introduced to help distinguish between the two?

SF: A dashboard can make you aware of something that prompts you to perform data analysis, so in that sense it can function as a portal to data analysis, but only one of many potential portals. Your boss asking you what’s causing sales to soar in a particular region is another portal. An operational report can also serve as a portal. Questions are raised that can only be answered through data analysis by many sources.

I reserve the term “dashboard” for monitoring displays. When I referred to “analytical dashboards” in my book Information Dashboard Design, I wasn’t saying that the dashboards themselves were used for data analysis, but that they were used by analysts to monitor the kinds of information that might reveal the need for data analysis. Data analysts, just like others who rely on information to do their jobs, can use dashboards to keep themselves informed, in this case of analytical opportunities. Ideally, a dashboard that’s used by a data analyst should be designed to make the transition from monitoring to analysis as seamless as possible, but the data exploration and analysis that they’re prompted to pursue by the dashboard are not done using the dashboard itself. It might be supported by the same software that was used to create the dashboard, but displays, interfaces, and data interactions that support data exploration and analysis are different in many ways from those that support data monitoring.

AC: What kinds of skills are required to be a good dashboard designer outside of learning about data visualization and dashboard design practices?

SF: Here are a few more useful qualifications that come to mind immediately, in no particular order:

  1. Good communication skills. Fundamentally, dashboards are a means of communication, so good communicators have an edge as dashboard designers. Those skills should be applied as a focused effort to communicate information as clearly as possible.
  2. A good understanding of the business/data domain. For example, to build a good sales dashboard, you must understand the sales process, your organization’s sales objectives, and sales data.
  3. Empathy. It’s important that a dashboard designer understand what the users of the dashboard do, not just objectively, but subjectively as well by getting into their heads and understanding their mental models and concerns.
  4. Integrity. You must have the courage to stay true to what really works, even when people naively ask for the opposite.

Commitment to outcomes, not just deliverables. You should see it as your job to produce the best final result in actual practice, not just as delivering something that people find pleasing and acceptable at first glance.

AC: Hopefully, the current and next generations of business intelligence professionals and educators will take your advice to heart and we will see these lessons put into practice. Thanks Stephen for your candor and I am looking forward to including more articles based on your research and best practices on Dashboard Insight.

SF: Thanks Alexander for the chance to express my thoughts. I’ll be keeping my eyes on Dashboard Insight. As long as you respect your readers by putting their needs first, you’ll have my support.

Tweet article    Stumble article    Digg article    Buzz article    Delicious bookmark      Dashboard Insight RSS Feed
 
Other articles by this author

Discussion:

No comments have been posted yet.

Site Map | Contribute | Privacy Policy | Contact Us | Dashboard Insight © 2017