• Votes for this article no votes for this yet
  • Dashboard Insight Newsletter Sign Up

When Data Visualization Works — And When It Doesn't

Thursday, March 28, 2013

Below is an except from the Harvard Business Review article by Jim Stikeleather. Stikeleather serves as Executive Strategist, Innovation for Dell Services.

For information to provide valuable insights, it must be interpretable, relevant, and novel. With so much unstructured data today, it is critical that the data being analyzed generate interpretable information. Collecting lots of data without the associated metadata — such as what is it, where was it collected, when, how and by whom — reduces the opportunity to play with, interpret, and gain insights from the data. It must also be relevant to the persons who are looking to gain insights, and to the purpose for which the information is being examined. Finally, it must be original, or shed new light on an area. If the information fails any one of these criteria, then no visualization can make it valuable. That means that only a tiny slice of the data we can bring to life visually will actually be worth the effort.

Once we've narrowed the universe of data down to those that satisfy these three requirements, we must also understand the legitimate reasons to construct data visualizations, and recognize what factors affect the quality of data visualizations. There are three broad reasons for visualizing data:

  • Confirmation: If we already have a set of assumptions about how the system we are interested in — for example, a market, customers, or competitors — operates, visualizations can help us check those assumptions. They can also enable us to observe whether the underlying system has deviated from the model we had and assess the risk of the actions we are about to undertake based upon those assumptions. You see this approach in some enterprise dashboards.
  • Education: There are two forms of education that visualization offers. One is simply reporting: here is how we measure the underlying system of interest, and here are the values of those measures in some comparative form — for instance, over time, or against other systems or models. The other is to develop intuition and new insights on the behavior of a known system as it evolves and changes over time, so that humans can get an experiential feel of the system in an extremely compressed time frame. You often see this model in the "gameification" in training and development.
  • Exploration: When we have large sets of data about a system we are interested in and the goal is to provide optimal human-machine interactions (HMI) to that data to tease out relationships, processes, models, etc., we can use visualization to help build a model to allow us to predict and better manage the system. The practice of using visual discovery in lieu of statistics is called exploratory data analysis (EDA) and too few businesses make use of it.

Click here to read the full article.

Tweet article    Stumble article    Digg article    Buzz article    Delicious bookmark      Dashboard Insight RSS Feed
 
Other articles by this author

Discussion:

No comments have been posted yet.

Site Map | Contribute | Privacy Policy | Contact Us | Dashboard Insight © 2017