Visualizing the Change
May 02, 2026Written by Ed Cook and Roxanne Brown
You have seen this slide. A spreadsheet, captured as a screenshot and pasted whole into PowerPoint. Forty rows, twelve columns, font size that punishes all but the most eagle-eyed. The presenter reads the numbers aloud. The executives do not follow along. They can’t. No one in that room has the time or the cognitive capacity to process a forty-row table in real time. What they do instead is something faster and potentially quite error-prone. They pattern-match.
This is confirmation bias in practice. The executive who believes the change is on track scans the spreadsheet and finds the numbers that support that belief. The executive who suspects the project is struggling finds those numbers instead. Both are drawing conclusions from an incomplete scan. These executives are not lazy or careless. They are doing what every human brain does when confronted with more information than it can process: it takes shortcuts. Cognitive science has documented this for decades. Under time pressure, people default to confirming what they already believe. The spreadsheet on the slide is data, not information. It does not help with the decision-making process because it has increased the cognitive load on those trying to make sense of it.
Data visualization is not about making your presentation look pretty. It is a psychological intervention. A poorly designed visual gives the brain permission to cherry-pick. A well-designed visual tells a fuller story, one that increases the information (not just the data) available to the executive who genuinely wants to make a good decision.
This is a specific form of the Narrative Fallacy we explored in The Numbers Do Not Speak for Themselves. There, we argued that the human tendency to construct a satisfying story from incomplete evidence is the central risk of data interpretation. A poorly designed visual promotes (or even accelerates) the Narrative Fallacy.
The Clutter: Cognitive Load and Chartjunk
Edward Tufte, the pioneer of data visualization and author of The Visual Display of Quantitative Information, built his career on the single principle that every drop of ink on the page should present information. (He worked in the 1980’s before we became so screen-focused.) He coined the term "chartjunk" for the decorative elements that consume cognitive bandwidth without contributing meaning. Clip-art, three-dimensional bar effects, gradient fills, and background images are all “chartjunk.” If you are presenting to a peanut farm, do not put cartoon peanuts on the slide. Every element that does not serve the data is working against you.
This is the "minimum set" principle from the Choose the Data phase, applied to design. In that phase, you eliminated metrics that would not drive a decision, keeping only the data that was useful rather than merely interesting. The same discipline applies to your slides. Every visual element that does not drive understanding is clutter. Clutter is cognitive load. And cognitive load is the enemy of the emotional response you need to generate a decision.
Consider the training completion example from our previous post, The Emotion of Analytics. Imagine the 47% completion rate presented on a slide: a single bold number in red, a clip-art graduation cap in the corner, a downward arrow, and a three-row table of regional breakdowns in 9-point font. The executive glances at the slide and sees the red number. One executive panics. Another dismisses it. Both reactions are instant, and both are based on a single data point stripped of context. The graduation cap contributed nothing. The arrow confirmed a mood. The table was unreadable at that font size. This is chartjunk.
Now strip it. Remove the clip art. Remove the decorative arrow. Replace the table with a single, clean line chart showing training completion rates by region over time, with the pilot benchmark of 91% drawn as a horizontal reference line. The 47% is no longer a number to react to. It is a pattern to understand. The executive can see where completion is falling, when the decline began, and how far the current rate sits from the benchmark that the pilot achieved. The same data. The cognitive load is halved. The insight is doubled.
One specific form of chartjunk deserves a brief mention because it is so common. The pie chart. Multiple pie charts ask the brain to compare the areas of circular segments, which it does poorly because area scales with the square of the radius. A horizontal bar chart displaying the same proportional data is read instantly and compared accurately. When in doubt, use the bar chart.
From Know to Feel Through Design
In The Emotion of Analytics, we argued that a presentation must move leaders through Know > Feel > Do. This post is a deep dive into the mechanics of Know, via the visual design that makes data clear enough to feel something about.
Here is the insight that connects these two posts: a well-designed visualization does not just reduce cognitive load. It generates the Feel. When data is presented so clearly that the implications are inescapable, the emotional response happens on its own. A leader does not need to be told that 47% completion three weeks before go-live is a problem. If the chart makes the gap between 47% and the 91% benchmark visible at a glance, the leader can see it, and seeing it produces the urgency that reading a number in a table never could.
The visualizations are more than illustrations. They are wayfinding elements. Consider the signage system in a well-designed airport. When you land at an unfamiliar terminal, you do not need a map legend or a detailed floor plan. You follow the signs. Symbols, colors, and spatial cues guide you from gate to baggage claim to ground transportation without requiring you to stop and decode anything. The signs do not tell you the full story of the airport's layout. They tell you the one thing you need to know right now: which way to go.
Data visualizations should work the same way. They should guide the leader's eye directly to the insight that demands a decision.
Three Visuals, Often Unused, That Drive Decisions
The three visualizations below are not exotic. They exist in every spreadsheet application and every LLM-assisted analysis tool. Yet they are dramatically underused in change management presentations, where the defaults remain tables, simple bar charts, and the occasional pie chart.
We will use the same dataset throughout (the training completion rates from our example) to show how each visualization reveals something the others cannot.
Heat Maps: Bypassing the Analytical Brain
A table of training completion rates by region and department requires sequential reading. The executive must scan each cell, compare it to the cells around it, hold the comparisons in working memory, and build a picture of where the change is succeeding and where it is not. That is a significant cognitive task for a room that also has a dozen other agenda items to get through.
A heat map removes that burden. By applying conditional formatting, using a color scale from green through yellow to red, the table transforms from something you read into something you see. The executive does not choose where to look. The color “chooses” for them. A block of green in the Southeast confirms what is working. A cluster of red cells in the Midwest operations group pulls the eye like a warning light.
Now look at what happens to the 47% in this format. In a table, 47% is one number among dozens. In a heat map, it is a red cell sitting in a row of other red and yellow cells. The cluster tells a story that the single number could not: an entire group is falling behind. The comfortable phrase "things are going fine overall" cannot survive a heat map that shows three departments burning red while the average sits at a passable 72%.
There is something even more important that is revealed with this visualization. A heat map that shows a struggling department surfaces the same structural insight that network analysis reveals in a different context. It can show a group that may be disconnected, uninvited, or not yet reached by the change management effort. A leader who sees that red cluster and responds with curiosity rather than blame is making the kind of invitation that creates the conditions for Trust and Belonging, two of the 10 Dimensions of Joy at Work. The heat map made the problem visible. The leader's response to it determines whether the moment becomes merely corrective or genuinely connective.
Box-and-Whisker Charts: Shattering the Illusion of the Average
The average can be a false certainty. A company-wide training completion rate of 72% sounds comfortable. It suggests the change is broadly on track with room for improvement. But 72% is an average, and averages conceal as much as they reveal. What if three regions are at 91% or above, and two are at 47% and 38%? The average is mathematically true, but the story it tells is false. What we need is a fuller view of the entire distribution of scores.
A box-and-whisker chart shows the full distribution of the data. The box represents the middle 50% of the values, called the interquartile range. The line inside the box is the median, the middle data point. The whiskers extend to show the range, and any points beyond are outliers. For a leader who has not encountered this chart since a statistics course (that perhaps they have tried to forget), a brief reminder of this structure can help them follow and feel confident. The box is where most of the data lives. The whiskers show how far the extremes stretch. The outliers are the departments that are dramatically different from the rest.
Return to the training completion data. When that data was a single average, the conversation was "Are we on track?" Put it in a box-and-whisker chart, and the conversation changes. The box sits between 65% and 85%. The median is 74%. That looks reasonable. But the lower whisker stretches to 38%, and two outlier dots sit at 34% and 29%. Those dots are departments. Those departments are full of people. The question is no longer "are we on track?" It is "what is happening in those groups?" The box-and-whisker chart invites a question that the average alone would miss.
Combo Charts: Providing the Context That Prevents Bad Decisions
A score without context is dangerous. Suppose the heat map revealed a bright red cell in one region showing a training completion rate of 31%. The instinct is alarm. But how many people are in that region? If the answer is four, consisting of a small pilot team that started late, the number is real, but the urgency is misplaced. If the answer is 140 operations staff, the urgency is entirely justified.
A combo chart using a dual axis solves this. Display the training completion rate as a bar for each region. Overlay the number of people in each region as a line on a secondary axis. Now the executive sees both dimensions simultaneously. A low bar with a high line is a genuine crisis: many people are not completing the training. A low bar with a low line is a data point to monitor but not to panic over. The visual does the contextual work that a table does not.
Consider the cumulative effect of all three visualizations applied to the same dataset. The heat map showed where the problem lives. The box-and-whisker chart showed how wide the spread is relative to the average. The combo chart showed how much weight each data point carries. No single visualization told the whole story. Together, they gave the executive something a pasted spreadsheet never could: a clear, honest, multi-dimensional picture of the change. The kind of picture that enables a well-intentioned leader to make a well-informed decision.
Wayfinding to the Decision
A well-designed airport does not hand you a forty-page PDF of terminal layouts when you step off the plane. It gives you symbols, colors, and directional cues that guide you to exactly where you need to be. You process them without stopping to think. That is what your data visualizations should do for the leaders in the room. Every chart, every color choice, every design decision should answer one question: which way should we go?
By presenting unambiguous visual information, you are not just reporting. You are coaching the leader. You are replacing the conditions that enable passive observation with conditions that enable active choice. This is the Know > Feel > Do architecture from The Emotion of Analytics, made operational through design. Before change can happen, the leader must change first. A leader who sees the data clearly, who can see both the pattern and the weight of the evidence, is a leader who has been given the opportunity to lead. Most will take it.
This is the final skill in the Present the Data phase. You have the architecture: Know > Feel > Do. You now have the visual instruments: heat maps that surface the pattern, box-and-whisker charts that reveal the spread, and combo charts that provide the context. What remains is the decision itself. Across this entire series, the data has been chosen, collected, analyzed, and presented. Everything in the Data-Driven Change Management process has been building toward this: the moment when the data, synthesized into a visual story, meets the values of the people who have the authority to act. It is time to make the Change Decision.
When the visualizations are clear enough that a leader can see where people are struggling, and when that leader responds with curiosity rather than blame, something happens beyond the change itself. The conditions for Trust, Belonging, and Growth begin to take shape. That is how Joy at Work grows.