Articles · · 2 min read

That report on the value of design is biased. And, the execs know it.

It's critical to find data, leverage it, and examine your own bias. Doing so means you'll be a better advocate for the value of design because I could anticipate how executives might react when I presented data like this.

That report on the value of design is biased. And, the execs know it.

You may have seen this image before. Maybe you referenced it!

In 2013, the dmi:Design Value Index was published as a report to prove the value of design. The report showed that "over a 10-year stretch, design-led companies maintained significant stock market advantage and outperformed the S&P by an extraordinary 228%."

When this report came out, I was excited. I breathed a huge sigh of relief. Finally! Someone has some data that will help me advocate for design. This report helped me get the initial traction and buy-in to build and scale a design org several times.

But as I learned, my excitement for the report waned pretty quickly. Why? Because it's biased.

The report was commissioned by the Design Management Institute (dmi), an organization whose mission is to advocate the economic, social, and cultural importance of design. A mission I believe in! This report has great information. Data was collected, insights were formed, and a report was created.

And this report kept me stuck favoring the information I wanted to see.

It was so stressful and deflating to continuously advocate for the value of design, research, content strategy, etc. For years, I was seeking something like this report to relieve my stress. This report did that, but that relief was temporary.

While this report was everything I hoped for, I learned how important it was to give reports like these some critical judgment. I learned how important it was to examine how my own preferences might be clouding the value I placed on data I agreed with. Why? Because the executives I was trying to convince told me they thought the report was biased.

While I was initially upset that they were still skeptical of the value of design, I realized my own confirmation bias got the best of me. When I saw this report, I ran with it as a fact. It was the data I was seeking that I hoped would provide me with the relief I was seeking. I didn't conduct any basic due diligence on how the data might be perceived by others.

So here I am ten years later, revisiting the DMI Design Value Index to look at any bias in the report. Why? To remind you how important it is to check your own biases.

With some quick looks at the data, I can see that, yes, the report holds up! However, when I examine the data a little more closely, removing Apple from the index drastically reduces the performance.

If I evaluate the performance of several companies (not as a weighed index) that wouldn’t fit the dmi criteria they performed just as well, if not better, than the companies in the design index.

Here are a few returns:

I learned that it was critical to find data, leverage it, and examine my own bias. Once I started doing the last bit, I became a much better advocate for the value of design because I could anticipate how executives might react when I presented data like this.

Read next