Guides · · 33 min read

Four "fits" to make Design crucial at your company

When I spend time with design teams that are repeatedly creating new value, they make bets on where, how, and with whom Design can successfully bring the most value to customers, the company, teams, and individuals.

Four "fits" to make Design crucial at your company

At CDO School, one of the fundamental steps in our Playbook is making bets on where to position Design to win and with whom, not as a zero-sum game, but in how design can successfully bring the most value to customers, the company, teams, and individuals.

When you lead Design people, teams, and the organization, the most significant shift is knowing that most of your colleagues don't think about design as a craft or practice. They think about it as a group of people with specific skills and capabilities with a budget… essentially a big pile of "features". Yes, they want to know how these "features" help customers, but they really want to know how those "features" also help them.

If you've ever read "Crossing the Chasm" by Geoffrey Moore, one of the basic theories that Moore shares in the book is that there is a gap between early adopters of a product and the larger, more skeptical mainstream market. When you lead Design, it's basically the same thing.

The chasm theory bell curve; source: Diffusion Research Institute

Innovate with Sidekicks and Sell to Champions

There will be early adopters of Design, but most colleagues are skeptical. It's the job of the executive leader and their team to place bets on which partners are early adopters and intentionally build out business cases and arguments with them to win over the skeptics. We call those early adopters Sidekicks.

We separate skeptics into two categories; Champions and Challengers. Champions are part of the early majority of adopters. They're influential leaders who want evidence that something works before scaling it. But once they're convinced, hot damn, that's some exciting stuff because they'll advocate for Design as well with not just words but time and money too.

Here’s a video describing this concept.

In my experience, design leaders spend most of their time and energy trying to convince Challengers that Design is real, necessary, and good. Time and time again, I see this approach fail. Why? Challengers aren't early adopters. They show little demonstratable evidence that they want to change. Instead, Challengers are late adopters and tend to fall in line once Champions start getting the benefits of Design. Design Systems, for example, are perfect for Challengers. They get value, customers get value, and design leaders can prioritize bigger bets for change in other areas. Win-Win.

This work is all Sales, Positioning, Marketing, Relationship Building, Communication, making good arguments, etc. If you want to become a Director or VP or CDO, this stuff is part of the gig and it's completely learnable. And you can do it without giving up your moral values or ethics. In fact, learning it and practicing it before becoming the exec is the best way how to keep your morals and ethics.

TLDR; When you lead Design, Design is the product. Great leaders make bets on where to position Design to win and with whom, not as a zero-sum game, but in how design can successfully bring the most value to customers, the company, teams, and individuals.

Why Good Design Isn't Enough

In my time creating and leading in-house teams, the honest truth is success has been mixed. After working with other product and design leaders, I’ve noticed they’ve had similar results. A mixed bag of wins and losses, but predominantly, outcomes that didn’t move the needle much either way.

For a long time, I’ve been examining the difference between design teams that have become crucial to their company’s success and teams that are thought of as “nice-to-have” but not necessary. There are countless factors at play (business models, industries, markets, regions, public vs. private, stage of company, etc.) that come into play. It would be difficult for any economist or academic to pull together these factors in a way that shows causal reasons. Still, I see something happening that separates these teams.

While this is entirely anecdotal, there is a pattern I see:

It’s a pattern I’ve thought about for a long time, and it’s taken me almost as long to explain what separates these teams. Here’s a framework I’ve developed to explain how design teams calculate (and recalculate) how and where design fits within the company. If you’re a design org leader, this framework may significantly change how how you lead your team.

When Design gets stuck

Imagine you're playing a video game where you keep falling into the same trap over and over again. Every time you try to get out, you fall back in, and it feels like you're never going to make progress. That's what a "doom loop" is like.

The term "doom loop" was made popular by Jim Collins, in his book "Good to Great." In it, he talks about how some companies get stuck in a doom loop. As Collins writes, a doom loop happens when people make decisions to fix problems fast, but those solutions don’t address the real issues, so the problems keep coming back again and again. A doom loop is this cycle where every fix leads to more problems, and it feels like you can't break out of it.

In my experience, the teams that stagnate are stuck in doom loops. They have likely made some progress in delivering value early on, but that progress has stalled. As a result, they’re under more and more pressure to provide more value, they’re faced with more skepticism about their judgment, and worse, they turn inwards and question their ability to get the team unstuck.

In my experience, I see three ways design teams consistently try to solve this problem:

These are all critical to raising the maturity level design at a company, but none address the elephant in the room: effectively changing the minds of those with the power and influence to make business decisions.

Repeatedly delivering value requires more than just doing "better" design

When design teams are stuck in this cycle, the usual response I see is to push for more adherence to the design process. But if this were the best solution, why do designs created by that process fail to resonate with customers and deliver better outcomes to the company?

The flaw in the "use the design process" approach is that it can trap experienced teams in a doom loop. Teams that rely heavily on standard processes to get unstuck unwittingly limit their ability to create something new. And the longer they stay in this cycle, the more they lose credibility with their cross-functional partners and leaders.

The cycle looks like this:

This cycle typically goes like this:

In terms of a growth curve, it looks similar to this:

Don’t get me wrong, I love a good process, but processes are overly relied upon as universal remedies. If you have strong thoughts about SAFe or Lean, you know what I mean. I’ve also witnessed many instances where teams used a design process well, but in the end, it didn’t translate to change. Perhaps you’ve observed this, too.

I believe a fresh perspective is required to break free from this loop.

It’s more than having more resources

Another common approach is to increase the number of people on the team. Simply boosting numbers doesn't translate to delivering new value. Increasing numbers often helps deliver value that’s already known.

The problem with the "more designers" mantra is its vagueness. If I ask ten designers from a team about their definition of great design, I get ten different answers. Such variance muddles the perception of design for leadership. Adding more designers could be costly and risky without clear value addition. Teams that are delivering more value are not doing so by multiplying voices. They harmonize the voices they have.

It’s more than Design Systems and DesignOps

This is a bit nit-picky, I’ll admit. I love me a good Design System and DesignOps teams are freaking magic. While both are vital components inside some design organizations, they aren't cure-alls for every company. Design Systems and DesignOps teams help Design Orgs multiply the value of design, but inherently, they don’t create value on their own.

Again, I firmly believe in both, and the people that do this work are incredibly important, but neither don’t get teams out of doom loops.

Introducing the Four Fits for Design Org Value

Another concept Jim Collins introduced was “the flywheel effect.” This concept describes how success is achieved when a consistent effort is placed in a series of small, connected actions that accumulate over time. When I spend time with design teams that are repeatedly creating new value, they are doing this exact thing.

These design teams are making bets on where to position Design to win and with whom, not as a zero-sum game, but in how design can successfully bring the most value to customers, the company, teams, and individuals. It’s a flywheel approach to running a design team.

The goal of this flywheel is to help the leaders who run design organizations consistently examine how the Design Org fits within the Company across four critical dimensions.

Here’s a visualization of the Four Fits of Design Org Value:

Model ↔ Practice Fit

A Design Team is a solution to a company problem, and that problem is often based on the business model and strategy. If you know the concept of product-market fit, this is how we examine how the design practice fits the business model and strategy.

Practice ↔ Partnerships Fit

The second fit is how Design Practices are built to fit with cross-functional and stakeholder partnerships. It’s not enough that we are advocates for customers. Our work has to help our partners as well. The degree to which we support our partners varies depending on the focus of our practice.

Partnerships ↔ Positioning Fit

In Marketing, Positioning refers to the place a brand occupies in the customers' minds and how it is distinguished from the competitors' products. Using this same concept, positioning refers to the place that Design occupies in the minds of our peers, colleagues, and stakeholders and how design is distinguished from other skills/teams at the company. Here, we continuously examine how the power dynamics work inside our companies so we can move peers and stakeholders from skeptics to adopters.

Positioning ↔ Model Fit

Lastly, the business model and strategy influence how design should be positioned inside the company. Here, we want to validate our positioning by gathering evidence that it’s working.


The four fits are essential to maturing Design, thus elevating Design's business relevance and value. These fits are interconnected; changing one affects the others. As the fits evolve, a holistic review and adjustment of your playbook is necessary. I will dedicate a post to each of these fits and how you can use our playbook to mature these four fits.

There are three extremely important points I want to hammer home throughout these posts:

  1. You need to find four fits to grow the value of a design org inside your company.
  2. Each of these fits influences each other, so you can’t think about them in isolation.
  3. The fits are constantly evolving/changing/breaking, so you have to revisit and potentially change them all. This is the purpose of the CDO School playbook.


I will show you examples of the framework through my failures and successes.  The series will roughly go in this order:  


Finding Business Model - Design practice fit

The road to Design Maturity doesn't start with Design.

In the introduction to this series, I emphasized that better Design is not the sole factor for figuring out where and how design plays a vital part in a company's success.

There are actually five essential elements required to examine the first of these fits, the Model-Practice Fit:

  1. Understanding the business model.
  2. Understanding the practice (people, process, operations, budget, etc.)
  3. Establishing early hypotheses about the fit of the design practice and the business model.
  4. Understanding the practical implications of achieving Model-Practice Fit in reality, not just in theory.
  5. Identifying strong indicators of finding Model-Practice Fit.

The wrong way to do it

In 2011, when I joined Electronic Arts (EA) to lead a new team of designers, program managers, and front-end developers, I was also a part of the leadership team for a newly formed organization at EA called the World Wide Customer Experience team. After assembling my team, I recall thinking:

"I can't wait to demonstrate how we can build exceptional products and services because that's what they asked us to do."

Although this mindset may seem reasonable if you have been in a similar position, it is flawed. I assumed that our team was the solution to their problems, essentially putting the cart before the horse.

What I should have done instead is focus on understanding the challenges and needs of my colleagues as they worked towards improving the experience for EA's customers. I made the common mistake of assuming that my colleagues wanted us to decide for them.

This is why I struggle with the term "Design-Led Strategy." While I understand that it refers to a practice and even a philosophical approach, my colleagues often interpreted it as "Everyone Follows the Design Team." Although this interpretation may make sense to some, language matters, and the way we word things impacts our perception and understanding.


A better way to find the Fit between the Business Model and the Design Practice

During my second stint at EA, I joined the IT organization. Our task was to develop a completely new catalog of products and services to support Employee Experience. This experience was akin to running a startup within the company.

Fortunately, I had learned some valuable lessons along the way. Instead of focusing solely on what we would be creating, we shifted our focus to understanding the business model of EA and the IT organization.

We examined four key elements:

  1. Who: We identified several teams eager to collaborate with us. Understanding the power dynamics and influence that each team had within the company was crucial. We initially focused on conducting experiments that would maximize value without causing significant damage if the experiments went wrong.
  2. Budget: Each new product had different funding sources. If the budget came from IT, we used a traditional Product Development approach. If the budget came from outside our team, and another department paid for the work, we used a more traditional agency or consulting model. Why? Because we considered this as the first phase to gaining trust with that team before trying to convince them, a fully formed Product Development approach would be the best way.
  3. Motivations: We assessed whether the teams were trying to assist a specific audience that held significant value and importance to the company.
  4. Problem: We evaluated whether the problem could be easily addressed from a technical standpoint.

Using these criteria, we discovered an ideal testing ground to determine the fit between the business model and the design practice.

One example of this fit was observed when we partnered with the VP of Talent Acquisition:

Based on these definitions, the team began considering potential solutions. Bill Bendrot, Jesse, and a few others collaborated to create a tool called Interview Sidekick. It marked the initial step towards developing a larger platform that would eventually replace EA's intranet. Despite being the same product with the same underlying architecture, Interview Sidekick was a simple web app that:

  1. Helped candidates understand what to expect during the interview phase, aiding in a smoother process and allowing candidates to focus on the content.
  2. Established personalized connections with candidates, with all communications at each step of the process in one place.
  3. Boosted candidate engagement, fostering a connection to EA's culture and providing insights into working at EA (answering the question "why EA?").
  4. Differentiated EA's interview process from that of other companies, giving EA a competitive advantage in attracting talent.

Although the app seemed simple, teams within EA had never experienced this type of product before. It helped the Talent team solve business problems and allowed us to adapt our practice to accommodate budgetary constraints, resource limitations, and cost drivers. Even today, some 8 years later, Interview Sidekick remains the standard way new candidates interview at EA.

This brings us to the concept of practice fit. We identified four main elements that were crucial to define:

These were the elements we identified:

These hypotheses play a crucial role and will be explored further in future posts of this series as they inform and deeply impact the other components of the framework: Partnerships and Positioning.

The Realities of Finding Business Model - Design Practice Fit

The reality is that finding Model-Practice Fit is rarely straightforward.

It often takes multiple iterations to identify a fit that works. The process begins with understanding the business model and strategy, not by prioritizing the business aspect, but rather as a reference for the signals that those with a business-first mindset are seeking. This iterative cycle involves creating an initial version of your practice, identifying who benefits from it, and then refining both the model and the practice. It is an ongoing process that becomes easier to comprehend over time.

The same applied to our work on Interview Sidekick. We initially laid out our hypotheses, but as we delved deeper, we discovered that our practice was delivering value to various aspects of the business. We achieved significant wins in enhancing IT's reputation, staying on budget, increasing top candidate closure rates, and improving NPS.

Upon realizing these outcomes, we refined the category, the target audience, the problem, and the motivations based on our actual observations. The second version looked like this:

We saw these teams as early-adopters/testers. We could introduce new ways of working, digital solutions, and have discretion over the approach. When these tests went well, they became case studies to scale our approach to other teams.

The refinement did not stop there. About a year later, we experienced another major shift, which I will discuss in future posts.

Model-Practice Fit is Not Binary

The iterative cycle of Model-Practice Fit leads us to another important point:

Model-Practice Fit is not a binary concept, nor is it determined at a single point in time. A more accurate way to perceive Model-Practice Fit is as a spectrum ranging from weak to strong.

When thinking about Model-Practice Fit as a binary concept, it implies that the business model and design practice remain static. However, in the real world, change is constant. Therefore, we should consider changes to the business model as the focal point for altering our practice.

In my career, I’ve seen three primary ways in which the business model changes:

  1. Senior Leadership turnover
  2. Regulatory pressure
  3. Overall business evolution

The first two changes happen quickly, without much notice, and we can only respond when they happen. The third one though, is a longer process that Design Leaders can play a more proactive role in influencing, but are never the sole party responsible for the change.

Signals of Model-Practice Fit

If Model-Practice Fit is not binary, how can we determine if we have achieved it? While there has been some discussion on this topic, most of it does not reflect reality.

In almost every case, it is necessary to combine qualitative and quantitative measurements with your own intuition to understand the strength of Model-Practice Fit. Relying on just one of these areas is akin to trying to perceive the full picture of an object with only a one-dimensional view.

So, how do we combine qualitative, quantitative, and intuitive indicators to gauge the strength of Model-Practice Fit?

1. Qualitative:

Qualitative indicators are typically the starting point for measuring Model-Practice Fit, as they are easy to implement and require minimal customer feedback and data.

To gain a qualitative understanding, I prefer using Net Promoter Score (NPS). If you are genuinely solving the audience's problem, they should be willing to recommend your product to others.

The main downside of qualitative information (compared to quantitative data) is that it is more likely to generate false positives, so it should be interpreted with caution.

2. Quantitative:

There were three initial quantitative measures to understand Model-Practice Fit: Budget Adherence and Utilization.

You’ll notice here, that we’re not talking about Product-Market Fit. That’s work that was done within the Product Team. Model-Practice Fit is ensuring that the resources, processes, and skills within the design team are fitting to the business needs.

3. Intuition:

Intuition is a gut feeling that is difficult to articulate. It is challenging to intuitively understand whether Model-Practice Fit has been achieved unless you have experienced situations where it was absent and situations where it was present.

Here’s a vibe that time and time again has shown me that Model-Practice Fit is strong:

When you have strong Model-Practice Fit, it feels like stakeholders are pulling you forward into their decision-making meetings rather than you pushing yourself into them.

The Key, Practical Points to Help You Find Model-Practice Fit

If you haven’t noticed yet, many of the things we teach here are structured activities to gain new insights into how you can learn to find your four fits.

The key questions that it’s important to have answers for are:


Finding Design Practice - Partnerships Fit

Instead of thinking about partnerships as a separation of skills and responsibilities, think about them in a similar way that customers adopt new products and services. Some partners are early adopters, while there are others who adopt late – often begrudgingly.

In our previous lesson about Model-Practice Fit, we explored the importance of aligning design teams with the business models and strategies for which they are working with/in. However, even with great Model-Practice Fit, the level of maturity in the partnership plays an outstanding role in how quality is defined.

In my experience (and hearing from others), every design leader will be faced with situations where cross-functional partners operate at different levels of maturity, prioritize operations over innovations, or make decisions based on gut feel rather than business 101.

This brings us to our second fit: Design Practice-Partnerships Fit.

There are five essential elements required to examine Design Practice-Partnerships Fit:

  1. Understanding who your partners really are (beyond the org chart)
  2. Understanding your partners' perceptions of success
  3. Understanding the practical implications of achieving Practice-Partnerships Fit
  4. Identifying strong indicators of having Practice-Partnerships Fit
  5. Recognizing that Partnerships run on a spectrum, and the need for design maturity will be different with different partnerships

The wrong way to do it

When I first moved into Senior Management (my first time at EA), I was hired to lead a team of designers, researchers, front-end developers, and program managers. We were responsible for creating in-house enterprise software as it related to Customer Support. We worked along side a brand new engineering team as well, and that partnership was solid from the start.

I leveraged all the skills I had learned at Apple, and began to bring these into the design practice at EA. We had what I thought was a solid design practice: strong processes, talented designers, and a clear vision for improving the user experience.

However, we kept running into resistance from our partners in HR, IT, and frankly, my boss.

I thought we were doing our job well by advocating for users, testing, validating, and creating polished designs. We did this because this is what I believed I was asked to do. After all, my boss even said, “EA needs what you did at Apple.”

What I failed to understand is that I was no longer surrounded by the same types of partnerships, skills, and beliefs that I had at Apple. The team I was leading was five steps ahead from what my new partners at EA had experienced before. As a result, some partners were confused or hostile, while others were energized and excited.

I was treating design as a binary, one-size-fits-all approach and could not see where this approach was working and where it wasn’t.

I had fallen into the trap of thinking that our only job was to create "good design," rather than understand that another job was to help our partners succeed. I had fallen into the trap of believing my way was the right way to succeed, and that failure led to predictable results: delayed projects, missed opportunities, arguments, compromised designs, and frustration all around.

A better way to find Practice-Partnerships Fit

Later in my career, during my second time at EA, I took a radically different approach.

Instead of thinking about partnerships as a separation of skills and responsibilities, we started thinking about them as relationships we needed to nurture and strengthen. We started thinking about them in a similar way that customers adopt new products and services. There are some who are early adopters (they follow the new trends), while there are others who adopt late – often begrudgingly.

This time, we examined four key elements:

  1. The Range of Success Metrics: We started by understanding what success looked like for each functional team. For Engineering it was velocity, technical debt reduction, maintainable code, reasonable sprint commitments. Game Studios were looking for feature delivery, consistent brand experience, and new competitive advantages. For HR, there was a strong focus on improving employee communications and perception. And for Customer Support, it was always cost savings, reduced ticket volume, and improving satisfaction.
  2. Perception of Value: As they say, perception is in the eye of the beholder. Despite many of the teams we worked with having stated goals and metrics, quite often the senior leader of those teams made decisions that were personal. The senior HR leader for example had “owned” EA’s intranet for years. No amount of data letting them know that their tools were unliked and weren’t be used convinced that leader that they were making poor decisions.
  3. Flexible Engagement Models: I have never seen or have been successful in having a single engagement model for how partners engage with the design team. IMO, the conversations on whether to work like an agency or be embedded leaves out two important factors: money and maturity. In the past, I’ve worked at several companies like EA where the design team worked on projects that came out of my boss’s budget as well as projects that came out of other departments. How a project or initiative is paid for greatly defines the engagement model. Many times, I’ve worked with Senior Leaders who needed help from the design team, but whose organizations did not need world class results in order to meet their goals. In this sense, it’s important to meet the appropriate level of maturity as it relates to the context.
  4. Value Exchange: We clearly defined what each partner would get from the relationship and what they needed to contribute in order to get that value. Our focus was to clearly articulate the reciprocal flow of expertise, resources, and benefits between teams… write them down, and communicate them frequently throughout the project. By highlighting this give-and-take between teams, and recognizing both tangible deliverables and intangible benefits, it helped us assess and communicate the pros and cons of each proposed change.

One example of this fit was observed when we partnered with the VP of Customer Experience:

Team Success “Metrics”

In working with the VP of CX, we identified several specific challenges that influenced how we structured our partnership:

Perception of Success

What made this partnership particularly interesting was the importance of perception. While the stated goal was reducing bill-back costs to game studios, we quickly learned that the VP of CX needed to be seen as the hero of the story. This meant:

Engagement Model

Based on our understanding of both the metrics and perceptions of success, we customized our engagement model. Since the work was funded from their budget, we operated like an internal agency:

This model worked because it:

Value Exchange

With the engagement model established, we created explicit agreements about value exchange:

From the Design team:

From the CX team:

This clear value exchange helped both teams understand their responsibilities and prevented common partnership pitfalls like scope creep or unclear decision-making.

The Reality of Finding Practice-Partnerships Fit

Just like Model-Practice Fit, finding Practice-Partnerships Fit is an iterative process. Our work with EA's Customer Experience team provides a perfect example of this reality.

Initially, we thought our primary partnership would be just with the CX leadership team. After all, they were the ones with the budget and the stated goal of reducing costs. However, as we dug deeper, we discovered a complex web of necessary partnerships:

Each of these partnerships required a different approach because each team had different success metrics, perceptions, and ways of working:

With Game Studios, we:

With Support Agents, we:

With IT Infrastructure, we:

With Analytics Teams, we:

Design Practice – Partnership Fit is Not Binary

Like Model-Practice Fit, Practice-Partnerships Fit exists on a spectrum. In our CX partnership work, we observed three primary partnership states that evolved over time:

1. Transactional

Initially, many game studios viewed CX purely as a cost center:

2. Collaborative

As trust built, partnerships evolved:

3. Integrated

Eventually, with some studios, we achieved:

Signals of Practice-Partnerships Fit

How do you know if you have strong Practice-Partnerships Fit? In our CX transformation, we looked for these key indicators:

Qualitative Signals

Quantitative Signals

Intuitive Signals
The strongest signal was when game studios started presenting their support strategies to their leadership without needing the CX team in the room – not because they were excluding CX, but because they truly understood and believed in the support approach. They had become genuine advocates for player-centric support, not just cost management.

Key Practical Points to Help You Find Practice-Partnerships Fit

To build strong Practice-Partnerships Fit, focus on these key questions:

  1. How does your design practice make your partners' jobs easier? (For CX, we made it easier for studios to manage support costs while maintaining quality)
  2. What are your partners' primary measures of success, and how are you helping them achieve those? (For studios: launch success, player satisfaction, cost management)
  3. Where are the friction points in your current partnerships, and what changes in your practice could reduce that friction?
  4. How can you adjust your practice to meet partners at their current level of design maturity?
  5. What value exchange model makes sense for each partnership?

Here at CDO School, we help you develop these partnerships through several key frameworks, step-by-step quick guides, and our course - COURSE: Activating Change.


Looking Ahead

In our next article, we'll examine Partnerships-Positioning Fit, where we'll explore how these partnerships influence how Design positions itself within the organization. In other words, how Design will decide what advantages only it can create that are needed at the company.

Remember: The most successful design practices aren't those with the most talented designers or the most refined processes – they're the ones that understand how to create and nurture partnerships that drive mutual success.


Finding Design Practice - Partnerships Fit

Instead of thinking about partnerships as a separation of skills and responsibilities, think about them in a similar way that customers adopt new products and services. Some partners are early adopters, while there are others who adopt late – often begrudgingly.

In our previous lesson about Model-Practice Fit, we explored the importance of aligning design teams with the business models and strategies for which they are working with/in. However, even with great Model-Practice Fit, the level of maturity in the partnership plays an outstanding role in how quality is defined.

In my experience (and hearing from others), every design leader will be faced with situations where cross-functional partners operate at different levels of maturity, prioritize operations over innovations, or make decisions based on gut feel rather than business 101.

This brings us to our second fit: Design Practice-Partnerships Fit.

There are five essential elements required to examine Design Practice-Partnerships Fit:

  1. Understanding who your partners really are (beyond the org chart)
  2. Understanding your partners' perceptions of success
  3. Understanding the practical implications of achieving Practice-Partnerships Fit
  4. Identifying strong indicators of having Practice-Partnerships Fit
  5. Recognizing that Partnerships run on a spectrum, and the need for design maturity will be different with different partnerships

The wrong way to do it

When I first moved into Senior Management (my first time at EA), I was hired to lead a team of designers, researchers, front-end developers, and program managers. We were responsible for creating in-house enterprise software as it related to Customer Support. We worked along side a brand new engineering team as well, and that partnership was solid from the start.

I leveraged all the skills I had learned at Apple, and began to bring these into the design practice at EA. We had what I thought was a solid design practice: strong processes, talented designers, and a clear vision for improving the user experience.

However, we kept running into resistance from our partners in HR, IT, and frankly, my boss.

I thought we were doing our job well by advocating for users, testing, validating, and creating polished designs. We did this because this is what I believed I was asked to do. After all, my boss even said, “EA needs what you did at Apple.”

What I failed to understand is that I was no longer surrounded by the same types of partnerships, skills, and beliefs that I had at Apple. The team I was leading was five steps ahead from what my new partners at EA had experienced before. As a result, some partners were confused or hostile, while others were energized and excited.

I was treating design as a binary, one-size-fits-all approach and could not see where this approach was working and where it wasn’t.

I had fallen into the trap of thinking that our only job was to create "good design," rather than understand that another job was to help our partners succeed. I had fallen into the trap of believing my way was the right way to succeed, and that failure led to predictable results: delayed projects, missed opportunities, arguments, compromised designs, and frustration all around.

A better way to find Practice-Partnerships Fit

Later in my career, during my second time at EA, I took a radically different approach.

Instead of thinking about partnerships as a separation of skills and responsibilities, we started thinking about them as relationships we needed to nurture and strengthen. We started thinking about them in a similar way that customers adopt new products and services. There are some who are early adopters (they follow the new trends), while there are others who adopt late – often begrudgingly.

This time, we examined four key elements:

  1. The Range of Success Metrics: We started by understanding what success looked like for each functional team. For Engineering it was velocity, technical debt reduction, maintainable code, reasonable sprint commitments. Game Studios were looking for feature delivery, consistent brand experience, and new competitive advantages. For HR, there was a strong focus on improving employee communications and perception. And for Customer Support, it was always cost savings, reduced ticket volume, and improving satisfaction.
  2. Perception of Value: As they say, perception is in the eye of the beholder. Despite many of the teams we worked with having stated goals and metrics, quite often the senior leader of those teams made decisions that were personal. The senior HR leader for example had “owned” EA’s intranet for years. No amount of data letting them know that their tools were unliked and weren’t be used convinced that leader that they were making poor decisions.
  3. Flexible Engagement Models: I have never seen or have been successful in having a single engagement model for how partners engage with the design team. IMO, the conversations on whether to work like an agency or be embedded leaves out two important factors: money and maturity. In the past, I’ve worked at several companies like EA where the design team worked on projects that came out of my boss’s budget as well as projects that came out of other departments. How a project or initiative is paid for greatly defines the engagement model. Many times, I’ve worked with Senior Leaders who needed help from the design team, but whose organizations did not need world class results in order to meet their goals. In this sense, it’s important to meet the appropriate level of maturity as it relates to the context.
  4. Value Exchange: We clearly defined what each partner would get from the relationship and what they needed to contribute in order to get that value. Our focus was to clearly articulate the reciprocal flow of expertise, resources, and benefits between teams… write them down, and communicate them frequently throughout the project. By highlighting this give-and-take between teams, and recognizing both tangible deliverables and intangible benefits, it helped us assess and communicate the pros and cons of each proposed change.

One example of this fit was observed when we partnered with the VP of Customer Experience:

Team Success “Metrics”

In working with the VP of CX, we identified several specific challenges that influenced how we structured our partnership:

Perception of Success

What made this partnership particularly interesting was the importance of perception. While the stated goal was reducing bill-back costs to game studios, we quickly learned that the VP of CX needed to be seen as the hero of the story. This meant:

Engagement Model

Based on our understanding of both the metrics and perceptions of success, we customized our engagement model. Since the work was funded from their budget, we operated like an internal agency:

This model worked because it:

Value Exchange

With the engagement model established, we created explicit agreements about value exchange:

From the Design team:

From the CX team:

This clear value exchange helped both teams understand their responsibilities and prevented common partnership pitfalls like scope creep or unclear decision-making.

The Reality of Finding Practice-Partnerships Fit

Just like Model-Practice Fit, finding Practice-Partnerships Fit is an iterative process. Our work with EA's Customer Experience team provides a perfect example of this reality.

Initially, we thought our primary partnership would be just with the CX leadership team. After all, they were the ones with the budget and the stated goal of reducing costs. However, as we dug deeper, we discovered a complex web of necessary partnerships:

Each of these partnerships required a different approach because each team had different success metrics, perceptions, and ways of working:

With Game Studios, we:

With Support Agents, we:

With IT Infrastructure, we:

With Analytics Teams, we:

Design Practice – Partnership Fit is Not Binary

Like Model-Practice Fit, Practice-Partnerships Fit exists on a spectrum. In our CX partnership work, we observed three primary partnership states that evolved over time:

1. Transactional

Initially, many game studios viewed CX purely as a cost center:

2. Collaborative

As trust built, partnerships evolved:

3. Integrated

Eventually, with some studios, we achieved:

Signals of Practice-Partnerships Fit

How do you know if you have strong Practice-Partnerships Fit? In our CX transformation, we looked for these key indicators:

Qualitative Signals

Quantitative Signals

Intuitive Signals
The strongest signal was when game studios started presenting their support strategies to their leadership without needing the CX team in the room – not because they were excluding CX, but because they truly understood and believed in the support approach. They had become genuine advocates for player-centric support, not just cost management.

Key Practical Points to Help You Find Practice-Partnerships Fit

To build strong Practice-Partnerships Fit, focus on these key questions:

  1. How does your design practice make your partners' jobs easier? (For CX, we made it easier for studios to manage support costs while maintaining quality)
  2. What are your partners' primary measures of success, and how are you helping them achieve those? (For studios: launch success, player satisfaction, cost management)
  3. Where are the friction points in your current partnerships, and what changes in your practice could reduce that friction?
  4. How can you adjust your practice to meet partners at their current level of design maturity?
  5. What value exchange model makes sense for each partnership?

Here at CDO School, we help you develop these partnerships through several key frameworks, step-by-step quick guides, and our course - COURSE: Activating Change.


Looking Ahead

In our next article, we'll examine Partnerships-Positioning Fit, where we'll explore how these partnerships influence how Design positions itself within the organization. In other words, how Design will decide what advantages only it can create that are needed at the company.

Remember: The most successful design practices aren't those with the most talented designers or the most refined processes – they're the ones that understand how to create and nurture partnerships that drive mutual success.

Read next