adopt data-driven design approach

10 Steps to Creating a Data-Driven Design Culture

At Qminder, we pride ourselves on having a data-driven, systematic approach to design. But it wasn’t always like this.

Three months ago, we didn’t have data to visualize our product usage, we didn’t do user interviews or usability testing, nor did we have design objectives or research know-how.

Sounds familiar? Then this article is for you.

Today, we’re going to discuss how we started to build a data-driven design culture and what we have achieved within three months.

What is data-driven design?

Data-driven design is an approach in which decisions are made based on both quantitative and qualitative data, leaving personal (gut) feelings or opinions aside.

For example, let’s say the team wants to build a mobile app, but interviews with the users tell us that 85% of the user base have to use desktop computers due to their company policies.

Then it doesn’t make sense to go with a mobile app, instead focus on improving the existing desktop application.

If we didn’t have that data, we might go with our personal opinions and start building mobile apps that will cost both time and money for the company (and end up with only a few people using the product).

Why do we want to be data-driven?

In short: products designed with users in mind are more likely to succeed.

Doing research at an early stage helps confirm the viability of products and saves a lot of time and cost for the company. It doesn’t matter how much a designer, software developer, or product manager loves the product.

It only matters what users think.

To design user-focused products, we need to understand our users and the problem we’re trying to solve.

Collecting both quantitative and qualitative data about user behavior helps to answer critical business questions, such as:

  • Who are our users?
  • Why are they using our product?
  • What is the value of our product?

Data-driven design 101: where to start

1. Do a design audit

Start by doing research. Good news: you’re on the right track, since you’re already reading this article!

But with all seriousness, audits help you understand where your company stands in terms of design maturity. We started by asking questions about the data:

  1. Do we know how many people visited this page during the past month?
  2. Do we know how many people are using our product via desktop or mobile phone?
  3. Do we know how many people signed up last week?

In our case, most of the answers were no. So we collected everything we wanted to find out and needed to start tracking into one document.

Secondly, as a new member of the team, I tried to understand what resources we have, what are the main procedures, who is responsible for what, and who can help me with specific tasks.

This included talking to different stakeholders and team members.

For instance, I found out that even though some tracking events were available, we did not visualize it nor had everyone from the company access to the data. That incentivized us to find a tool where to keep the data and share it with the whole team.

Moreover, talking to team members helped me learn about the product and understand our users better.

In short, you can start by analyzing the following:

  • Data: Start asking questions about your product’s usage. Collect everything you want to know (including existing and missing data) into one document, and use it as a backlog.

  • People and resources: Talk with people to learn as much as you can about the product, users, and how work is being done.

  • Procedures and tooling: Find out which design methods and tools are being used and how information is being shared.

💡 Tip: When doing an audit, you can also try out Nielsen Norman Group UX Maturity Quiz. It helps estimate your organization’s UX maturity level.

stages of ux maturity

Stages of UX maturity, image courtesy of nngroup.com

2. Data-driven culture starts at the top

One of the mistakes I have seen in the past when trying to build a data-driven design culture is goals not being aligned between teams and upper management.

It’s important to set the expectations and agree with your managers about what, why, and how you plan to build. Top management can set an example and propagate the data-driven mindset downwards.

When I joined Qminder, I had several conversations with our CEO and CPO to agree on our goals.

Here’s how we enhance data-driven mindset on a day-to-day basis:

  • CEO shares data insights during a company-wide all-hands meeting.

  • Team leads share monthly reports (based on data) during the all-hands meeting as well as in written form in chat.

We challenge each other (both leaders as well as individuals) during weekly product sync meetings by asking, “do we know how many users…”, “do we know why…”, and more.

3. Set specific goals

After joining Qminder, I asked myself immediately: “Where do we want to be at the end of 2021?”

One of our company-wide objectives was to build a feedback loop between us and our clients, which meant establishing a data-driven design process. We split this into smaller, more specific goals:

  1. We measure the impact of what we create (for example, when we release a new feature, we have methods to measure whether the feature is performing well or not).

  2. We make decisions based on the data we have collected (this means we won’t start working on a new project just because we like the idea; instead, the idea is backed with data).

  3. We collect quantitative data where the results can be presented in numbers, i.e., “how many, how often or how much”, and qualitative data where the results take the form of observations, interviews, feedback, and feelings and answer to “why”.

Again, agree on these goals with your manager or team lead or even with the CEO, depending on the size and structure of your company.

4. Start collecting quantitative data

Product analytics

We need data to get a bird’s-eye view of what people are doing and understand how the product works.

We have already created a backlog of questions that need answers. With the help of product and development teams, missing tracking events were added to the code. The next step was to find a data analytics tool that would allow everyone from the company to explore patterns in data, find answers, and share their findings.

There are many tools that can help you with that, depending on your needs and budget: Google Analytics, Mixpanel, Looker, etc. We chose Looker because that suited the best with our needs at the time, and it still does.

We started building dashboards one step at a time to see how users interact with our product. It’s often easy to get overwhelmed by all the possibilities, especially when starting out. For instance, we started one feature at a time (say, Data Insights) that we broke into smaller views (Team Performance and Visitor History).

(Speaking of which, our Team Performance has just gotten a visual and functional upgrade. Go check it out!)

You don’t need everything at once. Go slow and prioritize.

We also created a separate Slack channel called #analytics, where people can ask questions, raise topics, and share updates. Communication is key.

Surveys and questionnaires

Surveys and questionnaires are easy to use and low-cost ways to produce a combination of quantitative and qualitative data.

At Qminder, we measure CSAT (customer satisfaction score) to understand how our users perceive the product — i.e., what they say and how they feel.

The question that CSAT asks is simple: “How satisfied are you with Qminder?”, with an answer being a single number on a scale from 1 to 5.

As a follow-up question, we ask them to tell us more about why they chose a certain score. From January this year, we have increased CSAT from 69 to 85, which is another indicator of users’ well-being.

(For more information on CSAT and other satisfaction metrics, read our in-depth guide.)

In addition, we use on-page surveys to gather feedback on new or existing features.

For example, we’ve been using Hotjar on-page surveys to ask users how they would rate their experience with a new feature on a scale from 1 to 5 and if there’s anything they were missing on the page.

We use this data to measure features’ performance as well as to improve the product.

Visual design and aesthetics are often quite hard to measure and therefore categorized as subjective matters. One of the interesting ways to measure a design is by looking at how it makes users feel and what words they associate with it.

At the beginning of this year, we conducted an email survey where we asked our users to pick up to five words out of 26 that best describe how they feel about Qminder.

(There are 118 words in the full list of the product reaction words originally included in the Microsoft Desirability Toolkit.)

Then, we explained the term “branding” and asked users to rate the branding of Qminder on a scale from 1 (I hate it) to 7 (I love it). This survey aimed to measure and understand how users feel about the brand, visual design, and aesthetics.

Our goal was to collect data before redesigning a product so we can repeat the survey after the major update(s) and compare the results.

5. Start collecting qualitative data

Once you have gathered insights on what people are doing, it’s time to deep dive into your users’ behavior, problems and needs.

I highly recommend combining quantitative data with qualitative methods, such as interviews, surveys, usability tests, observations, etc. Quantitative data gives us an understanding of what is happening, but we also need to figure out why it is happening.

Three months ago, working at Qminder as the only designer (and without a researcher), that’s where my biggest struggles started.

One of my goals was to stop making guesses and instead make the improvements I knew our users needed.

What helped me the most back then was this:

  • Writing down the most important things I need to find answers for (that answered to why).

  • Getting the most out of Hotjar. I started watching screen recordings which captured thorough user behavior, like mouse hovers, clicks, and scrolls.

This helped me understand what users actually do.

  • I discovered a new user flow.
  • I found out that users couldn’t find a search option and spend over half a minute doing manual search instead.
  • I understood how often a specific user group used a feature that we didn’t know they even needed.

I started experimenting with on-page surveys. For example, I began asking users, “What’s the one thing we should change on this page?”, “What other information would you like to see on this page?”.

In addition to getting feedback about the product, we discovered a few bugs while doing those surveys.

Moreover, I ended up with even more questions which we did not know we should have been asking — you don’t know what you don’t know. These questions were valuable input for the next step - user interviews.

Finding participants, preparing, scheduling, and analyzing interviews most probably deserves another blog post, so stay tuned.

In short, quantitative and qualitative data complement each other well.

Quantitative data sets a great foundation for understanding user behavior and helps identify UX trouble spots.

Qualitative data reveals users’ needs by creating a new narrative to the story.

6. Analyze the data

So you have the data? Great. But we’re far from over.

Now you have to do something with this data.

Transforming raw data into valuable information, making it presentable, and finding a conclusion from that can often be challenging.

Interpreting quantitative data (yes, quantitative, not qualitative) in a rational and objective way is one of the biggest obstacles I have encountered when analyzing the data.

Depending on your resources, you may not have a dedicated role for that in the team. If you don’t, no worries — we don’t, either.

You’ll just have to figure it out by yourself. (It’s not that scary, I promise.)

That’s where mixing and combining different methods comes in handy.

For example, when working on one of the core features at Qminder, we came across having too many ideas that we did not know how to prioritize. We also had loads of data collected from different sources such as user interviews, support tickets, surveys, screen recordings, product analytics, etc.

After working on that data, organizing it, and making it presentable, we were able to use a modified ICE framework to prioritize our list.

Analyzing data helped us understand both the impact of a specific feature as well as boosted our confidence level. It is not bulletproof but helps summarize research and gives a clear overview of what we should work on next.

7. Hire a product researcher

You cannot, and should not, do everything alone. I’m lucky to have a really supportive team that trusts me 🤗

Right after joining the company, I recommended hiring a product researcher. The team heard my reasoning on the value of user research, and we posted the job ad soon after.

Like hiring for any other new role, you should first clarify the responsibilities and expectations for the future team member.

Our goal was to find someone passionate about answering the most challenging questions in design by helping our team understand users’ behavior, problems, and needs.

The primary responsibility for the role was to research multiple aspects of products and experiences through user interviews, usability tests, surveys, and observations.

Fast forward three months and many candidates later, we found the person we feel can do all of that.

8. Show your work

Showing how design and research impacts business goals is important for learning and improvement, as well as sharing knowledge and increasing transparency in the company.

The definition of data-driven design is not always consistent across the organization. Therefore, it is vital as a designer or researcher to increase the visibility of design by sharing information and opening conversation.

There are many different ways to showcase your work and keep the team in the loop. Here’s a quick rundown of what we do at Qminder:

  1. Weekly syncs between product, development, design, and research.

  2. Separate Slack channel called #design-showcase where design and research team share their work such as feature updates, visuals of UI, new design system components, research reports, upcoming plans, work in progress, etc. This channel is also an opportunity for people from other teams to ask questions and share their perspective.

  3. Presentations at company-wide meetings. I think that’s one of the most important ones — you can include everyone and provide context by covering why, how, and what you’re working on.

Sharing know-how and articles about UX, research, and data-driven design. Once again, we have a specific channel for that — #read.

9. Review your goals

Goal-setting is an ongoing process, and you need to constantly review your goals if you want to succeed.

I keep an eye on my goals weekly, if not daily. It doesn’t mean everything we do must support our objective (aka establish a data-driven design process).

But, it’s an excellent place to ask why we are doing it if it doesn’t.

Besides keeping an eye on the goals myself, I try to involve the whole team to increase transparency and share what and why we’re working on.

At most of our company-wide all-hands meetings, I start by reviewing the goals we set at the beginning of this year, what we have achieved, and then identifying opportunities for improvement.

10. Form a habit of data-driven mindset

They say it takes 21 day to form a habit and 90 days to turn it into a lifestyle.

Brace yourself for the long haul because data-driven design is a big commitment.

You might complete all the steps at once, but the key to success is to keep the momentum going. Being systematic and forming a habit of a data-driven mindset is what prevents the organization from falling back.

I have seen many companies start using new tools to either conduct user research or collect data on their products’ usage. But after a month or so, these tools end up collecting dust.

What makes a difference and helps minimize the risk of wasting effort and resources is to understand what the company wants to achieve, set goals and agree on these goals on a team and company level.

Make sure that the whole team is on the same page and you have each others’ back.

Don’t worry if something is not going as swimmingly as you hoped. At Qminder, we have a lot to improve and many challenges ahead, but we are also on the right track to becoming a fully user-centered company where data-driven design is a norm.

Previous

Improving Performance Tracking With Our Latest Update

Next

6 Best Strategies to Reduce No-Shows