In our last article, we discussed how to validate an idea with user interviews before investing time and money into development. So, what does the process of conducting user interviews actually look like? In this post, we'll walk you through a time when we used our validation process and user interviews to test an idea.
Like many ideas, ours came as a result of a problem we were experiencing at Remedy.
Last year, we began to grow rapidly as an organization. Our technology team grew from 15 engineers to 20, 30, 40, and now 50 engineers. When our technology organization was small, leadership could easily understand the productivity and pain points of each team and every individual engineer.
As we scaled up, it became increasingly difficult to measure team effectiveness and support individual engineers in their growth. We relied on subjective feedback and lacked objective measures to help us understand the health of our engineering team.
After a few casual conversations with friends at other engineering organizations, we confirmed that we were not alone in this problem.
We formed a hypothesis based on our experience and some logical assumptions:
- Problem: Engineering leaders often have a difficult time identifying struggling team members, ineffective practices, and broken processes.
- Background: There are objective measures that can help engineering leaders make sense of confusing problems.
- Solution: If provided with objective measures that have actionable recommendations, engineering leaders and teams could be empowered to solve productivity problems faster and support individual engineers to help them grow professionally.
These metrics would serve as a baseline that could help managers track progress and understand how changes, such as incorporating new tools or altering team composition, affect productivity.
The first step in market validation is to define the size of the market. If you plan to raise venture funding, you need to make sure you are serving a large enough market to ensure your company can generate venture returns.
We performed a basic top-down market analysis to get a sense of the potential. Our idea sat at the intersection of several market segments capable of supporting a multi-billion dollar company:
- Team collaboration software
- Project management software
- Engineering tools
We performed a competitor analysis in 2 steps:
- We listed the preliminary set of issues we wanted to solve and how we would address them.
- We used queries to reflect common development pain points.
Over the span of a week, we found tools that could help address engineering efficiency issues. Once we determined the right keywords and query terms our list of competitors started snowballing. Some user interview subjects identified competitors during our conversations. “Oh, this sounds like _____.”
We individually reviewed the solutions to see if and how they were addressing the issues we sought to fix. Our list of potential competitors included companies that provided 1 of 3 different services:
- Code analytics: Haystack, Waydev, Code Climate, GitClear, BlueOptima
- Coaching for software teams: Pluralsight, Botany
- Visualizing data across multiple development platforms: Screenful, LinearB
Demand analysis is a critical step that will help you validate market appetite. Are people searching for a solution like the one you want to build? Are users finding your potential competitors?
We found that there were solutions on the market, but our target users did not know about or utilize them. By comparing historical data with the current stats, we estimated:
- The demand for a solution
- The stability of our competitors' positions in the industry
- The cost of growth marketing
We decided to analyze key metrics of the most popular providers and market leaders. Market traffic to selected domains was 3.6M (+3.33% compared to the previous month), and the sum of the estimated costs of all markets to rank for organic keywords (important for growth marketing) was $570K in October 2021.
In order to get an idea of Domain and Market dynamics, we studied the share of visits and the domain traffic compared to traffic trends of the market. It helped us better understand how competitors allocate marketing budgets and resources.
Among the highlighted market players, there were only a few stable companies who maintained a steady flow of traffic that didn’t dramatically fluctuate from month to month. They owned the largest piece of the market share while the rest of the market was fragmented among a group of similar competitors with no significant traffic growth or decrease.
We outlined our target user persona by describing the person’s characteristics, desires, and motivations:
- Background: Our target user is a technology leader in an organization of 15+ engineers with multiple teams. This engineering leader is analytical and curious.
- Concerns: She is concerned about efficiency and wants to pinpoint productivity gaps to better manage her budget.
- Desires: She wants help identifying struggling engineers, underperforming teams, and broken processes. She is looking for insights beyond 1:1s and periodic check-ins with managers and team members.
To refine our persona, we conducted user interviews with a small number of key tech leaders at large enterprises.
We realized that engineering leaders at large, enterprise-level organizations could benefit from this tool, but they had a whole host of problems, questions, and desires that we couldn't address. Eventually, we would want to serve these users, but we decided to narrow our focus for our initial product so that we did not spread our resources and attention too thin.
We adjusted our user persona to be an engineering leader at an organization with 15-200 engineers with 3-25 teams.
We built an initial target list that matched our target user persona, looking for titles like CTO, VP/Director/Manager of Engineering at tech startups, agencies, and small enterprises.
We started by sourcing subjects from our first degree connections. Once the list was compiled, we created an outreach template that we could send out over email and LinkedIn.
We mobilized our networks by posting to social media and Slack communities, asking for recommendations of technology leaders who fit our target user persona.
We used the first set of UX interviews to source additional subjects by asking for recommendations. When interviewees recommended a new contact, we asked for an introduction in a follow-up email after the interview. We include an example of this in the Interview Follow-Up Section below.
This approach served 2 purposes:
- Increasing our UX user interview pool
- Expanding our reach beyond our direct network
We prepared the following user interview template consisting of an introduction, questions, wrap-up remarks, and final ask:
I am conducting user interviews to learn how different engineering organizations are structured and how they measure their teams. Please provide candid responses and speak for as long as you would like on each topic. I won't stop you.
- What's the size of your engineering organization?
- How is your organization structured?
- What is the division between employees, contractors, and outside development partners?
- How do you measure your engineering team’s effectiveness?
- What information do your engineering teams get about their own effectiveness?
- How do you evaluate individual engineers?
- How do you get a complete picture of engineering effectiveness across different teams?
- What tools do you currently use for managing your engineering team?
- What metrics/indications do you believe measure team effectiveness?
- What are the blind spots in your engineering management process?
Thank you for your time and candor. I am conducting these interviews because these are the questions we are asking ourselves internally. As our engineering team has grown, we have realized that it is difficult to measure team effectiveness and individual engineers.
We rely on subjective feedback but don't have any objective measures. We are considering building a product to understand engineering teams better, reinforce positive code practices, and provide frequent feedback to engineers and teams.
Is there anyone else you know that I should speak to about this topic?
Assuming we move forward with building this tool, would you be interested in participating in a beta?
Don't forget to follow up with your interviewees. It's an important step when conducting user interviews.
Your follow-up is an opportunity to thank subjects for their time and acknowledge their role in helping with your research. This is also a great opportunity to source new interview subjects.
In case the participant is willing to reach out to people in their network, I always include a few sentences that can be copied and pasted. These sentences include 5 main elements:
- Who I am
- What I am doing
- Why I am doing it
- Whom I am looking to speak with
- The ask
Results and Learnings
Over the course of several months we conducted 40 UX research interviews with technology and engineering leaders at organizations ranging from early stage startups to small enterprises.
- How they subjectively and objectively measure engineering effectiveness
- What tools they use in their processes
- How they think about managing their growing technology organizations
Our research and user interviews led us to the conclusion that although the problem we hypothesized exists, it wasn't an urgent problem that tech leaders were actively looking to solve.
We learned from our secondary research that there are many good solutions on the market, and our subjects were familiar with many. Yet even the engineering leaders who identified the lack of objective engineering measures as an issue for them were not utilizing these existing tools.
From this, we realized that our subjects weren't doing anything to substitute the process that a tool like ours would solve. This was perhaps the most important negative indicator: if users aren’t finding another way to solve a pain point, then it’s probably not that big of a pain.
We concluded that this is a problem that people identify, discuss, and maybe even complain about, but they do not consider it important enough to solve.
We ultimately chose not to pursue this project. We spent a few dozen hours conducting user research interviews, as well as preparing interview questions and templates.
As you’ve seen, we know the struggle of dedicating so much time only to turn up empty-handed. It's hard to come to terms with these facts, but we like to think it's better to learn this lesson early on and avoid losing several months and tens of thousands of dollars building a solution that some people want but no one needs.