was successfully added to your cart.

An in-depth look into how our usability testing shaped our product (and the visuals & scripts we used to do it).

Less than three months ago, I quit my executive-level job at a small, VC-funded IoT startup to join a local software company with a unique background and dire need for UX overhaul. With over 2.5 years invested in a product on a small team of 15, the decision to leave felt less like quitting a job and more like leaving a family. But, in order to grow in my career as a UX designer and entrepreneur, the decision wasn’t really a decision at all.

Now, after spending three months in the trenches of a redesign, I wanted to provide some insight into how usability testing has shaped the direction of our product, as well share the exact tests I’ve been using along the way.

For this guide, I’ll be focusing on one page, which helps users find, view, and manage physical products in a spreadsheet-based view. Our product is used by a variety of Fortune 50 retailers to manage their vendor’s products sold online and in-store. Here’s a sneak peek at the final design for the page:

Phase 1: User Interviews

Luckily, our product has existing users ranging from small businesses to large brands like Kohler and Bosch, so, working alongside our customer success team, I scheduled as many 1-on-1 interviews as possible during a seven day period.

Tools we used for phase 1:

  • Google docs, forms, and sheets
  • Confluence
  • Slack
  • Cell phone

Note: for this phase of the process, I only wanted to speak with users 1-on-1 to dive into each user’s individual needs, goals, pain points, frustrations, etc. and avoid any group think.

Before conducting any interviews, I needed to:

  • Write, edit, and test the interview script and questions to ensure that it flowed and understand the approximate duration of interview
  • Develop a process to synthesize, document, and share interview findings

Interview Introduction:

We developed an end-to-end interview process that was tested internally and tweaked based on feedback. For the interviews, I would be leading the discussion, and Tim, my fellow UX designer, would be listening in and recording the answers in a Google form to ensure that I could focus on engaging with the interviewee.

Note: having a second person on the call is key, as the lead interviewer is able to solely focus on engaging the user and guiding the conversation up and down rabbit holes in search of golden nuggets of information.

The interviews kicked off with small talk, a high-level overview of why we’re conducting these interviews (learn from users in an objective manner), and then jumped into a scripted introduction, which is as follows:

First, I want to thank you for your time and for giving me the opportunity to understand how you interact with our products. To give a little context, I’m a User Experience Designer that’s recently joined Edgenet with one goal: completely rebuild the product from start to finish so that you’re able to effectively (and enjoyably) do your job.

This conversation is purely exploratory, and none of my questions are focused on testing you. I really just want to understand your role and how you use our product today. Throughout our chat, I encourage you to be as open and blunt as possible, as this information will play a crucial role in how we improve the product.

The conversation should last no longer than 45 minutes, and if there’s ever a question that does not make sense, please let me know and I’ll further clarify.

Before we jump in, do you have any questions for me?

Interview Questions:

After reading through the introduction, the question portion of the interview lasted anywhere between 30–60 minutes and provided a wealth of insight into the user’s who, what, where, when, and, most importantly, why.

All-in-all we asked 32 questions, which were bucketed into 5 groups and included an introduction prior to starting a new section:

  • Profiling: 6 basic questions to help develop personas later (Ex: “What is your role and how would you describe that to someone who is unfamiliar with {company name}?”)
  • Problem: 7 questions focusing on the primary problem they’re trying to solve using our product (Ex: “What are you trying to accomplish when using our product?”). Questions are intentionally vague, as most golden nuggets are surfaced during follow-up questions.
  • Product Usage: 6 questions to help understand how they currently use the product at a granular level (Ex: “What makes you feel as if a product is complete?”). This was my favorite question because it was so vague and typically resulted in multiple unplanned follow-up questions.
  • Situational: 5 questions that introduced a situation and asked the interviewee to walk me through how they would go about completing a task. The most enlightening set of questions, as the user is reciting how they use our product by memory (Ex: “If I gave you a list of 20 products right now and asked you to add them to your account, how would you go about doing that?”).
  • Solutions: 8 questions focusing on likes, dislikes, and wishes for the product. (Ex: “forget about what is and what is not possible. If I gave you a magic wand and you could wave it to solve any problem with our product, what would it be?”). At this point, I knew what they were going to say, but this section really helped reiterate and further clarify pain points.

Interview Summaries:

After the 45–60 minute interview was complete, we gave an overview of next steps, said our goodbyes, and immediately followed-up with a thank you e-mail, so they had a direct line of communication in case they had additional feedback to share in the future.

Then, Tim and I would debrief on a call to document a summary in a Confluence wiki page that included 7–10 key points and 3–5 takeaways, which was shared with the full product and CSM team in Slack.

So, after 20 interviews in a week, the feedback started to plateau with similar feedback, so we stopped scheduling calls and started focusing on the next phase: discovery and ideation.

Phase 2: Ideation (Solo White-boarding)

At the ideation stage of the process, I’m a huge fan of our team working solo to rapidly develop creative ways to solve our users problems. The reason? It helps avoid a group of designers and PMs from sitting in a room with a blank slate “collaborating.” But, by starting solo, we’re able to timebox ourselves to think individually then come together with tangible progress to critique, merge, and build upon.

Unfortunately, I forgot to take a picture of the whiteboard, so you’re just going to have to trust me that this happened. On to phase 3!

Tools we used for phase 2:

  • Whiteboard
  • Markers (not permanent ones, of course)

Phase 3: A/B In-Person Testing (Wireframes)

After developing a general direction for the product, Tim and I moved away from the whiteboard to develop a formal workflow for in-person testing, which was happening on-site at Lowe’s HQ two weeks after completing 1-on-1 interviews. Luckily, over 90 users were visiting Lowe’s for on-boarding training, and the UX team was invited to join and conduct usability tests between sessions.

Tools we used for phase 3:

  1. Balsamiq
  2. Lookback.io
  3. InVision
  4. Google Docs

Usability Test Overview:

We decided to develop a story-based A/B usability test utilizing Balsamiq wireframes, which would test key ideas generated during Phase 2. We decided to make 2 variations of the story, which were:

Story A: Experience revolves around “task list”, similar to Trello, which helps users understand what they must do in order to progress the completeness of their account

Story B: Experienced revolves around dashboard, which provides visualized insights and helps user understand their data & ways to improve account

Our primary goals for usability testing was to test the new experience and design of the products page, but we had additional goals like:

  • Test hypothesis regarding whether or not dashboard provided value to users, as this is a large technical mountain to climb if not valuable
  • Introduce new product features including collaborative workflow, bulk editing, tasks lists, etc.
  • Test on-boarding and setup flow to ensure we’re requiring the right type of information and the right time
  • Build relationships with users and invite them into future testing

Prior to catching our flights to Charlotte, NC, we built out the clickable prototypes in InVision, wrote the scripts and questions, tested internally, and tweaked the experience based on feedback. A few samples of the wireframes are:

Regardless of story, the general process was the same. We setup the Lookback app to record the convesration (to relay to our team in Nashville), started with small talk, and then kicked it off with the introduction script:

I want to start by thanking you for your time and for giving us the opportunity to better understand how you interact with our products. To give a little context, I’m a User Experience designer that has recently joined Edgenet with one goal: completely rebuild the product from start to finish so that you’re able to effectively (and enjoyable) do your job.

Over the last three weeks, Tim and I have been conducting in-depth interviews with current customers to understand their roles, business needs, and experience with Edgenet. We’ve received invaluable feedback that’s been taken to heart and utilized to develop what you’re about to experience.

Today’s exercise helps us understand how you interact with our product. To do so, we’ve developed wireframes, which are low-resolution designs that you’ll be clicking through to accomplish a goal. The exercise is broken up into 3 main sections, and in each section, we’ll walk you through screens and ask a variety of questions to help us understand how you interpret content and expect experiences to unfold.

Before we start, I want to emphasize that this exercise is purely exploratory, and there are no right or wrong answers. We really just want to understand how you interact with our product, in the most objective way possible.

Throughout each section, we’ll be recording you, as well as your computer’s screen to help relay this information to our team in Nashville.

Before we jump in, do you have any questions for me?

Then, after answering any questions that the interviewee had, I gave a brief overview of the story, ultimately setting the stage for the rest of the exercise:

Today, you’re the proud owner of LionHeart: the premier supplier of spud wrenches in North America. Over the last year, your company has experienced significant growth, and the large, prestigious retailer, MalMart, has taken notice. Congrats!

MalMart has invited you to become a part of their local vendor program, which tests new products in different markets around the country. Your 20 SKUs have been selected to start testing in the Nashville market in less than 2 months.

If all goes well, MalMart plans to expand distribution and roll out your product throughout the country, so it’s extremely important for you, and your company of 30 employees, to knock this out of the park.

Once the introduction and story scripts were read, I jumped into the usability test, which was bucketed into three primary groups and included a short introduction script and a set of questions (usually 3–5 per screen):

  • Onboarding: 6 screens ranging from creating team to selecting primary retailer (Ex: “Now that a retailer is selected, how do you expect this to impact your experience throughout the product?”)
  • Adding Products: 5 screens focusing on bulk uploading products into the system (Ex: “What do you expect to see after clicking the ‘add products’ task in your list?”)
  • Managing Products: this section was one of the primary reasons for the test, as this would be the first page we would develop for the new platform. The section consisted of only 2 screens but was critical in seeing how users interpret the completely revamped UX/UI. (Ex: “What does the data quality score mean to you, and how are you currently using this internally?”). We developed 6 variations of the screen to test:

After two straight days of in-person testing, we received the feedback we needed, boarded a plane back to Nashville, and began reconstructing the wireframes to transition into Phase 4 (HiFi usability testing), which will be outlined in the next post.

Key Takeaways:

After 3–4 weeks of interviews, rapid iterations, and in-person interviews, a variety of key takeaways surfaced from these stages in the process:

  • Proper Preparations for Wireframe Testing is Key: When using lines for text & boxes with “x” for images, users can sometimes struggle with looking past the skin of the design. So, it’s incredibly important to set the stage prior to conducting a test, as well as carefully planning, testing, and iterating tests (so many tests, I know) prior to sitting down with users. Teammates throughout a company (especially not on the product or dev team) can play an instrumental role in preparing for these usability tests since wireframes are foreign to them, too.
  • Invest in Your Users: One of my favorite parts of interviews and usability testing is that I’m able to put faces to names and build relationships with users. Rather than just jumping into the meat of the test, make sure that you’re spending time getting to know users on a personal level. Not only is this a nice thing to do, but you’re also able to call on them for future testing because you treated them like a human being, not a test subject.
  • Enjoy the Process: Preparing and conducting interviews and usability testing is extremely time consuming when done right. The time spent writing, testing, designing, and practicing can get exhausting, so it’s important to have fun with the process. When I conduct story-based usability testing I’ll create scripts with fictional companies, characters, etc. to get a laugh out of thee users (and myself). For example, we went so far as to create a logo for the fictional company LionHeart when we flew to Lowe’s for on-site testing.

If this post was helpful, and you’re interest in more articles about ways to improve your product’s UX, I’d love for you to give this a share or follow me. Your feedback and support is much appreciated!

Also, I’m always interested in collaborating on a project, so I’d love to hear from you. You can reach me at: ducharme.kyle@gmail.com 🤔

Leave a Reply