Rapid usability testing: Developing a process
Ginkgo Bioworks is a BioTech company that also makes in-house software to support their in-house scientists.
But lots of that software was going out untested with users,
so many software issues were undiscovered...
The Challenge
The Product Design / UX at Ginkgo Bioworks was only a few years old when I joined, and still a growing team. It wasn't yet large enough to cover all the software teams & products, so I was brought in to build a program to ensure that issues were discovered and could be fixed early, across 8+ software & content teams, and around 20 products.
The Solution
I developed and iterated on a process of week-long usability testing sprints, rotating through the different software teams throughout the year. The process I developed was efficient, engaging, & highly collaborative.
The Impacts
Each research sprint resulted in 7 to 40 issues uncovered per round, and teams addressing 2-59% of the issues and opportunities uncovered. Smaller issues were typically fixed within a sprint or two, and larger issues & findings informed or initiated larger research and design efforts.
Some teams would even implement small fixes before the end of the testing week!
Program details
Timeline
Overlapping research sprints, rotating through the different scrum teams:

Planning phase:
- Kickoff meeting with cross-functional team
- Temporary Slack channel for project communication
- Work closely with cross-functional team to decide research goals, test design, & target users.
Execution phase:
- Facilitate user sessions, while team members attend as observers & note-takers.
Upload session recordings to Slack channel for members who could not attend. - End of week:
- Facilitate group synthesis session using team members' notes --> Pull out themes, Prioritize pain points discovered
- Write research report, including outcomes from synthesis session, as well as any quant metrics gathered (e.g. task success/failure rates and usability scores)
- Hand off assets to cross-functional team: Research report, user session recordings, everyone's notes, synthesis Mural --> all housed in our Research Repository, which I also set up & owned.

Excerpts from synthesis session Mural

Excerpts from synthesis session Mural
Example research report excerpts



Praise about the program
I ran over 55 research sprints like this at Ginkgo, engaging with around 200 users per year.
People from every role on scrum teams have given me consistent, positive feedback about the process and the value they've gotten from it.
I love providing this type of value to the teams, and seeing how much people appreciate it!
Here are a few examples:

Thought process
Here's a summary of my reasoning behind different aspects of the program:
About the method: | Why: |
---|---|
One week running sessions for a scrum team, while planning for the next week's testing | This felt like the shortest amount of time I could reasonably do a round of research. |
Qualitative usability testing | Qualitative usability testing is great for identifying usability issues and needs ~5 users, which was doable within a single week. |
Collaborate with the scrum team members to have them attend sessions and take notes | This gets the team members directly witnessing users using their software (which has proven benefits) and saves time because I don't have to rewatch a session to take notes before the synthesis session. |
Provide a note-taking template | This helps scrum team members know what to look for and take notes on during sessions. |
Facilitate group synthesis / analysis session at the end of the week | Again, beneficial to have the whole team involved in reviewing the findings and building consensus, while also having more hands on deck to analyze the findings from the week. |
Iterating over time
As I continued to run these research sprints, I also reflected on the effectiveness of the process, and made small changes over time as I noticed issues or based on feedback I solicited from the scrum teams.
Some examples:
- "Webinar" style Zoom, so lots of scrum team members could observe without the participant feeling self-conscious about the number of people (or who) was observing. This also prevented engineers from jumping in to "help" users overcome issues during the session.
- Quantitative metrics (SUS, SEQ), to provide data around severity of user struggle, which could be compared between studies.
- Concrete prioritization exercise in the synthesis session, so pain points could be more easily compared across usability testing rounds, to help the teams prioritize as time went on.
