User Validation Sessions
A large part of my work as a UX consultant is user validation. Getting feedback from customers at all stages of the process ensures we're creating the right solutions for our users. I have experience in the facilitation of both in-person and remote user testing, guiding scrum teams through validation setup, and teaching courses on testing best practices.
Pre-testing setup
Before testing can begin, my job is to first understand the product we plan to test and what specific tasks we'd like the users to complete. This usually involves meetings with the product team, marketing managers, and SMEs familiar with the application to nail these down. Once an agreed-upon testing protocol is completed, I create and test a cache of hardware or virtual access to the software for the user to use on the day of the test.
​
As often as I can, I like to get our users in the room or meet them where they are to test our products. For our specialized products, that first requires reaching out to our network of SMEs to gather a group of potential testers that have previous professional knowledge of the type of software we are creating. For those who can't complete their session in person, a remote connection is established where we can view the user's screen as they complete tasks.
The Testing Experience
Once a user is ready for a test, I'll send information to them on how to log into any remote desktops they may need beforehand if the session is remote. If it's in-person, I will have tested the software before the user arrives to ensure it's in working order. After greeting the user, explaining the purpose of the study, and gaining consent to test, the session can begin.
​
I will then ask a series of ice breaker questions to better understand the user's professional background and experience in the field. This helps us understand the type of feedback we're getting and contextualize comments made. I will then have the user begin the predetermined tasks while a notetaker, or SME, takes notes on observations from the user. These notes will later be discussed in debriefing sessions and compared to other notes from all of the sessions conducted.
​
After the session, we will ask the user what their favorite and least favorite parts of the app are and what they would've hoped to see. These questions often produce potential new features and weed out the ones that didn't meet the mark. After notes are compared and changes are agreed upon, an improved version of the app can start to be developed.