Confessions of a user testing CEO - Trymata
Confessions of a user testing CEO

Let me start by saying: by confessions, I do not mean a reluctant admission of guilt. Rather, what I’m talking about is an honest assessment of where our industry is, a discussion of the challenges we’re facing, and our bid at solving them.

Our mission at TryMyUI is to help companies understand their customers’ usage behavior on their website, web app, mobile app, or any interface. We strive to deliver insights that give you, our customers, the “user’s view” so that you can build more usable platforms – and as a result, create poignant and valuable User Experiences for your target market.

The primary data we use to do this consists of video and audio recordings of users following a series of tasks through a designated flow.

However, my growing sense of the limitations of this kind of data (our industry’s primary product) has spurred me to write this post.


Read more: What can user testing do?


 

The problem: “What to test” vs “How to test”

User testing data is extremely valuable – or rather, it contains extremely valuable insights. TryMyUI and most of our competitors equip you with different suites of tools to collect, analyze, and synthesize this data and extract those insights. However, the plain truth is, user testing is only ever going to give you insights into the parts of your product you actually test.

This may seem like a blatantly obvious thing to say, but it’s reflective of a real problem with the industry.

Reflection gazing in separate directions

All the tools in the usability testing space are what I will call “How to test” tools. Our customers come to us with a user flow in hand and our tool answers the question, “How do we test it?” The troubling part here is that the way researchers determine “What to test” is not as data-driven as it should be.

Typically, researchers default to testing the newest prototype on hand, or the latest build before its release, or a flow that is scheduled for a rework. This leads to a host of questions like:

“Are the flows being tested really the flows that are most problematic for my users?” or

“Are there flows I should be testing right now that I am not?”

Not knowing what to test can allow glaring pain points to fly under the radar, and resources to be misattributed towards testing things that are performing well (or at least better than serious problem areas). It can skew a company’s perception of it’s own problems and be a roadblock to achieving good UX.

Knowing “What to test” is absolutely essential before you try and figure out “How to test” it. I confess that up until today, user testing companies have not done enough to help customers answer that critical first question.

 

The solution (Part 1): UX Big Data

The gap between “What to test” and “How to test,” and its adverse effect on our customers, was frustrating me and my team. We had many sessions trying to think of a way to bridge the gap.

We realized that the current sources of constant, comprehensive user data (i.e. Google Analytics and tools like it) are specced to meet the needs of marketers and growth-hackers. The kind of data these tools provide is not the behavioral data UXers need. As a result, they make any UX insights obscure, if not entirely inaccessible.

However, a constant stream of data on user behavior would be rich with insights and, if harnessed properly, could bring an exponentially more robust, data-driven approach to identifying, categorizing, and understanding areas of user frustration that should be tested.

Windows of a tall building at night

Internalizing this inspired us to build a tool that could capture all of that user behavior data, focused on sniffing out instances of user frustration and delivering UX insights. A tool that concisely answered not only the “What to test” question, but also showed why those things needed to be tested.

TryMyUI Stream, named for the constant stream of data it taps into, captures every single user session on your website. If you get 100,000 website visits a day, it gives you 100,000 user videos you can watch.

This is, without embellishment, our foray into UX Big Data. You can watch everything, across every page and every flow of every user. You can be certain that your dataset contains any issue experienced by any user, in any given day, week, month, or year.

 

The solution (Part 2): Finding user frustration with A.I.

The obvious question that arises at this point can be politely paraphrased as: “How the hell am I going to watch 100,000 videos?!”

We are aware of the fact that User Researchers have to expend considerable effort extracting insights from video data, and if anything, having 100,000 videos seems less likely to help answer the “What to test” question.

This is why TryMyUI Stream is equipped with a self-learning A.I. engine fine-tuned to find instances of user frustration in these videos. After analyzing thousands of user videos over the last year, we isolated these 4 common user frustration patterns:

 

User Frustration Markers

1. Rage Click

Rage Clicking is the rapid, repeated clicking that occurs when a clickable element is slow, broken, or unresponsive, or when a non-clickable element gives the appearance of being clickable. A user trying to interact with the element is bamboozled by the lack of response, and keeps clicking it with increasing frequency to see if something will finally happen.

2. Scrandom (Random scrolling)

Scrandom is fast scrolling through content. It usually means the scent of information is weak, and users are trying to filter out unnecessary and irrelevant content to find the right information.

3. Wild Mouse

When something is taking longer than you thought it would, or a page doesn’t seem to have finished loading, you may subconsciously zig-zag your mouse around the screen as you wait – that’s Wild Mouse. This behavior indicates irritation, impatience, and anxiety.

4. Backtracking

Backtracking is like making a U-Turn. Somewhere along the way, you took a route that you shouldn’t have, so you’ve got to head back the way you came. Backtracking often implies that a navigation was fruitless: the visited page did not lead the user forward towards their ultimate goal on the site.

 

The A.I. engine we built automatically flags sessions that contain one or more of these user frustration markers. Moreover, as the constant stream of data keeps coming in, the A.I. engine learns and refines its definitions of these frustration markers as they apply to your users.

This way, without discarding or overlooking any data, you can see how many sessions out of the total contain each user frustration, and where.

Jumping in

Then, you can jump in and watch the sessions of these frustrated users to see exactly what frustrated them and, as a result, form a strong, data-driven hypothesis on “What to test.”

 

My hope for the future of user testing

One of my UX idols, Jared Spool, said: “If you say design is subjective, I will say, you have not spent enough time measuring design”. I hope that more user researchers embrace the need to bridge the gap between “What to test” and “How to test,” because knowing what to measure is just as critical as the tools to measure it.

TryMyUI Stream is in its infancy, but it has already delivered objective (occasionally uncomfortable) insights to us about our own product. We are working on growing it and visualizing more data to help seamlessly bridge the gap and I would love for the UX community to join the beta at www.trymyui.com/stream and help us along the way.

 

Sign up for a free user testing trial of our usability testing tools



By Ritvij Gautam

Ritvij Gautam is the Co-Founder and CEO of TryMyUI. He holds Bachelor's Degrees in Physics and Philosophy from Claremont McKenna College.