Any service provider, whether in the real world or on the web, has to be interested in and understand their user base so they can better determine how to evolve the services they’re providing. Specifically, the following questions are primary in understanding users:
- Demographics: Who are they?
- Intentions: Why did they visit my website?
- References: How did they get to my website?
- Actions: What did they do while on my site?
- Impressions: Did they like the site? Was it a positive experience for them?
These primary questions help evolve our offerings by helping us understand to whom we should be providing what services. Ideally, we could get this information in an unobtrusive manner from everyone that came to the site, but sadly that’s not possible. So let’s consider the information of interest, in order of how easy it is to obtain this information.
Luckily, on the web it’s pretty simple to determine how the user came to our website, because the browser always sends us the user’s previous location in the referrer field. And, if the user comes to our site via a search engine, we also get what terms they searched for.
Google helps again by providing Google Analytics for free, giving us a nice tabular format of where users come from. The one thing we won’t immediately get is why someone would have typed our website name directly in the browser or in the Google search field, but this is usually a small percentage of the population.
Usage actions are generally captured in clicks or form fillings, which themselves result in server requests which are nominally logged. Hosting providers often provide log analysis for free, and if not, there are several log analysis products such as LogAnalyzer, WeblogExpert, etc. or others that do the trick. Alternatively, you can use Google Analytics, which requires only that you embed a single line of code in each web page.
These tools generally provide full aggregate analysis, such as which pages or paths were most popular, where most users come from, where pages were abandoned, etc. graphically or in tabular format. Essentially, then, finding out what users do when they’re on our site can be determined with minimal cost and effort using existing tools.
Read more: 4 user behavior patterns that show frustration
Now, for the service provider who wants to have an even more detailed understanding of user interaction with the website, there are tools that capture the user’s interaction with the interface, including scrolling, mouse movements, mouse clicks, and so forth. These tools essentially provide an interface interaction log:
1) They can track interface interactions that don’t result in server requests, for example, whether the user scrolled the window to see what is “below the fold.”
2) If a page contains 2 links that point to the same URL, they can provide information as to which one is more effective. For example, if there is an article with a picture in the summary, it can show if most people click on the article title or on the picture.
Typically, these tools can show the information in the form of a heat map, showing which parts of the screen are clicked on the most, as well as graphically. Examples of such tools include Clicktale, Clickheat, and CrazyEgg.
Now we get to the harder part. Why did the user come to the website? What did they intend to do there? Most websites target a particular user group, so it’s useful to understand whether the user who wasn’t interested in the website left because they didn’t fit the target profile, or because they didn’t get the site.
One way to extract user intentions is to understand how the user got there, by reviewing the reference information. However, probably the simplest way to determine intention is to ask the user, for example, by presenting a small survey when the user navigates away from the site. Again, online survey tools help in this regard, but the practice is annoying to many users, and most are not going to fill in the surveys.
The good news is that some of this information can be derived from the reference information. So if a user comes from a blog that has written about the website in a specific context, one can get a pretty good idea as to why the user has visited the site, but deriving this information is a bit less automatic.
Most of the time, when we create a website we have some idea of who will be using it the most, but it’s immensely instructive to understand who really uses the website. We can infer some of the demographics by looking at visit patterns – for example, if users are mostly coming during late nights, they’re more likely to be folks without professional daytime commitments, so perhaps students or underemployed people.
But in general, short of asking the user, it’s not easy to determine user demographics. Fortunately, there are many online services such as QuestionPro, Crowdscience, SurveyMonkey, etc. that make it easy to quickly survey website users.
While actions can be explicitly measured, user impressions are necessarily more difficult to derive. How can you find out how the user reacts to the website design, or colors, or message? Again, the most straightforward way is probably to ask them. Since impressions are intrinsically “soft,” they are not easily captured with surveys which tend to “lead the witness”; so asking the user to respond to the question in free form is the easiest way of doing this.
This can be performed either through a moderated session, or simply by asking the user to articulate their impressions as they are interacting with the website. We at TryMyUI provide the tools to capture user impressions via unmoderated remote user testing through narrated video, that is, video of the user’s screen and actions while they are narrating their impressions, thoughts, and emotions.
A different way of capturing impressions is to set up A/B studies, in which different users are presented with different views of the website or from a mobile app, and seeing which one has more success. Famously, this is the preferred way that Google makes changes to their site design. This approach has the benefit of being purely data-driven, but is both time- and resource-intensive. Furthermore, it requires a lot of users to come through to provide meaningful information. Additionally, by discarding any interest in why users may prefer one path rather than another, it does not promote greater understanding.
Usage information is essential in helping improve websites and web services. We categorize this feedback in terms of who they are (demographics), what they do on the site (actions), what they wanted to do (intentions), and what they thought and how they felt about the website (impressions). Actions are explicit and readily captured either through server logs or interface interaction logs, intentions and impressions are tougher to capture because they’re, well, fuzzier. At some point, the best indicator is to just ask users what they wanted to do and how they felt about it.