Is it good that users spend a fairly long time on the homepage? You do not know whether it is because they read everything thoroughly there or because they struggle to make a decision what to do next. Users might abandon your website at checkout for a completely different reason than you might think, thus you might spend resources on simplifying the checkout process when the problem is not with it but with misleadingly worded delivery options.Marketers, analysts, UX teams can make good assumptions to interpret the data, however, these are just assumptions that might be wrong.
No expert can get into the heads of your users to see what exactly they are thinking. And making wrong assumptions is often expensive.
What you need is getting some insights into how actual users use your website to achieve their goals, how they perceive it, how exactly they behave, what surprises them etc. It is especially important if the process of creating the website was only partly or not at all user-centred. The behavioural data should not be treated as a replacement for analytics, it should just supplement it, help to test various hypotheses of why usage statistics are the way they are.
User Testing is easier than you think
Nothing helps better in understanding users’ behaviour than actually observing them using your website, however, a common reaction when I recommend a client to do some user testing is unease or scepticism, mostly due to an assumption that it is something time consuming and expensive. It might well be, but you do not necessarily need to rent a lab with some eye-tracking equipment, find a professional facilitator and recruit a lot of potential users (or, more realistically, contract an agency that does it for you).
When you are doing user testing not for measuring usability (thus, do not need accurate statistics for things like success rates or average task times) and not for doing benchmarking, but in order to get some understanding on how users use and perceive your website, you do not need many participants. In fact, as few as 5 would provide many good insights, ideally you should run several rounds of testing (every time some changes are made). Even observing fewer users than this would enable you to look at your website from a different angle and avoid a common problem in design teams – not being able to see obvious issues with a website because they have spent too much time looking at it, cannot to step aside and look at it as a typical user.
You could just invite some people (who are part of your target audience) to your office, ask them to carry out some specific tasks on your website while thinking aloud and observe them – do they carry out the tasks a bit differently than you have expected? Do they get frustrated? You are likely to make many interesting observations, which, combined with the insights from analytics, can inspire some successful design or content changes.
If your time and budget are really limited, and the website is for the mass market, not for a narrow target audience, you could ask some strangers in a coffee shop to participate in a short testing session in return for a coffee and cake (it is called guerilla user testing). As long as you focus less on what they say (“I don’t like this ugly green background”), more on what they do (press a wrong button, use your search functionality in a different way), you can get some good insights into how real users use your website.
You can do user testing even without participants being physically present, nowadays there are some remote moderated testing tools, for example Validately and UserTesting Pro Version. These services help to recruit some participants that meet your criteria and allow you to observe them interacting with your website in real time – you see the participants’ screens as they use your website, hear their comments, you could ask some questions when needed. A disadvantage of this approach compared to offline testing is that you miss the body language – a participant might not say anything, but lifted eyebrows would indicate that he is surprised, to address that you would need to ask some questions regarding their experience using the website at the end of the session.
There are plenty of unmoderated testing tools – there you just get videos of user interacting with your product to carry out some tasks while thinking aloud. They are also good, but you do not have an opportunity to clarify unexpected behaviour, thus moderated (live) sessions are preferable.
They key is not to aim at statistics, but at observing and understanding what is happening behind the numbers you have.
Another useful method for understanding users’ behaviour is tracking their actions within a page. Services such as MouseFlow, CrazyEgg, ClickTale and others track mouse movements, all clicking and scrolling, and provide usable visualisations of it, including heatmaps. These tools help to understand how far users scroll, which parts of the web page get the most attention, whether users look at the parts of the website you would like them to look at etc. Google In-Page Analytics provides very basic heatmaps as well, however, very limited compared to the aforementioned tools.
This method is not comparable to user testing, since you still need to make assumptions why users behave in a particular way, however, this way you can make much more informed assumptions and have a much greater understanding on what users actually do on each web page. When the reasons of a particular behaviour are not clear, you could always do user testing, this way you would know which particular tasks and workflows to test.
The more you understand how your website is actually being used, the better decisions you can make for its improvement. There is no single magic method for making informed decisions, however, the best decisions are usually made using a combination of understanding what exactly is currently happening and why it is happening. It is easy to make assumptions or rely on expert opinions, however, there is always a risk of getting it wrong, thus hurting your conversion rate and user experience.