![A group of mobile phone wireframe mock ups, the main screen showing "UX"](https://a-us.storyblok.com/f/1010806/2048x1365/53b4b1f95f/ux-metrics_what-is-ux.png/m/)
While it is indisputable that UX is a pivotal element in the success of a brand’s digital offerings, without a comprehensive understanding of the impact that UX can yield, you may struggle to ensure its efficacy. Moreover, securing stakeholder buy-in and a sufficient budget to drive a significant impact on your brand’s success could also become challenging. That is why tracking and measuring your UX becomes essential. It helps UX designers and researchers gain insights into user behaviour and optimise the user experience through process simplification, usability and design enhancements, content improvements, and interactive element refinements.
In this article, we are sharing 10 UX metrics that you should consider measuring in 2023.
Average Time on Task
It measures the average time taken by users to complete a specific task within a digital product. In general, a shorter completion time signifies a better user experience. If the Average Time on Task is excessively long or deviates significantly from expectations, it signals usability issues or interface confusion that should be addressed.
Average Time on Task = Total Time on Task / NO. of Users
Average Time on Page
It is about how effectively a page communicates its information by measuring user engagement and interest in the content or features of that page within a digital product. Longer Average Time on Page indicates active user engagement and a positive user experience with valuable and relevant content.
Average Time on Page = Total Time on Page / ( NO. of Page Views – NO. of Exists)
![Four ipad wireframe mock ups, showing types of charts](https://a-us.storyblok.com/f/1010806/1536x1024/9753161947/ux-metrics_how-to-build-a-framework-1536x1024.png/m/)
Average Session Length
It tracks the amount of time users spend engaging a digital product. The goal is to evaluate user engagement. For instance, a longer Average Session Length indicates higher user engagement.
Average Session Length = Total Session Length / NO. of Sessions
Engagement Rate
It assesses the level of user interaction and involvement with a digital product. Instead of focusing on the frequency of visitors leaving a website or mobile app, it offers valuable insights into the frequency of visitors staying on and engaging with the digital product.
Engagement Rate = NO. of Engaged Sessions / Total NO. of Sessions x 100
Abandonment Rate
It is the percentage of users who leave a particular task or process incomplete, such as cart, form, or task abandonments. A high Abandonment Rate can indicate user confusion or frustration due to reasons such as unclear instructions or a need for more (or even less) information before completing a process or task.
Abandonment Rate = NO. of Abandoned Processes / NO. of Started Processes x 100
Net Promoter Score (NPS)
It measures the satisfaction level of users with your product by asking them a single question, “How likely are you to recommend this product on a scale of 0 to 10”:
Users who rate your product with a score of 0 to 6 are labelled detractors.
Users who score themselves a 7 or 8 are considered passive or neutral users.
Users who give a score of 9 or 10 are referred to as promoters.
NPS = Percentage of Promoters – Percentage of Detractors
Customer Satisfaction Score (CSAT)
It measures the satisfaction level of users with a specific interaction, feature, or overall experience of a digital product. It differs from NPS in that it asks multiple questions instead of a single rating. It is typically collected through surveys or feedback mechanisms where users are asked to rate their satisfaction on a scale. However, to obtain a comprehensive understanding of the user experience, CSAT should be combined with qualitative feedback, usability testing, and metrics like NPS.
CSAT = NO. of Positive Feedback / Total NO. of Feedback
![Three emoji faces in a row, unhappy, neutral and happy](https://a-us.storyblok.com/f/1010806/1536x1024/31375d4926/ux-metrics_nps-1536x1024.png/m/)
Customer Effort Score (CES)
CES evaluates the ease of a customer’s interaction with a digital product.
CES is commonly collected through surveys or feedback mechanisms that prompt customers to rate their experience based on the level of effort needed to resolve issues, complete tasks, or achieve goals. The rating scale typically ranges from “Very Easy” to “Very Difficult” or similar options.
For example, if a customer had a poor experience, they would likely select a small number such as 1, whereas the other end of the scale represents a highly satisfactory interaction.
CES = Total ratings / No. of Responses
Usability Metric for User Experience (UMUX)
It evaluates the user experience of a digital product, assessing the perceived usability and user satisfaction based on participants’ feedback, with higher scores indicating a better user experience.
The UMUX is a four statements questionnaire:
This website/ product/ tool/ software/ prototype capabilities meet my requirements..
Using this website/ product/ tool/ software/ prototype is a frustrating experience.
This website/ product/ tool/ software/ prototype is easy to use.
I have to spend too much time correcting things with this website/ product/ tool/ software/ prototype.
Users are required to rate each statement from 1 to 7 with 1 being strongly disagree and 7 being strongly agree. Here is how to calculate a single UMUX score:
For each odd-numbered question (1,3), subtract 1 from the participant’s response.
For each even-numbered question (2,4), subtract the participant’s response from 7.
Add up the scores for each user and divide the total by 24.
System Usability Scale (SUS)
It assesses the perceived usability of a digital product and is consisted of 10 standardised Likert-scale questions:
I think that I would like to use this system frequently.
I found the system unnecessarily complex.
I thought the system was easy to use.
I think that I would need the support of a technical person to be able to use this system.
I found the various functions in this system were well integrated.
I thought there was too much inconsistency in this system.
I would imagine that most people would learn to use this system very quickly.
I found the system very cumbersome to use.
I felt very confident using the system.
I needed to learn a lot of things before I could get going with this system.
Participants are required to rate their agreement or disagreement from 1 to 5, with 1 representing strongly disagree and 5 representing strongly agree. Responses are scored and combined to calculate a single SUS score, ranging from 0 to 100:
For each odd-numbered question (1,3,5,7,9), subtract 1 from the participant’s response.
For each even-numbered question (2,4,6,8,10), subtract the participant’s response from 5.
Add up the scores for each participant and multiply the total by 2.5.
Design a UX Measurement Framework
Measuring user experience is not a one-size-fits-all approach. To improve UX, you must choose metrics that align with your business goals.
The good news is that you don’t have to create a UX measurement framework from scratch. There are many frameworks to choose from that cater to different business goals. For example, the AARRR framework by Dave McClure focuses on business metrics to drive growth, while Google’s HEART framework focuses on experience metrics.
Tips for UX Metric Measurement
As you begin building your UX measurement framework, here are a few tips to keep you on track:
Use both quantitative and qualitative metrics to understand the full UX picture.
Ensure that the metrics you track are suitable for the design changes you make. Remember that UX metrics should remain adaptable and may require adjustments over time.
Make sure that your metric analysis is relevant to the UX touchpoint you want to improve.
Leverage tools such as GA4 and NPS calculator to enhance tracking and measuring efficiency.
Don’t Get Caught Up Trying to Find the Perfect UX Metrics
To identify the most appropriate UX metrics to track and measure, it’s important to start with your goals, rather than the metrics themselves. This is because there are no “perfect” metrics – only metrics that are relevant to your specific objectives. Establish a consistent process by using existing frameworks or creating a customised one to ensure the metrics reflect your organisation’s unique needs and priorities.
Adrenalin is a leading digital product and technology agency for Australia’s top brands and organisations. Stay informed about the latest digital product trends, strategies, and tactics by subscribing to the Adrenalin newsletter below.
Learn from us
Join thousands of other Product Design experts who depend on Adrenalin for insights