Luke Hay
Making the Most of Analytics in UX Research
In this episode of Brave UX, Luke Hay shares his thoughts on the role of web analytics in UX design, why we can't only rely on remote research methods and how to avoid bias in quantitative data.
Highlights include:
- Why do we often seem to avoid analytics in UX?
- What is the role of analytics in making UX decisions?
- How do you handle weird and wonderful participant situations?
Who is Luke Hay?
Luke is a Senior UX Researcher at Clearleft in the UK, where he is part of the design research team. He also consults privately.
As someone with a keen appreciation for analytics in UX research, Luke provides highly regarded training on the subject. He’s even written a book about it, “Researching UX: Analytics”, that was published in 2017.
Luke’s mission is to help UXers to make a significant and measurable impact through their work.
Transcript
- Brendan Jarvis:
- Hello and welcome to another episode of Brave UX. I'm Brendan Jarvis, Managing Founder of The Space InBetween and it's my job to help you to put the pieces of the product puzzle together. I do that by interviewing world class experts in user research and product management and they share with us their expert learnings, stories and advice. My guest today is Luke Hay. Luke is a consultant, trainer, speaker and author from the seaside town of Brighton in the UK where he has been making digital experiences better since the late nineties. Originally working in website optimization and marketing, Luke developed a keen appreciation for measurement metrics and results. In 2011, he made the move into UX design and user research. He currently invests his time between his role as user research director of Fresh Ag and his consulting and training practice, a self-confessed analytics Geek Luke published his first book in 2017 titled Researching UX Analytics, A deep Dive into Working with Quantitative Data to improve the user experience.
- He has also been published in UX Collective and Site Point on Medium as an active relaxer. Luke has also generous generously given his time to UX Brighton and UX Camp Brighton and annual One Day and conference for the past eight years as a co-organizer, he has been described as a first class trainer and an extremely knowledgeable person on the subject of using analytics to inform UX design decisions. Luke's mission is to help UXs to make significant and measurable impacts through their work and it's great to be able to welcome him to the show today. Luke, welcome to the show.
- Luke Hay:
- Oh, thanks for having me, Brendan.
- Brendan Jarvis:
- Yeah, it's great to have you here and I've been looking forward to this conversation all week. I think the blending of the worlds of analytics and UX, while they are things that sometimes spoken about, they're often subjects that probably gets little less attention than it deserves. So I'm really excited to have you tell us a little bit more about that today. But before we get into some of the details around that, I was doing a bit of research preparing for today's conversation and I saw looking back through your LinkedIn profile that it see seems that you are largely self taught and you got into the industry in a technical admin role straight after high school. What was it that made you dive straight into this industry?
- Luke Hay:
- Good question. I wish I had a really clever answer for that one, but it was luck to be honest. So I, I'd finished high school. I felt that I sort of had enough school for a while. I'd been doing it for a few years and didn't want to jump straight into university. So I originally planned to have a bit of a year out, but I hadn't really planned what to do on that year out. Got to a point where I needed some money, so I was just looking for jobs and funny, I saw an advert in the local paper, this is how long ago it was. That's where people look for jobs and it was for a web company and I sort of vaguely knew what web company was, but this was back in 98. I didn't have the internet in my home at that point that not that many people did.
- So it was a real sort of eye opener to sort of say I'd work with computers at school but I'd moved on from that. And yeah, it just went in for an interview and got the job at a very, very junior level. It was more sort of office junior really, but at this work company. And then it sort of went from there really, and I'm learning a lot because the company grew from about, I think it was about seven of us maybe when I joined to probably around sort 2025 by the end. So a bit of growth over a couple of years and obviously the industry changed massively during that time as well.
- Brendan Jarvis:
- So you've moved out of that web management, digital marketing focus into UX and research in particular about nine years ago. What was it that again made you make a shift from what you were doing into what you are doing now?
- Luke Hay:
- Yeah, I think it's probably more gradual shift and it would appear on my LinkedIn or on my cv, but I think it was really the case just over time working in primarily roles that touched on a lot of areas. They had one job title but actually covered a range of different roles anyways, there were bits and pieces dipping my toes into user research. But as I think everyone really who works in this particular industry that the passion came from really wanting to understand how people interacted with websites, thinking about the people behind them. I'm definitely not a developer at all. I'm not artistic enough to be a designer. So it's sort of what you do in between that. And there's the sort of two sides to it, obviously the more [inaudible] side, the sort of things that UXs will perhaps be more familiar with, which I'm very interested to spend a lot of time doing. But also obviously marrying that with the more analytical side, which I was doing from that first job that we spoke about. They were very, very basic analytics back then, but that was a part of my job to see how many hits a website has had. That was pretty much the extent of it back then. But that was where it started on the analytics side
- Brendan Jarvis:
- And people used to have those little hit counters on their home pack.
- Luke Hay:
- Exactly that kinda thing. Just slightly more sophisticated than that back then. Yeah,
- Brendan Jarvis:
- I think it's interesting talking to people with various different backgrounds and just how a lot of the times there's a lot of value that people bring from their various experiences and other aspects of this broader digital field that we work in. And it's interesting to see how that influences where they end up going with it, particularly with their UX and research practices. Now just before we get into a deep dive into UX and research and analytics and all of those good things, I wanted to spend a little bit of time just understanding a little bit more about your involvement with UX Brighton and UX Camp Brighton, because that's something that you've been doing for quite some time and it appears to be something that you value quite a lot. Can you tell us a little bit about what those two things are and what it is that you've been doing?
- Luke Hay:
- Yeah, so I, I think the general interest in the sort of digital community here in Brighton came about, cause I used to work for a company called Wide Sussex who are a membership organization for basically web agencies in and around Brighton. And we used to have job sport on the website and essentially help them develop their businesses there. So I was very much sort in the center of everything that was going on in terms of digital then anyway so then when I moved more into UX it made sense to, and I have contacts in those sort of areas. So a guy called Danny Hope who runs UX Brighton, I've known him for a long time. And so we worked aspects I've I've run workshops for UX Brighton, both under myself and also sort managed and facilitated those for other people as well as helping out a bit with their conferences of organizing volunteers and that kind of thing. So yeah, that's really my involvement with UX Brighton UX Camp Brighton getting founded by friend of mine, Patrick Sampson. And he essentially needed help to build the event from what it was. I mean it's a really interesting, then actually I don't know if you're familiar with the way on conferences work
- Brendan Jarvis:
- Somewhat, but some of our viewers might not be. So give us a rundown.
- Luke Hay:
- So it's an conference and basically the difference between a conference and nonconference is in a conference you have your speakers all organized and lined up whereas in an UN-conference, the attendees actually give the talks or run the sessions. So what you have at the start of the day is everyone who comes along gets a little card and they write their name on the card and what they want to talk about. And that can be a sort of presentation, it can be an open discussion, can be some sort of workshop. And then all of these cards get put on a grid. So there's a grid of times during the day and multiple rooms normally too. And then each session you'll have five or six different rooms with five or six different talks being given by the attendees. Obviously the other attendees go along to those. And it took a lot of persuading actually for Patrick to get me to come along to the first event because I was really nervous about present which is a bit sort of funny looking back. But at the time I was really nervous, even though it was a very sort small friendly environment, it was still a bit unsure. I still rather just come along to watch. But he talked me into that and then gave a small presentation there and then the next year he'd need the hand pulling everything together. So myself and some of the other volunteers got together, sort worked on organizing the event, which takes a little bit organizing and then yeah, it went from there really.
- Brendan Jarvis:
- Yeah, I mean it's interesting that you say that you were nervous and you kind of got dragged along and someone sort of encouraged you to step into the spotlight and do that. And that's something that, I mean obviously I find myself running this interview series. I mean I think just in the intro there I was got a little bit tongue tied. And even though that this is pre-recorded and it's pretty low pressure you know, do feel that anxiety, what would you say to people in the UX community that have been considering contributing in the way that you have been and getting up there, putting themselves out there and sharing their perspectives? Do you have any words of wisdom to them or encouragement that you'd like to share?
- Luke Hay:
- Yeah, I think so mean for me, the way it's worked, and I don't really speak from my point of view cause obviously everyone's very different. There are people who are more sort of extrovert, who absolutely love doing these things. But for me it was always just sort starting small, kind of almost forcing yourself to do it, but starting in a friendly environment. So say UX Camp Righton, for anyone who's perhaps a bit more local than you definitely worth looking into when we're at a lockdown well worth people coming along to. But it's a very friendly, supportive environment. We have our session rooms, some of the smaller ones you can maybe only fit about 15 people in any way. So you're not having to give a talk to a huge audience. So I think that definitely helps. And then really once you've done one or two of those kind of things, then everything else just gets a little bit easier.
- And ideally you can step up a bit, speak at a small conference and then a bigger conference and that kind of thing. And for me, I still get nervous. I spoke at Brighton SEO last year and that was I think it was sort of about eight, 900 people in the room. And for me that's quite a lot. So yeah, still get nervous about those things, but I think starting small, growing with that. But the other thing is just practice really. I read someone, I can't remember the exact details, but there was someone recommending you should practice something in full seven times before you actually do it. So unfortunately I can't remember the exact kind of theory or study behind that, but I try and do that. So I'll literally sit with my laptop in my kitchen or whatever and just kind of run through it till I know it more or less off by heart. So more like a script and then when you actually get there, you can then hopefully just come back to you rather than trying to grasp around for things.
- Brendan Jarvis:
- So everyone that's listening, this is take number five of this interview. So we've got a couple more to go, Luke and I. Yeah, I think it's important for people who are listening and wanting to perhaps contribute more to the community globally, we're quite an approachable bunch. I think everybody's really supportive and I think it's really important for people to realize that their harshest critical, the thing that often stops them the most actually lives between the two ears, your two ears. So you just get out there and do contribute and put yourself out there. You mentioned seo. Now that's a topic I think is a good segue for us to move into this conversation around analytics and UX. And while SEO isn't necessarily analytics you've described UX as SEO's best friend in the past. What was it that you meant by that?
- Luke Hay:
- Well, that was technically that, I guess that wasn't me describing it in that way, although that was the talk I was in, but I was doing that with an SEO friend of mine. So it's funny, I have a sort of jokey relationship with SEOs in as much that I am. I pretend I don't value what they do and I'll find their work boring, but actually is a very necessary thing is important for us as UXs important that people are able to get to our websites. So although I don't want to be involved with SEO at all, it is important that those things happen. And I think over the past few years, I mean probably several years now, Google's algorithm has really been based around trying to reward good user experience. So in the old days used to cram loads of texts on the page, loads of keywords, those are links, that kind of thing.
- Whereas Google obviously saw people doing that and now they're trying to make an algorithm that will effectively reward you for having well written text and well laid out pages and that kind of thing. The kind of things that obviously we look for as UXs. And I think that was the point of the talk Pretty, which was the one I gave a Brighton SEO with my friend mission. She came with it from an SEO's point of view as to why they needed to spend more time talking to UXs. And then I was there effectively giving the job of sending UX to say 800 or so SEOs who were perhaps as skeptical about UX as I was about seo. But as say the point is Google does Google's over is based around UX. So I think for SEOs they need to be looking at that side of things. And certainly Fresh Egg, the background of the agency is primarily was in in SEO originally. So from my point of view, it's really good to work with the guys there and actually their formal thinking and they're all up for doing usability testing and even testing the search results pages and seeing people's reactions to those. So I think there is definitely more of a crossover over time where it's not just about chucking loads of keywords in a page, it's actually about thinking of the end user and understanding what they make the experience.
- Brendan Jarvis:
- So look, in 2017 you published a book called Researching UX Analytics and no one in my experience writes a book just for fun. What was it that made you write this book and who is it for?
- Luke Hay:
- So I hadn't really intended to write a book. I think perhaps a lot of people, it was something very, very far in the back of my mind that one day it might be a nice thing to do, but I had no intention of doing it all. But Site Point got in touch with me who are the publishers and I think again just because I'd had experience with analytics, UX, I've written some blog posts, some few talks, that kind of thing. I guess they found me through that and got in touch and said, we're writing a series of books because it's the sort of research and UX series. There are various other books in that series too. And they wanted one on analytics and they came to me said, did I want to do it? Told me that I wasn't gonna get rich writing a book but they were right, [laugh].
- Yeah, exactly. It was yeah. But they were very upfront about it and said it wasn't gonna be a huge income, but it paid enough to make it worth my while to spend the time since it does take a lot time to write it. And I think from my point of view, it was one of those where I was a bit unsure sort of, do you want to do this or not? It's gonna be a load of effort, it's probably gonna be a nightmare, but at the end of it you'll have a book. So I went down that route. I thought it's not like this opportunity's gonna come up every day because as I've mentioned before with my freelance work, I do have a effectively spare day a week. So I could kind of sacrifice that to some extent and spend that writing. And yeah, it was actually not quite as bad as I thought it'd be in terms of the effort involved.
- There was a lot of effort, a lot of back and forth, but ultimately it wasn't too bad. I think with the writing you got to a point, I got some tips from other people who'd written books. I think the best one was to kind of focus on word count as a way of getting toward the end. So not the word count for the book overall isn't that important, but they had a rough guideline of how many words it should roughly be. So I knew that if I do, and the magic number I think was 2000 words a day. So I knew if I could get 2000 words down a day that was happy with then that was good progress and that I was definitely moving towards that finish point. So I think that that helped to have an end goal in mind cuz I'm a runner as well, I hobby. So I do a fair bit running and that for me is always when I set up on my runs, how far am I going counting down towards the end To some extent it was a very similar, the process with the book of actually putting a little time and effort into it. But ultimately if I could close the laptop at the end of the day and think I'd done say 2000 or so words, then I thought that was fairly good and I was one step closer to finishing really.
- Brendan Jarvis:
- Yeah, 2000 words sounds like a solid effort. Tell me who is it really written for? What audience are you trying to reach with this book?
- Luke Hay:
- So I think it's really anyone who works in UX who doesn't doesn't know a huge amount about analytics. I mean it is a bit of a beginner's book in terms of it's, it presumes no prior knowledge of analytics. But I think in my experience you have analysts who really like data but maybe don't people so much and you have UXs who of spending time with people they like user interviews, all of that kind of thing, but really don't want to spend time looking at the numbers. And I think for me, hopefully what the book does is it is very different to how to use analytics side of things or all the sort of tutorials for Google analytics because it's not really focusing on how do you find this numbers and how do you find these numbers and statistics and that kind of thing. It's all about really how those numbers can be translated into something that's useful for us as UXs. So I think the main premise is really to take an analytics first approach to work. So I'm in no way, I don't really wanna stress this no way, saying that people should just look at their analytics and then make assumptions based on that. But they're often a good
- Brendan Jarvis:
- Leave a comment if you agree or disagree, is that what you're saying? We'll we'll have a good robust discussion in the comments section.
- Luke Hay:
- Yeah, I'm hoping and assuming that no UX is gonna come on here and say, oh you don't need to do any proper research, you can just look at the numbers hoping that's the case. And that's certainly not the point the book's trying to make. But I think the point is that you can get a really good understanding about what's currently going on the website where people might be dropping off certain journeys. There's a chapter in the book about analytics views of research, so you can actually find a lot about your users from that. So within Google Analytics they've got kind of demographic reports that will give you a broad understanding of the kind of ages, genders, that kind of thing of people using your site. Obviously the countries they come from, all of those kind things. And that is a say, it's the starting point because everything else on top of that then is the fun stuff.
- So that say there's older people are dropping out at this particular point of your journey, well then you can go and get to do what we all kind wanna be already, which is research to why that's happening, perhaps doing some usability testing with older users or whatever that might be there. So yeah, it's using analytics as that starting point but also looping it back in on the other end of actually measuring the results of work. Because I think that's something that UX has struggle with generally of how do you actually prove and show the value of your work. And it's not always as easy as just looking at the numbers, but certainly that is one form of measurement that you can perhaps add.
- Brendan Jarvis:
- I think we may have scared a few people with that. You mean we actually have to check that what we are doing is working. That's a radical idea.
- Luke Hay:
- We can always prove that one way or another, whether it's more qualitative or more quantitative. But yeah, definitely good to have that option.
- Brendan Jarvis:
- So you gave a couple of examples there of ways in which analytics can be used. What are some of the most common problems that you see UX is encountering and that having a better grasp with analytics can help them to solve? Does anything come to mind?
- Luke Hay:
- Yeah, I mean think, say the first thing is that shortcut getting started with something that I said to you tomorrow I got in touch with you and said I'd want you to improve my website. You wouldn't really know much about that site and who's using it. I could perhaps give you some anecdotal information, that kind of thing. But within half an hour or so of looking through the analytics, you can find out how many people visit it, which page is the most popular, what the conversion rate is what countries they come from. There's a whole bunch of information that really get you started. So that's why it's a great starting point there. It's also good for checking and validating assumptions to some extent. So if for example, you've got a five step form and you just have a look at that form without looking at the analytics and you think, oh this third step seems really clunky to me, I'm not sure whether that's gonna be working for users. Then you know, can look at the analytics and if there's a big drop off at that point, then that's just kind of backing up further what you're thinking. So again, for me it just should work alongside alongside more [inaudible] methods.
- Brendan Jarvis:
- So the two working together to be able to really deeply understand what is going wrong and why that might be happening.
- Luke Hay:
- Exactly. That is the thing that people put out all the time and it's entirely true, although there's a little bit more nuance and that essentially the analytics do tell you what's happening and the more cognitive form of research will hopefully be able to tell you why that's happening.
- Brendan Jarvis:
- Yeah. So you mentioned, and we were stereotyping earlier about how some UXs might not be that comfortable with the numbers and the book was really written to help them to become more comfortable with that and using that in their practice. In your experience and conversations that you've had with other UXs, what are some of the fears or objections that people have around integrating analytics more into their practice and their research?
- Luke Hay:
- Yeah, I, I suppose the couple that spring time first of all are the fact that for lots of people, and I include myself here, I'm not a very numerical person. I didn't do very well at my math exams at school and that kind of thing, so that's not really where I'm coming it from. But I think people do have, I dunno, a fear of numbers but they don't feel comfortable working with data in that way. Which again, for slightly less technical people that shouldn't be a barrier because if I can do it, other people can. The whole point about these kind of tools, particularly sophisticated tools like Google Analytics is you don't need to do any of the maths yourself respectively. You're not really there to sort work out complicated statistics and things is more just looking at what's in the tool and getting what you want from the tool.
- And I think that's part of the problem that a lot of people come to analytics and if you come at it without a view of what you want to do or without that idea in mind, then it is just a wall of numbers you know are just looking around blankly. It's just hundreds of different reports and it's just confusing. So I think if people come at it more with a goal in mind of I want to find out this particular part of the journey or I want to find out which most popular pages are, whatever it might be, if you've got a bit of in mind it then becomes easier to interpret the numbers. So I think that's definitely one thing that puts people off. And also just getting into the tool itself whether it's Google Analytics or Adobe Analytics or HEAP or any of the other tools that are out there because people don't particularly like using them necessarily, it means they are quite complicated to use. They're the kind of tools that you do need to be using quite regularly, otherwise you forget where things are and you forget how things work. So that can be a barrier in itself that every time you open it up you are showing all these different charts, all these different numbers and it can be a bit overwhelming. So hopefully the book would help to demystify that somewhat.
- Brendan Jarvis:
- Yeah, I think I remember you writing that there, something like 500 default report views can be a bit overwhelming if you don't have a good start point or as you mentioned. I think a key thing you mentioned there, Luke was having the right question before you delve into it, need to know what it is that you want outta it, otherwise you'll probably get lost in the sea of the data. Now we were speaking off there about AB testing. Now I understand that this is something that's part of your practice at work now. Can you explain how are you using testing in your UX practice? What sort of problems or things are you trying to learn through that?
- Luke Hay:
- Yeah, so I mean when I started at my job role was a sort of conversion strategist. So basically it was focusing purely on how to make websites convert better which to some UX purists is probably quite a dirty way of looking at things because I think we all like to think, oh it's all just about having the best user experience. Of course it is. But realistically if you're working in a commercial world you do need to see some kind of results and that really is what the sort of CRO side of things the commercial optimization is about and that's really testing all sorts of things. So it can be that's I suppose sort of things on the tools of business end. So things around the checkout and changing things there, whether it's the sort of classic cliche just changing the color of the call to action button or ideally something a bit more nuanced and complex than that.
- But essentially by changing the design which we suggest new designs pretty frequently, but this is actually testing it. So mean to go back to the very basics testing is when people come to the website you show half of them one design, half of them another design and see which half convert better in this case. And as I say that the way that works with Fresh Egg is it's a little bit more involved than that to see myself coming from more of a UX background. Again it's combining these different techniques. So we'll use the AB testing to get the more quantitative measurable results but we'll also do quality testing alongside that. So we might do user testing to understand yes it might be converting better, but what does it make users feel? Is it causing them issues? Is it frustrating the people who aren't actually converting also, particularly for longer term conversions.
- I mean we work with one of the largest distance providers, distance learning providers in the world and a lot of their comprehension of challenge is to make sure that people understand what their offering is and understand things like how much it costs compared to other providers, that sort of thing. So it's not the kind of site like an e-commerce site where people don't come in and they're gonna convert. It might be a six month process of them researching everything, giving careful consideration. So the AB testing there gives us part of the story but we do really need to do that longer term longer term research as well to properly understand the impact things are having.
- Brendan Jarvis:
- Yeah, yeah it's interesting this area of AB testing, multivariate and data science and I was having a conversation with Phil Gordon from Spotify last week and he is in charge of data scientists as well as the UX researchers and these worlds are definitely coming closer and closer together. And having spoken to a few other people also that have been touring, particularly the valley, the big tech start startups and big tech businesses sometimes they're using exclusively quantitative technique testing techniques to refine the UX and the ui. And I suppose we can come to that maybe near the close the show, but just this notion of where UX might go and what the role of qual and quantitative techniques might have. I think it's a topic for the next decade for us to really consider. Now just before we move on to another conversation, what are some of the common mistakes or things that people should bear in mind or try to avoid when they're delving into their analytics and using what they see to make decisions? Are there any things that you think people should know or be weary of?
- Luke Hay:
- Yeah, absolutely. I mean I think first and foremost you need to make sure that the data you're looking at is accurate. So it's very easy to set up Google analytics incorrectly. It's easy to have your data skewed by whether it's bot traffic or who knows what else might be happening there. I mean at the moment lots of people working from home. Can you be sure that you are filtering your employees out from that, for example. So it may be that you've got employ a lot of people and they're on the site doing things then that could be skewing the data. So it's really considering is your data kind of technically set up correctly? Is it technically sound, is there anything externally that's gonna be changing that and causing issues there? So I think that the first thing would be make sure the data sound. The second thing is obviously to report in the sort both sensible way.
- So again as I've mentioned before, it's about coming into analytics with an idea of what you want to find in mind and then looking at that rather than just grabbing anything that you think looks interesting. But I think the main issue for me doing both of those steps is the sort bias in reporting. So as UX we are very familiar with bias in quantitative reporting. We we're very aware that we might shouldn't be leading users but we are talking to, I shouldn't be asked them, well why do you think this website's so great? Those kind of things need to be carefully in our language and nuance there. But it's the same with numbers cause people make a mistake of saying, well the numbers don't lie but they really do. So one thing I do in my training is I give split my course into two groups and I get to look at exactly the same data.
- So like said in a couple of months time period in Google Analytics for a certain account and I get one of them to be really pessimistic and one of 'em to be really optimistic and then I get them to report back to me. So the idea is that one group will say oh well this figures down and this figures looking bad and actually we're getting less visits from this group and then the other group will report back to me and say oh we've had a brilliant couple of months, this is up, that's down. Cause things like commercial rate for example, it's a useful figure in itself, but if all of a sudden you're getting twice as many visits as the website, then that's gonna skew the commercial rate. So there's a lot more to the numbers than it seems you need all the context behind it.
- So people do bring their own bias and I've seen it all the time, I see it not work from us, but potentially from some clients they'll look at launch a new website for example and they'll be all of these good things and all of these bad things and they'll purely just look at these. So I think that's definitely something to avoid. And as with bias in the quality side of things, that's quite challenging because the whole point really is around a lot bias is it is almost it's a subconscious thing that you're doing. So I think you need to try and check yourself as much as possible. You need ideally someone else to give their input. So it's not just you looking at it and you need to report consistently as well. So you know, are reporting one thing one month, then don't go report on something else the next month because those numbers look a bit better. So yeah, it's about being consistent.
- Brendan Jarvis:
- Yeah, it brings to mind that saying lies and lies and statistics [affirmative] and it sounds like what you're saying is unintentionally or intentionally people seem to have somewhat of a flexible interpretation of what the truth of the data might be.
- Luke Hay:
- Yes, oh absolutely. I mean we're seeing it definitely at the moment with the of current situation of things, daily reporting of how many cases there are of coronavirus and that kind of thing. And depending on who you talk to, it's either a positive or a negative depending on how you look at it. Obviously the number of cases is often down to the number of tests that are being done as well. So we're having a load more cases than we were in say February or March. But that's partly at least cuz we are doing a lot more testing so you're not really comparing light for there. And yeah, obviously politicians goes without saying the worst for this kinda thing. So politicians will, yeah, they'll report on the one good figure even it's surrounded by bad figures.
- Brendan Jarvis:
- So if someone's presenting data to you, what are some of the questions that you can ask yourself or ask them that would help you better understand whether what you are looking at is a fair enough reflection of reality or where the bias might be and what is being presented?
- Luke Hay:
- I think definitely the first thing would be a considered comparison. So to give an example, we're just into December now. If I had an e-commerce site that sold kind of gifts and I was comparing this December to say November or even to further back October, September then obviously you'd expect the figures to be massively up. It's not really a fair comparison because the seasonal trade means that from my particular site it's always gonna be high in December. So if someone was showing me the stats of here's how we've gone versus November, I'd be asking them, well actually how do we do compared to last December or the December before even. So comparing that light for, so that's I think quite an important one there. And also to some extent it's about trying to work out even why they might be being biased. Again, if it's someone reporting back to you who you've paid to do a job and they're trying to show you that it's been really successful, then you might wanna be a little bit more suspicious and dig a bit more. And again ask for certainly those comparisons. Ask for benchmarking as well. So as much as possible you can compare your conversion rate to other similar websites, conversion rates as an example, then that's worth doing. And just making sure you're getting all the numbers. So again want they might be turning about your commercial rate but actually you need to look at obviously things like revenue and average revenue per visitor, those kind of things. So really getting the whole data rather than just those sort of little slices that they want you to see.
- Brendan Jarvis:
- Yeah, it also sounds like there's an understanding that you need to have of the incentives that sit behind that other person and what it is, the story is that they're trying to tell you. Now you mentioned for example if you've employed somebody or you are, you've hired an agency for example to report on something, then considering how they might not be incentivized to tell you bad news and therefore you might need to ask some deep questions of what it is that you're seeing. I think that's an interesting point that you've made there.
- Luke Hay:
- And I think the other thing to mention that as well is about looking beyond things that necessarily the analytics themselves can show. So for example, if I massively improve the commercial market, the website by make it a load easier to buy things and had all sorts of offers where people could get things cheaper or anything like that obviously reducing prices there's gonna be hit on revenue. But also things like returns a big, big thing at the moment, particularly ass, a lot more people ordering things online, they're getting things delivered quite often you need to spend a fair bit of money to get free delivery, which is great and it looks like people are spending lots of money but then if they're sending half the clothes back because your website doesn't do a very good job of describing them in detail or showing the sizing or the colorings off or whatever it might be, then that's not gonna be any good to you as a business. So you need to be thinking more than just what's happening there, just what's happening longer term to
- Brendan Jarvis:
- Another scenario sometimes where the truth is just too dangerous for people to report on.
- Luke Hay:
- Good question. I don't know. There is, I think again, as long as there's reasons for that, I think that's the key thing. I think if I had to deliver bad news there was something that happened in terms of if the numbers are down for whatever reason, then I suppose the best thing you could do is not lie about that because ultimately it's almost certainly gonna get found out anyway but is more about actually doing the work to find out why that's happened. If I came to you and said your commercial rate's down and that was it, then you're not gonna be too happy. But if I set it down and I give some valid reasons for it and I give some next steps and some actions, even if we don't necessarily know the answer, even if it's a case of saying, well it's down it appears to be down on mobile for example, so what we're gonna do is we are gonna do some usability testing on mobile, we're gonna check that, we're gonna make sure we're running through things and finding a solution. So I don't think there's ever really excuse for lying about that, the stats or hiding the stats. But I think it's always good to, if there is bad news, have at least a plan of action of what you're gonna do to improve it.
- Brendan Jarvis:
- Yeah, I think that is such an important point as well. I mean our superpower is basically the ability to find out why things are problems and what you're saying is that the data, even if it's not good data, is telling you that there is a problem and coming with a plan as to how you might find out why the problem exists and how to solve it is really the thing that's gonna make the most difference to the organization.
- Luke Hay:
- And I think it's about without spinning things, it is about how you position things. So is potential you know, just launch something and everything's tanked then that's a bit different. But if you're coming to a new client and you are seeing actually the industry average for your commercial rate is this high and you are here then that's bad news for you at the moment. But it does mean there's all this potential that actually with a little bit of work, hopefully you can be up there and you can be making quite a lot more money. So that's definitely one way to look at it. And we do with the more qualitative side of things as well. We are doing usability testing and we're finding all these problems that's good because we found the problems and then we can work towards fixing them which is better than just being living in ignorance and having an underperforming website or app or whatever it might be,
- Brendan Jarvis:
- 900% Now since Covid descended upon us all earlier this year, it's reshaped the way a lot of us are running research and a lot more studies are running remotely, particularly usability tests. How do you feel about the move towards more remote research?
- Luke Hay:
- Yeah, I mean it's something that a couple of years ago we had a couple of big remote research projects and they were quite challenging. The reason for doing those was because we were looking for people in both England and in Scotland in one case which isn't a million miles away but it's obviously a bit of travel involved for a few user tests. So we decided to run those remotely and they work to a point. But there is definitely a challenge for doing the remote testing compared to being in the same room as someone, it's harder to build that rapport, it's hard to get those I nonverbal signals. You can sort get a limited amount of that but not quite the same. It's a little bit harder to put people at ease. And also there was, particularly at the time we were doing that a few sort of technical problems, it doesn't take a huge amount.
- Someone for example, just having a slightly bad wifi connection can really throw things out. Cause even though you can see them, they're slightly delayed, it makes things very stilted, very hard to actually have a natural conversation. So I think I was a bit skeptical about the remote side of things, but then obviously the current situations occurred and we've done, I think we've done about 50 odd usability tests of fresh egg since it's happened really. And all of those have obviously been remote and they've all pretty much all of them worked really well. And I think there's a couple of reasons for that. I think people are getting more used to using things like Zoom your average person is using Zoom to talk to their families, to talk to their friends, to do quizzes, all of these things. So from their point of view they're much more comfortable not only using the technology but also with the way the technology feels and the way people act on there.
- So I think that's really improved things and open things up. It also means from a practical point of view, people have kinda got it installed already so it's a bit less of a sort of hoop to jump through there and they're of course really big benefits to doing the right side of things from an accesses a D point of view people who have trouble traveling or are using a lab that's upstairs and that has accessibility problems for people people who just don't like going out or traveling, all of those people normally their voices wouldn't be heard in the lab based user testing that goes on. So that opens that up. And of course geographically as well as we were talking before we started about where there are usability labs and most of the ones local-ish to us are probably in London. So there was perhaps a bias towards London with some of our testing whereas now really doesn't matter whether they're in London or Aberdeen or wherever it might be.
- So I think that's definitely helped. So there are some elements that are missing from doing it this way, but I think certainly it's become easier for people to take part. And also as user researchers I think we probably upskilled a bit just by having that experience because I think when you start out it takes a few usability tests to really get going with it. And I think it's the same with a change in the way they're facilitated the remote side of things takes a slightly different skill set and I think just purely by doing quite a lot of them you learn more and get more comfortable with it.
- Brendan Jarvis:
- Yeah, I mean with I suppose accessibility that you mentioned more and more people have Zoom and other platforms installed and the actual access that it provides people that might be disabled to participate the ability to get I suppose a more representative group of users because you're not restricted to geography and this is a big question for me to be posing because of course we at the space between do have a lab. Is there any reason to return to moderated in person testing?
- Luke Hay:
- Yeah, I think so. I think my plan, depending on obviously how everything pans out, but my plan would definitely not be to completely stop doing that. I think there's so much more you can get, say it's that bit easier to build the rapport but also from a physical point of view, particularly if you're trying to test across different devices for example, that's a bit of a challenge at the moment because you know can get a user to join on the phone, you can get 'em to join on a laptop but it's not that easy to get them switching between the two if you want to test across both platforms. Also on the mobile side of things, even though Zoom is relatively user friendly, it still can be quite challenging for people to share their screens on their phones and with slightly different phone makes and models, again, from our point of view as well, we're sort of saying, well you want an iPhone or it should say this and it should say that on an Android.
- So that's a little bit more complicated. So that technology, it's not the same as being able to lean over to someone and say, oh yeah, just do that thing there or whatever. You just set it up for them and let them go with it. So I think there's definitely, definitely the sort of device side of things from a more practical point of view. And yeah, it's having that chat in person, particularly if it's sort of more user interviews, I think it is nice and you can't replace that sort of human contact of being in the same room. So yeah, the plan moving forward as far as I'm concerned would be to still use labs but for the right projects and use remote for other projects.
- Brendan Jarvis:
- So one of the other things that you're really passionate about, Luke is training and you've got some fantastic reviews about the training that you deliver. Now obviously you were delivering those in person and now with Covid that's become almost difficult and I believe you're doing about 10 sessions a year, so almost one a month. How have you had to adapt your approach to training to this new environment?
- Luke Hay:
- Yeah, I mean again I said similar to the remote testing really and as much as you have to to think about things a bit differently you have to consider various things and then you have to just start doing it and kind improve each time. So hopefully the first training session I gave in that way and I've still got fairly good feedback so I don't think it was a disaster, but I think you know, sort of hone it as you go and I think there's different things. So it's definitely making sure people are involved a bit more, making things, I try and make things interactive with my training anyway cuz no one likes just being sat there listening to people. But it's more about making sure if nothing else is just for you to make sure they're still there. Because particularly when you first start doing it, it's really unnerving if you can't see anyone or hear anyone, you'll literally just feel like you're talking to yourself and you always sort of think in the back of your mind when I asked this next question, it might be that my internet connection went down 10 minutes ago and no one's been listening.
- So
- Yeah, it's just good to get people even just saying that they're there, but more so to do interactive things and I think that's where tools like come into play where you can actually do sort of virtual post-it notes and that kind of thing. Jam board, other various other tools, even just things like Google Docs in general, some spreadsheets and things for some of it, but whatever the tool is, just something to get people engaging with that is good. And I think making sure that you are open to questions and answers realizing that some things will take more time and some things will take less time. That that's the tricky bit. I mean that's always been the tricky bit in a way with training is getting the timing and doing things online strangely sometimes take longer, sometimes not as long. So that's a little bit of trial and error for that I think.
- Brendan Jarvis:
- Yeah. So tending our attention to usability testing, again, I believe you've run tests with some quite colorful groups of society, your experience, what is the funniest thing that you've seen or observed when you've been running a usability test?
- Luke Hay:
- So I guess that the funniest one would be that the guy who's in bed when he was doing the testing so that was at the mobile test for a free university website. And yeah, I sort of zoom calling again this, I suppose I should say front, this was a remote one, wasn't an in person one. Yeah, it yeah, it was a right thing. So I had him on the laptop and came up, answered the call and they head, he just looked, it was like he just answered the phone and he wasn't exposing it. He was like, oh hello. And then yeah he was, yeah, you just sort of see the bed behind him and then he, it went a bit quite for you could kind of see, had the phone down, you could see, put his socks on and things. It was all a bit old to that one. So how
- Brendan Jarvis:
- Did you handle that? I mean cause for people researchers that maybe knew or haven't yet run many studies, they might not have encountered some of these weirder situations as an experience moderator in these cases, what are some of the stories that you tell yourself or techniques that you have to employ when stuff starts going sideways,
- Luke Hay:
- You just have to go with it. I think mean it's always gonna happen. The more of these kind of sessions you have, the more chance you're gonna get a few characters as we like to call them, popping up here and there. But in that case it was just a sort of thing of saying, well you know, okay to do this now, do you wanna do it later and is fine to do it now. So you just kind of carry on and ignore it to some extent and try and give 'em a bit time to get everything together but you just carry on through it. And actually, I mean overall the feedback was as good and valid a session as it would've been normally. And arguably we try and recreate a more natural environment in these ways, you know, might looking at that website, but it was in bed normally, so yeah, why not?
- Brendan Jarvis:
- It's a bad habit that we have, isn't it? Waking up and checking our phones.
- Luke Hay:
- Exactly. Yeah, exactly.
- Brendan Jarvis:
- So common not, probably not that healthy but quite common.
- Luke Hay:
- Yeah, absolutely. I mean I suppose the other one that was more of a other sort challenge in its own way was an in person one and it was working with a website for pregnant and recent others and we had the genius idea of getting people in the office and actually using this site, which I was aimed at effectively giving help and advice and support to people from looking to get pregnant to having more or less toddlers. So it was a bit of a range and obviously all of them bought children with them most of them, some of them multiple children. So that was quite chaotic because obviously you had sort of mum doing these testing on the phone with me of sat next to her with baby one arm kid running around screaming in the background, that kind of thing. So it was a bit like, okay, yeah, is
- Brendan Jarvis:
- Probably quite reflective of reality.
- Luke Hay:
- That was the main thing that was an intense afternoon and perhaps perhaps put me off having kids. It was still a good environment in that way. It was the next best thing to going to visit them at home and seeing them using it there.
- Brendan Jarvis:
- Yeah. Can I just ask, was it all males that were planning that study that were surprised by the outcome?
- Luke Hay:
- [laugh]? No. Funny enough, it was a bit of a mix and we weren't necessarily that surprised. I think it was just yeah, we had planned, we got sweets out and that kind of thing for the kids, but that obviously doesn't necessarily help. Yeah. So there's always one or two odd, odd things like that going on. But again, just super interesting and that's why we do the job ready to have those kind of experiences beats going in to do a sort, very boring nine to five looking at spreadsheets all day when you can have things like that.
- Brendan Jarvis:
- A hundred percent. And what role do non-design or UX stakeholders play in the testing that you've done? Do you ever have them observe what's going on?
- Luke Hay:
- Yeah, I mean from my point of view it's really key to get those people involved and I think we are in an easy position that it's quite an easy sell UX in terms of getting people involved. As I mentioned before fresh, we have EO team for example, and all of those. Sure. I always tell them they have really, really boring jobs and what they do is really dull and I'm not sure they'd agree with that, but I think they definitely would agree that getting involved in the sort of usability testing side of things and user interviews and that kind of thing is a really nice bit of variety to their role and is really key of course to what they're trying to achieve as well. So getting them involved either as observers or hopefully a little bit more than that actually actively inputting into the research one way or another is really key.
- And particularly from our client's perspective. Cause they don't really care whether it's her sort of SEO recommendation or UX recommendation, they just want a recommendation to help them with their challenge. So the more we can come together and give them something that works from all aspects, the better. Because with the best one in the world, I can say from a UX perspective, do this and that, but actually if it's gonna completely bullying someone's search ranking, it shouldn't cause that's what Google's looking for. But if doing that would actually have an adverse effect on their search ranking, then that's something that needs to be considered. So by working closer as a team, we can do that. And then from all a sort of stakeholder, a client point of view. Again, really good to get them involved. It's good for them to see things firsthand because it's a massive difference between a user saying something and sounding frustrated with things.
- I mean, we've probably all put together little clips of users shouting and swearing and ranting at their phones to show stakeholders and we all know how effective those are. But also just between the sessions. So particularly for lab based testing we'd normally have perhaps the client sat in the observation room in between each test we kind of pop in, kinda say what you think about that they can give us insight. And also particularly it's a website or product that we are not experts on. Recently we were doing testing on a very sort of complex financial services pensions system. And it was quite a challenge for me as a data to actually say, oh, you are looking to increase your e c by 10% or whatever and all of this sort of job. And it didn't mean much to me. So having the stakeholder client who I could go and ask and say, oh, you're me, tos them this, but what does this actually mean to actually have someone on site to help get me me to make sense of it and be able to talk in the same language as the user as well. That was really useful.
- Brendan Jarvis:
- Yeah, it's this notion I suppose, of removing the abstraction that exists if you're just looking at the numbers and if you have people together, you're able to have those conversations where you can observe something happening that's real and then interpret and see from that what the issue underlying issue might be. I think this is a good point to come back to that I suppose, that notion of where UX might go that we spoke about briefly earlier in this conversation. And it seems that the conversations that I'm having, at least that data and big data and data science is playing more and more of a role in UX. So thinking about the trends and the things that you've observed to date and how you see things in the near future, maybe project that out. And what is it that you think UX and UX research will look like in 2030?
- Luke Hay:
- It's a, yeah, good question. I mean, I don't know it will necessarily be that different. My thinking is that you can have the best algorithms, you can have the most sophisticated of multivariate testing, that kind of thing. But it's gonna be really hard to properly replace the nuances of human behavior. I dunno how a robot would cope with a user being in bed when they did they, user testing is those kind of things that I don't think there's a replacement for. So although there'll definitely be elements of that. And as you said yourself, we're already seeing elements of that. There's definitely more human side needed to of course. And then I think there's the other thing about thinking longer term and to think about dark patterns and that kind of things. You know can have an algorithm that really drives conversions up for a website perhaps by doing all sorts of multivariate testing.
- But that doesn't mean it's a good experience. That doesn't mean that person's going to come back. It doesn't mean that person's not gonna get annoyed after the purchase and leave a really terrible review, which over the long term then you all know the importance of social proof. People are just gonna stop using companies because they've got such a bad reputation. So it's those kind of things that are really gonna be really difficult or almost impossible for machines to understand of how important the reputation of a company is and how what's happening on the website might be affecting that. So I think definitely we'll still need humans to understand humans effectively until robots can start sending to each other and that kind of thing. We get completely put outta the picture I think. Yeah, I can't see things changing massively. I mean the only changes we'll be on a smaller level as I mentioned before, perhaps having more remote testing, that kind of thing.
- Definitely the field opening up in terms of there probably won't be such a thing perhaps as a user researcher I think it will be more a sort of part of people's jobs. I think there will be people who perhaps specialize in it a bit more, but I think it will definitely be more evenly spread out and other people working in the industry will perform that role too. So I think things will change slightly, but the basis research that has been the same for a long time and it will probably be the same for a long time in terms of the methodology and how it all works
- Brendan Jarvis:
- And us to be able to bring that humanity and perhaps not a topic for today, but that ethical lens that we can look at the design decisions with. So tell me, are you up for playing a game?
- Luke Hay:
- Yes.
- Brendan Jarvis:
- Yes. A bit of hesitation there. I promise it'll be painfully
- Luke Hay:
- Hesitant, but let's give it a go.
- Brendan Jarvis:
- So I play this game every interview, it's called What comes to Mind. So when I say a word, which I'll say soon, a word will pop into your mind and you just tell me what that is, [affirmative], if you feel like explaining it, you can [laugh] had people swear, I've had people give long explanations of why they thought of things. It's totally fine. Whatever you feel like doing. Yeah. First word is bias.
- Luke Hay:
- So do you want a single word answer or do you want a
- Brendan Jarvis:
- Single word? Feel free to explain whatever comes to mind.
- Luke Hay:
- I mean think, oh it's a good one. One to start with. I mean the first thing that popped into was cognitive, just cuz sort of cognitive bias but doesn't just a thing that doesn't really mean anything. I think the next thing would be I suppose that he said it's a very interesting subject in as much that a lot of there is unconscious bias. And I think that is, yeah, for me, I'd say if I had to say one word, I'd say interesting. Cause the subject sort of bias and the psychology behind it side of things is something I'd definitely I interest in it and would like to learn more about really as much than anything else.
- Brendan Jarvis:
- Cool, cool, cool. The second word is analytics
- Luke Hay:
- Necessary. I think that's probably what I go for there. Yeah, not as interesting but necessary useful. Both those kind of words spring to mind.
- Brendan Jarvis:
- And our final word today is politics
- Luke Hay:
- Depressing
- Brendan Jarvis:
- [laugh].
- Luke Hay:
- I so calling a bit of chatting on I'm in the UK and at the moment I think depressing probably sums it up quite nicely.
- Brendan Jarvis:
- Well let's not finish on a depressing note. No, Luke, what keeps you working in and contributing to so general generously this field of UX and research?
- Luke Hay:
- I think the genuine interest in the side of things. We touched on a few points today that there's nothing, but I enjoy more in the work context than going and doing these kind of user interviews and finding out more about people and their lives and how they do things and their thoughts on things. So that definitely from the ones who are actually doing the work side of things is what keeps you going. I think always good to actually make a difference. I mentioned before there's been some sort of tests that be done that can be measured quantitatively particularly if it's been a long project with Qu Quantum and everything involved. And at the end of it you can actually say, we did all this, we can see the users are happy and your sign ups have gone up by this amount. So that's good. And I think finally the people really, you know, said yourself. You've spoken to a lot of interesting people through this. People who work in UX generally by and large, nice people. They're friendly people, they're interesting people, they've got a lot to say. And I think that's Stephanie something that on the sort of events side of things, organizing conferences and things, what keeps me doing that really. So yeah, just need to make friends in the industry as well.
- Brendan Jarvis:
- Yeah, there are some really great people. Thanks Luke. It's been really great having you on the show and today's conversation has been really valuable. Thanks for generously sharing your experiences and your knowledge with us. It's been so good to have you here.
- Luke Hay:
- Yeah, well thanks for having me.
- Brendan Jarvis:
- I'm sure what we covered today will be useful for people trying to create better experiences for users. What's the best way that anyone watching this can connect with you?
- Luke Hay:
- Yes, so depending on what they're into in terms of their social network of choice, but my website is look hey.co uk so you find out a bit more about me there and get in touch directly from there. LinkedIn is, Luke had obviously and I'm, Hey Luke on Twitter, that's h a y l u k. So yeah, follow me on Twitter. You want sort of bizarre rants about football and the occasional bit of thrown in there too. What's
- Brendan Jarvis:
- Your favorite football team?
- Luke Hay:
- Oh, Brighton, obviously.
- Brendan Jarvis:
- I love it. Yeah, here we go. Hey, look, everyone who's tuned in, it's been great to have you here as well. Everything that we've covered today, Luke and I will be in the show notes, including detailed chapters so you can get quickly to the questions and the topics that we've covered will also be linking through to all of the ways that you can contact Luke and keep it in contact with him and see where his career takes him. If you've enjoyed the show, don't forget to the video and leave us a comment. If you wanna see more of these conversations, subscribe to the channel because they'll be coming out roughly every two weeks. And until next time, keep being brave. Everyone.