David Travis
Where Does Great Design Live?
In this episode of Brave UX, David Travis shares some insights from his stellar 30 year career in UX research, his thoughts on the future of the field, and the essential skills to be a great user researcher.
Highlights include:
- Where does great design live?
- What is the future of UX research?
- What big mistake do UX researchers often make?
Who is David Travis, PhD?
David Travis is a researcher, strategist, and educator with over 30 years of experience in the fields of human-factors, usability and user experience.
He holds a PhD in Experimental Psychology from the University of Cambridge and is the Founder and Managing Director of Userfocus, a leading London-based UX research consultancy.
David has provided his expert services to organisations such as American Express, Microsoft, The UK Government, Skype, eBay, Yahoo! and the World Health Organisation.
Transcript
- Brendan Jarvis:
- Hello and welcome to another episode of Brave UX. I'm Brendan Jarvis, the Managing Founder of The Space InBetween, and it's my job today to help you unpack the pieces of the product puzzle. I do that by interviewing world class UX researchers and product management professionals. My guest today is Dr. David Travis. David is a London based UX research and strategy consultant. He's also a coach, educator and author and he has over 30 years of experience helping organizations to design the right things the right way. David holds a PhD in experimental psychology from the University of Cambridge and he's also a chartered psychologist. David also has an associate fellow membership to the British Psychological Society and is a member of the User Experience Professionals Association. He started working in the field of UX research in 1989, originally for British Telecom. He established user focus in 2003, which is a UX consulting and training company.
- And David and his team have since provided services to clients such as American Express, Microsoft, the UK government, orange Skype, eBay, Yahoo, Whirlpool, and the World Health Organization to name a few. He is the author of three books, effective Color Display, which was published in 1991, e-Commerce Usability, published in 2002 and his most recent Think Like a UX Researcher, which I have a copy of here, which was published in 2019 and co-authored with his colleague Dr. Phillip Hodgson. David is a prolific educator who was on a mission to create more user experience professionals so far on top of his consulting work, he has delivered over 250 live training seminars and educated over 53,000 people through his highly rated UX courses on Udemi of which he is one of the highest rated instructors. David also runs the popular the UX T Break YouTube channel where he provides insightful five minute answers to specific UX questions. The conclusion I drew, unsurprisingly, when I was preparing this introduction is that David knows a thing or two about UX research. You can [email protected] and on Twitter at us at User Focus. David, welcome to the show.
- David Travis:
- Thank you for having me Brendan.
- Brendan Jarvis:
- You are most welcome. It's great to have you here. And before we get into some big topics, I wanted to discuss your brief but successful acting career. I understand that you started in a movie next to Ray Windstone and Sting.
- David Travis:
- Unfortunately it didn't go anywhere, my acting career, so I was in, think how old was I was about 18 at the time, 18 or 19. And I had a friend who was at Sussex University. I went down to Brighton for the weekend to see him and we were in a boozer, I think a pint lunchtime. And a guy came up to us who we hadn't met before and said, what are you two lads doing this afternoon? So he said, do you wanna be in a movie? So we thought you being young and naive, we said, yeah, okay, what's that about? And then it turns out he was scouting for extras and for some reason he felt that me and my friend would make good extras for Quadrophenia, which is what they were filming. So we got a haircut, they kind of did us up in stuff and we also got selected for a special scene where me as a mod throw a rocker over the beach wall in Brighton. And the only advantage of that is that on the video case and actually on the D case link, that that's one of the two or three steals that they show from the movie. So when I say to people, I'm quite afin, I can actually point to the video and there's a photograph of me which would be unusual for a scene in which I think took place in about five seconds and then it was over. But sadly Ray Winston and Sting, I lost contact with them and their career trajectory has been a bit different to
- Brendan Jarvis:
- If only gorilla usability testing was as exciting as that.
- David Travis:
- Yeah.
- Brendan Jarvis:
- So look, obviously you parked your acting career and you turned to psychology and eventually went through to earn a PhD in psychology and you're still registered as a chartered psychologist. What was it about psychology and experimental psychology specifically?
- David Travis:
- Yeah, so it's really to do with a curiosity about people and trying to understand people, which was why I did a first degree in psychology and the fact that people's behavior could be understood and could be predicted at some level and that there were experimental ways of investigating that. It kind of combined a kind of an interest in I think literature with trying to understand character with an interest in kind of science and the experimental method and putting those two together. You kind of end up with experimental psychology and it's just like a fascinating field and it's a field that just keeps on giving. And I kind of still describe myself as a psychologist, even though my professional society, the bps, I don't think they really represent the field of user experience or user research in any valuable way. I mean they've, they've ended up in this field which is really predominantly to do either with health psychology or it's to do with business measurement of things like personality and character that you can use to decide what career someone should go into.
- And for reasons I don't understand, they've ignored this enormous field of user research. And I bet you in the UK there are more psychologists doing user research than there are doing many of the other fields in psychology. Yet it's kind of not represented as a discipline. So every year when my Jews come up for renewal, I kind of think, is it worth paying again for another year? Cause I don't really feel, I'm a psychologist but I don't feel my professional society represents me. But I still continue to pay because I kind of like to be able to say I'm a psychologist. I feel it provides, provides that kind of elevator pitch for people to understand what it is I do. You say you're a user researcher to most people that don't work in the field, that you say you're a psychologist, [laugh] actually you say you're a psychology, say, are you psychoanalyzing me is the first thing that you normally get and then they completely get the wrong end of the stick. So actually probably saying I'm a psychologist is even worse than saying I'm a user researcher because when I say I'm a psychologist they think I'm completely different to what I actually am. So who knows? I think one of the pro struggles I've had in my whole life is being able to describe to people that aren't in the field and what it is that I do. And I don't think I've found a useful way of being able to do that really.
- Brendan Jarvis:
- So after you finish your PhD, you went on to do some work as a fellow, was that purely for academic interest? Tell me about that.
- David Travis:
- What I was trying to do then was kind of stake out a career for myself in academia, but what happened at that time was Margaret Thatcher was PR prime minister and she was kind of cutting lots of jobs in academia and it was clear that to get a job in academia that I needed to be shit hot and I wasn't shit hot at being an academic, I was mediocre to good. So I was kind of thinking what am I gonna do next? And while I was in my office at, at NYU, I had a phone call from somebody from BT and they headhunted me for a job that they were looking to feel a BT research labs in Iwi. It's the only time in my life I've ever been headhunted. And they said they'd come across the book I'd written which was Effective Color Displays.
- And they decided we've got these enormous network displays that we have in bt. It is like a NASA control room and they've got these massive monitors, not a monitor, it's a projection screen with lots of colored lines on which indicate whether or not the network's failing in certain areas of the uk. And they were worried that people who were colorblind would struggle to understand the color coding that they'd used on these displays. And they said, we need someone with your background to come in and help us work that out. So I thought, well this is time to make a jump. And then what happened when I went to BT was it was self evident probably from day one that the problems that they had with their systems had nothing to do with color and they had everything to do with usability. They were kind of digging in the wrong place.
- That wasn't where people were gonna make mistakes, it was with the interaction design of the system. Now to me that was a brand new field. I never knew anything about usability at the time, but fortunately I was in a group called the Human Factors Group at BT Labs. At the time it was the biggest human factors team in Europe. Think they had about 40, 45 people working there. It was and was most with hindsight, it was the most fantastic place to work because all of the things that turned out to be important issues over the next 20 years or so, things to do with video and demand, things to do with home automation. There were projects where they were working on that at bt the only problem that they had was there was no network infrastructure for it. So they were trying to promote isdm, which was a telephone, a kind of broadish band telephone connection that people could use probably equivalent to about 3G I reckon at best.
- But they thought that would be the infrastructure over which they would be able to deliver these services. And then of course the internet happened and suddenly all of those things that the teams have been researching, suddenly they became products that other big international companies were working on. And as is the nature with big companies like Xerox Park in the past they come up with all the great ideas, but for one reason or another, the organization decides not to exploit them. And so what happens is other organizations take those ideas and then apply them. And that's a long winded way of saying that. What that actually caused me to do was transition from the academic area of interest that I had, which was color, vision, and kind of studying it experimentally and instead move over to this new field which I'd discovered I kind of didn't look back cause I thought this was so much more interesting because the problem with academia is you're so narrowly focused like color vision, not just vision, but color vision I was doing. And that's so narrow that you get bored really, you know, wanna do something that's more interesting.
- Brendan Jarvis:
- Yeah, I had a conversation last week with Dr. Elizabeth Allen and she actually also studied color as part of her PhD over at University of Chicago and she said similar has similar sentiments about being so narrow in academia and part of the reason why she moved into UX research. So it's interesting to hear those parallels. And one of the things that I've heard you David talk about in one of your videos is that and I'm gonna quote you here, is that great design doesn't live inside designers and at face value. That doesn't sound like a thing to say in design circles. So if it doesn't live inside designers, where does it live?
- David Travis:
- So well there's a myth and I think Johnny Ive is an example of that myth made large that you have designers who are just, for some reason they have fantastic predictive abilities about what people want from the world. And I'm sure that maybe there are a few people around that for whom that is the case. I don't actually think Jonathan and IBE is one of them probably Steve Jobs was that they, they've just got their finger on a pulse and I dunno how they do it, but they do. But the vast majority of us, and I mean like 99% are pretty clueless when it comes to know what makes good design. And the only way we can discover what makes good design if you are in the majority like I am, is to go out and look at the people who are gonna be using your system and their, that's where good design lives.
- It lives in your users' behavior. So understanding what it is users are trying to do, understanding what their needs are understanding what their models are when they're using a new workflow that you've designed, discovering the problems that they have when they're looking at a screen that a designer in your team has put together and discovering the problems and refining the system based on the comments and the feedback that you get. That's where good design lives. Good design lives in the space where you end up creating things that make people's lives easier, making people's lives better, making the world a better place. And that doesn't live in the heads of a single designer.
- Brendan Jarvis:
- I mean now we have this field that may have been called human factors initially and then it was usability. And it's recently moved into being called user experience as the umbrella term. And as part of that field we now have the new role that exists in organizations called a UX researcher. And it sounds like from what you were describing that there's a certain mindset or mindsets that someone that's in that role that needs to try and find that good design within users needs to adopt. What is that mindset or what are those various thought processes that someone in that role really needs to grasp the effective?
- David Travis:
- It's so simple. So people really overthink this issue and in my experience, good UX researchers have got something in common and the thing that they have in common is they have a way of getting their users to let them into their lives. So they have the ability to be able to spend time with people and those people can be open with them about what it is they're trying to achieve. So they don't treat, and it's actually, it's an interesting distinction with classical experimental psychology. Those user UX researchers don't treat people as kind of lab rats that are gonna take part in an experiment. Instead they go out and really try and understand that the lives that they're leading, the problems that they have. And when they do that, the people that they're working with, the users that they're working with open up and they discover the real needs that people have.
- I think the biggest mistake that UX researchers make is that they spend time in front of their computers rather than spending time in front of their users. So I've speak to some UX UX researchers who might not have seen a user for the last three or four months. It's not doing your job if you're not seeing users. So you should be seeing users every sprint if you're working on an agile team, if you're not every couple of weeks and you should be going out and speaking with users that could be going out and doing some pop-up research and testing out a very specific feature of your product. It could be going out and doing field research in people's homes with coronavirus. It could be screen sharing with someone as they do a meaningful activity that you're trying to understand. It could be running a usability test, but it'll be something where you can see a real person at the other end. Not automated research, like a survey tool for example. But real research where you are actually speaking with people to understand their lives better. If you're not doing that, then I don't think you're a UX researcher, you are something else.
- Brendan Jarvis:
- And so you've recently written and launched this book last year, Think Like a UX Researcher and I get the suspicion that nobody writes a book, let alone three books just for fun. Why was it that you felt that the world needed this book?
- David Travis:
- I felt that having, I'm a great consumer of books, so that book shop behind me, there's loads of UX related books on there and I probably have read cover to cover 10% of them but I keep buying them. But I open them up and I start them and you know, kind of have to get over the basic stuff and then it moves on to the more interesting stuff and then you kind of get bored and you give up because, and I wanted to put together a book that overcame that problem, a book that you could dip into. So I spoke with my colleague Philip Hodgson, who I've worked with probably for about, probably started my career in user experience. And we were thinking of ways of doing that and we felt short related articles, each of which gave people something valuable that they could take away was the approach to take.
- So we put together a book that I've not seen anything like it in user experience before because you don't need to read it covered to cover. If you're working on a project at the moment to do with usability testing, you can open that up and you can read a handful of ideas to make your us your usability test work better. If you're doing field research, if you're doing remote research, you can find stuff something in there to read and get an idea and without having to read the whole book cover to cover. That's the first answer. The second answer as well is that at the time Philip was coming to the end of his career, and I'm probably coming pretty much to the end of mine as well, and the book that I'd written before this was E-commerce Usability. That term it sounds so dated, doesn't it really?
- Brendan Jarvis:
- Yeah, but there was in 2002, and I was thinking about that because I built my first website in 1998 and 2002 was still very early days for eCommerce. So to be writing a book on eCommerce usability in 2002, that was still pretty on the forefront of things.
- David Travis:
- Yeah, well and although I like that book and I'm proud of it and I think it's still got loads of value. Cause what that book does, it is a book you need to recover to cover really. But it describes a process for doing user experience. I didn't want a book with the title. We having e-commerce in the title to be my kind of validation from the field. I wanted to make sure I had something that was kind of more up to date. I didn't wanna even call me e-commerce usability, that book originally it was the publishers, they said put E-commerce in the title and you said, well shed loads more copies and not being proud. I decided to follow their advice. But what I wanted was to put together a book that I felt was more contemporary because the whole field has exploded since then.
- Like you've said, there wasn't really much happening then. And I wanted something which described the way we were feeling about the field now and the stuff that we discovered that would add value. So the second reason was to put together something that acted more like a legacy. And I know that sounds a bit, I don't know what does that sound? Sounds a bit overblown, but I think that's something that it is important that people that work in the field of UX should do more of that they don't do, which is leave a legacy that doesn't need to be writing a book that gets published. It could be writing a case study about a project that you've done to help people on the team understand why you worked the way that you did. It could be mentoring people within your organization to teach 'em about the importance of user experience. It could be giving a talk at a conference. But all of those things are what I think when you become more established in the field of user experience you have to do, you owe it to the field to do rather than keeping that stuff close, you need to share more. So that was the second goal of it as well. Really.
- Brendan Jarvis:
- Yeah, and I think you've really done an excellent job with it. I mean, when I was reading through this book, and I still refer to it regularly, I think the thing that I appreciated the most about it was the end of the short chapters. You leave people with questions, you're not trying to give them all of the answers. And it's that I suppose engaging of your own minds to consider how you apply what you've just read to your own practice that
- David Travis:
- I think I'm glad about that. That's my favorite part of the book. I really enjoyed putting those together. They were the hardest things to write as well. They were the things that Phillip and I spent more time on than actually writing the stuff itself. Because what we wanted was to create questions, but there wasn't a clear yes or no answer. I mean, nearly all of those questions you could answer in a number of different ways. What we wanted to do was create things that made people really think and have either an internal debate with themselves about the answer to the question or indeed, you know, could take any of those questions. You could run a workshop with your team to talk about the know the way you would approach it, the possible answer that you would come up with. And that's how I think you grow as someone that works in the field.
- It's too easy I think, in any field to quickly fall into a routine way of doing things. You always do it like this, every time you run a usability test, you go through step A, step B, step C, and a lot of the time you need some kind of procedure about the way you do things. So I wouldn't say that that's necessarily a bad thing, but it's also appropriate, I think every now and again to question the way you do research and say, well is that really the right way of doing it? Is there another way of doing it? Is there another method that could have used that I might provided more value so that I can get this insight into users so that I can do better design? Because design living inside users rather than inside designer's heads, the only way you do that is by exploring these different methods and trying them out.
- Brendan Jarvis:
- And it's really is, it's an excellent resource for the seasoned practitioner and also for people that are getting into the field. And that's something that I wanna talk about with you a bit later on is the explosion of new talent that has entered UX. Before we do that though, I wanted to bring you back to your talk in 2014 that you gave at Northern UX and that was a talk where you said that we didn't have a problem with the quantity of research that was going on, but we did have a problem with the quality of research. Do you still believe that?
- David Travis:
- Well if anything it's got worse, it's not got better and it's got worse and it's gonna get even worse because of the pandemic. It's got worse because
- People and organizations mistake numerical measures of usability or user experience for insight. So they think that by putting numbers on things, they've automatically got more insight into it. Now I wanna be careful because I like numbers and one of the things I did early on in my career was try and quantify certain things like usability. So the ISO definition of usability, effectiveness, efficiency and satisfaction. You can measure those, you can provide numbers and that's a good thing, but it's only a good thing if it's done in collaboration with also collecting qualitative insights as well. Because most of the things you learn from users, they're not numerical, they're not things you put numbers on, they're to do with behaviors, things that you've learned. Now the problem with modern a lot of modern user research that's gone on is that it's becoming increasingly kind of automated. So people are trying to collect data from users.
- So in the old school it would be run a survey they're trying to collect that data and they then they're analyzing the data without seeing users. So there's no users are being seen, but data's being collected and in the new way of doing it, what people doing instead is they do remote unmoderated usability tests. So they'll ask users to carry out, they'll send, they'll, they'll recruit a panel of users using some established service, they'll send those users a series of tasks, users might record themselves and then they'll upload the videos to the cloud and you can watch those videos afterwards. And it seems like the perfect solution. But in fact nobody watches the videos afterwards. They just look at the statistics, did someone complete the task or not? And you've got a very rudimentary knowledge of your users if you only look at the numbers in the same way.
- So I think an analogy I might have used in the book, but I can't remember cause I've used it at other places, certainly is to do with the football results. So you could imagine at one extreme we could go to watch a live football game or soccer if you are watching this in the United States, we're watching a liable game and we can see all the highs and lows of the game. We we're part of the crowd we see what's going on in detail. There's that experience that we have of understanding the game versus looking at the scores in the Sunday newspaper where we see that our team lost to one so that we still got, we've got a number there, but that number is a very poor representation of what the experience was about. Now it's still useful because we know whether or not our team's gonna get promoted or relegated at the end of the season, but on its own, it's not enough for us to decide, well what do we need to do with our team to change it?
- The number doesn't tell us how to change, it just tells us that things are going one way or the other. What we want is to know, well how do we make things better? You get that insight about making things better, but actually sitting with and watching your users, even small samples, it works. You don't need the huge samples, which actually you do need for quantitative data. Ironically, you can do it with much smaller samples and it's, that's why I mentioned earlier on, it's so simple, it's so people so overthink it. All you need to do, spend a bit of time with your users, sit with 'em and you'll get these insights. You'll get a ton more insights than you would do if instead you'd run a thousand people through a survey or some kind of summative usability set. If that's the only data that you get, you've got to collect this more qualitative data instead.
- And the reason I think it's getting worse with the pandemic is because clearly in the middle of a pandemic, it's gonna be harder for you to get face to face with users and it's very tempting instead to say, well we still need to do user research, let's do some remote unmoderated tests instead. At least we're keeping things ticking over. And I don't think that that's the right approach. I think that you should be doing some remote moderated usability tests, but you should also be doing some remote screen sharing with users to try and understand the way they do whatever the meaningful activity is that you are designing for. So not using your product but doing the thing that they're trying to do. At least you can observe the digital component of it by screen sharing with them and watching them over zoom. But I suspect I'm fighting against a incoming tide to be honest, that I do think that it will, there's only more quantitative data to come and quantitative research to come and at the expense of qualitative research. Now I wouldn't mind if it was as well as, but it's as a user research, you can only do so much. And if there's pressure from your organization to do research where you just generate numbers, that means you do less research where you generate insights and you need to fight against that as a user researcher because your team deserve better than that.
- Brendan Jarvis:
- Where is this demand for quantitative in this sort of lack of trust and the qualitative insight? Where does it come from?
- David Travis:
- I think there are two places. One is it could come from senior managers who thrive on numbers. So a few years back I was working for Orange the telecommunications company in the uk and I was working as an internal consultant. I was covering for somebody who was on maternity leave for about a year. And while I was there I was kind of running their human factors team. And while I was there we were looking at different ways that we could engage the organization in the research that we did. And we did all of the obvious things like asking people along to usability tests producing highlights videos and sending them out and all of the obvious things you'd think about doing. But despite all of this, nobody was actually reading the research report that we put together. So we thought, well how could we engage people with the data?
- So instead we looked to see how shorter report we could make. So instead of a 40 or 50 page report, what would we cut out if we dropped it to 20 pages and what would we drop out if we cut it to 10 pages? And eventually we thought, well what would a one page report look like? So we put together a kind of dashboard. I mean then nowadays we're kind of suffering from dashboard overload, but in those days it was relatively novel to have a dashboard summarizing the results of usability test. And although we had some qualitative comments on that dashboard, predominantly it was about these measures of usability that I've mentioned and showing how oranges online shop compared with competitor online shops. Now what happened was that whereas previously we got very little senior management engagement, suddenly the usability stuff that we were doing was being discussed at board level.
- How comes we're worsen Vodafone on this particular task? What do we need to do differently? Because once you've got numbers, you can start measuring things against them and that's one of the benefits of it. And I think that's a good thing about quantitative research because it can influence people. The problem comes when you then only deliver that kind of research. And another place where it comes from is the team itself because they don't understand user research now they don't understand the way user research works. They know that they might be familiar with the usability testing, but the chances are what they'll do to you, it is they'll say to you, here's two alternative designs that we could have for our homepage. Find out which one users prefer. Now if you're given that problem at a phrase as a preference problem, then you clearly need very large samples of users and you would go out and do some kind of survey I guess to find out which one they prefer.
- But you've let the team define the research that you've done. Instead, you should say to them, that's not my job. No, I don't do preference. What I do is behavior. I can tell you which one they're better with. I can tell you which one they're more successful at achieving their tasks with. If you want me to do that, I'm gonna do a usability test instead. And it won't be numbers that you'll get. What you'll get is problems with both systems. And I'll be able to tell you which one's better than the other, but it's not gonna be just a numerical comparison. And I think the, it's the team that don't understand what user research does. So part of your job is to educate them on what user research is about, make them realize it's about these insights rather than just about numbers to characterize the results of a design.
- Brendan Jarvis:
- Usability is really easily pigged back to looking at behavior. And you can say that about a field study. But one of the things that we also do is research is as we interview our users to try and understand a bit more about how they see the world. And that's a very similar to certain techniques that you might see in qualitative research within a market research company. And I suppose I'm trying to understand looking back on your experience of seeing this evolve over 30, 30 plus years, have you seen this confusion since the beginning or has this been a relatively new thing?
- David Travis:
- Right. So I think maybe what what's behind this is the fact, and it's kind of related to this issue, which is I've said that teams don't understand user research, but they do understand market research. We're all familiar with market research, we're all familiar with political polls, we're all familiar with surveys that we'll receive in the post. And as a consequence, your team tends to think that what you do is, and in my experience in my career, there's always been a tension between market research and user research. But what I think the bit of the jigsaw that people don't often get is that market research, bless them. I think what their job is about positioning. So if you've got a product, what they'll do is they'll work out how they sell that product to users. So they'll go out and they'll find out a little bit about what users need to do and they'll come build a Venn diagram, but this is users' needs, this is the product, this is where they overlap.
- When we market the product, we're gonna focus on that sliver, that overlap. We're gonna, because if this team had done the research properly, they would've met all of those needs, but they've not done that. So let's just focus on that sliver. Instead, what user researchers do is they're a bit more bullshit and they say, well no, you're only meeting a little sliver of the user's needs. Instead let's change the product. Let's pivot the product and focus instead on this full set of needs. Rather than try and convince people that this fixes something that they might otherwise not be able to do, let's look at the bigger picture. Instead their market researchers tell me, oh well we do that as well. But they don't do that. They almost exclusively focus on positioning. They don't really do that generative research that user that a good user researcher would do.
- And I don't think it's the fault of user researchers that misconception exists. I think, you know, do need to educate your team to help them appreciate that it's broader than market or different, actually it's not broader, it's just different to market research. But the problem is the problem doesn't lie with user researchers. Other than that, when you become complicit in doing what your team tells you to do, so you do a survey when it's not the right thing to do, you know, should go out and see them, but you do a survey instead so then you know are complicit. It's your mistake. But generally it's more to do I think with teams not really getting a grip on what it's about.
- Brendan Jarvis:
- Yeah. So we're in this COVID 19 world unfortunately, and we may be here for a while. It sounded like you were quite concerned about the impact that Covid might have on the practice of the face to face aspect of user research. And you mentioned the proliferation of unmoderated studies that are being run. And I don't wanna paint too much of a dystopian view here, but do you see that UX and also I suppose UI to a degree will become entirely automated in the future, driven by data, run by quantitative insights, basically just an AI that's continuously AB testing different versions of a certain layer. Is that where things are heading?
- David Travis:
- I think in some organizations it probably is, but we've got a secret tool in our armory as user researchers which is that if all of your competitors are doing that kind of research and none of them are doing qualitative research and really, really allowing users to let you into their lives, then you can immediately trance the competition by going back and doing the right kind of qualitative research where they do let you into their lives. So I think that there will be a swing in the same wave that in visual design there was a movement away from skew amorphic user interfaces to flat design. So the pendulum swung in one direction and then people realized, well actually this is a bit crap cuz people don't realize that that's a button, they think it's a box or colored box on the screen is there. So then the kind of pendulum came back a little bit again.
- And I think the same thing's gonna happen with these kind of more automated use of research tools. What will happen is people will realize that these aren't providing the value that they thought they were providing, but I've got, I think it's gonna happen. I think that is where the field's going. And I think any use of researcher my advice to them would be to make sure that they know how to do automated research because there's gonna be a lot of jobs and a lot of expectation that you're able to do that. So add it as a string to your bone, but don't pivot entirely and say I'm now, I now do automated research and collect quantitative measures. It's one thing that you do, but you'll be much more powerful if you do these more blended methods. So if you do qualitative and quantitative measures, then you're gonna really gonna get insight that your team can use much more so than if you just do one or the other.
- I mean one of the things I'll often worry about when I talk about this is I come across as this kind of someone who only wants to do qualitative research and that's not the case. It's just that I feel that no one's batting for qualitative research, everyone's batting for quantitative research and I'm trying to bring everyone back. It's like, well no, no, it's not just about that. So I also like quantitative research but when it's in combination with qualitative research, so qualitative research gives us an insight and then we can use quantitative research to try and find out, well how representative is that across our audience? Or quantitative measures like Google Analytics will give us an insight about something not working properly on a particular page or part of the system, the website not being used, but it won't tell us why. And that's when qualitative comes in. Similarly with AB testing, we can eventually, with AB testing, we can find out why one performance better than another, but we can't find out why. And that's where again, the qualitative side comes in and then you get a general principle that you can apply when you start doing design later on. So you get the benefits that way. So it's good, it's important to do both rather than think of them as kind of exclusives or one or the other. You can do both, but
- Brendan Jarvis:
- Yeah, tend
- David Travis:
- To chat more loudly about qualitative cause it's the one that's missing out I think.
- Brendan Jarvis:
- Yeah, you talk about triangulation and the need to use multiple methods in order to really hone in on what the need or what the problem is. And I think that's really valid and insightful. Just keeping on the theme of covid, it seems to me the thing that's occupying a lot of people's mind at the moment, and I know you're on a mission to create more user experience professionals and you, you're doing that through your EMMI courses and I think in your introduction I mentioned over 50,000 students go through them. That's only going to accelerate that trend as the world continues to digitize and at a more rapid pace thanks to this virus that we're all living with. The demand for UX skills is gonna increase as a result of that and we're gonna end up, I would assume with more diverse range of people entering the field. You've spoken about some of the positives that are inherent in diversity and having more diversity in UX research. What is it that you feel like has been lacking in our fields in terms of diversity at the moment?
- David Travis:
- There's two answers to your question. I think there's two questions in there is one question around what's the base level of knowledge that everyone should have no matter what discipline they come from. And then there's another question to do with diversity on teams. And here's deal with the second one. First deal with diversity on product teams. So in my experience, most people on product teams are young. They they're most often want it, they're most often middle class, they're predominantly male. Those people are just not representative of your audience as a whole. And the reason that matters on a product team isn't for some kind of political correct, we need to represent the real world here. It's to do with your product. So the team get group, think about the way the product's used and the way the product works and what they need is diversity on that team to have different views so that people can say, well hold on that I don't get that so that doesn't resonate with me.
- What does anybody else here think? That's the way it should be done? And at least they'll raise a question which then says, well maybe we should do some user research on that instead. So there's this kind of diversity at the product team level because then that stops that kind of group think and going down a particular channel. But there's also, in terms of people coming to the field of user experience from lots of different disciplines, which is a great thing because again, we get that diversity of ideas and we get new ideas for new ways of doing things. But what everybody should have is a base level of knowledge. So there's gotta be a certain level of understanding that people agree on a certain level of truths or facts or values about the way research should be done, for example about what good design means, for example, that people, there needs to be a consensus on that.
- Now in my experience, one of the best tools we've got is an international standard ISO 9 2 41. Part 210 defines a series of rules if you like, or values or principles of good design. And that those set of principles have been taken up in by, the reason it's important is because they've got international consensus. It's not like one guru saying, these are my heuristics for good design. I didn't mean to pick on Jacob Nielson. I'm sure there's other people as well but it's not just one person to view, it's a broad, broadly accepted that people will agree these things matter. And that's those that particular standard has been adopted in the UK by the bcs who are an organization that examines people in the IT industry and they've created a curriculum around which if you do that curriculum, you've get a foundation level knowledge in user experience. And my course on you, at least one of those courses covers that curriculum. And I think that everyone that works in the field should have that base level of understanding, that base level of knowledge. And then on top of that, you know, build experience and you bring in new tools and ways of doing things, but fundamentally you agree on the basic principles. And so there's kind of two very different answers to that question about diversity.
- Brendan Jarvis:
- So when we've got this tidal wave of potential new people joining the field and they go through some sort of certification like that, or at least some under understanding of those principles as a laid out, what would you advise them afterwards when they're starting their first job? What sort of things do they need to do when they first start in their first few weeks?
- David Travis:
- So I guess it's more to do with the issues of making sure you've got a good working relationship with your team and that your team that you can set in place certain procedures to make sure that your team are engaged in the user research that you do. So a common mistake I see when I coach user researchers on teams is they believe that it's their job to go out and understand users. It's their role, that's their job to understand users. And then they synthesize that knowledge and they present it back to the team and they give them the answers so that somebody on the team will say, how should I design this widget so that users can use it? And then user researcher goes off, uses his or her knowledge to synthesize and come back with a definitive answer. And if you follow that approach, you're on a false errand.
- Because the one thing that you'll know if you've ever run a usability test is people do often do things differently to the way you expect even when you've had a lot of experience working with users. So what you need to educate your team in is the knowledge that it's not your job to job to understand users. It's your job to help them understand users. You are not gonna do that learning for them. They need to do the learning. And they do that learning by coming out with you on field visits, by observing usability tests, by watching highlights of videos that you've presented by working collaboratively with you to identify the different user groups or the personas if you follow that at that particular approach. But the goal that you've got as a user researcher should be not to go in as the expert. And I know that's difficult because you think you should can't belong the expert, I should know all of the answers.
- Instead, you need to go in more as a facilitator to help the team understand how they need to understand users better. And obviously you're gonna lead the user research, you're going to plan it in collaboration with them, but you are going to do a chunk of the planning. It might only be you that observes all of the user research sessions and the people on your team may only see one or two of those sessions. But the idea is that they're involved and engaged in it and they realize that your job is to help them get to see users. You're more like a fixer. You're in the sense that you're able to get them to see users rather than someone that's delivering all of the research for them. They need to be observing it too.
- Brendan Jarvis:
- It's almost a harder job. The conversations that I have with internal researchers are often around the difficulty that they have convincing people to attend sessions, to read their reports like you mentioned you were experiencing with Orange. It seems to be an uphill battle and continually needing to think about ways in which you can build that culture. And one of the things that I wanted to ask you about was Agile and the role that Agile has had in the way that it's impacted user research. A lot of organizations are adopting it, even banks, traditional organizations, and presumably they wanna do that cause they want to manage change more effectively. But in my observation in practice it seems to be more about doing more things faster and with less thought and not necessarily about doing more effective work. Has Agile been a good thing for UX research and users?
- David Travis:
- Right? It's been a good thing for UX research. Absolutely. No question. No question whatsoever. You could still do crack projects with Agile, you can do crack projects with Waterfall but with, let's say that we, let's say that we live in a parallel universe where Agile had never happened and we were still using waterfall based design. It wasn't as if that was the niana for people that did UX research. It was still rubbish because, and then it was even worse because you were expected to do all of your research up front and have it all now down so that the requirements could be written so that then the software could be delivered. And if you got the chance to do a usability test, it would be done at the very end of development, but it was too late to change anything anyway, so that wasn't working either.
- So what works user user UX research will only work in a design process that's iterative. And if you don't have an iterative development process, then UX research will always struggle. What Agile provides you with is the opportunity to do that iterative research so that you can continually test out your assumptions. And so for example, you might run a usability test with five users in one sprint and you've discovered something but it's a bit weird or a bit off the wall and you're not really sure if it's true or not, but you can make a change to the design and in the next sprint you can test it again. And so you're continually checking your work. I mean it's not science, but it's that kind of scientific method that you're check you, you're making changes and then you're checking afterwards, did this actually make things better or did it make it worse?
- Now you can do iterative design without using Agile. And I know that there are problems that many organizations struggle with when they're adopting agile often because they're not actually doing Agile, they're doing their own version of Agile. But I think, I can't see any reason to criticize Agile because it doesn't support user UX research it. It's the only method I know of where it does. And the other thing is it's the only game in town at the moment. I mean, what else are you gonna do? Are you gonna refuse to work for a team because they're doing agile? I don't do agile, I do it differently to that. Well, you won't get a job, so you've got to fit in with it. There's no question that you've gotta fit in with agile teams but look on the positives. The positives are that it's iterative design.
- What you need to do is make sure that the stuff that you are working on is considered an important part of the backlog as features and functions that the team we're working on. So it's your job as the UX researcher to make sure that your stuff gets on the backlog and that you're not just treated as a second class citizen, that you are part of that process too. And if that happens, then you can work, it can work well with Agile. I mean, that said, I've worked in some organizations like GDS that have adopted Agile and UX Research works brilliantly within that organization and it's part of the process. Everybody values it and they're using Agile with UX goes Perfect. And I've worked with other organizations including some banks that do Agile and UX research struggles because more often are not the people that doing UX research feel that they're always trying to run to catch up. They're kind of never able to be in front of what's going on. So it depends on how Agile's applied within the organization. But I've seen it work well and I've seen it work badly, but I don't think it's the problem necessarily with Agile. I think it's the best method we've got at the moment to make UX research happen on teams.
- Brendan Jarvis:
- Yeah, one of the objections that I often hear is that research slow slows things down and that particularly you spoke about the rockstar designer when we first started talking, there is still that attitude out there in some sort of maybe more UI focus design and product teams that we already know what our users want or if we get it wrong no big deal, we'll figure that out when it gets to production and we'll just adjust it once it's live. What sort of other objections or similar objections have you seen to the role of user research in an agile product environment?
- David Travis:
- Well, no, that's too negative. A way of thinking about it, Brendan. I think you can always come up with reasons to not do UX research, usually coming down to money or coming down to time and so on. But instead, what I would say if you work are a UX researcher on a team that holds those values, you gotta find yourself another team. You're not gonna make any impact on the world. If you are stuck in a team or an organization that doesn't value the work that you do it'll, it's just move on. Now for every team that doesn't value UX research, there's another team somewhere else that does value UX research. So yeah, why fight against it? Just my suggestion is move on. And now I've been fortunate because I've worked as a consultant and that means people, organizations only come to me when they have a need. So they realize that bloody, yeah, we better do this stuff because things aren't working properly. But in, look, I've worked, I know people work in organizations who do have teams that do have that mentality, but my advice is, move on. Why try and convince a Republican to become a Democrat or vice versa? It's
- Brendan Jarvis:
- Way too short for that.
- David Travis:
- Yeah, so just move on Instead, it's much easier I find to preach to the converted than it is to try and convert people. So that said, so long as you've got an organization or a team that are relatively open, they may be a bit cynical, but if they're relatively open, then in some ways you can have a massive impact. So if you think of it in terms of UX maturity at the top level, you've got an organization for whom UX is part of their process. They're always doing UX research they use it to decide what to build next and so on. Now that's a great type of organization to work for as a UX researcher because your life's easy, but on the downside, it's hard for you to have a big impact because they're doing things really well. So it is difficult for you to make the product 50% better.
- You can make it 5% better perhaps, and that's a good thing. That's a valid way to spend your career but it's hard for you to have a big impact. Compare that with working at an organization that's at the bottom of the UX maturity pyramid or whatever, but ladder if they're at the bottom of the ladder. But assuming that they're still allowing you to do some kind of UX work, even if it's just usability testing, you can have a massive impact because the chances are they've never done that kind of research in the past. So you are not gonna make a 5% difference. You're gonna make a hundred percent difference to the product. So although it's more frustrating to work in that environment, you can have a bigger impact. Now, of course, at the bottom most level where they're not interested in user research or UX at all, that's what I'm saying, that's where I'm saying get up, find another rung on the ladder to work it. Because at the bottom most level, you're not gonna be able to do anything cuz they're not gonna give you budget to go and see users. And even if you went to see users on your own time, they would probably ignore the results because they know what users want anyway. So kind of try and find a different team within your organization or a different organization that's at least on the one rung above that so that you can have the impact. But you definitely can have, as a UX researcher,
- Brendan Jarvis:
- You need to be able to get some runs on the board in order to build that belief and having even greater impact. And what you're saying sounds like you don't even have the chance to step up to the crease. There's no point even playing that game. Yeah, yeah. Hey, look, you've been doing this for a while now. I don't wanna keep bringing back you back to that, but 31 years, it's been an amazing, amazing stretch and you've seen a lot. What has it been that you've enjoyed the most in that time about being a UX researcher?
- David Travis:
- I think it's coaching and training other UX researchers. Cause when I started in the field, there really weren't that many people doing well, what was called human factors or usability back then. Basically I knew everybody in the UK that worked in the field the there'd be a small number of people that were doing that kind of work. And there's been an absolute explosion in people that work in the field these days. And as a consequence, a lot of the people that come into the field, they don't have that kind of base level of knowledge that I was talking about earlier. And what I found most satisfying is helping those people kind of seeing the scales before the eyes fall away and get the insights and be able to start on a career. Because I feel that that's the way, if I'm gonna leave a legacy of any kind, the way I'm gonna do that, it's not gonna be with a book it's not gonna be with a [inaudible] course. It's gonna be by having a tribe of people that I've managed to convince this is the right way to do it. But then go out into the world and make those things happen. So that's the way you get leverage, you know, get other people to apply the ideas. So for me, what I find most satisfying is coaching people and seeing them thrive and seeing them being able to make that difference in the world. So for me that
- Brendan Jarvis:
- Are you up for playing a quick game?
- David Travis:
- Fire away.
- Brendan Jarvis:
- Excellent. So it's called, what's the first word that comes to mind? So I'm gonna say a word and you're gonna give me your first word that comes to mind. Shall we do it?
- David Travis:
- Okay. So it sounds like a bad game, bad psychologist would play. So I hope you hope you're not trying to raise my mind.
- Brendan Jarvis:
- Well, having zero qualifications in psychology, it probably would be quite like that bad psychologist game. So the first word is UX research.
- David Travis:
- What comes into my mind, I think of UX research, well actually it would be user research. And the reason of that is because I struggle with the difference between those terms. So I tend to use UX research and user research interchangeably. I'm sure I've done that a few times during our conversation today as well. I tend to come down on the side of UX research rather than user research because I think user research sounds too small, it sounds too focused, because often the research that we do in UX research, it's not just about users, it's also about the business. It's also about understanding stakeholders and so on. And although user researchers do that as well, user researchers sounds like you only do research with users, whereas a UX researcher, you do research on anything that affects the user experience. So I would say user research is the first thing that comes into my mind when you say UX research.
- Brendan Jarvis:
- Excellent product management.
- David Travis:
- Agile will be the first thing that comes into my mind when I think of product management.
- Brendan Jarvis:
- And the last one is opinions
- David Travis:
- Misunderstood. Think is the word that comes to mind. So let me qualify that. What I mean by misunderstood is people misunderstand the importance of opinions. So of course opinions matter. Of course we want people to the products that we design, but it's not the most important value that we should have, at least equally, and I would say more important that people can do things with our products that we want them to do. If you go back, I don't know, 20 years, 30 years, then maybe a lot more was to do with opinion. So if I went into an electronic store 30 years ago to buy a vcr, what mattered was how the VCR looked and the was it attractive or not? And they're the things that matter to me. The fact that I then got it home and I couldn't change the clock, so it was continually flashing 12 o'clock for the rest of its life in my living room, kind of was too late by that time because I'd actually bought the product.
- But with digital products, it's different. People are using these things every day, and if they don't like it, then they'll go somewhere else. The costs of leaving are much smaller with digital products than with physical products. So we need to focus as well as thinking about whether or not people like it. So focusing on their opinions, we need to focus on their behavior. We need to find out, well, what makes 'em successful? How long do they take? What roadblocks do they encounter when they try and do common tasks? So I think it's a common mistake that some people, maybe people that aren't in the field, but also some people that are in the field think it's about soliciting opinions when really that's just one small part of the puzzle.
- Brendan Jarvis:
- When thinking about the future in the next 30 years, what is your greatest hope for this field that we both work in?
- David Travis:
- That the pandemic gets resolved, let's say, within a year's time, and that people can get back to doing face to face UX research with the real users.
- Brendan Jarvis:
- So important. Well, David, look, I've really, really enjoyed today's conversation. Thank you for being so generous with sharing your experience and your insights. You've shared so many great stories today and I'm sure it's gonna help the people that are watching this be better UX researchers and build better products
- David Travis:
- No worries, Brendan. I'm glad it was useful. I hope everybody watching enjoyed it.
- Brendan Jarvis:
- Wonderful. And thanks everyone for tuning in. Everything we've covered today will be in the show notes, including where to find David, David's YouTube channel, his uDemy courses, a link to where you can find, Think Like a User UX Researcher, and all of the other resources that we've mentioned. If you enjoy the show, please remember to subscribe and comment on the video. And until next time, keep being.