Laura Klein
Maximising the Value of Lean UX
In this episode of Brave UX, Laura Klein reflects on the many important lessons she's learned from over 20 years of working in UX and product management at Silicon Valley tech startups.
Highlights include:
- Why do organisations still seem to resist UX research?
- What needs to be in place to maximise the value of UX?
- How did Eric Ries (author of The Lean Startup) shape your thinking?
- Do teams have to be happy to make great products?
- What do UX researchers need to understand about product managers?
Who is Laura Klein?
Laura is a UX Manager at Indeed.com and the Principal of Users Know. She is also one of the most well known personalities in the worlds of UX and product and dDuring her career she has worked as a UX designer, product leader and engineer.
Laura is the author of “UX for Lean Startups” and “Build Better Products”, the co-host of the “What Is Wrong With UX” podcast and the host of the "What is Wrong with Hiring" podcast.
Her books and content have had a profound impact on the way that product people approach creating value for users and their organisations.
Transcript
- Brendan Jarvis:
- Hello and welcome to another episode of Brave UX. I'm Brendan Jarvis, Managing Founder of The Space InBetween, and it's my job to help you to put the pieces of the product puzzle together. I do that by unpacking the learning stories and expert advice of world class UX design and product management professionals. My guest today is Laura Klein, one of the most well known personalities in the world's of product and UX, drawing on her 20 plus years in the West Coast technology scene. As the principle of users know, Laura helps companies to build better products through world class advisory training and coaching and product management, user research and UX design. She shares a particular skill and passion for helping startups as they search for product market fit. In fact, it was largely her time at InView, a company founded by the author of the Lean Startup, Eric Ries, that inspired her to condense what she'd learnt into a book that shows startups how to use UX to get to fit faster.
- It's called UX For Lean Startups, and it's been widely praised for its witty, punchy practicality. If you've read it, it's easy to see why it's fast become a classic. It's just so very helpful. Laura is also the author of Build Better Products, an excellent field guide for product people who want to unlock their team's potential to make smarter and more successful design decisions, especially those working in larger and more established businesses. Both of her books have had a profound impact on the way that product people all over the world go about creating value for users and their organizations. Laura has spoken at many conferences and events, including the Lean Startup Conference and mine, the product. She's the co-host of What Is Wrong with UX, with Kate Rutter, a podcast where, and I'm quoting now two old ladies yell at each other about how to make products suck slightly less. While she might describe herself as an angry old lady, I prefer Wise Provocateur and it's my pleasure to be speaking with her on Brave UX today. Laura, welcome to the show.
- Laura Klein:
- Thanks for having me. I should have you write all of my bios. I should I tell you. That was awesome. Thanks.
- Brendan Jarvis:
- It's great having you here today. And one of the things that really stood out to me when I was preparing for today's conversation was looking at your bio and your work history, and you've done a lot of things in your career. You've been a researcher, an engineer, a UX designer, and a product management manager as well as a consultant. And it got me wondering, do you get bored easily
- Laura Klein:
- [laugh]? Yes, I was just gonna, that's my typical line is that I just get bored easily and I do. And also with the exception of engineering on that list, and I, to be very clear, I was always a front end engineer and I was never really that good at it. I will say that the UX design, the product, the strategy, all of that stuff, when you have done it for long enough and you kind of get to the high enough levels, it's really, really similar. We think about the same things. We are all trying to just find the right product to build for the users and for the company and hopefully for the world. And those questions become very similar to one another. And I honestly have trouble with folks who split them up too much and say, oh no, I only care about this one little area of things like, okay, but we all need to figure all of this stuff out together. So [laugh] all kind of asking really similar questions at a certain point. And I mean obviously people specialize in one of those areas and that's fantastic and we need those folks, but we all also need to understand what all of those different specialties do and how they contribute to making good products. And so yeah, I haven't found them that different when you really get into the decisions that you're making.
- Brendan Jarvis:
- And I picked that up from the Fords and the openings of your books. Also you, you're very clear in there that while UX for Lean Startups might suggest that it's just for UXs, really not. It was a book for entrepreneurs or anyone that was interested in getting to product market fit faster. And it's interesting also to look at other leaders in the field. Say for example, Marty Cagan in his book inspired he also includes techniques there for understanding users as a core part of being a product manager. So really these lines aren't as clearly drawn as they would appear to be in some organizations.
- Laura Klein:
- If you're a product person and you're not interviewing users and understanding users and finding ways to get in touch with users, I think you're a very poor product person. And if you're a designer and you don't understand how your company makes money and you don't understand things like who's using the product and how they're using it, I think that you're gonna be much less effective at most types of actual user experience design if you're doing that stuff. And I think that if you're making any kind of strategy decisions and you don't care about who your users are, you're gonna make TE or you don't understand what the engineering implications of things are, you're gonna be garbage at it. So not that we all have to be perfect at it. It's not that we all have to know everything. This is not gonna turn into a designer's must code cuz I think that's nonsense. But I think we need to understand each other better. And I think that there's more overlap than people sometimes believe there is
- Brendan Jarvis:
- [affirmative]. And that's interesting and maybe this is a bit of a can of worms to open, I suppose it depends on who you speak with, but you mentioned there about it might be ridiculous to expect designers to code yet there seems to be an expectation out there for everybody else in the product team outside of the user researcher to do user research. Why does it go that way and not the other way?
- Laura Klein:
- I think that there is a very strong distinction between do user research and find ways to understand your user. Those are two slightly different things. The best user researchers that I know are extremely experienced in lots and lots and lots of different specialized methodologies for really understanding different things, either qualitative or quantitative and sometimes both, and how to put those together and devising studies. And they're very well informed about all kinds of things and they're also very good at helping other people get the information that they need. So one of the reasons why we often say, you know, need to be out talking to users is that I think that just waiting on that report from a user researcher is not a great strategy. Having an expert come in and help you figure out how to answer the questions that you need answered and helping you set up an ethical and well-designed study that will actually answer the questions that you want answered and helping you to refine that, yeah, you should absolutely do that.
- And being there and even learning to moderate I, I think a lot of people could do that. So there's a range of things. I think one of the reasons that people often say, yeah, product managers need to be out there doing it themselves is mean frankly we have a shortage of great user researchers. And often if you do bring 'em in, then there is the tendency to just to turn over the whole thing to them and then have them turn over the slide deck, which then everybody ignores. And that's just not a good strategy. [laugh], I would actually rather have people doing their own bad research than that. But what I really want is get experts in to help you figure out how to do the right thing and then have them help you do it. And so that everybody on the team learns. I like engineers watching real user research. I don't necessarily want them planning the study unless that's a thing they're an expert in too, but I sure want them watching actual people. I want them hearing directly from the user what they're doing to them. [laugh], I mean, what are we inflicting on people? [laugh], take a look.
- Brendan Jarvis:
- [laugh], yeah, let's get real. Let's get it in front of some real people. Yeah, it is really important part of the product process and opening up those opportunities for the rest of the product team is important too. You spoke just a second ago about the reason why we have this push for people other than a specific user researcher to get involved in that research is because of a lack of supply of qualified researchers and that we do need this work. Is that a result of people wake awakening to the things that you and some of your contemporaries have been going on about for quite a long time about the value of involving the user and the product design process or something else?
- Laura Klein:
- I mean I wanna take total credit for it. I think that's probably, I'm gonna say that slightly unrealistic. Okay. No, it's wildly unrealistic obviously. But I will say this, I in the 20 or so years that I have been doing some form of user experience design and I have been going out and talking to teams and companies sometimes as a consultant, sometimes in house. I will say that where I am in Silicon Valley and the kinds of teams that I talk to, which tend to be more technical, I get much less pushback now than I did 10 years ago, especially 15 years ago about we need to talk to users. I remember in the early odds when you'd go out and talk to people and you always had to start with this is what user research is and this is why we're gonna do it before we start drawing pictures for you.
- And they're like, well [laugh], I mean not a hundred percent of the time. And some of the companies were like, yeah, we know that's why we hired you. But there was more of that and now there is less of that. And I'm not gonna say that there is none of that because there absolutely is some of that. And I run into it every single time I'm at. I give a talk to any user researchers and the first question that gets asked is, Hey, how do I get people to take me seriously and actually listen to the things that I say? And it just makes me super sad because I know that user researchers, we may have gone to the point where now user researchers get hired and then ignored whereas before they didn't know the need for them and it was just engineers building stuff. And I mean this is all in the nineties and early yachts like I said.
- So it's getting better. I think people understand the need for user research. I think there are obviously wonderful practitioners out there who are doing it, who get it. There are obviously lots of great product people who get the need for it and who can actually do it themselves or who know when to bring in an expert. And I think that's not the majority of people yet. I don't know if it will ever be the majority of people. Why do we ignore this? I have my own hypotheses about that [laugh]. It is especially in the startup world, many people approach things, what I call idea first, where I have an idea, I have a great idea, it's gonna be jobs for pets and that's the thing that they wanna build. And they're, I actually own the domain name jobs for pets.com because why wouldn't I? It's a great idea.
- We're gonna get jobs for pets and then I've got this idea and then if I've got some person coming in and going, well, not pets, pets make terrible accountants, then I mean, don't maybe wanna hear that because I wanna build the thing that I wanna build. And we get a lot of that and then we get a lot of founders of being idolized for having that brilliant vision up front and never swerving from it. And user research just gets in the way. So there's a lot of that. And then there's a lot of, we don't really know how to incorporate them into the product development process. And a lot of people have actually had bad experience trying to do user research. But I went out and I asked people what they wanted and they said they wanted jobs for pets and then I built it and then nobody used it. And all the user researchers who were listening to this just got extremely angry cuz they're like, it's not about just asking people what they want. And so anyway, all of those ID reasons, all of the reasons for we don't have enough and the ones that we do, we don't listen to enough. And there is this push to like, well, you should do your own user research. I mean, to a certain extent, yeah, that's better than not [laugh] better than doing nothing.
- Brendan Jarvis:
- Yeah, it certainly is. I mean it sounds like you feel that things are trending in the right direction, but there's still a bit of a way to go. And part of that you'd also touched on earlier, which was the time it takes and the fact that the reports, and we all laugh about this, but the reports don't get read. And to me that seems like we have quite clear understanding of the parameters around the product delivery cadence. There's ceremonies, there's rituals, there's the two week sprints, there's release, the release cycle that you're going to production with whatever it may be, daily, monthly, weekly, yearly. But when it comes to discovery, it's almost like that part is somewhat forgotten. There's not as enough structure around it to make it usable, make the research usable and integrate well delivery. What have you observed in your experience working and consulting with these organizations?
- Laura Klein:
- The funny thing is that I'm currently writing a course on designing and researching for agile teams. And so I've done a tremendous amount of research about this just recently. And also if you haven't interested in how some research can fit in better to agile teams, everything you mentioned there was a symptom of agile [laugh], at least you should look at the work of there Torres who talks about continuous discovery and that's all really good. And Cindy Alvarez writes a lot about, well, she doesn't write as much anymore, but she did wrote a great book on customer development and like that, how to figure out product market fit and how you use user research and customer interactions to do that sort of thing. So there are a lot of people who actually are thinking about this and talking about this. So I don't wanna say I have all the answers I, but what I will say is that most people, the problem is that most people who are doing things like sprints and scrums and retros and the whole thing are only doing it for their engineers. They are, or if they're doing it and including their designers and researchers, they're adding a sprint zero because of course every research and design project can be time boxed to two weeks [laugh].
- I mean there's no concept that maybe a diary study might go a little bit longer than that or that might be shorter than that or that no, it's exactly two weeks, that's what it is. That's what I heard that we need to time and we'll know that ahead of time. All of that is obs absolutely nonsense. And if you cut that into a clip and just show that we will have words [laugh] a it's, I may very well do that. [laugh], no lockdown won't last forever really.
- Brendan Jarvis:
- I promise. I promise won't.
- Laura Klein:
- Really important to understand that you need to approach discovery, which [laugh] a funny word that covers a lot of different topics. I mean it's discovery for like, is this the right product? There's discovery for, there's usability stuff there. What should I build next? Is this the right thing to build next? Is this the right way to build it? There's all of these different parts of user research and we've all kind of shoved them into this concept of discovery because that's what it was called in waterfall. And we did it all up front for a brand new feature, or sorry for a brand new version of the product, right? Oh, we're gonna redo the whole product, so we're gonna go and we're gonna do six weeks of discovery. And that's how that project's gonna be run and everything has fallen out from there. And that's not how you work on an agile team.
- On an agile team. You need to be doing continuous discovery where you're constantly talking to people and constantly running experiments and constantly getting feedback on things. And you're making it very easy to get the quick, most people don't work on products from scratch. Most of us work on existing products and just improve them incrementally all the time. And that is a very different kind of research and discovery in many cases, especially on a specific feature. Then we're building a whole new thing and it's gonna be wildly innovative and completely different and it's completely from scratch. And so you approach those very differently. And so what we need to do is we need to get out of this discovery period and figure out ways to integrate research constantly into the design and development process. And that might mean involving engineers earlier, it might mean involving engineers in things like figuring out the metrics for things.
- And it might involve things like designing little pieces of things and experimenting with them and getting them in front of people and then getting feedback and then building something else. Or it might involve us all taking a little time and really doing some in-depth research with people to understand their needs because why would we have the engineers working on building something if we don't know that's the right thing? So these are all different modes and we call them all user research or discovery or whatever. And that's just we have to stop. We don't have to, we can keep doing it, but I think it's not helpful.
- Brendan Jarvis:
- Yeah. Yeah, I mean I suppose what it seems to me that it get, it's all getting at is a sort of a search for certainty that people want to know that they're gonna have an output at the end of whatever the period is that you've allocated for the work, regardless of whether or not that's realistic to expect. But it also seems that it comes back down to the product team understanding what are the right questions to be asking and then how do you go about answering them so you can get some reality into the situation in terms of timing. And they're very different. I think you touched on this earlier, very different types of research mean running a usability test on an existing product or on a potential new feature is very different to, like you mentioned, going and doing a diary study and really searching for a new problem to solve. Those are wildly different skills
- Laura Klein:
- And people do a lot of, and I mean I used to do this a million years ago and I kind of hate that I used to do it, but a lot of people do prototype testing and then try to extrapolate from that whether somebody will use the product. And that's not what prototype testing does. Prototype test will not tell you whether somebody will use the product. The prototype test will generally speaking, tell you if somebody could use the product in the way that you have asked them to try to use the product in the prototype test [laugh] it. Now there are other
- Brendan Jarvis:
- Definitions, it's an abstraction.
- Laura Klein:
- Yeah. And there are other definitions of prototype that I don't wanna get into, but I'm talking about the traditional, I'm gonna show you a collection of screens that you're gonna click or swipe through and do. I'm like, okay, great. Yeah, no, that's fantastic. That's a great usability test. And it'll absolutely tell people you, if people can do the thing that you are asking them to do in the test and it will tell you literally nothing else other than maybe, oh, this is wildly confusing or I don't even understand what I'm supposed to do. But it won't tell you [affirmative] that they're gonna do it. So that's a really important thing to understand. You need a different type of experimentation and interview technique and study setup to do that kind of thing. And people don't always get that and they don't like doing it because it's so much easier. Its more fun to just build a prototype and then show it to people and then everybody goes, Ooh, look, it's so pretty. And then they build the wrong thing.
- Brendan Jarvis:
- Yeah, I mean you've talked about this in one of your talks that I was researching for our conversation today. I think one of the techniques that you spoke at length about was fake door is one way that you can test demand for a feature before you actually commit resources to prototyping that out. There are different techniques, I suppose, outside of the norm of what people think of as user research to understand usability but also desirability of whatever it is that you're creating.
- Laura Klein:
- Yeah, I prefer concept. I mean these days I prefer concierge tests, just so you know. The fake door only works if you have a product and it's got a front end and you've got enough traffic. I think concierge works really well for that as well. And it works with a much smaller group of people and you're less likely to irritate people and you actually get more detail, but it's also harder. So [laugh],
- Brendan Jarvis:
- For people that are listening that don't know what a concierge test is, could you just give us a bit of a description of how that works, what it is?
- Laura Klein:
- Sure thing. Yeah, you have to do stuff by hand. That's it. That's the whole thing. You offer a service and you can offer it through whatever, you know can offer it through whatever medium you wanna offer it through. But if it's we're gonna get people jobs or we're gonna get jobs, we're let's just use jobs for pets. That's my favorite one. I have a zillion examples for that. We're gonna get jobs for pets and we have this new system where we're gonna set up pet profiles so that people can browse pet pet profiles and find the right border collie to herd their sheep. And it's actually, once you start using this analogy, it is remarkable how many people have pets that actually have jobs [laugh]. So we're gonna do this. And so the first step isn't necessarily to set it all up and automate it and have this great onboarding system so that border colleagues can come in and register themselves. That's probably not the first step. The first step is to see if you can actually find people who want to hire border colleagues and find people who want to rent out their border colleagues and the border colie, have that skill and see if you can put them together yourself like a border collie matchmaker. And that's if you can do that and if there's a lot of demand for that, then you can start to automate it. And if there isn't, then putting a website up is probably not gonna create a lot of demand for that.
- Brendan Jarvis:
- Yeah, yeah. It's such a wise way to approach things and I believe Airbnb was one of the case studies for doing that. They were actually going around and taking photos of their guests houses because they realized that the quality of the photos that the guests were posting, posting were terrible and that they were actually increasing conversion by going out and putting up better photos themselves. So
- Laura Klein:
- Their hypothesis was that they were like, oh, these are terrible photos. We believe to be true that if we had better photos we would have better conversion. Now there are a bunch of things that you have to validate. You have to say, oh, is that true? And B, is that cost effective and C, is that a thing that we can get people to do with more guidance or is that a thing that we have to do for them? Or is that a service service that we could offer them? And so there's a whole bunch of steps there. And yeah, the first step that they were like, can we just come take better pictures of your house [laugh]? Is that okay? And then we're just gonna see if you rent it more. And then they did. And so then that was a service they offered.
- Brendan Jarvis:
- Anyone that looks at your body of work, whether it's your podcast or your posts or your books that you've written, and I mentioned this in your introduction, you have a real passion and also a skill for helping startups to solve these kind of early stage questions or problems that they're facing with the least possible wastage. What has it been in your career that has made that such a focus or one of the focuses of your energy? You've put a lot of energy into helping people solve this problem. What is it about this problem?
- Laura Klein:
- That's a great question. I'll be honest. I think we screwed a bunch of stuff up in the nineties especially that a whole lack of a business model thing that was a real problem that we was a corner we painted ourselves into, didn't mean to sorry about that. But I worked with startups then and I sort saw people trying to struggle to figure this stuff out. And the thing that made me a little sad, I will say this, is that over about the 20, 25 years, whatever it is, I don't know, 8 million decades that I've been doing this, I saw a lot of people make the same mistakes over and over and over. I mean, not necessarily the same ones we were making in the nineties, although some of those are coming back in style, [laugh], none of this social media stuff didn't happen on bulletin boards, [laugh]. So just letting people know. So a lot of this stuff, they're just making the same mistakes and it always kind of made me sad cuz I'm like people to make new and more interesting mistakes. And so I try to help them get over the obvious stuff and not just build a bunch of crap that nobody uses or nobody wants cuz it's just such a waste of time and intelligence. And we could have flying cars, I don't actually want flying cars, but I don't know renewable power, I want better batteries. I don't know, somebody build me some important stuff [laugh].
- Brendan Jarvis:
- And I wonder, from your time working at InView with Eric Grace, it seemed to me that was a bit of a pivotal point in your thinking, or at least your desire shortly thereafter to express that thinking and help solve this problem or these repeatable problems for people with the book. What was it about that time working with Eric at InView that shaped your thinking on UX and product?
- Laura Klein:
- That's when I started to use metrics. So I had been an engineer, like I said, front end, not very good back in the nineties. And then I had switched to UX design and I'd always done some research and I did more research and all that stuff. So I did that in the early two thousands. And I had done that for a while and got a really good grounding from the place I was working in really solid, what we call interaction design, but user research, real grounding in user research. And we all did the user research ourselves. So doing the user research, doing the interaction design, doing the prototyping, doing the iteration, understanding users, interviewing stakeholders, interview the whole process, the actual human computer, the whole user centered design process. And I loved it and it was fantastic and we built a lot of cool stuff.
- And then I got to envi because I wanted to do some programming again cuz I started to miss it for some reason that I honestly cannot even imagine why that was. But it was. And I kind of missed programming and I wanted to do a little bit of both. And at InView I got to do both. I got to do all three, I got to do research and design and also I got to ship code, which was fun. And they had this really strong culture of testing and experimenting and I got to learn about AB testing and multivariate testing and looking at metrics and understanding how experiments work and how they don't work. And I got to see a lot of, we screwed a lot of things up and we looked at things like the conversion funnel and why doing this thing here makes this conversion funnel better and why, what makes it worse?
- And coming up with better hypotheses about this and better ways to test those hypotheses. And I actually got to do that with a big enough sample size and it changed the way that I think about design because I always kind of had approach design, if we understand the user's problem then we can come up with a great solution and then as long as they can use it, it'll work. And what we saw was that we don't always come up with the right solution the first time and sometimes there are little pieces of it that are better or worse than others and sometimes we can tweak something. And that was the other thing is that I really got into this idea that there really were things that you could tweak that would make the experience just much, much, much better for users. And sometimes the best things we could do were the smallest things and other times we actually had to dig in and understand what's this giant thing that we have to do and let's break it down into pieces and let's get it out to people.
- And it just wasn't a way of designing that I had done before and it was great. I strongly recommend it to anybody who still wants to just design and then not worry about it. I also got to work directly with a lot of really unbelievably smart, fantastic engineers. Do not tell them I said that they will. I will never live that down. [laugh] still really good friends with a bunch of 'em though I'm actually married to one of them [laugh] and especially don't tell him. But yeah, I got to work directly with these unbelievably smart engineers who could help think through the different experiments and they cared really deeply about the users and a lot of them went and talked to users themselves and it was just such a good experience working sort of much more closely with the engineering department and not looking at deliverables that I chuck over the wall to people in a deck that I know they're gonna ignore half of which I designed a lot of really nice stuff, some of which never made it out into the world and that made me sad.
- Brendan Jarvis:
- Yeah, that's always a deflating experience. And it's interesting you mentioned the culture that when you arrived at in view it was really supportive of getting design in front of the user. You've even mentioned there that engineers really cared about the users went in and spoke with users, were really involved in that process. And that really gets me wondering what is it outside of the practicalities that I feel like our community has quite a good handle on, or at least they have some great resources like the books that you and others have written as to how we should be approaching the process of discovery and delivery. But what are the cultural or structural or potentially both elements that need to be in place to really enable that to deliver value for the organization?
- Laura Klein:
- I think on the engineering side of it specifically you have to have a culture where I'm gonna use, use the most overuse jargon right now where we're looking at outcomes over outputs. [laugh], I apologize for that, but it's actually a really good, a great way of expressing it where engineers are actually held to account, I don't wanna say held to account cause I don't wanna blame people for it, but where they're, they care about the thing that they're making and more specifically that they care about the way that the person on the other end how they experience it. The engineers have to care and they have to be not punished for caring. There are lots of teams where engineers are punished for doing anything other than just crossing tickets off the list. And if all you're doing is get it trying to tear through Jira tickets as fast as you possibly can, you can't say I need to take the time to really understand this user story.
- I need to really understand the user. I think this is maybe the wrong way to go to, engineers are generally speaking bright people who are good at problem solving. Some of them are great with people, I'm not gonna stereotype all of them as being hermits. They're not all like me. Some of them are actually good with people [laugh] and some of them aren't and that's fine too. But overall their job is often problem solving. And so exposing them to the actual problem that they are solving often has wonderful results for the outcome and they can often help figure out what is the right way to solve this for that person. But they have to be exposed to those people. They can't just have those people in their head on cuz then they'll just start building. If it's all just tickets then it's all just abstract and they don't know what it means on the other end. And they can't make decisions about, oh, if I make this decision, it'll be a better experience for the user. And if I don't be, I mean what are they gonna do? They're gonna make decisions based on what's easier for them. Cuz I mean whom amongst us [laugh], if all we care about is pulling tickets off.
- Brendan Jarvis:
- Yeah, I mean it's that real delivery hard nose delivery culture I suppose. It sounds like what you're saying is that we need to lift people out of the abstract understanding of the user and actually make that a bit more tangible and real for them. So if you're in a product team or a product organization or even an organization that also builds products, maybe one of the things that you can consider if you're listening to this is actually try and find ways to expose or involve your engineering team and the wider team, including your stakeholders in this process of putting the work in front of users and seeing what happens.
- Laura Klein:
- Yeah, it's remarkable to me how few people know how the results of their actions affect the end user, whether that's engineers or marketing people. I mean sales people do cuz they're actually out talking to people, but designers, product people, they don't see the end result. And if you don't have that end result, then you don't have the feedback of, oh, that decision I made was a good one or that decision I made was a nightmare [laugh]. And so if you don't have that, then you can't make better decisions in the future if you just put things out and you're like, yeah, no, it went great. And I believe that to be true because I was judged on whether it shipped not on, whether it made someone happy. Right.
- Brendan Jarvis:
- [affirmative]. Yeah, it's tricky though as well. I mean I was running a usability test the other week and in the debrief in between one of the sessions, one of the stakeholders literally said there were no issues with this form. And it's almost like, well there definitely were [laugh] it.
- Laura Klein:
- Were you strong belief? Yeah. Were you watching the same sessions that I
- Brendan Jarvis:
- [laugh] that was being in parallel universes you not, it was quite eyeopening, but it was also a good reminder, I think that you can still do the doing and the practices and you can open the process up and you definitely should be doing that and putting things in front of users and involving stakeholders. But in some cases you're gonna have a bit of a longer and harder challenge on your hands because you really do have to and I don't wanna use the word convince, because it's very difficult to convince people to believe things that they don't already believe. But you do have to find another way sometimes of shifting those mindsets where, no, everything's fine. We don't really need to be doing this. No problem to see here, let's just use quant methods or do something on production and see how it goes. Instead of doing this kind of work.
- Laura Klein:
- If quant methods would convince them, I'm more than happy to use quant methods. Part of it is that we do sometimes need to be a little bit flexible. And one of my favorite questions when somebody's like, well this is absolutely how things are and this is how I believe they will be. One of my favorite questions is, well, what evidence would you have to see to make you change that belief? [affirmative], right? What
- Brendan Jarvis:
- I love it.
- Laura Klein:
- What would I have to show you to make you think that maybe that wasn't the correct thing? And if you can get that in writing. So here's the problem there are people who just won't change their minds. And this is a thing that I have run into, and like I said, I'm not terribly good with people. I'm very bad at convincing people of things which seems like a terrible trait for somebody who's in consulting, but I just picked my [laugh] clients very well. I pick the ones who want to change [laugh] but absolutely there's something
- Brendan Jarvis:
- In there though,
- Laura Klein:
- Right? Yeah, no, there absolutely is, right? You can sometimes run into people, and I don't think there are a lot of them or a big percentage of people, but there are absolutely people who you can do, you can show them the actual usability test and they can watch the whole thing and you don't understand if they were watching the same thing that you were because that person was crying and they're like, no, all good looked good. Everything worked the way I wanted to. You like, oh, are they supposed to be weeping on the ground? Okay, I didn't realize that that was one of the things that we were going for. I actually saw somebody weep in a usability test. It was very uncomfortable. No, it was terrible [laugh]. So not my product. I did not build it. So that's a real issue. I think it's not as common as you'd think. Those times really stick out for me. But way more often what I've run into is people starting to do this and say, wow, we learn so much even from just this one thing. Do they always learn exactly the right thing? No, but I mean they keep doing it and they keep drawing and they keep getting better.
- Brendan Jarvis:
- Yeah, well that's the thing, isn't it? I mean, no one starts out as an expert. Actually gotta put the time in when we're talking about these people that refuse to accept what is, I won't reality, maybe I'll be as bold to say the objective truth. Yes. Reality that if they're unwilling to, then you really do need to move on and find other people that you might be able to get some support from. And this comes back to what I've observed as a unwillingness, sometimes a resistance, sometimes a straight out denial to be wrong, that whether it's our education system or whether it's the way that we incentivize people in organizations or a combination of many things people don't like to be wrong actually. It takes a very brave person to admit that they may not be right about something. And this to me has been one of the central focuses of your work, which is the validation of assumptions. I wanted to talk to you about this because the way in which people express their assumptions is quite important to the way in which they interrogate them and therefore they interpret their results. How do you go about helping teams to frame the assumptions and then go about validating them or invalidating
- Laura Klein:
- Them? My goal is helping people understand and predict what little invisible things they're just assuming to be true. And helping them figure out if they're right or wrong. And also helping them figure out if they're wrong about it, what's gonna happen. Because those are the two really important things. You don't need to validate or invalidate every single assumption. Some of your assumptions might be wrong, that might be fine. You we have to make certain assumptions all the time and some of them are going to be incorrect. And for most of us, life will go on and it won't be a major thing. So the trick is though, to figure out which ones are likely to be very, very, very bad if they happen. It's not that hard to come up with experiments to validate or invalidate some of the assumptions. Some of them are literally impossible [laugh] I found.
- And you actually have to build a fairly big thing. There are certain assumptions that we make that are very easily with some simple user research. I believe it to be true that all of my users will have internet access all the time when they are using this product. Well that's actually pretty easy. It turns out to validate or invalidate. You just have to figure out what the context of use is likely to be and then go and follow people around and understand what their life is actually like. And you can pretty quickly start to see like, oh yes, that's true, except when they're on the subway or yes, that's true, except when there's a power outage. And those might be things where you're like, that's actually fine. This is not a thing that is meant to be used on the subway. So this is meant to be used in secure government facilities.
- So we have a slightly different risk assessment about that particular assumption. Other ones are much, much harder. Like I said, the one like the will they use this thing in the way. I predict that's a tough one to actually validate or invalidate. All you can really do is de-risk that, right? You can mitigate that risk. But I don't think you ever really know until it's out there in the world because there are so many things that affect it. There's marketing, there's sales, there's the environment into which you launch it. I mean, a thing that might have been true six months ago might no longer be true. I mean I think that the thing that we learned from 2020 [laugh], well we learned a lot of things, but one of the things that we learned was never assume anything. I mean just everything could change all of a sudden and we just got a roll with it.
- So sometimes the worst case does happen and there was something that you didn't plan for, but maybe you plan for it next time. I think for me the most important part of all of this because you're right, it is very hard to convince people that they're wrong about some of these assumptions because nobody likes to be wrong. It's even hard to get people to admit that they were wrong even when it was just completely obvious. Which is why I like the hypothesis tracker which I have my book Build Better Products where you actually write down, it's not that complicated, it's a spreadsheet. You know, actually write down your prediction and write down what you expect to happen and by when and then you have to go back and check it. And then I also like to write down things, why did I believe that thing?
- That's actually a really important thing. I think X is gonna happen by why date, and here's what I believe and here's all the research that went into helping determine that. And then what I can do is I can go back and go, oh, that was very, very wrong. And what I've noticed about all of my very, very wrong predictions is that they all came from the same source of research, which was pulled outta my butt or whatever it was [laugh]. It was absolutely no research whatsoever. So that's a really important thing to have written down. It also forces you to confront the fact that that's actually what you believed previously. You don't believe it anymore because it's real easy to what you believed in the past. The other really important thing is postmortems on failures. So when the thing that you believe to be true, your stakeholder who was like, everything's fine. Okay, great. So [laugh] fantastic when you ship the product and it inevitably fails for all of the reasons that you pointed out that it was going to [laugh]. Having a postmortem, blameless postmortem. Unfortunately in that case a blameless postmortem that says, why didn't we see the things we should have seen? Why didn't we do the things that we needed to do to mitigate that risk? And it's not, well, it's Bob's fault because he just didn't see reality. What could the system have done better to catch that problem besides fire Bob, which might also be necessary.
- Brendan Jarvis:
- I don't know. Yeah, it's that discipline to actually take the time to reflect on what the outcome has been, whether it's positive or negative. I mean, we can't just assume. I think 2020, you touched on this, you can't assume that every year is gonna be better than the one before. You do have to roll with the punches and you do have to look back. You definitely have to look back,
- Laura Klein:
- What could I have done differently? What could we collectively have done differently to have made this less of a shit show? Yeah, yeah. [affirmative],
- Brendan Jarvis:
- [laugh] a hundred percent. The identification and the prioritization of these assumptions is this really, and this is a close question, so this isn't a user research interview, but is this the heart of an effective user research program is actually knowing what your assumptions are and then knowing which order you're gonna investigate them?
- Laura Klein:
- No, I don't know. I think wouldn't call it necessarily the heart of user research. I would call it sort of the heart of making any kind of product decision and realizing that everything is an assumption and that we should just be writing down our predictions and under trying to get better at making predictions. And if we continue to be bad at them, figuring out ways we could be better probably through user research. And so I think finding the right questions to ask more gen, I'm gonna be more general and vague. Finding the right questions to ask is a huge part of good user research because once you have the right questions to ask, I think finding the right methodology to answer that is easier. It's not easy. You still need experts, but it is easier. But finding what is the thing I wanna know and why do I wanna know it? And how would my behavior change if I had an answer to this? Because if the answer is it wouldn't, then why bother?
- Brendan Jarvis:
- Yeah. And also if it doesn't add value to the business and the users, why would you bother bother trying to investigate it?
- Laura Klein:
- Yeah, I mean if you're just doing it for your own voyeurism, I don't know. Sure, whatever [laugh]. But yeah, we're not in the business cuz we just love learning about people. Although some people do, that's fine. No, it's how am I gonna behave differently and why do I think that that way would be better? What decision am I trying to make that this will help me make it? That's hard. Those are hard questions to answer. I don't think anybody gets them right all the time.
- Brendan Jarvis:
- I mean thinking about the teams that you've consulted to in the organizations that you've worked in as well, maybe as an employee what do the senior business stakeholders outside of the immediate product team or even the product organization, what do they really need to know about their product teams that's gonna help them to create better products?
- Laura Klein:
- Details matter for strategy can't, strategy is not this. And this bothers me a lot. I've worked with a lot of teams that are like, think that strategy is saying things like add AI to it and I won't work on teams like that anymore because it's just wildly frustrating. I need they to be giving product teams metrics to hit and reasonable ones. And they need to set it up in such a way that the teams have the autonomy and the ability and the personnel to do those things. And they need to understand that strategy is about saying, we need to improve this KPI or we need to improve this kpi and it's not, we need to add AI to it. And if it is, we need to add AI to it. I need to know what the hell that means to you and why you would wanna do it.
- Brendan Jarvis:
- It sounds like you want them to articulate the outcome, but there's also some degree of detail needed. And this is the fine line. I mean, I'm not sure what the answer is here. I mean this is why I'm asking you the questions and I'm not the expert in this, but or is there a role in between the articulation of that outcome and then the translation into what is actually tangible for a product team to put into practice?
- Laura Klein:
- It's really hard. I actually can't tell you how to get a strategy person to do that. I think most people aren't strategy people frankly. And I think that a lot of people have think have MBAs and that's different. Some of the people with MBAs are good strategy people. I don't know that there's even a correlation there. Some of the people who are engineers are great strategy people. I think that's why you end up with, I hate to pull this, there's one Steve Jobs and a lot of people who aren't [laugh]. I think it's really, really hard. And I hate sort of holding him up as the paragon because I wasn't great to work with as my understanding in a lot of ways. I don't know, I didn't work with him. But I think that's really tough and I think that it's tricky because then you're, the question is, are you getting too micromanaging? Is that what's needed? I don't know.
- Brendan Jarvis:
- Well you, you talked about happy teams in your mind, the product talk that I was looking at on YouTube in the 2016 and you mentioned Steve Jobs there, and that is quite a good example actually of the point that you were making is that by and large the team should be happy. But there are obviously exceptions to that role. I mean a lot of people working under Steve Jobs weren't maybe that happy in their work life, at least not with their relationship with him as their boss, but they were happy with what they were putting out into the world and the change that they saw that they were making. So
- Laura Klein:
- Look, I have a black belt in the martial art that wasn't always fun. I got hit a lot. [laugh]. I didn't like the part where I got hit. I liked the part where I got to hit people. I'll be perfectly honest, but I didn't like the part where I got hit. But I really liked the part where I got the black belt and I felt a tremendous amount of accomplishment and felt really good about myself and what I could do. Yeah, so I'm not saying we should hit people. I just wanna be very clear. I am on record. I do not want to be on record as saying we should hit people. I want to be on record as saying that not making something great is not always pleasant. Achieving things is not always easy or super fun, but that doesn't mean that we can't be a happy team and collaborative and excited about what we're building.
- Brendan Jarvis:
- Let's shift gears. I wanna try something new out on the show today, which is some rapid fire questions and you're gonna rapid fire answers. Oh,
- Laura Klein:
- Good
- Brendan Jarvis:
- Luck. We'll see how this goes.
- Laura Klein:
- Oh, you want short answers? Yeah, let's do that. Yeah, that's a hundred percent gonna happen. Every answer will just be, it depends. Next
- Brendan Jarvis:
- [laugh], probably safe. This could be a very short segment to anyone that's listening. So the general gist of the questions that I'm gonna ask you about is advice for the product team and in particular specific roles within the product team. So I'm just gonna go ahead, I'm gonna give you the first question and we're gonna see where it goes. Okay. All right. So the first question is, you are a product manager and you've started at a new company. What's the first thing you should do?
- Laura Klein:
- Talk to absolutely everybody who is on your team and on other teams and a stakeholder and make sure that you understand what you are getting into and why things have been done the way they have been done in the past. And don't just come in and start changing everything because that's how they did it at your last place. And also talk to users. Listen to users. That's two pieces of advice.
- Brendan Jarvis:
- [laugh]. That's all good. It's all good advice.
- Laura Klein:
- Yeah, but just wait users, sorry, sorry. Users are stakeholders so you have to talk to them too. That's it. [laugh]
- Brendan Jarvis:
- And what would you say [laugh]. Very good. That was very concise. I liked it. [laugh]. If you were a user researcher and you've just started a new company, what would be the first thing that you should do in that case?
- Laura Klein:
- Well, the first thing I'd do is I look at the user research that has been done in the past at all and just, I mean that's the very first thing. Also the talking to all of your coworkers and understanding what they're trying to understand and what they don't understand and what they need your help with. But just looking at what's been done up to this point and trying to figure out why and what was learned so that you don't start just repeating the same thing that has been done before. Is one good thing to do?
- Brendan Jarvis:
- What's the one thing you wish UX researchers understood about product managers?
- Laura Klein:
- Good product managers do care about outcomes very much, but may tend to be more focused on business outcomes and don't always understand that good business outcomes should come from good user outcomes. That may be a thing that I want product managers to understand though.
- Brendan Jarvis:
- [laugh]. So in the interest of fairness, I'm go, I'm gonna give you the ability to ask the same question but the other way around. So what's the one thing you wish product managers understood about UX researchers?
- Laura Klein:
- I wish product managers understood that UX researchers have a tremendous amount of information about the user that would help them make the decisions that they are probably struggling to make and can't seem to come up with the right answer to. And that if they just would involve the user researchers in those big decisions, those big decisions might get real easy that they don't have to sit around in meetings of other product managers and just debate them for hours and hours at a time when they could just go ask a user researcher who would be able to provide tons of data that would help actually inform that discussion and make the decision very easy to make user researchers make your life easier.
- Brendan Jarvis:
- Hey, so if a senior stakeholders fallen in love with a solution to a problem that hasn't been properly validated, what should the product manager do?
- Laura Klein:
- Cry [laugh]. Oh, sorry. After cry, drink after crying and drinking. Probably something better than that. So I have an exercise that I walk people through that sometimes works. I'm not gonna tell you it always works cuz nothing always works, but it works on some people. Somebody comes, anybody, but especially executives, somebody comes to you and says, I really wanna make, we gotta make this thing. It's this feature. We gotta build this feature. It's fantastic. It's gonna be great. Okay, here are the questions that you need to ask. Wow, that sounds like an amazing feature. What exactly are you hoping that that will do for us metrics wise? What metrics do you think that will improve? And they say, oh, well I don't know. That's a thing right there. You help them figure it out. Okay, well, well if you dunno what metrics it will improve, why do you wanna build it?
- If they do give you a metric, you can say, oh, fantastic. Okay. So what you really care about then is acquisition. Great. What about these other 15 things that we already have in the backlog that are other acquisition problems? Where do you think it falls in priority amongst those things? And what can we do? We as a team, what can we do to help figure out what the best option is for fixing that metric that you seem to care so much about, because what you need to do is you need to expose the fact that they are falling in love with a feature for no good reason. And you need to make them come up with a reason, and then you need to show them the other things that you could do that would help them reach their goal more easily.
- Brendan Jarvis:
- Just bringing us down to the close of the show, what is the one thing that you wish you could teach Every single person who's invested their time or is currently investing their time and energy into creating products?
- Laura Klein:
- That ideas are cheap and generally pointless. And that what matters is impact
- Brendan Jarvis:
- Are yet for playing a game.
- Laura Klein:
- [laugh] okay, this feels very, the end of war games. Is it gonna be TicTacToe? I swear I'm not a robot.
- Brendan Jarvis:
- Well, it's gonna, I've never had such a long pause. I did have someone ask me a question the other week as to what the game was about, which I thought was very, very wise as well. Well, I can put your mind at ease. It's nothing. It's to worry about, it's called, what's the first word that comes to mind?
- Laura Klein:
- Oh God, this is not gonna go well. [laugh], are you sure? Lemme, are you sure you wanna play this game? It's fine for me. [laugh].
- Brendan Jarvis:
- Hey, it's an 18 show, so whatever happens, happens. It's okay.
- Laura Klein:
- Let's do
- Brendan Jarvis:
- It. So I'm gonna say a word. I've got three words. I'm gonna say them one, one after another. Wait for your answer. Of course. In between. And that'll be the end of the game. Okay. You ready? Yes. All right. First word, time estimates.
- Laura Klein:
- [laugh]. No,
- Brendan Jarvis:
- [laugh]. [laugh]. See the second word is validation.
- Laura Klein:
- Yes.
- Brendan Jarvis:
- [laugh]. And the third word is surveys.
- Laura Klein:
- Fuck, [laugh]. That's it. That's it. That's the whole thing. Yeah. Yeah that's pretty, that actually worked pretty well in my head. Yeah. The time estimates, thank God. Time estimates, Gant charts. If you'd said Gant charts, I would've just hung up on you.
- Both speakers:
- [laugh].
- Brendan Jarvis:
- I suspected as much, so I didn't put that one in. That was smart. The final list.
- Laura Klein:
- That was smart.
- Brendan Jarvis:
- Yeah.
- Both speakers:
- [laugh].
- Brendan Jarvis:
- Okay. So thinking about the next few years, what lies ahead? I know humans are terrible predictors of the future, [laugh], but I'm gonna ask you to do it anyway. Well, actually, it's really less of a predict prediction and more of a hope. What is your greatest hope for the people who are making digital products?
- Laura Klein:
- Oh, generally not for me, cuz I mean, for me it's always, the answer is always the RB dragons. I literally never have any idea what it'll be doing in six months. What is my greatest hope? I do hope that we get better at predicting things that are going to hurt people. I do hope that, and avoiding them, sorry, [laugh]. I do hope that the people who work at companies get more organized about pushing back on corporate decisions that they don't like. I'm seeing that a lot and I'm actually really excited about it. I do hope that we do something to return to this idea that technology actually can make people's lives better. And because I think it can, in a lot of ways, I would love to get some of the unbelievable amounts of money out of tech, which sounds kind of bizarre. I think the whole economy has gotten really screwed up by billion dollar B rounds, and it doesn't make any sense.
- The economy's just a nightmare and it ends up producing a bunch of shit that never makes any money and never makes anybody happy and just ruins a bunch of lives. And I was kind of hoping that with SoftBank, maybe having some troubles that would happen a little less. And I don't know, maybe it will, maybe it won't. But yeah, I'd like to see more of a return to making the stuff that people like enough to pay us for [laugh] or things that make money ethically. That'd be cool. I dunno, that's really a return to it. I think that might be entirely a forward movement thing. But anyway, that's my hope. It's probably completely delusional, but it's my hope.
- Brendan Jarvis:
- Probably need, need some assistance from some regulators as well. But I
- Laura Klein:
- Mean, yeah,
- Brendan Jarvis:
- It's not
- Laura Klein:
- Oh, oh, Hills. Yeah. Yeah. I don't think that's just gonna happen. I don't think any of that's just gonna naturally happen because why would it hasn't so far.
- Brendan Jarvis:
- [affirmative]. Hey, look, those are some really great things to leave everybody with to think about. Laura, it's been really a great conversation with you today and it's been an absolute pleasure having you on the show. I wanted to say thank you for all the time and energy you've put into so generous generously sharing your insights and your passion and your experiences with product people around the world over the years.
- Laura Klein:
- Well, thanks so much for having me. This was super fun. And yeah, always happy to chat.
- Brendan Jarvis:
- Yeah, look, we'd love to do another round some time. We've got so much more material to cover. Trust me, I hardly made my way through half of it. So many great questions I'd love to ask you, but for the people that are wanting to know more about you to connect with you what is the best way for them to do that?
- Laura Klein:
- I have a website that's usersknow.com and that's No K-N-O-W not users, no. But usersknow.com and you can always find me there. I am also @LauraKlein on Twitter. And of course the podcast with that I co-host with Kate Rudder, as you mentioned is What Is Wrong with UX. And you can find it wherever you find podcasts.
- Brendan Jarvis:
- Wonderful. Thanks Laura. We'll be linking to those in the show notes so everybody will be able to find them really easily. And to everyone that's tuned in, it's been great having you hear everything that we've covered today, as I mentioned will be in the show notes, all the great resources, Laura's books, podcast, everything. It'll all be in there. If you've enjoyed the show you want and you want to hear more of these great conversations with world class experts like Laura, leave us a comment if you're watching this on YouTube or subscribe if you are listening to it via podcast platform. And we'll keep them coming. And until next time, everybody keep being brave.