Jane Davis
Equipping Product Teams to Run Effective UX Research
In this episode of Brave UX, Jane Davis brings epic levels of insight to this conversation about the evolution of UX research, and how it can enable product teams to build better products.
Highlights include:
- What can grizzly bears teach you about effective leadership?
- How do UX researchers get product managers to listen to them?
- What is the identity crisis facing UX Research?
- How do we ensure product teams are doing valid research?
- How do you overcome nervousness in user interviews?
Who is Jane Davis?
Jane Davis is the Director of UX Research and UX Writing at Zoom, where she helps the company to make strategic decisions, about the direction of its business and its products.
Before joining Zoom, Jane was the Head of UX Research and Content Design at Zapier, a 100% remote-work company who’s mission is to automate workflows, freeing humans from the boring and tedious parts of their jobs.
Jane also invested several years at Dropbox. Originally starting out as a UX researcher, before becoming the Design Research Manager for Growth.
Transcript
- Brendan Jarvis:
- Hello and welcome to another episode of Brave UX. I'm Brendan Jarvis, Managing Founder of The Space InBetween the home of New Zealand's only world class UX lab, enabling brave teams to de-risk product design and equally brave leaders to shape and scale design culture. Here on Brave UX. Though it's my job to help you to put the pieces of the product puzzle together, I do that by unpacking the stories, learnings, and expert advice of world class UX, design and product management professionals. My guest today is Jane Davis. Jane is the director of UX research and UX writing at Zoom. You may have heard of this company before. It's been instrumental in helping the world to keep connected during the Covid 19 pandemic. At Zoom, Jane is helping the company to make strategic decisions informed by UX research about the direction of the business and of its products.
- Before joining Zoom, Jane was the head of UX research and content design at Zapier, a 100% remote work company whose mission is to automate workflows freeing humans from the boring and tedds parts of their jobs. Jane also invested several years at Dropbox, originally starting as an individual UX researcher before becoming the design research manager for growth. In that role, Jane created a working model that enabled her team to focus on high level strategic challenges while still ensuring that individual product teams were able to run effective research described as clear, concise, and stage ready by some fun, smart, and talented by others. I'd say we're in for a great conversation. Jane, welcome to the show.
- Jane Davis:
- Thanks so much for having me.
- Brendan Jarvis:
- It is great to have you here. It was a real laugh actually researching for this talk. You have some really interesting, insightful, and hilarious talks on YouTube that I was able to go through. And on that serious note that we're starting on, I'd like to wind the clock back just a little for the people that are listening. And could you please tell us about the summer you spent after your senior year of high school? I understand it was an exciting time.
- Jane Davis:
- Yes, I, it's wonderful that you found that out about me. So the summer after I graduated from high school, I actually spent just over six weeks canoeing in the Arctic Circle in the Northwest Territories and none of it in Canada. I did that through a YMCA camp called Camp Wiji Wagon. I grew up in Minnesota, so this is pretty much how every Minnesotan spends their summers. And when we were on the trip, it was six of us and we were, we'd been canoeing for well over a month in the Arctic Circle and we were canoeing on the Arctic Ocean at the time. And one night, well, I say night, we also, the sun never set, which was another interesting part of being up there. But we actually encountered a couple grizzly bears that started expressing some interest in our camp and what was going on with us.
- And they just kind of kept wandering closer and wandering closer. And we realized this probably might not be the best location for our camp tonight. So we got in our canoes as the grizzly bears continued to come closer and closer and we packed up our canoes and we were just kind of casually like, all right, well let's not panic, but it's grizzly bears. Let's not be in no hurry. We got in our canoes and we kept paddling as the bears were just kind of coming closer and closer. And so we paddled for another couple hours and we were a little shaken up and we were a little bit unnerved because you normally just want grizzly bears to go away and leave you alone. But we were like, all right, well we'll set up camp. We'll set up one of our two tents and take turns keeping watch so that we can get some rest because we've put in a long day of paddling on the Arctic Ocean.
- So we got out of our canoes and we got up onto the tundra and we looked over and there was another grizzly bear and this grizzly bear, instead of casually kind of saing towards us, got up on its hind legs and looked over at us and then it got back down and it charged [laugh]. And so the leader of our trip, who was at the time 23 years old, she was the oldest one of us by five years, she just screamed, get in the canoes there probably was an expert ever two. But so she screams for us to get in the canoes. So we get in the canoes. The canoes are facing inland, so we are back paddling to get out back onto the ocean. And they're not big breakers, but it's the ocean. So there are waves. So we were having to try and back paddle offshore over these [laugh] over the incoming waves while there's a grizzly bear charging us.
- And we are just giving it everything we've got. And the last thing I see as we are finally getting our canoe back onto the ocean after the grizzly bear has charged us is just a set of grizzly paws at the very edge of the ocean, about 10 feet from the bow of our canoe. And it just stopped. And we were like, oh, that's good because they're good swimmers [laugh]. So if it had wanted to absolutely lunch. So after that we realized that probably the land is not our best location. We were pretty shaken up. We had a couple folks who were absolutely going through some fairly basic trauma reaction and we, so we decided we were going to call for help. The problem is this was 1999. There were no, there's not a ton of cell coverage up there already. We didn't have anything. We had an emergency position indicating radio beacon, which calls the National Guard, which was like, you only pull that for polar bears.
- And then we had a line of sight radio. So we floated on the Arctic Ocean for a solid 12 hours until plane finally passed overhead. And once the plane passed overhead, we made contact. We told them everything we possibly could in that brief amount of time. And then we just waited because the plane isn't turning around to come help us. They've got stuff to do. And we waited and we floated and we waited and we floated. And 36 hours after our initial encounter with the grizzly bears, the first set of grizzly bears, we finally heard the sounds of a helicopter. So 36
- Brendan Jarvis:
- Hours, six hours,
- Jane Davis:
- We were floating on the Arctic Ocean for almost a full day. And just not knowing if anyone knew we were in need of help or if we were like yes, we're gonna fight grizzly bears with canoe paddles cuz we've gotta get to, we're on a schedule. But this helicopter touched down and I will never forget this member of the Royal Canadian Mounted Police named Rob got out of the helicopter in full mounty dress, including the Stetson. And we were just like, excellent. Well that's a way to get rescued. Yeah. So yeah, the summer after I graduated from high school, I was charged by grizzly bears while canoeing in the Arctic Circle and then rescued by the Mals. And then I went to college
- Brendan Jarvis:
- [laugh]. And what did that teach you about yourself?
- Jane Davis:
- It taught me that the only way to really get through those situations is to lean on a group when you need to. It was really interesting. We actually took turns breaking down. So during those 24 hours when we were floating on the ocean and did not know if anyone was coming, we actually at any given time, only one of us was really processing and the rest of the group was just holding strong. And that was really interesting to see. And I didn't realize it was happening at the time, but we did went in order where one person would just break down absolutely. And everybody else just came closer together for them. And I think it really taught me some things about bringing shotguns on your camping trips, but it also taught me the power of being able to let go and trust in a group of people. And also the way to respond in crisis situations, which is you find the edge of your strength and you go to it and then you trust a group when you reach the edge. And to not push past that, no, there's no pushing through a moment when you think you are actively think convinced you're gonna die. You just let go and you have to trust the group. So that was definitely the big thing that I have taken with me.
- Brendan Jarvis:
- So, UX research isn't really life and death. No. But have there been situations where you've had to, as a contributor or as a manager, do what you've just described and trust the group and put your faith in others to get through a challenging time? Can you recall anything?
- Jane Davis:
- Oh absolutely. There's actually a direct line between that experience and my management philosophy and why I got into leadership, which is in my mind the reason to become a leader is to solve problems that are too big for any one person to solve. And I, so I try to live that as much as possible in how I approach building and running teams, which is I don't need to know what's going on in minute detail with any of my reports, should I trust them to come to me when something is going on where they need support. And otherwise, I got to a point where in my management career, this was at Dropbox where I realized I was trying to review every research plan and I was trying to critique every research finding stock and give these minute copy edits and things like that. And that's just the worst use of everyone's time.
- It's not a good use of my time as a leader. It's not a good use of my report's time. It's not a good use of anyone's time. And so when I, Dropbox went through a reorg and so when I stepped into a new role at running the research team for the growth organization, it was really just this opportunity to be like, what does the team need from a leader and how do I show up? And so a lot of it was I'm gonna let go of reviewing every single plan. I'm going to let go of looking at every findings report in detail. I am going to trust the people and the process and I am going to make clear to them that I am here to catch them when they need me as well. So there's that aspect of it. There's also, I mean when the pandemic started, that's probably the most recent example of a time when we all had to trust the group where people were really struggling to try and make themselves stay productive in the midst of this huge amount of uncertainty.
- And it was really interesting watching each me individual member of the team kind of need that same space and need that same safety and the opportunity to break down a little bit or step back a little bit. And I think if I hadn't gone through that experience in the Arctic, I wouldn't have been as aware of what was happening and how to put a name to it and also how to encourage it. And so I think would've tried to say, oh, let's all power through as a group and can everyone contributes on their level and all that. The things that we all said as managers, the beginning of the pandemic. But realistically, I think the biggest thing that it taught me was like, no, people are gonna break and you have to let them break and you have to provide the support and you have to say it's okay to hit zero
- Brendan Jarvis:
- [affirmative]. Yeah, Google has this concept and it is possibly not something that started at Google but of psychological safety at work. And I can't really think of a better example than what you've just given of enabling your team to really feel that in the midst of what has been completely unprecedented in the last a hundred years or so. At least the sort of pandemic situation that we've found ourselves in.
- Jane Davis:
- Yep, exactly. It's really, I mean I think understanding and understanding that the way you create that is by making it truly for someone [laugh] to not
- Brendan Jarvis:
- Just some nice words
- Jane Davis:
- Essentially just be like, no, no, you can just be broken and not show up. And that has to be okay if we're serious about creating safety for people
- Brendan Jarvis:
- Not like that. And you need to still have your report done on Friday. Okay. Yeah,
- Jane Davis:
- Exactly. I can think of nothing that is less conducive to creating psychological safety than the idea that we should still go through review cycles at the beginning of worldwide trauma [laugh], when everyone was like, how do we deal with performance reviews? And I was like, well, the simple answer is I think I just don't
- Brendan Jarvis:
- In a place people time in a place [laugh].
- Jane Davis:
- Exactly. That felt like a very strange discussion to be having as management teams during that time where it was, I truly dunno how to explain to you all the performance review should consist of you're a human in the world. Well done. Yeah.
- Brendan Jarvis:
- Yeah. I could kind of understand though maybe this is a huge assumption, what might have been going on for some people that were in that position still wanting to charge on with bau because for them that might have been some form of psychological safety, the of the headness and mentality. I'm just gonna pretend that this isn't happening and we are just gonna try and keep things as normal as possible as the world burns around me.
- Jane Davis:
- Oh, absolutely. I think there's an aspect of that. I also think there are people, and this depending on where I am with everything that's going on with my life, this is also me sometimes people who find work genuinely restorative where either you've got a project going on that's really exciting or that's just where you are mentally is something is really working for you at work. And I never want to discount work as a very effective coping mechanism in those situations. And I don't think that that's unhealthy in all situations. I think it's unhealthy if that's your only coping mechanism or if you are like, oh, I'm doing this because I don't wanna process things. But if you're like, I'm doing this because I genuinely like it and get joy from it and it's some small slice of normalcy, like no judgment there.
- Brendan Jarvis:
- [laugh]. Yeah, yeah. Hey Jane, you weren't always in UX either, were you? And you've had a relatively, well, not even relatively, it's an absolutely impressive career in the past eight or nine years. How did you get started? How did this all come to be?
- Jane Davis:
- I like to say, so I wound up as a researcher by accident several times over. So I originally got into UX because I wanted to go back to being a librarian. I worked as a library assistant doing cataloging and preservation at the University of Chicago Law Library. And it was just a really wonderful and fun job that I enjoyed tremendously. But it was also in Chicago, which is very cold. And one winter my pipes froze and my furnace broke and I just turned to the person I was dating at the time and I was like, I'm moving to California, you can come if you want. And so I had done event planning in college and was always interested in getting back into it. So I moved out to California to be an event planner at the University of California Berkeley. And after a few years of doing that, I realized there are aspects of the job I love, but there are aspects that are just exhausting a lot of nights and weekends, lot, just a lot of hours depending on the season.
- And so I decided to go back to school to get my degree in library and information science because I was like, oh well I loved that job. It'd be really nice to get to do that for the rest of my life. So I applied to schools. I wound up going to University of Michigan and the thing about University of Michigan's School of Information is that they have multiple specialties within their eye school. So there's library and information science, but there's also, as it turns out, human computer interaction. So a fun thing about me was I did not really know about design or UX as a career despite living in the Bay Area for five years before I went to grad school.
- Brendan Jarvis:
- That was because you were too busy doing arts and theater, wasn't it? I was
- Jane Davis:
- There at some point we will diverge into my history and building giant things that light up and shoot fire and stuff like that maybe. But yes, I was a little busy doing other things and so I got to grad school and I started taking classes and there were a lot of classes that obviously there's a lot of overlap between library and information science and human computer interaction and thinking about how you organize information and service of human needs, which is kind of the common thread between all of the careers that I've followed. And so I took more and more classes in hci. I actually wound up doing an internship as a UX designer was asked, that was at J store. They asked me to stay on as a contractor for that year and even after I graduated. But I knew I wanted to move back to the Bay area, so I kind of accidentally wound up in UX. And then after I graduated I was applying to jobs and I applied to a friend referred me to Bit Torrent and they had an open role for an interaction designer and another one for a UX researcher.
- Brendan Jarvis:
- I was like the fork in the road.
- Jane Davis:
- And at the time there wasn't quite as much specialization, there weren't as many dedicated research roles, so I hadn't really considered the option. And then I interviewed with them and they were like, okay, so which one job do you want? When they were making the offer, I was like, I wouldn't have to do pixel pixel perked mock if I were a researcher cause I was always, I was never gonna be a phenomenal interaction or UX designer. But research really came naturally and was a lot of what I loved about UX design in the first place. And so when I was given the option, I thought, oh no, no. Is there a way to just tell people what to do and then not have to execute on it? Fantastic. I'd like that one please. So really just that deep understanding of the problem space and exploring that and kind of coming up with direction but not being so opinionated on the final designs that hit a sweet spot for me in terms of how my brain works and what really well suited to
- Brendan Jarvis:
- Now bit torrent. That is a fairly interesting place, at least in my mind to work in your first research role. I mean it's a curious company and I don't wanna pass any judgment here, but I think most listeners will probably be familiar with the age of tolerance where you're able to procure certain pieces of software, music and video through, I suppose it was peer to peer sharing. What was it like at what do you remember in your first couple of months or the first day on the job, does anything sort come to mind that could contextualize that for people and what that experience was like for you?
- Jane Davis:
- Yes, absolutely. I think the jokey line I always say is imagine that your day at work consisted almost exclusively of people telling you illegal things,
- Brendan Jarvis:
- [laugh],
- Jane Davis:
- And then that you were supposed to turn that into useful product feedback, [laugh], [laugh] and people. I think it was really interesting and I think one of the things that really stood out to me was that people were not shy about sharing their full experience and about saying those things. I mean they'd signed, we had a mutual nda, but that's not gonna, if we get subpoenaed, that's not helping you [laugh]
- Brendan Jarvis:
- Like this. Is anyone from the FBI listening to this episode
- Jane Davis:
- [laugh]? I mean it was really interesting because people were so open and they so excited to help make something that they felt deeply connected to and that would served this really important purpose for them that they shared so much. And I was, it was my first real experience with research where you were like, I would've expected to have to really pull this out and really work hard to get this information and instead it just flowed out. It taught me a lot about, it really shaped my research approach in terms of how I treat individual interviews now, which is for the most part, I think a lot of early stage career researchers, myself included when I was there, really thought, oh, this is more, I'm gonna have to work really hard. And there are always participants where you have to work a little bit hard to get them warmed up. But I always thought, I went in thinking, wow, I am going to have to do so much more work to get the information I need. And instead it just was constantly dropped in my lap and I was like, I should just sit back and let people talk
- Brendan Jarvis:
- [laugh]. Is this from your users or from your colleagues or both
- Jane Davis:
- Users? Users, yeah, colleagues. Colleagues was interesting. Colleagues was ENT was a very small research team. I was managing one person and then we brought on an intern and that was like we peaked at three of us. It was 150 people when I was working there. And there were a number of different product teams and it was really interesting both helping the product teams understand how to incorporate research into their process, especially earlier on. That was the big emphasis cuz it was very much all usability testing and surveys when I got there. And so trying to get teams like, hey, we could save a lot of engineering time if we just didn't build this right away. And maybe
- Brendan Jarvis:
- If we just built the right thing that might be useful.
- Jane Davis:
- What? What if we just talk to people about some ideas first? And so getting product teams, just getting research just a little bit further forward in the process so that we weren't building everything, we weren't constantly coding and then doing research so that a lot of it was just sheer, I just spent all my time talking to people, either users or product teams. And just being, that's been my philosophy ever since is just be incredibly generous with my time because the most important thing I do is help people understand what I do and how it can help them and understand how I fit into their needs and their world. And I had a lot of impact and I think a lot of the way I was able to do that in terms of getting research to a better place in the product development process was just I wasn't shy about constantly talking to people and being like, absolutely, I have more time for this. I have more time for this, I have more time for this. The most important thing I do is meet with product teams and help them understand research and we can push off everything else as long as it means that we are moving research to a better place and helping product work faster.
- Brendan Jarvis:
- I think this is a really interesting moment to slightly segue into something that you've recently written, which was an article where you suggested that the field of UX research is facing a bit of an identity crisis. What is that crisis and why have you been thinking about this lately?
- Jane Davis:
- So we've been talking about this crisis using or the identity crisis, using the word democratization, which is not necessarily the, as a UX writer, maybe not the best way we could be talking about it, but it's the way the conversation has already been happening. So it's the word I'm going to use here as well. So a lot of the discussion in UX research over the past several years has been about the tension between, or the perceived tension between enabling anyone in a company who wants to conduct their own UX research and kind of preserving the rigor and prestige almost of UX research as a function. And I think these get talked about as though they're mutually exclusive gets talked about as do we democratize or do we try and reserve space for UX research and protect what we do? I started getting frustrated by that conversation because a lot of what I was doing at Zapier was literally just both those things. [laugh]
- Just like I don't see them as at all exclusive. I think that they are both tools that we can use to accomplish a goal. I have this long rant about tools, higher order tools and goals and the way that we always wind up talking about things. Is it the higher order tool space when we really need to get up here and talk about goals? Like democratization of research isn't an end goal, it's a tool, but having high contact, high expertise internal UX research teams is also a tool. It's a way to achieve a goal. And so the field has been talking about these two things as though they're ultimately goals, but it's not. It's all about what we get from doing them. What do you get when you democratize research? Well, you get a more rapid product development process. You ideally get product teams that are closer to their users that don't need a moderator to understand what users are trying to accomplish and that can move quickly and then have a better internalized sense for what people need and what they do out in the world as humans.
- And at the same time, an in-house research team is also a tool. What do you get from that? Well, you get people who can go out deeply understand and create foundational insights that you can use to apply to your company's strategy and direction for years and years to come. You get people who can work across product teams to define how a company should think about the topics that are most important to them. These are both incredibly useful things, but they're the means to an end. The end is providing what people need in the world to get their work done.
- Brendan Jarvis:
- And it sounds like you're drawing a distinction between the product teams and the internal UX research team in terms of the nature and the scope of the research activities that they're undertaking. What is the difference, what should product teams be doing in your view, and what should an internal UX research team be doing in terms of methods and questions that they're trying to answer or problems they're trying to solve?
- Jane Davis:
- Absolutely, and I mean every product team is different. I have met product teams with PMs or designers or engineers where I was like, oh, you can just go do your own foundational work. You're fine. You don't need me to run that diary study. You should run that diary study. That's like the rare product team, but they exist and I never want to get prescriptive about what methods the product team should use. But so the framework I always apply is do you need how much context, how much organizational or product context do you need? If it is high organizational or product context and it's relatively low expertise, you don't need to know how to use specialized research software. You don't need to be trained in formal synthesis methodologies. You don't need to have deep training of any kind. Then a product team should conduct their own. Last week we hopped on some concept calls and that's one where the product team absolutely can conduct them themselves after they've had just a little bit of training in how to talk to humans in the world effectively. Because there's nothing special I do about being like, so let's talk about this one design screen.
- I can teach anyone to do that. So it's really if you don't need a huge amount of research expertise, but you do need to understand the product, the product strategy, the roadmap and the goals and the type of information you're trying to get, that's when a product team should really be conducting their own work. When you're getting rapid feedback, when you're doing iterative designs and you're trying to make changes even in between sessions, there's no particular reason you need a researcher for that. If you have someone who can give you basic training or if you've read just enough research by Erica Hall, you're pretty well equipped. And then there are these big questions that need both organizational contexts. So understanding what the company strategy is and the long term vision and the roadmap and the big open questions and risks and opportunities, having that deep rooted understanding and also being able to say, okay, this is this big fuzzy, complex, ambiguous area. Here's how we're going to break it apart and actually go understand it. That's really where the in-house research team comes in is we have ambiguity about product vision strategy or direction. And that's where that's that wonderful sweet spot where you're like, oh, that's really, we need somebody who knows our company, but also we need somebody who's very good at dealing with ambiguity, complexity and understanding a problem space.
- Brendan Jarvis:
- So you've both got that higher level of subject matter expertise in research and some of the more advanced methods and practices and you've also got that deep context of what's actually going on in this company. How does the external UX research agency or practice, how do they fit into this new and emerging world of UX research?
- Jane Davis:
- Oh, they're critical. So they're absolutely critical in my view. So I think my ideal research setup is very much product teams are conducting their own rapid concept and usability tests. Research teams are conducting STR company strategy and objective related research. And then external research teams are hand or firms are handling all of the things that are complex and fuzzy but that don't require high organizational expertise. And so if you've got a big problem like oh, I want to understand something about the market or the landscape that's not specific to the company. If you're like, I just want to know how people accomplish their work in this area, external research, external contractor freelancers, agencies are great for that because they don't need to know what your OKRs are. They don't need to know your five year company strategy or company vision. They just need to know how to decide what methods to apply and when and how to appropriately scope, define and execute on a project. So yeah, they absolutely have a role to play. I think the only place there's no real need is low context, low expertise. At that point you're like, I don't know, I could just make up an opinion. That seems fine. [laugh],
- Brendan Jarvis:
- We all know that opinions are fairly dangerous things, especially when you hold onto them too tightly. And there's plenty of people and organizations that do that. Now you sort of touched on this idea that the product team should be running their own rapid concept and usability testing on the product. And that's not something that I disagree with. So don't think that I'm trying to grind a specific X here. But what I have been wondering lately with this conversation that's going on in our research community about that distinction between who should be doing what types of research, I can't help but think that usability is being demoted after so many decades of some of the founding people in our industry trying to get usability as a consideration for organizations and the software and the experiences that they're putting out in the world. And where I'm going with this is that we still have so many terrible user experiences out there, not because the company has necessarily misunderstood the core needs or problems that they're trying to solve for their users, but just because the design is bad and there are just some very basic usability principles that are not being applied effectively.
- What are your thoughts on that and what, if any, is the risk in removing the UX research expertise from having some oversight or involvement in the way that usability testing and subsequent improvements are carried out in the product team?
- Jane Davis:
- Absolutely. So I think that's a great point to bring up because there is for all the companies say, oh we absolutely do usability testing the okay, but if you usability testing the wrong interaction or a bad, there are plenty of ways to usability test something in for it to still be absolutely unusable. And the thing I think you're specifically kind of calling out, I actually don't know that research involvement in the usability testing is the solution there. If the goal is to make sure that we have a set of baseline principles and standards, then what we really want is for UX research to be involved much earlier on. So it's having that voice in kind of shaping the roadmap and helping shape design principles and really having a voice at the product team strategy level rather than the execution level. And that to me is absolutely not antithetical to the operating models that I've been putting into place, which is UX research very much is very much involved in the let's define our principles, let's set our roadmap, let's identify our big open questions and risks and the more that we can be involved there and set our heuristics and set design principles and push on things when they don't have those heuristics and design principles.
- That's really where I see the most important role for research to play is before we get to usability testing, anything is saying what are the metrics we're trying to drive? What are the things our users are trying to accomplish and what are the ways that we are going to deliver on those things our users need? So I mean I don't think there's any substitute for a good set of principles and I think they have to be very much actionable. And so they have to be example driven. They have to be something that they can apply with the researcher not in the room and they have to be developed with the team altogether. So we can say, okay, these are the principles for this project, this is how we measure them, this is how we know we've succeeded in delivering what we're supposed to. These are, I love heuristics. I think that heuristics are wildly under underused in the field of UX research. I think that's
- Brendan Jarvis:
- Part people that don't know what heuristics are. Could you give us some context and of explain how you might apply heuristics in your practice?
- Jane Davis:
- Yeah, absolutely. So heuristics are effectively a set of, I say char, I usually refer to them as characteristics. So a great example is Abby Covert probably has the greatest set of information architecture heuristics ever produced [laugh]. But it's things like is it clear, is it concise? Is it human? And so they are characteristics of an experience that you evaluate against. And so when you have heuristics, you have to say, okay, is it clear? And then how are we defining clear? So clear means this, it looks like this, it doesn't look like this. So example, counter example, in many ways they get used fairly interchangeably with design principles these days, which I think is fine, whatever gets the job done, honestly [laugh] not gonna get attached to the language. But yeah, they're a way of saying these are the criteria we care about in this design and this is how we will know we have met them.
- And I think they get a bad rap because a lot of people don't see how we might because it feels subjective because you give this list of things and you're like, oh, are we meeting these check boxes? And people will be like, well it's clear to me. And it's like, so we need an agreed upon shared definition. We need again those examples, counter examples and we need ways of saying yes, this did or no, this didn't meet this and why. And once teams are empowered to do that, I think if people were able to get more buy-in for heuristics as something as a mechanism for evaluating designs and ideas, it would save us all a lot of time. There are plenty of usability tests that get done that could have been in heuristic evaluation, but it just doesn't have the same cache as five people told me a thing.
- Brendan Jarvis:
- That's so true. I also have observed a strong will to want to quantify usability in terms of summative testing, but people mixing the methods, so wanting summative outcomes but from formative methods and that's like a cake. I wanna have the cake and eat at two situation. So I don't know if that's something that you've encountered, but I feel that's also something that's quite dangerous for our profession at least, is to give false confidence off the back of small samples by trying to mix methods in ways in which they weren't designed to be used.
- Jane Davis:
- Absolutely. I think one of the great disservices we do to usability testing is treated as though there is a quantitative component. I mean apart from the diminishing returns after X number of participants. And I think we've all seen the well
- Brendan Jarvis:
- That lovely graph, that
- Jane Davis:
- Lovely graph. And also there are situations when you want more than five and there are situations when five is like it's all we can argue over the exact end, but the problem is that it gets into an argument over the end. And so people really fixate on this idea of, oh usability, if we usability tests with 30 people, that's better than if we usability test with six. And it really is difficult to explain that neither of those things is quantitatively valid [laugh]. Like neither one of these is statistically significant and they never will be because that's not what usability testing is meant to accomplish. Usability testing is a tool to find problems with your designs or with your product. It is not a tool to understand every single individual's, every problem with your product. And so people get really caught up on, well is it really valid if we only test with six people? Yes, as long as we're seeing trend, as long as we're identifying themes and problems to be solved, absolutely. Would we maybe find a couple more if we tested with 10 or 15 or 20? Sure will this design where we caught a bunch of usability issues with five people be better than what we were gonna put out into the world otherwise? And I
- Brendan Jarvis:
- Think that that's it. I think that's the key point and that's one of the responses that I always give is exactly what you've just said there about doing nothing versus evaluating with five people. If that's all we have time and budget for those, the value inherent in doing that is infinitely more than the value of just praying that whatever we are putting out is gonna work for people.
- Jane Davis:
- And this is something that I emphasize especially with, I have had the great privilege in my career to manage a lot of researchers who want the perfect, it is a joy to get to manage those people because they are always striving and they are always wanting things to be the best they possibly can for users. But nothing burns out a researcher faster than presenting the perfect over and over again and the product team saying, no, we don't have time for the perfect. And so the thing that I always emphasize with researchers, especially when they want to, they're like, but we should really be doing this deep dive and this three month study and instead we're budgeting two weeks, weeks to talk to six people about something foundational. And I'm like, okay, but will we make a better decision? It doesn't have to be the perfect decision, it just has to be better. And so that's what I really try and drive with my teams is like, don't be perfect, be better. You don't have to be right, you just have to be less wrong.
- Brendan Jarvis:
- Yeah, that's also a huge point and I think that's a sign of maturity as well. And in the researcher, if they can wrap their head around that, I think a lot of us come from very idealistic backgrounds and we carry those values through into how we practice when we first get into the field and then we run up against organizational culture, other people's agendas, the practicalities of time and money of which all organizations are beholden to. And just to realize that you don't have to be perfect and that better is good enough is actually incredibly liberating for people in their practice.
- Jane Davis:
- I think the perfect, to me the absolute perfect state is somebody who constantly drives towards the perfect while recognizing that it is unattainable but never stops aiming for it. That is, that's such a sweet spot and it's so hard to find and it's so hard to maintain that balance without burning out. And so I think it really is about that maturity and about saying, I am operating creatively under constraint and every single improvement I drive drives this one step closer to the perfect and maybe we'll never get there, but man, I'm just going to take to hold myself responsible for knowing what that is and for helping the team get a little bit closer every time.
- Brendan Jarvis:
- Yeah, huge. So let's talk about maturity then. Let's talk about field of UXR as it evolves and as you've described Jane with this sort of move for the internal research team to tackle those more high level problems and stepping away, and again I'm putting words in your mouth, so correct me if I'm painting this in the wrong light, but step away on the ground, the usability testing the evaluative methods. How does someone who is straight out of UX bootcamp or straight out of a master's program who hasn't got any experience necessarily in generative explorative research, how do you help to get them started? What do you give them in your teams? Where do you get people started?
- Jane Davis:
- Absolutely, and I do not want to say that I literally just last week as the head of UX research for Zoom, sat down and ran my own concept tests for this 15 minute concept test with users. I love it. Nobody is beyond doing evaluative work and I never want people to believe that there is something inherently more noble about generative or foundational work. Absolutely not. The kind of work that is noble is the kind of work that has impact for users. So if that's what that is, all that matters at the end of the day is how do you make a company make better decisions? And so when I am bringing in junior researchers, when I am mentoring researchers, the thing that it is so much harder to teach than generative methodologies is how to get product teams to listen to you. And so [laugh] literally,
- Brendan Jarvis:
- There's a few product product managers that listen to this podcast. So I'm sure there is will be a pricking up at this particular point.
- Jane Davis:
- If we had hours and hours, I would tell you, I would sing you all the songs of all my favorite PMs because I do pride myself on being very much a product manager's researcher. But really, and the way to do that is to understand what the product team is trying to accomplish and to bring them information that will make them better at that. And I think a lot of researchers lose sight of that and they're like, oh, I'm just gonna bring you the answer to this question. It's like, well that's worthless. You need to bring them the answer what the answer means for them. So it's not what workflows do people engage in, it's how can we support those workflows? What are we best positioned to support? I often say the job of research is to have an opinion and I very much mean that we should have opinions on the findings that we are bringing back and we should help the product teams understand why we formed those opinions and why we believe those opinions are what will make us successful.
- I was gonna say to that point, when someone is earlier in their career, it's not really about getting experience with any given methodology, it's really about having that lens and helping the product team craft that narrative and understand how to use those findings. Like that to me is the real skill that needs to be built. And a real sign of maturity and a researcher is can you work effectively with product teams to help them understand any findings? Like the concept test we did, the product team sat in on them and they were like, oh yeah, this is definitely what we saw. And I was like, so here's the thing that's like what you thought you were going to see, so that's what you saw, but let's talk about what we actually saw. And I showed, here's the list of the actual options people selected and here's what we heard about why they selected each option and here's how we should apply that information.
- If we want to design to support discoverability, we should do this. If we want to design to support ongoing organization and usage, we should do this. And so really, really crafting a story that the product team can understand and use to make better decisions. That can absolutely happen with evaluative research. And so for people earlier in their career where it is harder to get those like, oh yeah, I did this six month diary study and I was just heads down forever and now I've done every different qualitative methodology. No, no, it's fine. Just run a week long usability test but understand how to get the team to think about it and apply the results. I think there's no recipe, there's no perfect recipe for research impact, but at the same time there's no one methodology that's going to make your research more impactful. So I always encourage junior researchers not to worry about what methods they're trying to learn, but instead to worry about how do you answer the right question, how do you identify the right question that the product team needs answered? And then how do you answer it for them in a way that helps them make better decisions and that has real impact.
- Brendan Jarvis:
- Yeah, huge point you, you've touched on so many interesting things in that story there. One that came to mind was this notion of participation in research, both from the researcher's perspective but also from the product teams. It seems at least at face value that if we can open up the practice and we can get people involved as in coming to observe sessions, if they're not running them themselves, that's a great thing. But you mentioned the danger in that is that they can actually walk away during conclusions based off bias that they held because they were so close to it before they even got in the room and watched what was going on. I'm not sure, I don't have any answers here. This is a show where I ask you the questions, so I suppose I better ask a question. My question is, knowing that that is something that happens, how do we ensure that the impact of the research that is being done by product teams or with product teams involvement is actually valid and is actually going to have a meaningful impact for users if we know that is a risk?
- Jane Davis:
- Yeah, absolutely. So I try to counter this in a couple ways. I've got a few strategies. The first one is I actually just make everybody say their opinion before we start. So before we test any concepts, talk to any people, any of that, I say, which design do you prefer or which concept or what do you think we should do? And I have the whole team do that together so that it's out there so that everybody is how everyone's opinion has been stated. And everyone has to come to be very clear that I have an opinion. I am not neutral. I am saying, I am saying this stake in the ground and then I write it down [laugh].
- Brendan Jarvis:
- Do people get uncomfortable actually having to put that out there
- Jane Davis:
- Sometimes? I mean it really depends on the level of discussion, I think. So the times I've seen it get a little bit uncomfortable are when people genuinely believe they're neutral. That's a very uncomfortable moment when you think of yourself as being neutral. And then you come to terms with the fact that you are not a rational actor, [laugh] like you are not just an opinion list blob you have, even if they are loosely held and ideally they are loosely held, but even if it's just loosely held or just this small germ of one, you've got an opinion. And so we will state the opinions and get them out there and have that conversation and say, great, and tell me why you think that. Right? Let's be listening for what you actually, what you are going to be listening for. Cause I want people to be super aware of it. And then if we have time, I also like to ask them, what would you have to see to change your mind?
- Brendan Jarvis:
- Oh, that's a lovely obligating question. Yeah, I
- Jane Davis:
- Love it. And it makes people really think about how strongly or loosely held their opinion is where they're like, oh, I don't know if there is something. And I'm like, okay, if there's nothing then we need to talk about what we're doing here in a research kit come
- Brendan Jarvis:
- [laugh]. Yeah. Cuz if I could show you evidence that your opinion is incorrect and you're not willing to listen to that and we've got a bigger problem to address.
- Jane Davis:
- Right, exactly. I have every once in a while encountered that. And it leads to a really fruitful discussion about wow, either we are going to be blocked on progress because we have two people in the room who both will not be convinced even there's just nothing there. But it also makes people confront that uncomfortable moment where they're like, oh, oh wow, yeah, we need to talk about that more. Most of the time people are like, oh, I'd need to see people fail to find the button. And I'd be like, yeah, that's reasonable. That would be a problem. So it's not all big, deep moral and ethical discoveries. Some of it is just like, yeah, I'd need to see this fail a bunch of times. But once we have that conversation, I like to have, especially with evaluative stuff, I like to make sure the team has set success metrics.
- How many people, what do we need to see happen to make this decision? Do we need to see everybody fail? Do we need to see one person fail? What is our tolerance for for how this performs? So being very clear on saying, okay, we need to see everybody go through this flow without any issues before we'd be comfortable putting in it to the world, or we need to see at least a few people go through this flow and that'll be better than the zero people that can do it today or so having a conversation so that it's clear. And then the simplest thing I do is anytime it's concept testing, especially between multiple concepts, I literally just keep a scorecard [laugh]. Like I just have a spreadsheet and I'm like, they preferred this option or this option. And so in the concept test we just ran, the perception of the team was like, oh wow, they overwhelmingly preferred option two. And I was like, so fun story. Seven people preferred option two, five people actually preferred option three. And they were like, wow, really? And I was like, yes, it was nearly half and half, but depending on how many and which interviews you sat in on and what you were already thinking about for the concept test, you were pretty well primed to be like option two, that's the ticket. And so just raising small provocations that being like, Hey, let's do a reality check.
- Brendan Jarvis:
- So Jane, here's a small provocation for you then. Is this example that you've just given a good reason why UX researchers should always be helping product teams evaluate design?
- Jane Davis:
- So I think there are a lot of ways to help product teams evaluate designs. I think the reason I've just given is a reason that organizations need robust research operations teams in strategic roles. And that is like, this is my favorite drum to beat right now
- Brendan Jarvis:
- Is let's go there, we're going there, let's beat
- Jane Davis:
- This drum, let's go there. This is my favorite. So one of the most underutilized functions in research to date is research operations. We historically have treated research operations as like, oh, thanks for getting us all those participants. Please send them their gift cards. That is the absolute wrong way to think about research operations. That is the worst use of those skills.
- Brendan Jarvis:
- Although, let's be honest, nobody likes recruitment. Oh
- Jane Davis:
- My God, no, no. Let's be like, let's be real. We're very grateful. Grateful that happens when it does. But in my current role and in my last role at Zapier, research operations has nothing to do with recruiting participants. Researchers are on the hook for that themselves and will be as long as I am in charge of things. And the reason why is that, so tell us. There are a couple reasons. One, it's a lot faster in the long run for researchers to have to recruit their own participants. We know what we're looking for. I don't have to try and explain like, oh, here are all the screener questions you could should ask, or here's the exact right model of a participant. It's like, oh no, I'm gonna write my own screener. I'm gonna do my own phone screens so that not only do I find the exact right fit, I also formed that connection with them in advance.
- I am actually better able to connect with them during the research session because I have had that background chat. And it also in a world where we are resource and headcount constrained, which I have yet to encounter the world where we're not to use headcount on someone just to do those things when instead I could have them setting up internal insights programs or this is what I mean about research operations as a strategic function, is setting up the infrastructure to make every concept test feel like there's a researcher in the room, even if there's not a researcher in the room. Because the teams have the right frameworks, they have the right training, they have the right resources and the right skills that have been taught to them by research operations to just do that themselves. They can be the researcher in the room. The solution in my mind isn't to make, to have a researcher in every single concept test.
- It's to make every PM into a researcher and every designer into a researcher and to have them understand how we would do it and how we think and to be able to apply that themselves. And the way to do that is to have research operations really focused on these big strategic things. How do you train an entire company in being a researcher? How do you teach that mindset? How do you make previous research findable, discoverable and applicable, not just discoverable, right? Because it's also the thing that research does again, is have opinions. So how do we teach product teams how to form their own opinions based on these wells of insights that we're creating? And how do we, I mean at Zapier, they're actually before I left, we created a role called Insights program manager. And the role of that person is to synthesize all of the existing information about key topics and then to help the company apply them to help the company understand how to think about them and form a point of view and develop ideas based on them.
- And I think that is, if you wanna talk about is it a better use of my head count to have someone doing that versus writing screeners and trying to read my mind. I will tell you every time I want my little research librarian more than I want someone to do the incredibly complex and many moving parts work of recruiting that is ultimately faster for me to do myself because I know exactly what I want. [affirmative]. Yeah. And all of this said with a lot of love for people who spend all of their time recruiting participants because I know what a pain in the ass we are [laugh] like, we
- Brendan Jarvis:
- Love you, we love you people, we
- Jane Davis:
- Do love you so much. I think it is one of that's the other reason is I want my researchers to, I want us to stay close to it. I want us to continue being humble and understanding how much work goes into our craft and into our process from start to finish. I don't want us to get precious about like, oh, I only do this one part of the research process. It's all the research process from the moment you are working with the product team to be like, that's not a good question. All the way through recruiting your people, screening them, conducting the research, synthesizing and socializing it, everything in there should be the researcher's domain.
- Brendan Jarvis:
- Jane, are you up for some scenario based questions?
- Jane Davis:
- Absolutely.
- Brendan Jarvis:
- Okay, excellent. Let's do it. Number one, you're a junior UX researcher and you've been given an opportunity to plan your first study, that having a good set of research questions is important for focusing your efforts and delivering useful insights. What does a good research question look like?
- Jane Davis:
- So first and foremost, a good research question looks like something that is actionable. So it is something that we will be able to make decisions based on the answer to it. So it really is, it's something that ties directly back to the goals of the people asking the question. And it is something that does not include an answer already within it. So an example, maybe the worst research question I've ever been given comes from my favorite PM of all time, John Adams. And his research question at the very outset, way back in the day at Dropbox was something like, how much should we build this thing? Basically he was like, how much should we build this new product that we definitely should build? And I was like, John, where are you getting this idea that this product is needed from? He's like, well, I mean, it just seems like, and I was like, okay, cool. [laugh].
- He didn't really have an answer. And so after we talked for an hour, we finally got to this question where he was like, I think what I would really like to understand is how people work with their clients. And I was like, okay, that's a very different research question [laugh] than how much should we build this definite product idea that we are going to build? And so I think a lot of it really is going back to as far down in the assumption chain as you can. And so a truly good research question does not contain any assumptions. It contains a wide open space that does not have a solution inherent in it. And so the way that I always get to those is really by being like, well, how do we know we wanna do that? Okay, well how do we know we need that?
- How do we know it's clients? And eventually you get down to the actual deep root of the question, which is, oh, and my favorite answer is always, oh, we've seen some interesting logging data, or I've seen something directional in the market, or there is a signal, but we don't understand what that, why that signal is happening or what to do about it. So all my best research questions have come from a moment where there is signal and gotten down to, Hey, we see this why and how is happening there? That to me is the ultimate best research question. I do still get a lot of how much should we definitely do this thing that we're gonna do though?
- Brendan Jarvis:
- And reframing it like you talked about there is actually really useful because you can change it from how much should we build this thing to, should we actually invest further resources into building this thing or exploring this thing? Depending on what the question is that you come up with. I was just gonna say, you also touched on a really interesting way of interrogating with your PM or whoever's coming to you with the need for research. Instead of asking them, why do you think this or why do you think that? Which can be quite a confronting way of interrogating someone's thinking behind an idea. You framed it as a, how do we know? And I think for people listening, that's a really subtle, and you probably didn't even realize you were doing it. That's probably just your training and years of experience. But that's a really subtle yet effective way of being on the same side as someone else when you're trying to understand the problem that they're trying to solve.
- Jane Davis:
- That actually comes directly from coaching training. So when you are coaching people, either mentors or team members or things like that, asking why is absolutely, it's out of the question. You're just like, Nope, don't ask why. It creates a confrontation moment. And so a lot of it is from that training where it's like, oh, I know we, I know we all believe that the researcher's job is to ask why over and over again, but I actually, I find how to be a more fruitful area of entry.
- Brendan Jarvis:
- A hundred percent. Are you ready for another scenario?
- Jane Davis:
- Yes.
- Brendan Jarvis:
- Okay, here we go. You're running a series of user interviews, you've prepared lots of open ended what and how questions. Unfortunately in your first session you were a bit too rigid. Literally reading each question to the participant, you never really felt you got a good sense of their behavior. How can you frame your interview questions for a better and more natural outcome?
- Jane Davis:
- This is one of my favorite, favorite things. So a thing I do with have done in the past with team members, and I'm sure we'll do in the future that makes them very uncomfortable, is I will go through a project with them and say, don't write a script. And that's like the coming from your boss. That actually is, it's not really a suggestion. Sometimes it's like, no, I want you to not use a script for this one if I'm seeing someone struggling with that. And so I will say, we're not gonna use a script. Instead, I want you to write down the five things we're gonna walk out of this interview knowing I want you to just have those in a bulleted list at the top of a page. I want you to have those questions in front of you, and that's it. That's the only thing you can have in this interview.
- Five bullet points, what we're going to know at the end of this interview, and you can do 'em in any order. You can be awkward, you can, and we're gonna practice, we're gonna practice finding those five things and we're gonna practice being awkward about them. We're gonna practice saying, wow, I don't really have a good segue for this, so I'm just going to jump in and switch us onto a new topic. We're gonna practice all those little tricks that researchers do to keep things flowing or we're not gonna be afraid of awkwardness. And so that is my number one thing. Five bullet points, what we're gonna know, no other script. It's how I, it's actually when I run interviews, that's what I personally do. I do not write a script because if you've watched conference talk for me, you will. It's pretty easy to tell when I have a talk track and when I don't have a talk track, because when I have a talk track, I am much, much worse at giving conference talk. It's because I read and reading is not nearly as compelling as when I am just speaking to an area I feel strongly about. And so I stopped writing research scripts pretty early on. Just five things I'm gonna learn from every person, come hell or high water, I'm walking outta this interview with this information.
- Brendan Jarvis:
- I love it. It's another great example of don't let perfect get in the way of better. Yep. Jane, I'm gonna bring us down to the close. This has been a wonderful conversation. I have one final question for you. You've been spending a lot of time talking and thinking about the future form of UX research teams and how they can help organizations to make user centered decisions. What does that future look like to you?
- Jane Davis:
- I think to me it looks like a deep investment in research operations and really understanding research operations as a strategic function, giving research operations equal space with research in terms of how we design and build programs. It looks like creating non-traditional research roles like insights program managers, so that we can build off of and leverage existing work and help companies really apply what we've already done and help scale our efforts without recreating every single study or having to be everywhere at once. It really looks like aligning ourselves to the organization's goals and working to unpack those and do the research that informs them. And it also looks like, as I mentioned, like turning everyone who is interested in research in any way, shape or form in an organization into a researcher and giving them the abilities to think the way we do and to apply our methods and to do that in ways that are going to produce better if not. Perfect.
- Brendan Jarvis:
- You heard it, ladies and gentlemen. The one thing that I'll definitely be taking away from today is a reminder not to let perfect get in the way of better. Jane. Thank you. What a fun conversation. This has been absolutely packed, full of great stories and huge insights for people to take away. Thank you for so generously sharing those with us today.
- Jane Davis:
- Oh, thank you so much. This has been just an absolute joy. I had a great time. Thank you for having me.
- Brendan Jarvis:
- You are very welcome. Jane. For people that are interested in yourself and your career and the way that you think about UX and UX research, what's the best way for them to connect with you?
- Jane Davis:
- I think LinkedIn is kind of the only place where I'm at all, even a little bit active, so that's probably the best option. But I do have a website that I am trying to update every once in a while, which is janendavis.com. So you can always check and see if I've gotten around to a second blog post
- Brendan Jarvis:
- [laugh]. I'll definitely post links to your LinkedIn and also to your website Jane and the show notes. Thanks again, and to everyone that has tuned in, it's been great having you here to listen as well. Everything that we've been covered will be in the show notes, including where you can find Jane and any of the resources that we've mentioned. If you enjoyed the show and you want to hear more great conversations like this with world class leaders in UX design and product, don't forget to leave us a review and subscribe. And until next time, keep being brave.