Tammy Everts
How Web Performance Shapes UX
In this brand new episode of Brave UX, Tammy Everts unpacks why web performance is a fundamental UX issue ⚡, how slow websites affect user perceptions and business outcomes 📉, and why designers need to take ownership of performance alongside developers 💪.
Highlights include:
- How did you come to own uber.com?
- What does performance have to do with UX?
- Why aren’t designers more involved in performance?
- How do you identify the best executive champion for performance?
- Is measuring the business impact of performance as easy as it sounds?
Who is Tammy Everts?
Tammy is the Chief Experience Officer at SpeedCurve—a platform that enables organisations to unlock the full potential of their web performance ⏩.
In her role, Tammy champions the connection between site speed, user experience, and business success, working closely with customers to deepen their understanding of how people use with their websites 🔍.
Tammy's career journey also includes senior UX roles at Soasta and Radware, and over two decades of pioneering research involving EEG headsets, facial action coding, and advanced machine learning 🦾.
She is the author of “Time is Money: The Business Value of Web Performance” 📘 and a sought-after speaker, having shared her expertise at prominent events like Chrome Dev Summit, Smashing Conference, and Beyond Tellerand.
Tammy also co-chairs the annual performance.now() conference in Amsterdam and co-curates WPO Stats, a valuable resource of web performance case studies 💡.
Transcript
- Tammy Everts:
- The thing about Gen X is most of us, if we've been working on the web, any capacity for our adult lives, we didn't start doing this until we were in our mid twenties and there was nobody to teach us. So we're
- Brendan Jarvis:
- Hello and welcome to another episode of Brave UX. I'm Brendan Jarvis, managing founder of The Space InBetween, the behavior-based UX research partner for enterprise leaders who want an independent perspective to align hearts and minds. You can find out more about me and what we do at thespaceinbetween.co nz.
- Here on Brave UX though it's my job to help you to keep on top of the latest thinking and important issues affecting our field of design. I do that by unpacking the stories, learnings, and expert advice of a diverse range of world-class leaders.
- My guest today is Tammy Everts. Tammy is the Chief Experience Officer at Speed Curve, a platform that enables organisations to unlock the full potential of their web performance. In her role, Tammy champions the connection between site speed, user experience, and business success, working closely with customers to deepen their understanding of how people use their websites.
- Tammy's career journey also includes senior UX roles at Sosta and Radware, and over two decades of pioneering research involving EEG headsets, facial action coding and Advanced Machine Learning.
- She's the author of "Time is Money: The Business Value of Web Performance", and a sought After speaker having shared her expertise at prominent events like Chrome Dev Summit, Smashing Conference and Beyond Tellerand.
- Tammy also co-chairs the annual performance.now() conference in Amsterdam and co-curates WPO Stats, a valuable resource of web performance case studies. And now she's here with me for this conversation on Brave UX. Tammy, a very warm welcome to the show.
- Tammy Everts:
- Thank you. What a nice introduction. Wow!
- Brendan Jarvis:
- I thought it was pretty 'wow' too.
- Tammy Everts:
- I really enjoyed that.
- Brendan Jarvis:
- In terms of the things that you've done, I was very much looking forward to speaking with you today. Tammy, I want to start with something that's perhaps a little out there, but it's on your LinkedIn profile and you'll probably see where this is going soon enough. I believe you once owned the domain name uber.com. What's the story there?
- Tammy Everts:
- So basically this gives away my age. So in the nineties, the mid nineties when we were still in the early days of the internet, I had been publishing a zine, believe it or not, along with my husband and a friend of ours university. So we had publishing the zine on paper and it was called Uber because we were university students and we were like, oh, it's very clever. We're going to be just over a whole bunch of topics. We're publishing the zine. And then the internet sort of came along and we were like, oh, we should put this online. Yeah, let's take our zine and put it online. So we bought the domain. We were the first people to ever register uber.com, and we published our zine under that and you can't find it anywhere because it predates the way back machine. I know, I've looked, it's kind of a bummer. We did that for a year or so, and then I went to graduate school and our friend moved away and we sort of let the zine die and then somebody offered to buy the domain from us. It wasn't Uber the company, it was another company that I don't even remember what they do, and we did not get rich. We made tens of hundreds of dollars from selling uber.com, but it's a good story. So yeah,
- Brendan Jarvis:
- It is a good story. It's one of those fascinating insights into the early stages of the web and just how you can stumble across things like that and how later in retrospect, they end up being really, really interesting stories.
- Tammy Everts:
- Well, AR didn't even have photos.
- Brendan Jarvis:
- Oh, didn't it?
- Tammy Everts:
- No, it started as black text on a grey background as much of the web was, and I can remember when we introduced background colour, it was light yellow and when we introduced images and then also being able to centre text. So all of these things kind of happened over the life of our little online zine. It's pretty funny. There's probably a performance tie in that I can make to this,
- Brendan Jarvis:
- But I was going to say, it sounds like it would've been quite a highly performing website back then if it was just text and no images.
- Tammy Everts:
- Yeah, the problem was our 14 four modems probably the biggest liability at the time.
- Brendan Jarvis:
- Yeah, 100%. Speaking of performance on things that are fast and slow, I noticed on your website that you had a quote, and I'm just going to quote your website now, which I believe is written by you. It says on there, I believe in a faster web and slower living. And I was curious about this. What does slower living look like to you?
- Tammy Everts:
- I have given this a lot of thought over the past few years. I used to live in Vancouver, my husband and I and our then young children. About 12 years ago, we made the choice to leave Vancouver and move to a little ski town in the Rockies. And it's not an original story. It is the classic kind of have young kids make a quality of life move so that you can slow things down. But over the course of the time that we've lived there, and I'm not perfect at this by any stretch, but it, it's a goal to just actually try to live very intentionally to feel like whether it's how I'm cooking and eating or the things that I make and use or the types of activities that I like to do, I like to make things with my hands. I think that the connection between faster web and slower living is that by living slowly and intentionally, you can actually feel that time is stretching out for you. A faster web is a way of taking this thing that's kind of unnatural. Using technology on a daily basis is not something that human beings are really designed to do. So it's kind of like we need to speed that up so that it can at least give us something close to the feeling that we have when we do all of our more natural things. And that feeling of kind of flow and satisfaction.
- Brendan Jarvis:
- One of the things that I think fits into the category here of slower living is an activity that you and your family like to do, and that's hiking and camping out in the Rockies. And I also understand that as far as the wildlife in British Columbia goes, it can be pretty wild at times. Have you got any close encounters or any sort of surprising stories that you can share?
- Tammy Everts:
- Yeah, I do actually. Yeah, we do a lot of hiking in the wintertime. We do a lot of snowshoeing, which is, we live in a ski town, so that gets us a lot of mockery because snowshoeing, sorry, slow shoeing. I just made that up. Snowshoeing. It's a point of mockery if everybody really likes to go downhill really, really fast and you're like, I like to strap things on my feet and go really slow, people are like, why do you do that? But a story that I have that doesn't actually even take place in the wilderness. So with the town that we live in only has about 11,000 people in the fall and winter bears when they get hungry and they're in the woods and they're not quite ready to hibernate, they come into town and they try to take food from people's gardens, from their fruit trees, from the garbage.
- And I'm a baker, I do a lot of home baking and I make a lot of Christmas treats. And one treat that I like to make is chocolate peppermint bark. And for anybody who doesn't know what that is, it's basically you take really good chocolate, you melt it and you add various things to it and you put it in trays to cool down in layers and it helps to chill the trays to kind of speed the cooling process along. And we have a porch that is very, very handy in the wintertime for putting baked goods outside if you want them to cool quickly. And I actually have a table outdoors for exactly that purpose. And so one night I was making peppermint bark, and this was in early December and I had done all three layers. So it had taken about two hours, not total time, but total time out on the porch really.
- And it was taken about two hours to get to this point where it's finally done, got that last little sprinkling of kind of peppermint candy on top and it was ready to go. And I went outside to fetch it off of the porch in this big cookie sheet and it wasn't there. Looked around and literally saw the tail end of an enormous black bear that had clearly come up on my porch, eaten the bark, and I didn't even hear the pan hit the ground. It was super stealth mode and there was not a bit of peppermint bark to be seen. The bear had been sitting on her porch probably eating the peppermint bark, and then it sort of just ambled off the porch and kind of hung out down on the street for a little bit and probably was waiting for me to put out more peppermint bark. And the same bear came back several more times, pretty much every night at the same time because they're very intelligent and it was obviously thinking that was really good. I would like to come back and have more of that. So yeah, so I felt really bad because you're not supposed to leave bear attractants out. I really thought I was monitoring what my situation and that I was on top of it, but clearly the bear just got in there faster.
- Brendan Jarvis:
- Have you had to institute any security measures now for the table?
- Tammy Everts:
- Yeah, I don't wait baked goods out there anymore. The security measure is that I cool things indoors now, so yeah, I've learned my lesson.
- Brendan Jarvis:
- Hey, taking a completely different direction now and thinking about your start into the world of UX roundabout, say 20 years ago, I understand that you rigged out your first usability testing lab with what were then state-of-the-art power, Mac 95 hundreds. And since then, as I mentioned in your intro, you've gone on to do some really fascinating UX research using quite advanced technology such as those EEG headsets, facial action coding, and then there was a mention there of Google's machine learning system too. Now at Face Value, having a look at your educational background, you've got a master's in publishing this seems like and a degree in English literature. This seems like a really interesting direction to go based on what you studied into what you've now done, how does it seem to you?
- Tammy Everts:
- So going way, way back to when I was in high school, and so first thing is I'm Gen X. So the thing about Gen X is most of us, if we've been working on the web, if we've been working online in any capacity for our adult lives, we didn't start doing this until we were in our mid twenties and there was nobody to teach us. So it's kind of a nice thing because most of us are pretty scrappy and pretty good at figuring things out, but it also means we have huge gaps in our knowledge. And so sometimes we need younger people who have actually proper rigorous training to explain some things to us and fill us in between. But going way back to when I was in high school, I was in grade 11 or 12 and we were given aptitude tests to take that said fill out a whole bunch of things.
- The results that I got, my top three, I can't remember what the third one was, but two of them were journalist and astronaut. And I thought that was really kind of a weird combination, but I liked it and I almost studied journalism and then thought that's really very specific, and I'm not sure that I want to be a journalist exactly, but I know I like reading, so I'm just going to do an English degree. And so I did an English degree and I really enjoyed it and had a great time. People kept asking me what I was going to do with it and I said, I don't know, I just want to read some books, leave me alone or publish a zine. And knew I didn't want to teach
- Or publish a zine. Exactly. So I did a lot of writing in university and I worked on my university newspaper. I ended up being editor of the university newspaper, so I knew that I liked explaining things to people and I liked and I had just liked the act of writing and communicating. I briefly worked at a publishing company as an editor between getting my English degree and going back to school, doing my master's, what I learned about myself working in print in a few different formats. I also interned at a magazine, like a print magazine and did a few things like that. And what I learned about myself is that as much as I love writing and I really love explaining things to people and sharing information, I find the format of print publishing very restrictive. And I mean, I read a lot of books and I read a lot of magazines.
- It's not that I don't like those media, it's that for me it's that idea that you take six months to a year to write something or even longer like the lifecycle of a book before it goes to press it so long. And even a newspaper, a magazine, same thing, the months out sometimes and they're planning. And I just really liked the immediacy of being able to publish online. So when the internet kind of came along magically right after I'd done my undergrad degree, I just knew that I was very fascinated by it. And when I did my master's, my master's was focused on at that time print publishing. And I told the director of the programme that I wanted to do my thesis on online publishing at the time, new media, we don't call it that anymore for obvious reasons. He felt very strongly that I was chasing the wrong horse in this race and it was a fad and I was making a huge mistake.
- And he very, very, very strongly tried to encourage me to stay with print publishing. But at the end of the day, I got my way and I was able to do my thesis on online publishing new media, and I'm really happy that I did. So kind of following that, I worked for a web agency that I eventually became a partner in. So it was basically two guys. One was a developer, one was a designer, and they needed a content person. So I became the content person. We didn't have a name for it, that was just it. And eventually in the late nineties started to teach myself about usability. So kind of was an early follower of Nielsen Norman Group like Jacob Nielsen and all those teachings. And that was kind of the foundation of what got me started in thinking about usability. And so when I was partner in this agency, which is still around, I'm not involved in anymore, but it's a very healthy agency, I felt very strongly that we needed to do in-house usability testing.
- So we should stop just building prototypes and building websites for people, but we should actually run them through their paces. So that was really fun to do just to figure out how do you run a usability lab, how do you do a usability test? And the one thing we never tested for was performance because that was just an invisible vector of user experience from our perspective, showing people prototypes or wire frames, everything was on local machines. Everything was very, very, very fast. And we would build sites and we would pass 'em along to our clients and say, there's your website, go maintain it. Or we would maintain it for them. But performance wasn't even on anyone's radar at the time.
- Brendan Jarvis:
- And people may be wondering who are listening or watching this, what on earth does performance have to do with user experience? And that's also my question, what does performance have to do with user experience?
- Tammy Everts:
- I mean, I want to say everything because that's a very dramatic sounding statement, but it's not everything. But I would argue that if you are not taking performance into consideration whenever you are creating an app, a website, you are missing a fundamental pillar of the user experience. So design isn't everything. It's not enough to have a beautiful design if people can't engage with it. A webpage isn't a printed page that you look at and maybe click a button. It's an experience. And as human beings, we need our experiences to feel intuitive and to feel seamless because that's just how our brains are wired.
- Brendan Jarvis:
- It makes a lot of sense. And it is, as you mentioned earlier, it is kind of the forgotten or the invisible part of the experience. But having run many, many usability tests myself, it's very evident during a test when things like slow loading times cause people to completely lose their attention, it drifts elsewhere. Just last week I had someone pick up a mobile phone while they were waiting for something to load. And of course that detracted from the time they were spending with the experience. And that kind of added, I suppose, to that friction between what's going on on the screen and the life that they're trying to lead and the task that they're trying to execute through the software. And I wanted to pick up now on something that's tied into the current economic condition that we find ourselves in and relates to performance.
- And this is this theme at the moment of needing to do more with less. And I watched your conference talk at last year's Performance Now conference where you were kind of empathising with people about this. And also you lay it in the fact that there's a rapidly expanding landscape of performance-based tools and technology that people need to get their heads around. And at a time like this where teams have been downsized and people are being asked to do more with less, some are even being asked to work smarter, not harder. And I know you've got a particular feeling about that phrase. What is important for people who are either keenly in performance from a UX perspective or even just more generally or that are already in charge of that particular dimension of experience, what should they be thinking about or focusing their efforts on in terms of the problems that are facing their organisations right now?
- Tammy Everts:
- The most important thing is to not give up and think, well, it can't be done. So coming up with some plan, even if it's a bare bones plan that just keeps you engaged with the idea of performance is something that matters. So one of the things that I talked about in that talk at now with this massive landscape of tools and even metrics that we track, so for your viewers who are less familiar with performance, there's a lot of different page rendering metrics like start, render and time to first bite and all kinds of esoteric metrics that I won't even start to go into right now unless you want me to. Maybe I'll let you drive that part of the conversation.
- Brendan Jarvis:
- Take us where you like.
- Tammy Everts:
- So the first thing that I would do is just try to hone in on, oh, what is the metric that matters for my particular site? And probably the best metric or metrics for you to get behind are the ones that you can see that if you make a change to that metric, you improve it. Or I guess conversely make it worse, it's going to have an impact on your users and on your business. And so with a lot of the performance tools out there, speed curve is one of them. You can look at the intersection between metrics like for example, start render, which is when content ideally meaningful content is starting to render in the browser, we know that we can correlate that metric to conversion rate or to bounce rate pretty effectively. Not necessarily for every page or for every site I should say, but for a lot of sites. And this is the kind of thing you can validate for your own site just to make sure is start render a metric that's meaningful for my pages and my users. And so even if you can just start with one metric that you track, and that's just our industry likes to use the word North star a lot. If you want to make that your North Star, well though you're in another hemisphere, what would you have? Would it be
- Brendan Jarvis:
- The southern star?
- Tammy Everts:
- So some compass point that you can use. So all of your actions are around tracking that metric, noticing changes in that metric, and then ideally figuring out what you need to do to optimise for that metric. And there's so many different optimizations that are available out there because webpages are so extraordinarily complex these days, but figuring out, well, what are the optimizations that are actually going to make a difference for this particular metric? Don't waste your time on optimising a lot of things that don't really matter. Ideally, our pages would be really, really lean and really fast and there would not be any unnecessary third parties or JavaScript on your pages, but that's not really the world we live in. So it's kind of just to get those first and most important pixels to the user's screen, what do you need to do to do that? And
- Brendan Jarvis:
- One of the approaches for focusing performance efforts that I've heard you speak about before is called critical rendering path and thinking about it seemed particularly user-centered to me, when you get to explain it, I think this will make sense to people. What is the idea behind critical rendering path?
- Tammy Everts:
- It's kind of directly tied into what I was just saying. So a critical rendering path is the path that the browser takes to get the most meaningful content up on the screen. So I feel like I'm going to explain this really badly because I'm a little bit jet lagged right now, but in simplest terms, that's how I would put it. And so your critical rendering path would be, well, what is the minimum number of important page resources? So that is your HTML, your JavaScript, your CSS images, your fonts, what's the minimum number of all of those resources that it's going to take that the browser needs to render in order for the user to see something meaningful on their screen?
- Brendan Jarvis:
- Is it somewhat borrowed from project management when they talk about the critical path of a project, other stuff can be going on, but if you don't execute this stuff well, then everything gets pushed out. Is it similar to that in the way people think about it?
- Tammy Everts:
- No one's ever actually expressed it that way before, but I feel just like, yeah, you're right. It is a lot like that. And even if you are looking at performance tests, one of the outputs of a performance test is something called a waterfall chart and a waterfall chart. The first time I saw one as somebody kind of coming from building websites kind of background and project management was, oh, it looks like a Gantt chart. It's a type of Gantt chart. Instead of showing tasks, it's showing different tasks, it's showing the resources on the page. And you can see the interdependencies that if your browser can only stream eight resources at a time, you see what those eight resources are and you see how the next eight can't start to be downloaded and render until those first eight are pretty much done. And so you can actually see what all the interdependencies are on the page. And so even if you're not a developer and you want to understand how pages are rendered and what some of the issues are, learning how to interpret a waterfall chart is a pretty handy little skill to add to your toolbox. And somebody showed me how to do it, and it is pretty easy once you get it, you just go and look at some, if you've read a Gantt chart, you can figure it out a waterfall chart.
- Brendan Jarvis:
- You mentioned engineers or developers, there's a bit of a grey zone here for me at least, between who's responsible for performance or who should be. So when you think about something like the critical rendering path, is this something that the design organisation should be taking the lead on or is this something that more naturally sits with engineering or product management? Where do you see this most naturally finding a home, this concept of the critical rendering path?
- Tammy Everts:
- It's a really good question. And 10 years ago I would've said it's completely falls into the camp of developers and engineers, but that's only because they're the people at the time who were using the tools that we're making the critical rendering path sort of apparent. But what's changed over the years is more and more at speed curve. I wear a lot of different hats because we're a small company and I talk to a lot of customers and we have customers really big sites or big web properties, enterprise organisations that use us down to much smaller properties and kind of more like mom and pop shops. And what's interesting about the types of people that I speak to, I've been at speaker for about seven and a half years now, is in the early days I was mostly talking to developers and engineers. Now I talk to SEO people, I talked to product people, I talk to marketing people.
- So I would say that in my view, performance belongs to everybody. If something that you do touches a page, whether it's your person who uploads images to a CMS or your person who introduces third party tracking widgets and things like that to the page, whoever you are, you should know what the performance impact is of the thing that you're adding to the page. So if you touch a page, then you should care about the performance of the page. And the reason why people don't care about the performance of pages isn't because they actually don't care is because they don't know, just not currently aware. So I think that the bigger task is just building awareness throughout organisations but really, really healthy companies and they exist. Chances are if you look at a really fast website like an Etsy or Pinterest or something like that, these are people with very healthy performance cultures in the organisation and they try to involve people throughout the organisation and make performance really interdisciplinary throughout the company.
- Brendan Jarvis:
- You may have just answered the next question, however there may be more to it. So you mentioned Etsy, and I understand that you've spoken about Laura Hogan before who's the former engineering lead at Etsy and also previously VP of engineering at Kickstarter. And Laura previously has said, and I'll quote her now, the largest hurdle to creating and maintaining Stella site performance is the culture of your organisation. And you touched on culture there saying that the focus should be on creating an interdisciplinary approach to performance and basically not having it siloed within any one particular department. What else does a great performance culture look like?
- Tammy Everts:
- Oh, that is a really big question. So to try to unpack it a little bit. So Laura is probably the person who coined the term performance culture. So even though she doesn't really work in performance anymore, she does very high level kind management consulting and things like that. Our industry still owes her a debt of gratitude for introducing this as a concept. She and her team at Etsy have always been pioneers in sharing how they do what they do. So as far as how you actually create performance culture, there's kind of not one way to do it. And I've been at, we Love Speed, which is a performance conference that happens in France every year, and I've been talking with different people about how they do performance culture in their organisations and definitely there's no one right or wrong way to do it, but with one person being the champion and that person on the ground who is willing to be the point person for various teams to come to, and they're kind of acting as an ambassador, a very educated ambassador and teacher, and it has to be somebody who really likes talking with people and is really good at talking to people throughout the org using language, whoever their audience is at the time, language that resonates with that audience.
- And then the next thing that person needs to do is get buy-in from somebody much higher up in the org, who is, that's your high level champion who is going to make sure that you get the resources that you need and that it becomes a priority for everyone in the org to care about this because you really do need that kind of stamp of approval. Otherwise you're not going to get the resources you need. You're not going to get the investment in tools. Your efforts are going to get deprioritized every time there's a push to release new features and things like that performance. It's like writing a blog post for your own blog. It's like, oh yeah, I'll maintain all that later on. And nobody ever does it's sort of shoemakers children's situation. But so those are two of the big elements to start with.
- Brendan Jarvis:
- So thinking about the champion, the executive champion or the senior stakeholder to get him behind the performance efforts that you're leading, how do you go about identifying who's the prime candidate for fulfilling that champion role? One
- Tammy Everts:
- Thing that can be really successful is using tools that already exist out there to tell a story about performance using your own site's data. So maybe you don't have a real user monitoring tool installed because there's a bit of a learning curve and investment curve to use those tools and really user monitoring tools, while they are really, really effective for correlating your performance metrics, your business metrics and doing that kind of deep dive, you're probably not going to be able to use those right out of the gate. But there are publicly available synthetic testing tools that let you do really fun things like benchmark your own homepage against your competitors' homepages. For example, if your audience just Googles synthetic performance testing, web performance testing tools, you'll find these various tools out there. You can do it in speaker as well, I have to say that.
- But there are other tools as well that do this and you can create really compelling visuals. There's a visual that's an output of synthetic testing that we call a film strip view where you can actually see a frame by frame rendering of your page so you're seeing, and it can even be broken down to tenths of a second. And so you can really slow down how rendering happens and see a whole timeline when your images come in and when your text comes in and when other bits and pieces of the page come in. And with synthetic testing tools, you can do this for your own page, but you can also do it for your competitors' page and the synthetic tools will stack rank your page against your competitors' page. The ability to do this has existed in the tools for quite some time, but probably it's still the most effective way to get people in your organisation to care.
- If you create a stacked set of film strips like benchmarking your site against your competitors and there's five sites and you are number four or number three or even number two, and you show that to people higher up in your org, people kind of in corner offices are very, very competitive and they don't like to see their site being second or third or fourth in anything. So that can be very, very motivating. So that can be, you can kind of shop these kinds of visuals around really casually and say like, oh, hey, I just did some tests and it looks like we're number three and our competitors are up here. That's just a good way to open the door and just see who's interested in wanting to continue those conversations.
- Brendan Jarvis:
- That's really interesting. And you mentioned those people in the corner offices and we often talk about these people on the show and a lot of our emphasis, particularly within the design community more broadly has been on how do we build better relationships with them? How do we, I suppose get more of us into those corner offices, those sorts of aspirations. And I'm going to grossly generalise now as well about people in corner offices. You mentioned that they are competitive. One of the other things that I suspect that they also is political actors in order to rise to that level of the organisation. So once you've keyed in on someone who's giving you all the signals that they are willing to get behind this, what's in it for them? How do you position the conversation around performance? Maybe not overtly, but how do you even tacitly communicate to them the benefit through to them in their career and their organisation, particularly within a large enterprise that getting in behind performance is going to be beneficial?
- Tammy Everts:
- Money always talks money. Profit is very compelling with real user monitoring tools. So let's say you've got somebody's interest with the competitive benchmarking and maybe they're interested enough that if you say, well, we can try out a RUM tool, rum being short for real user monitoring, we can try out a RUM tool and even do for example, some AB testing where we maybe serve 90% of our users or 95% of our users an optimised version of our page, but we can serve 5% of our users a throttle page. You can actually see a correlation chart in that as a result of that AB test that shows that conversions are lower for the throttled cohort of users or bounce rate is higher for the throttle, that throttle cohort, this kind of thing where, and so from that you can extrapolate, you see what throttling does, what do you think optimising will do?
- You can take the results of these correlations where you see that, well, if conversions dropped by 2% when our page was significantly rendered one second slower, we can guess that perhaps conversions will increase by 2% if we make the same page a second faster. And that's kind of a gross overgeneralization, but it's using that kind of language and that kind of thinking that can at least get people thinking about the fact that there's money that's being left on the table if you can't quite get people to that point where they want to look at their own data or start futzing around with their own site and their own monitoring at this point. There are so many case studies, so when you introduce me, you talked about WPO stats.com, it's a repository of case studies going back several years. All kinds of companies, e-commerce companies, media companies, travel companies that have done this work, they have done the AB tests, they are sharing their own data and their own findings that show, and we don't allow any case studies to appear on WPO stats unless they're proving that they actually made some kind of business or UX impact by making pages faster.
- It's not enough to make pages faster. You can do that and still not have a business impact. You need to show that there was a business impact. There's a lot of really good compelling case studies in there and probably something that resonates with your industry and that then might also give you a little bit of a framework for doing your own research and how to set up your own AB test and how to do your own extrapolation.
- Brendan Jarvis:
- I'm not sure if it's a WPO stats case study, but you've given examples before of how small changes in the web performance of an experience can lead to quite impressive improvements in business metrics. And one of the case studies you've talked about is fanatics, which as I understand it was a retail site for some description and they made some basic modifications to this site along the critical rendering path, including things like compressing some images and deferring the loading of a Sprite as well as optimising some CSS and js. And the story goes that this led to a two second improvement in the metric they were measuring, and as a result it doubled their mobile conversions. And that sounds awesome. Is that measurement of business impact as straightforward as it sounds in a case study?
- Tammy Everts:
- Yeah, I mean those kinds of results, they were really typical of what I do see a lot of the time performance is kind of about two things. One is making changes to your code base or to your design that make your page faster and then doing clever things and tracking performance over time to make sure you don't regress after you've done those great initial things. And my colleague Andy Davies at Speed Curve, he's an amazing performance consultant and he's been doing this for a long time. He refers to things like, well, just looking at are your images optimised? Is your hero image actually optimised? Have you deferred the JavaScript that needs to be deferred? He calls it the dull boring stuff, but often looking at the boring stuff can yield results, especially in the early days. And then interestingly, that just puts me in another way to have a performance culture.
- When I was talking to somebody a few years ago at a conference and she said that she's a performance consultant and when she goes into a new client, she just looks for that easy boring stuff that you can basically identify. If you know what you're doing, you can find those unoptimized resources and fix them, make things a couple seconds faster right away, and suddenly you're hero and suddenly that opens up a lot of doors for you because people believe in what you've done and they believe in you as the person who did them. The fanatics case study is a good example of that where the team that did those optimizations, they saw an opportunity to do some really easy wins. These fixes did not take a day and being able to track over time and those were the only changes of the page. They didn't change the design in any other way or the content. It was just behind the scenes optimizations the page. If you looked at it as a flat image, it looked the same for all intents and purposes. And that's how it was a lot easier to kind of see like, okay, well clearly the gain in conversions was because you did this thing. It's not because a bunch of other things on the page changed as well.
- Brendan Jarvis:
- Yeah, and I suppose getting at the heart of what I was curious about is when it comes to putting your head above the parapet and going, Hey, if we do this thing, then it's going to lead to this great result. And then when you actually do that thing and you attribute your actions to a meaningful movement in a business metric, it lends itself to criticism if that attribution is not concrete. And so I was curious about how it is that you can be confident that the business metric you are trying to tie the performance improvement to is watertight. Is there a golden rule here or some best practises that people can follow so they don't leave themselves open to jabs from people that may want to detract from their efforts?
- Tammy Everts:
- That's a really good point because that is a recurring question that comes up. So there's two things, and I'm going to say them out loud right now because otherwise I won't remember. The second thing, whenever I finish talking about the first one that I want to address, one is performance plateau, which I'll get to later on. And the first is really getting back to the idea of doing AB tests because an AB test is really your most solid way of demonstrating performance impact. And so doing an AB test or a multivariate test where you really are looking at the same page being delivered to people over the same period of time, so it's not like some of the common objections that people can have to, oh, maybe this wasn't a performance win. Maybe this was because we made some other change to our copy or we made some changes to the offer on the page, or we did an ad campaign or something like that.
- So the best way to just get rid of all of those objections is to do just side-by-side testing of the same page with the only variable being that the page was optimised for a big chunk of people. As a funny note to that, years ago at another company, we did that for a customer, a very large US retailer, very, very large, and they were only slowing down 1% of their traffic because they were still kind of doing this. They were in this proof of concept phase where they wanted to really, this is probably 2012, so still kind of early days of performance. So they really needed to still convince their bosses even higher up the food chain that performance was something they needed to care about. So they only slowed down 1% of their traffic, which was still a lot of traffic for this particular site, and the experiment was supposed to last for 12 weeks and they cut it short after two weeks because they realised how much money they were losing.
- So that was very gratifying and we never wrote it up into a case study because they didn't want to do that. Case studies are anytime you see a performance case study or any case study really, I hope people have an appreciation of how much effort it took to get that case study across the finish line and published, especially if there's a big brand behind it. Somebody had to work really hard to do that. So AB testing is your best way of convincing people. If you can't do an AB test and you are looking at kind of a little bit more hand wavy before and after type scenarios, one thing that can come up for people is well made the page faster. We fixed a bunch of things and the page did get measurably faster, but nothing changed as far as bounce rate or conversion rate or anything like that.
- And that's something that it was a bit of a head scratcher for a while until we actually started looking at more data. So looking at correlation charts, so a correlation chart, it's really hard to describe, and I tend to start drawing lines through the air, which is so helpful for people, but it's really a chart that shows cohorts of users based on how fast their experience was of using your site. And those are kind of the bars that go across the graph this way. These are the bars and then the line part that goes over the bars is whatever business metric you're tracking. So it could be conversion rate for example. So what has been discovered, I don't want to say that I discovered it, I think I might've been one of the people who noticed it really early on, I've just been doing this for a long time, so I don't want to accidentally take credit for anything, is that we can notice that if a page goes from being three seconds to two seconds in a significant metric like start render, suddenly conversion rates will improve or bounce rate will improve.
- But maybe whenever we go from seven seconds to five seconds, nothing happens because there's a long plateau on that metric line where just nothing really changes. So the performance plateau is just that it is a point where you can move and improve metrics across the plateau, but you're not going to see a change to any of your business metrics. You kind of have to get into that sweet spot. I don't want anybody to take this as gospel because it really varies from site to site. Performance plateau really does vary quite a bit for different websites, but typically where we see change happening in terms of performance metric is if you can take your numbers from five seconds to three seconds or four seconds to two seconds or three seconds to one second, things like that is where you're going to see a lot of improvement to your business going from nine seconds to five seconds. Even though that's very, very dramatic and isn't unhelpful, you're not going to see dramatic business gains.
- Brendan Jarvis:
- It's not entering the goldilock zone here. It's not that sweet spot that you need to get to. So I was curious, can you predict what the sweet spot is in terms of the plateau or can you only see it based on historical data?
- Tammy Everts:
- What you can see is, and what you can sort of extrapolate is that if you notice that for your cohort of users that has start render time of two seconds, if you see that there's a noticeable improvement in the conversions, like for example, 10% for the two second group, 5% for the four second group, you can guess that if you were to move more of those users from the four second cohort to the two second cohort, that would probably improve your whatever business KPI key performance indicator you're looking at. But that's just extrapolation. There are people who try to do kind of predictive analytics around if you make X change to a performance metric, you'll experience Y result to your business metric. And I think those are interesting to do and it's a good to maybe set it as a goal or a hypothesis we believe or we hope that making this change to our performance metric will help our business, but to say that it's actually going to happen is disingenuous. I personally wouldn't feel comfortable doing that because if you disappoint people, they're going to remember that. They're going to remember that you said the business metric could improve and it didn't. So
- Brendan Jarvis:
- You need to caveat the hell out of it to make sure that it's not a promise as such. It's just a prediction.
- Tammy Everts:
- A good piece of advice for life in general, but it applies to this in particular, is anytime you're engaged in an activity or something like that, where you feel vulnerable is to frame it to yourself and other people as an experiment and that makes you feel a lot less vulnerable. So it's like, oh, I'm just going to try this. This is an experiment. So if you want to get into the predictive analytics game, but you also want to kind of cover your bases, just say, well, we're going to experiment with the idea that making this improvement to our performance metric might have this impact on our business metric. And if you call it all an experiment that you might get away with it,
- Brendan Jarvis:
- You might not get fired. Tammy, I want to come back to something that you said earlier, which was where you were describing, I think you were talking about performance culture and how everyone who touches a page needs to care about performance. And you mentioned a number of different departments. I think you talked about marketing people that are uploading content, so whoever's got that responsibility. Engineering's obviously one, but I noticed that design was absent from your description there. Now that may have just been a thing you were just thinking of a few examples or it may be symptomatic of a broader problem. Why I'm asking about this is because design's always looking for a tangible way to demonstrate its value and often falls short of ideas as to how to do that. It's obviously not that easy. It's sometimes not that easy to tie the efforts of a design organisation or a UX organisation to an outcome that makes sense for a business. Is this symptomatic of a bigger problem that design wasn't sort of top of mind there when you were describing who should care about performance?
- Tammy Everts:
- Yeah, and I guess part of the reason why design didn't come up for me is that in, I do a lot of customer calls with speed curve users and designers are strangely absent from those calls, and I don't put the blame on the designers. I sort of wonder if designers have just been left out of the process because their involvement often is so kind of early in the game and design is kind of just treated as a thing that happens at the beginning of a project and maybe doesn't hands-on involvement doesn't stay throughout the lifecycle of a page. One area where I really feel the absence of designers is campaign landing pages, for example. So the holidays are coming up, a tonne of new campaign landing pages are going to appear. And what's really interesting about how those are created and how that process is managed as opposed to an entire kind of legacy website that's been around for a long time is you have teams of engineers, developers, everybody who's kind of responsible for this legacy website and they have hopefully a lot of standards and processes and code freezes that they do.
- But leading up to Black Friday and everything like that and then kind of happening in its own silo are campaign landing pages that are done to support ad campaigns, magazine campaigns, just all the money that gets spent funnelling people to holiday gift guides and things like that. And what's interesting about those pages is that I test the performance of it and often they're very, very non performant because I think I'm just theorising. The people who create those pages are not really invited to be part of the whole performance and usability conversation. They're kind of treated as one-off designers just come in, build a page, it's maybe done through marketing. So it's kind of underneath the umbrella of marketing and it's not subjected to the same rigour, which is incomprehensible to me because this is the page that you're putting all your ad money behind. You would think that of all pages on your website even more than your homepage. A homepage doesn't actually matter that much in terms of conversions. People don't go to most sites through the homepage. They come through search and they get parachuted into various other parts of the site, but a campaign landing page should be the most scrutinised page of your site during the holidays and often it's not. It's an afterthought, a design from a usability and performance perspective.
- Brendan Jarvis:
- Following on from this, you've previously said, and I'll quote you again now, if you don't consider time to be a crucial usability factor, you are missing out on an extremely huge aspect of the user experience. Well, that sums it up. It seems like as designers or the world of design, the organisations of design very concerned about user experience. It's kind of the thing that we live and die by. That's why we exist to create great user experiences, yet we are absent almost entirely from what it sounds like in your experience from that conversation around the performance part of the user experience can or should we do about that?
- Tammy Everts:
- I think it's kind of one of those, take the bull by the horn situations if you're not going to be, I think I'm just a little bit stubborn and I don't like not being part of conversations when I know that they're happening. So if you're not being invited to the conversation, I think you have to kind of push your way into the door. And that might start with familiarising yourself with some of the tools that are used so that you can start to monitor your own creations. So you can use synthetic tools to just test the performance of your pages. So a synthetic performance testing tool for your audience that might not know what this is is it allows you to enter a URL into the tool and fine tune the simulated environment. So you can tell the tool what browser you want to use, what region you want to run the simulation in, what connection type you want to emulate, a lot of other variables.
- And it will give you, the synthetic tool will give you an emulated idea of how that page might perform for a real user, it's not going to be as good as a real user monitoring tool, but just by showing that you have the facility to use these tools and a willingness to have these conversations and maybe the reason you're not invited to the conversations is because of the conversations actually legitimately aren't happening. I wish they were happening at every company they're not. And you can be the person to facilitate that and to just get it started or just maybe by showing an interest and kind of get in there and get yourself added to the group of people in your company. As I said, I speak to a lot of developers and engineers and I don't think that they're intentionally trying to exclude people. I think it kind of goes back to what we were talking about earlier where everybody's just really busy juggling a lot of things, trying to do more with less, and sometimes it's easier just to kind of do something yourself than it is to involve other people.
- And it's hard to sometimes see the ultimate reward of taking that extra time upfront to involve other people. If I can just give a quick aside, speaker was started by Mark Zeman, who is an amazing graphic designer. He was the creative director of a web agency and he built speed curve because he actually wanted to have a tool that he could use that was going to create, that was a going to monitor and test the pages that he was building, but also have really compelling visualisations to demonstrate the impact of performance for less technical folks. So the fact that there are tools like ours out there that are created by designers, they should be used by designers as well. So I would really hope that if people aren't already using tools to do synthetic testing, that they give it a shot. It's not as hard as it sounds.
- Brendan Jarvis:
- No, and I've had a look at speed curve and some of speed curve's competitors, and it really seems like it's quite a mature set of products now that are available to people. So you should definitely check them out. Some of the studies that you've done have been pretty high tech. You've been involved in something that is called facial action coding, and I had not heard of this before. I was looking at the material for today, and that's what I understand is where you look at people's micro expressions when they're experiencing something. And there was one study I believe you did, where you were doing an AB test of a throttled experience where one cohort was looking at the fully performant experience and the other was looking at intentionally slowed down version. What did that study or studies like that teach you about people?
- Tammy Everts:
- Yeah, so these studies are really fun. I should state that I directed the research, but we outsourced it to companies that do this type of technical research. So they're the ones who actually have all the real research skills and the really cool gear, but I got to be part of it and it was really fun and very interesting. So facial action coding research is really interesting. And as a user experience person, just the fact that this capability exists is so fascinating because you don't even need to bring people into a lab to do this. It's done through people's webcams. So you can use this tech to recruit a lot of people, as many as you would like to pay for depending on the scope of your research and the webcam records, their facial expressions while they're engaged in whatever usability thing you want them to be.
- So it could be performance related, it could be something else, and it's tracking all of these very subtle little micro expressions. So just the way that our faces move. And I had not believed that such a thing could be possible. I guess I tended to think that everybody just has the seven dwarfs emotions. You just like sneezy and grumpy and I can't remember the rest, sassy, I don't know. But there's a lot of more nuanced expressions that are capturable and we can take all that data and synthesise it and actually say, well, actually, typically people experience frustration or confusion or all of these different kind of more nuanced emotions when they were doing different parts of whatever function you ask them to do. And so you can trend this and sort of see, oh, okay, well this part seemed to be most confusing or this part seemed to be most frustrating or people seemed actually pretty happy and relaxed when they were doing this thing. So for that particular study, what we were experimenting with was we wanted to get people's reactions to different image rendering formats. So baseline image rendering, which is the top to bottom rendering, and then progressive, which is the layers. I'm telling you things you already know. I'm sure
- Brendan Jarvis:
- It's all good.
- Tammy Everts:
- It's all good. And what was interesting about this is I think most of us are kind of on team progressive image. It's just industry standard for a number of reasons because it sort of just feels better. But when we did this particular study where we, with the facial action coding, we found that when people are watching an image render progressively, their faces are working a little bit and it's very subtle, but they're actually trying on some level to parse what they're seeing and compose the image, a little more cognitive load for them to do that. Whereas a baseline image is just top to bottom people where their faces were more relaxed, they weren't working so hard. And that's not to say you should switch over to using baseline images. I still am in team progressive jpeg, but it was just an interesting little revelation that just because something seems like it's making us happier doesn't mean it's actually the better practise. That was the facial action coding study.
- Brendan Jarvis:
- And this is tied into waiting, right, and you've shared some pretty fantastic statistics before about waiting, and one of them is that we perceive wait times to be 15% slower than they actually are, and afterwards people report them as being 35% longer than they actually were. And related to this, there was another study that you were involved in where you looked at user behaviour again of slowing down to various degrees and experience, and you had a really interesting insight about the longer term impact that forcing people to experience longer wait times on websites had on their perceptions of that experience. What was that?
- Tammy Everts:
- If it's the one that I think you're talking about, it was an EEG study. We put EEG headsets on our participants and we gave them mobile devices and we asked them to engage in a few different flows through the site. So a couple of them were travel sites, so we had 'em try to book some travel and a couple of them were e-commerce sites, so we had them fill a shopping cart and we had divided our participants into two cohorts. So there was the people who were getting the fast experience and then people who were getting an artificially throttled experience and none of them knew they were part of a performance study. They didn't know that we were looking at site speed or anything like that. They just thought this was a straight up usability study where they just going to see how a site worked for them and were able to complete tasks.
- What we found were a couple of things. One was that not surprisingly, we anticipated this people frustration levels spiked during certain portions of the various transactions for the slow site. So we found that they spiked during filling a cart phase and during the checkout phase. So that wasn't a big surprise, but the spikes were pretty significant up to 26 or 28%. I can't recall the exact number, but then when we gave people exit interviews, one does after any kind of usability study, and we just gave them exit interviews and we asked them for their impressions of the sites. We took all of the adjectives, the descriptor words that people used and for the fast site or the fast cohort and the slow cohort, and we fed each set of words into a word cloud generator, and I love word clouds. They're so fun, they've kind of gone away.
- They were really fun for a while. And what was interesting was that for the word clouds for both cohorts were very different. So very same set of sites, but very different words. And what was interesting was that even though nobody knew they were part of a performance study, the people in the slow cohort, the word slow is the biggest word in the word cloud. But the other thing that's really interesting is that the word cloud for the slow group was much bigger than the word cloud. For the fast group, the people who had experienced slower sites wanted to talk more and they wanted to complain more. And the things they were complaining about other than slowness were things like they said the design was boring and tacky. They thought that the site wasn't very usable. They thought it, they complained about the navigation, they complained about the content, everything, design content, everything. Whereas the people who experienced the faster site, they had a few little kind of complaints, but much smaller word cloud and they were out the door a lot faster. So just the fact that the slow site affected all these other things that were not part of the, that was a really eye-opening moment for me. That slow performance could actually affect the entire perception of the site and the brand.
- Brendan Jarvis:
- And if there was ever a reason for designers to draw a line between the work we do and then the eventual outcome when it's put in the hands of a user, that would be it, right? It would be that all of it is for Naugh, or at least most of it becomes significantly impacted if that performance isn't at a level that people can, I suppose, blissfully be unaware that they're experiencing frustration because of those slow loading times. And you talked about this earlier on way back when we first started this conversation, how our technology isn't really designed to work the way in which we as humans are best working. You talked about flow state, what impact or insight or perhaps it's a north star or a goal, should we be aspiring to in terms of how that experience should feel for people so that it doesn't lead to these negative perceptions that cloud the rest of their experience?
- Tammy Everts:
- Oh, so there's definitely a lot to unpack with answering this, so stop me anytime if I'm bopping around too much. It goes back probably, I don't know, 200, 300 years to early studies in persistence of vision where very early studies where they lit something on fire like an oily rag or something like that, tied it to a wheel, it spun it around and realise that roughly 100 milliseconds of revolution of the wheel was enough to give you that one bit of fire looked like a perfect circle. And that's roughly everybody's different. Some people have more visual acuity than others, and so that number might be a little bit lower, but generally speaking, anything that happens in under a hundred milliseconds feels instantaneous to us. So that's the North Star. Anything slower than that, and Jacob Nielsen has done some really interesting research in the Nielsen Norman group and then even revisited that research and his findings basically mirror this, that under a hundred milliseconds feel seamless, and then anything above 10 seconds is like your brain is context shifting.
- You've already kind of checked out, and then there's kind of just various degrees of attention loss that happen in between a hundred milliseconds and 10 seconds, and those kind of vary from individual to individual. What's interesting about his research, as I said, is that he did it more than once. So I think the first time he did it was in, I want to say 2000 and then maybe again in 2010 roughly, and the findings were the same. And that doesn't come as a big surprise because actually when you look at the history of the study of human computer interaction going back to the 1960s and the early days of computers, basically if you look up early studies in human computer interaction, one of the findings was that even in early computer use, any delay of more than two seconds was very challenging for people to come back on task for.
- So the numbers might change a little bit. I think we tend to be optimists strangely. I know a lot of people think that we live in optimistic times, but I think we do tend to be optimists and think, well, I'll get used to it. I'll just get accustomed to the fact that things are slow and computers are laggy and web pages are glitchy and things move around on my phone screen and I have to chase it with my fingers and hit the wrong button and go back and all those things, and you can kind of get used to it. Obviously we have to because many of the things that we need to do in our day-to-day lives require us to, but that doesn't mean we're actually used to it. Our flow state is actually integral to not just our ability to be productive, but to our mental health.
- We need to engage in as many flow state activities as possible. Modern life doesn't really allow us to do that. We engage in a lot of really glitchy non flow things like sitting in traffic and waiting for elevators and all the, I don't need to listen to you. We live them every day. Thousands of years ago, the things that we would do would be much more seamless. We would go and herd the cattle and then we would milk the cattle and then we would come in the house and we would make a meal for our families. All the things we did just had an organic flow to them and they don't anymore. As a little aside, I would love it if everyone who watches this podcast takes away some very important things about the web and usability and performance testing and all those things, but I think it would mean a lot to me if I felt like people were going to take away the idea that they actually need to take flow time for themselves outside of tech and do something that gives you that flow state of wellbeing and feeling just the right amount of challenge and to feel that you're losing your sense of time.
- You'll be a happier person if you do this ideally every day for anywhere from a half an hour to an hour.
- Brendan Jarvis:
- You use that word alongside flow state, seamless, and that's a word that in user experience circles and even marketing circles and possibly even in engineering circles gets thrown around a lot. But it sounds like there's actually a really meaningful measurable point that we should be aiming for that a hundred milliseconds that would make that possible for people, and it wouldn't just be this random buzzword that people throw in to describe the experience that they're trying to create.
- Tammy Everts:
- For me, the research that I'm interested in is this. The why of that buzzwords are they can be annoying, but not because of the fact that they exist. Because I think they actually, if we explore them, they exist for a reason. What's annoying about them is how we get kind of anaesthetised to them and we stop thinking about the meaning behind them, and it becomes very sort of Orwellian. The word loses meaning or it takes on an opposite meaning or something like that. So we need to be just be revisiting our buzzwords and make sure that we understand why we use them and why they matter.
- Brendan Jarvis:
- Tammy, my last question for you is, and I'll quote you one last time, you've said previously, time is life, time is all we have. Treat yours as the precious thing that it is. Remember that for other people, time is a precious thing. Take that into account in all the things you do. Now, time of life is a really powerful statement, right? You're not mincing words there. As user experienced professionals, we create things that people often give small, but sometimes large portions of their time to. How are we best to take that into account when we are designing?
- Tammy Everts:
- That is a very big question that no one's ever asked me before. I mean, the first thing that comes to mind is how tech can really manipulate us into wasting a lot of time. The obvious endless scroll. Are we getting meaningless content but that we're sort of addicted to? So tech that's designed to steal our time away from us, and I do feel it is that because our brains are much simpler, they're complex, but they're simple, and we are very easily manipulated in some ways, and to feel that your time is being taken from you by manipulative technology feels like a violation. I take it really personally, and I've been subject to it many times, so I'm definitely not above having this happen to me, even as somebody who knows that it's happening, I feel like the best tech experience is the one where it's fast.
- Ideally, we should be on tech less. Get in, do the thing you need to do, get out. We shouldn't be using tech for the sake of tech. We should be using tech to make the rest of our lives better, so the less time we spend on it, the better. I tell people at Speak Curve, we don't want you to spend an hour using our tools every day. We want to make it possible for you to get in, get what you need from our tools in minutes, and then go on and do the things you need to do with that information so that you can then be done with your job and go and enjoy your life. Those aren't just words, that's just deeply felt beliefs. So yeah, less time wasted tech, more efficient tech so that we can all step away from our computers and enjoy the world.
- Brendan Jarvis:
- That sounds like a great goal, Tammy. This has been a really insightful conversation into the world of web performance and how it relates to UX. Thank you for sharing your stories and your insights with me today.
- Tammy Everts:
- Thank you for having me. You've asked such good questions. I've really enjoyed this.
- Brendan Jarvis:
- Oh, me too. It's been a pleasure. Tammy, if people want to catch up with you and follow all your contributions to the world of web performance and UX or connect with you personally, what's the best way for them to do that?
- Tammy Everts:
- I am on Mastodon. If you search for Tammy Everetts, there's a web perf server. We like to say we shorten web performance to web perf in our world. So there's a web perf server. You can find me there. I'm just @Tammy.
- Brendan Jarvis:
- Great.
- Tammy Everts:
- There aren't a lot of Tammy's in web performance, so I just took it and you can find me on LinkedIn at Tammy Everetts and then on the other social media platform at Tam Everetts, so you can find me there as well.
- Brendan Jarvis:
- Thank you, Tammy. And to everyone who's tuned in; it's been great having you here as well. Everything we've covered will be in the show notes, including where you can find Tammy and all of the things that we've spoken about.
- If you've enjoyed the show and you want to hear more great conversations like this with world-class leaders in UX research, product management, design, and now web performance, don't forget to leave a review. Subscribe to the show also, so it turns up every two weeks. And perhaps just tell one other person about these conversations if you feel that they would get value from them.
- If you want to reach out to me, you can find me on LinkedIn, just search for Brendan Jarvis, or there's a link to my profile on LinkedIn at the bottom of the show notes. And lastly, you could head on over to thespaceinbetween.co.nz. That's thespaceinbetween.co.nz. And until next time, keep being brave.