Laura Kalbag
Accessible, Inclusive, and Ethical Design
In this episode of Brave UX, Laura Kalbag calls on businesses using people’s data for profit to examine the ethics of that model, and talks practical inclusive design and digital accessibility.
Highlights include:
- How are accessibility and inclusive design different?
- What does the inaccessible state of the web say about technologists?
- What’s wrong with companies profiting from our behavioural data?
- Why have you gone to great lengths to live up to your values?
- What can we do to help our organisations to make more ethical decisions?
Who is Laura Kalbag?
Laura is a designer and developer, as well as the Co-Founder of the Small Technology Foundation, a two-person and one-husky, not-for-profit, that strives for a more ethical, more private, and more just technology industry.
From 2016 until December 2021, the Small Technology Foundation made and supported Better, a digital privacy tool for Safari - across Apple’s operating systems.
Laura is also a passionate proponent for creating a web that is inclusive and accessible. In 2017 she published her first book, “Accessibility for Everyone”, through A Book Apart.
The book is a guide to the accessibility landscape. Helping people to understand disability and impairment challenges; get a handle on important laws and guidelines; and to learn how to plan for, evaluate, and test accessible design.
Transcript
- Brendan Jarvis:
- Hello and welcome to another episode of Brave UX. I'm Brendan Jarvis, Managing Founder of The Space InBetween, the home of New Zealand's only specialist evaluative UX research practice and world-class UX lab, enabling brave teams across the globe to de-risk product design and equally brave leaders to shape and scale design culture. Here on Brave UX though, it's my job to help you to put the pieces off the product puzzle together, I do that by unpacking the stories, learnings, and expert advice of world class UX, design and product management professionals. My guest today is Laura Kalbag. Laura is a designer and developer as well as the Co-Founder of the Small Technology Foundation, a two person and one Husky not-for-profit that strives for a more ethical, more private and more just technology industry, one that improves human welfare over corporate profits. Now if those sound like fighting words, you'd be right.
- You may even go so far as to say that Laura is somewhat of a design and technology activist. From 2016 until December, 2021, the Small Technology Foundation also made and supported Better, a digital privacy tool for Safari across Apple's operating systems. Laura is also a passionate believer in designers and technologists creating a web that is inclusive and accessible. She published her first book, Accessibility for Everyone, through a Book Apart in 2017. The book is a guide to the accessibility landscape, helping people to understand the disability and impairment challenges that people have, get a handle on the important laws and guidelines that are out there, and to learn how to plan for, evaluate and test accessible design. Laura is a generous sharer of her perspectives and her knowledge traveling all over the world (when that was a thing) to give talks at conferences and events like FF Conf in Brighton in the UK, Webstock in Wellington, in my home country of New Zealand, and From Business to Buttons in Stockholm in Sweden. And now it's my pleasure to welcome Laura here to speak with me on Brave UX today. Laura, welcome to the show.
- Laura Kalbag:
- Thank you so much for having me. That was such a wonderful introduction. I love the way that you said first book, like I've got another one in me at some point.
- Brendan Jarvis:
- I'm sure you do. I'm sure you do. I'm sure you've got plenty more, lots more to share. Now, Laura, I understand that you are British, but you've made the move recently to Ireland and I thought that was interesting because that's the equivalent of in my part of the world, someone from Australia moving to New Zealand. It usually goes the other way round, right? It's not something that usually happens that way. What
- Laura Kalbag:
- I'm assuming is fairly unusual.
- Brendan Jarvis:
- Yeah. And look, I'm assuming it wasn't the weather one doesn't go from the UK to Ireland expecting better weather. It's usually the UK to Spain. So what was it that prompted you to leave home, leave the UK and head over to Ireland?
- Laura Kalbag:
- Well, actually between being an Ireland and the uk, we'd moved to Sweden as well. We spent two years in Sweden and then came over to Ireland. So really it was prompted by the UK government. So we were working on things to protect people's privacy. And the UK government, when the conservative government started to look like they were going to get in, they were talking about passing laws that would ban encryption and other things like that that are actually becoming familiar across the world. It's not unique to the UK in any way whatsoever. And so we thought, well, let's look at somewhere that might be more accommodating to a small organization that wants to work on these projects. We're only teeny tiny. We don't have the funds to be able to fight legal battles and things like that. It's not our area of expertise at all.
- And we are also looking down the prospect of Brexit. And my partner who also works at Small Technology Foundation or our bacan, he actually has French citizenship and the prospect of it being difficult to then live in the UK and work in the uk. We thought, Hey, well look where our other options are. And we spent two years in Sweden, which is an amazing place mamma we were in, which was in the very south, very near to Denmark. And it was wonderful there, but it was very bureaucratically difficult to set up business and organization stuff. And also we kind of learn the grass is always greener. We hear so much about how Sweden is this wonderful progressive country that will make it easy for you to work on the things that you care about. But honestly it's very difficult, particularly as a non-citizen to get those same rights and to be able to do those same things.
- And we found it a bit of a struggle. And so we thought, well look at where else. And then looking at Brexit potentially happening, we had to think, well actually as a UK citizen, where do I have the rights to live where I don't have to worry about getting a job that gets me a visa or something like that. And so we settled on Ireland and I have to say I love Ireland. It's a really friendly country. It has a lot of what I love about British culture. Two cultures growing up alongside each other for worse and for better. So it feels familiar to me but it's also got a lot that the UK doesn't have, and it's not too far away from my family as well.
- Brendan Jarvis:
- Yeah, I wouldn't imagine it was be much more than an hour's flight from Ireland back to the UK if that.
- Laura Kalbag:
- Yeah, it's not even that. And I'm not one for getting planes all the time. I don't want to do that much travel, but it's means that if I need to get there, I can. And that's important.
- Brendan Jarvis:
- Hey, just briefly coming back to Sweden, mentioning the difference there and what you may have expected it would've been like to what it was actually like to try and get established there. What were the sort sort of things that you were struggling to get done that would've been easier for Swedish citizens to actually do?
- Laura Kalbag:
- So it was things like in order to get a business, you need to have a bank account. In order to have a bank account, you're suddenly, oh, you need all of this paperwork, you need to have all of these I forms of id. I think you need to get a personal number, which is, it's a fairly unique concept to Sweden in that it's a number where you can be looked up and someone can find out pretty much anything about you if they have access to your number. And it's not a private system, so it's, especially if you are interested in privacy, it's kind of a scary area to go into. You're like, oh, do I want to commit this much to this country where I'm kind of letting anyone look up where my address is my home address? So it wasn't necessarily that we didn't want to get those numbers because I think if we had to, we would've been willing to, but it just became so difficult to do anything to even get those numbers in the first place. And I think in contrast in Ireland, it's been very easy to get registered to be in part of the health system, to be able to set up an organization. It seems to be a system that's not necessarily as technologically advanced or as huge as Sweden, but it's works out and there's always people willing to help you if you get stuck as
- Brendan Jarvis:
- Well. Yes, I have to say the Irish do have a very good reputation on the world stage for being very hospitable and very happy to help people, especially been my experience meeting Irish people over here in New Zealand as well. Now, if anybody's listened to anything that you've talked about in the past, watched any of your previous talks or listened to podcasts, they would've heard you speak about ethical tech, accessibility, inclusivity, these quite important and large topics that our industry is currently grappling with. And you are also quite clearly willing to make personal sacrifices to change your life in a way that you feel, and I'm projecting here, so jump in if I'm putting words in your mouth here. But that fits you better with what your ethical stance is on certain issues. And I'm talking about here moving country twice so that you can live somewhere where the laws may be more aligned with your personal values. Why is it though, if you think back about where your interests in all of these big and important issues began, why did you choose to focus on these issues and go to such great lengths to stand by them and do your best to live up to your values?
- Laura Kalbag:
- I think going to the lengths to some degree is a little bit of naivety. I think that I often, especially when you first start learning about the world around you and you hear about these wonderful things in other places and you think, oh, well, I'll go there and it'll be easier and it'll be better. And it's naive [laugh] not really that. And it is very difficult moving. It's very difficult leaving your culture. It's very difficult leaving what you're used to. We don't realize how much these systems support us, but I think that basically I care a lot. I care about injustice and I care about doing the right thing. And I don't know that I'm always doing the right thing, but I really want to try. And that isn't anything that I do. And that's why I end up touching lots of these different areas. So you might think, well, what's accessibility and inclusivity got to do with privacy?
- Well, I'm going to find the connection because I think it's rights. It's rights. I hate the things that I build could possibly infringe on people's rights that could harm them. And so I want to find out as much as I can about how to prevent that harm and with that knowledge, I wanna help other people understand how to do those things as well because I don't wanna keep what I learned to myself. And I think that's most important to me. And it's what I loved when I got into the web. It's why I got into the web in the first place was what I loved about it was this idea of sharing information and that you don't have to be an incredibly educated, you don't have to study for a long time and pay a lot of money in order to get knowledge. And that's what I loved about it. I loved. So I was like, well, these people are sharing all this free information on their blog about how I can learn how to code websites that's so generous of them. I wanna be a part of a community that does
- Brendan Jarvis:
- Obviously the origins of the web. And I have a similar story in connection to why I started doing what I was doing, making designing, and building websites when I was, I don't know, 10 or something. Then sort of going back to 1995. And I did that for many years. And it was the same thing that attracted me to the web was just the possibility and the openness of the resource that it was. And of course that's changed over time. But I was, well, to a degree, I mean, there's still a lot of great content out there, I must say, but as you will know probably better than I do, there are some platforms that have been developed that operate in very closed in a very closed way, which is maybe not the original intention of the founders of this technology. But if you think about your own experience and your own story, what was it that you either saw or that you personally lived through that made you so passionate about ethics and human rights? Is there anything that comes to mind as a moment where you realized that this was actually an incredibly important area for you to dedicate your time and energy to?
- Laura Kalbag:
- I think I was just frustrated at seeing other people suffer unnecessarily. So what I always say about my interest in accessibility is that I grew up with my brother who is disabled here at Sarah Ballsy, and I never really thought about the impact that had on me until I was writing a book about accessibility. And people started asking me all the time, well, why do you care about accessibility? A lot of people who work in accessibility are disabled themselves and not, and a lot of people would think that was unusual that I'm not older. A lot of people get into accessibility when they get older and they find that they are sort of needing glasses or needing other assisted technology as they age, and that wasn't me. And then I saw surreal, well, of course it's because of my brother and he is, I mean, I don't consider him, so him being disabled is not the first thing I think about with him.
- He's been my brother since I was three years old, so I can't remember a life without him. And my entire life as his big sister, we have been making accommodations for his needs because that is part of living with somebody you care about and you do it for everyone, whether they're disabled or not. You make accommodations for everybody's needs and what everybody likes and enjoys. And so when I learned about accessibility, it seemed perfectly logical to me. I was like, oh, I'm just making some accommodations with what I'm building. And that has seemed to have sort of affected me in many ways. It's really sort of seeped into the way I see the world. And yeah, I also said that sometimes I think it is a big sister personality type. I'm the oldest of five children. I mm-hmm sort of have grown up wanting to look after the people around me. And I think that that can be a positive aspect, but you also have to be very careful that you don't slip into thinking that you know better for other people than they know for themselves. It's true of being a big sister, and it is true of being a designer, being out there in the world that it is very easy to think, well, just because I know how to build this, I know better than the people who are going to use what I
- Brendan Jarvis:
- Build. That is such a key point. Your brother's story that you just shared there, having cerebral palsy is one, I suppose one example of disability, and there are many, many others. I was having a conversation, I think it was last, oh, two weeks ago with trip odell and Trip is dyslexic or has dyslexia, and he's a neurodiverse person. And there are some people that would look at neurodiversity and consider that to be a disability as well. I don't believe that trip is one of them. He frames his neurodiversity as different. But I suppose what I'm getting to here is a question somewhere in here, which is there's a broad spectrum both in terms of physical and cognitively of disability and how that impacts people and their experience of the world. I was curious about your views as to, given the broad range of disabilities that people may have, how do you know as a designer or a developer, a practitioner where to start? How do you find the right place to begin in terms of applying inclusive design and applying accessibility into your practice? Where do you begin?
- Laura Kalbag:
- I think the first thing that I had to do when I started out learning about accessibility was give up the idea of control. And I think that particularly, I kind of come from a print background in that I started out studying graphic design. I very quickly moved into interactive design, but I started with that fixed idea of design is creating a picture of something. And as you build and create it, the further you get away from the picture. And a lot of people get quite anxious about that because that's getting further away from their initial creation. And we can be the same no matter what artifacts we build and no matter the format of what designing. But I think the first thing was thinking for, well, this isn't for me. This is for me to build a system that is flexible to accommodate somebody else's needs or even just once.
- It doesn't necessarily even have to be what they require. It doesn't necessarily be the only way they can access that information that I want to deliver to them. It might just be that they prefer it in one way over another. I'm a person that's interesting that likes reading text over watching videos, and so often something that is built with a transcript is actually far more preferable to me. But it is most important obviously, that this is for accommodating disabled people's needs because they don't have a choice necessarily of which they prefer. They can only access one versus another. And so the first thing I would say is you understood learn to what the different formats are that you can provide. I think that that's probably the easiest thing you can do. Most of the time we're trying to deliver some form of content to somebody else in whatever format we have and how can we make that as flexible and as accommodating as we can to anybody's needs? Because if you start trying to think about, oh, how am I going to make this work for this person with these needs and this person with these needs and this person with these needs, and you start building a million different versions of your design and so that they can pick this bit and pick that bit and work out or detect what they need, that's going to be impossible to maintain. So we just have to build things that are flexible and accommodating
- Brendan Jarvis:
- And thinking about the language and the labels that I've mentioned in your intro. So we have accessibility and we have inclusive design. Now those are different things and you've got a really great way of describing that difference. How does accessibility differ to inclusive design?
- Laura Kalbag:
- So I go back and forth on this a little bit, but the main way that I would put it is if you have a shop that is raised above the road and so you think I've got to have steps up to my shop. So I will put rails down sides of the steps so that people who need rails can use those rails for support. And I'll put a ramp up the side to maybe a back entrance of my shop because the step space going up to the front door is too steep. And then perhaps I'll put stripes on the steps so that people who can't see the steps so easily can differentiate between them. And this would be making your shop accessible. People don't necessarily have equal access to the shop. They might end up going in through the back door rather than the front door, but it is accessible. People have access. Whereas the way that I think about inclusive design is, oh, well you build your shop and you dig away the foundations and you put that shop on the ground so that you don't need steps to get to it. You do away with that and you consider everyone from the beginning. It's easier for everyone to access your shop if you don't have it raised in the first
- Brendan Jarvis:
- Place. And one of the things that I spoke to about with Sherri Bern Harbor was this notion of needing a business case to justify an investment in accessibility often. And needless to say, Sherri feels very strongly against that way of thinking against the futurism of accessibility as something that we build into if we have time and budget for our products, which is clearly not the right way to be approaching things. We should be designing for all people, my view. But from your analogy, it sounds like it's much easier and potentially from an investment point of view, unfortunately businesses do look at the money that they spend through that lens. It's easier to do this if we start with inclusivity in mind rather than having to dig out the piles from our existing shop and put it in
- Laura Kalbag:
- Exactly that. It is easier to consider in the, it doesn't mean that if what you're working on isn't accessible yet, that it is you just give up until you start on a new project. Not at all what I mean. But it is generally easier if you think about it from the very beginning because you're not spending money on things that you then have to undo and redo. And I, I also hate the idea of using accessibility as a line item. I like to put it in terms when you're thinking about the web. Mm-hmm. Think about performance. Nobody says, oh, performance is a line item. We are going to only make the website fast if you pay this much more money. So why would you do that With accessibility, it's exactly the same. It's providing accessibility to people.
- Brendan Jarvis:
- Why is it that we do that? Why do people do that? Why do organizations often look at accessibility as a cost a line item?
- Laura Kalbag:
- Because they don't think of themselves as requiring accessibility. They think of themselves as requiring speed. But if you are not disabled, it is very easy to consider other people as just that other. And so they're somehow less important. And also because disabled people are often made pretty invisible by the societies that we live in, and it's people deprioritize them and think that they somehow are going to be customers that they won't buy as much or they might have different interests or different lives. But we are all, whether we're disabled or non-disabled, we have the potential to become disabled tomorrow. We have the potential that we may become non-disabled tomorrow. It all depends how we identify, but I think that we just need to stop thinking about disabled people's needs as being somehow different from our own because they're not.
- Brendan Jarvis:
- How many people in the UK identify as being disabled
- Laura Kalbag:
- Now? I think it is about one in five that is. So yeah, I think it is about 20% something that I think that globally it is probably around 20%. It's a huge number of people and a lot of those people also identify as having multiple disabilities. So multiple things that they think prevent them from day-to-day living and work is some sort of barrier. Now of course there's different ways of considering disability and sometimes different disabled people will identify and consider disability in different ways. And I think the social model is one of the models that is probably is most popular at the moment. And that is the idea that disabled people are disabled by the world around them that does not accommodate their needs. And I think that that is a particularly good way of thinking about it if you are designing for disabled people too, not only because you are taking their perspective and on as your way of designing, but also because that actually puts the onus on you to make sure that you accommodate their needs.
- Brendan Jarvis:
- Look, if we can send people to the moon in the 1960s, we surely can make our websites and our digital products accessible. So thinking about that social model you've just raised and then thinking about this web as a space in particular the focus of this conversation, just how accessible is the web currently?
- Laura Kalbag:
- It's very interesting because by default the web is actually fairly accessible and browsers do a pretty good job of trying to make as much content as you throw at them accessible. But then we tend to build tricky complex interfaces on top of it that become less accessible cause we're not considering the accessibility when we're building them. And there are different issues and there are some quite easy fixes that we can do to make our site accessible. And one of the things that I always say is actually the biggest barrier is caring about it in the first place, knowing that it's a problem in the first place. Because once you're in the mindset of looking at everything you build with how can I make this accessible? It's pretty easy to pick up on the tips and tricks that will make probably 99% of the difference. And there's always specialist cases for that extra 1%. Those interfaces you've built that are unusual and they're a bit harder to make accessible something that's very new and interactive perhaps. But 99% I'm sure of things can be easily accommodated by some fairly simple bit of attention.
- Brendan Jarvis:
- So if the technologies that underpin the web are designed to be accessible, yet the current implementation, and I believe I saw a study cited that it was something like 90% or so of the top 1 million websites in the world, their homepages are not accessible as in very, very poor accessibility. What does that say about the state of design and development education currently? That we are so far below what would be acceptable to the degree that almost all of the web is inaccessible?
- Laura Kalbag:
- It is because we don't care enough. And it's not specifically about the education because I think that you can study all kinds of wonderful universities and learn absolutely nothing about accessibility and technology. Even if you are taking a technology or design related course, you don't necessarily get taught about that. I didn't think many people do. I actually think the place where we need to foster these interests is in the community but also in business. But the problem is that business, especially in technology, is not focused on the people who are using what we are building. We have so much more focus on how we can monetize what we're building as quickly as possible and build these vast systems that make a huge amount of money. We're not thinking about making great experiences for people. We are thinking about funneling them into a way that can make us more money. And I think that that is the greatest problem is that so much of our industry is now completely formed around that. If you look at the language that we use talking about engagement and impressions all the time, if we look at the way that we design things now and how much of it is built around just studying data that we've collected from people and not interacting with humans at all, it's kind of sad because that's not the web that I felt like I was getting into when I started out over a decade ago.
- Brendan Jarvis:
- Laura, I just wanna come back to the word language that you mentioned a little earlier there relating to how we start to frame the outcomes from our experiences that we're creating. So words like engagement. I do want to come to big tech and sort of capitalism and some of the things that surround the decisions that we make as designers and developers, particularly in the private sector. But if we could just come back to that word language, the language around accessibility and disability can sometimes be, I know for myself, I'll speak personally here has been a slight barrier for me in terms of understanding and engaging in the space in a way that I felt like I wasn't saying things that were inappropriate and I possibly have been so happy to be called out on anything as well. But this language barrier I feel is something that can inhibit people who are interested in this space and want to do the right thing from participating fully in it. So from your understanding of the space, what are some of the ways that people can talk about this, the words that they should use, maybe some words that they shouldn't use that will enable them to fully participate in this conversation.
- Laura Kalbag:
- And I think that especially now when people are talking about their identities more than ever before and we're more aware of it because we have access to a global web where we hear from all different types of people and all different types of identities. I think that this is not dissimilar from when we are talking about things like race and people being trans and all of these things that we're so many of us are learning, especially if we don't identify with those names ourselves. And even so an example is in my book I refer to disability and I talk about people with disabilities. I even have a note in my book as to why I talk about people with disabilities rather than disabled people that's now outdated depending on who you talk to. Some people prefer to have people first language, which is saying people with disabilities or people with a particular impairment.
- But now a lot of people will prefer to be called disabled people because of that social model, the idea that they are disabled by the world around them. And so I think part of it is being comfortable with letting your language evolve and understanding that most people aren't going to hold it against you as long as your language is trying to be inclusive. And if someone corrects you, take on what they're saying and don't get overly defensive about it. It's the same with any identity. Don't be defensive of something that is not your identity. Take on the people who actually know for themselves. And I think, I mean it's difficult. I don't necessarily want to give a list of terms that you shouldn't use because they're derogatory. I think the most of us probably are familiar with those. I think a good example though to understand is about the deaf community and the fact that many people maybe when most people in the deaf community would not consider their deafness being a disability.
- And so it is, you should perhaps say deaf people and people who are hard of hearing, if you are talking about accessibility needs for deaf people, you might talk about deaf people and disabled people and you wouldn't necessarily lump all of them into one box because they don't want that. And a lot of it is just trying to be sensitive and listening to disabled people, people from communities that are not your own at all times, trying to pay attention to what they call themselves and take that on. You do also have to be cautious because some people there's a lot in disabled culture of taking on derogatory terms and using them for yourself. And so I would perhaps give an example here of the word spastic, which I would never use to describe anybody. And it is often a term used to describe people with cerebral palsy.
- It used to, in fact the main charity in the UK now called Scope for people with cerebral palsy used to be called the Spastic Society. But it is considered a derogatory term now, but many people who have cerebral palsy will refer to themselves as being spastic. So you have to be very careful that you are not picking up on the terminology that is being used in a kind of way of turning it on its head and claiming it back. So you do still have to be cautious. And I'm really not saying that in order to put anybody off from trying to learn the right terminology to use. But for example, my brother Sam would hate it if anybody called him that and he would not refer to himself as that either. And so not everybody is comfortable with the same terminology either. He refers to himself as somebody who is disabled and somebody who has cerebral palsy. So I think the key thing is just to listen,
- Brendan Jarvis:
- You know, said so many things there that I felt were hugely valuable. One was just being comfortable with the language evolving. And clearly that is what is happening at a rapid clip at the moment particularly when it pertains to identity. And I think that open mindset, not taking trying to insert yourself into that picture and take any negative reaction that you may get from something that you may say personally and just seeking to try and understand, I think is such a key thing that you've raised there. I want to, Laura, just come to some conversation around technology more broadly now. And I'm just going to paraphrase something that you've said in one of your previous talks, and that was that technology facilitates and enables people including those from disadvantaged groups to have access to social, civil and labor infrastructure. Yet you've argued at the same time that in another, on other conversations that technology is used to exploit us.
- So is this the situation, the sort of tension between having access to this technology that enables us to participate in these areas of life, yet that technology from your way of looking at it, and I don't doubt that there is some degree, there's some degree of exploitation baked into the way in which a lot of technology works, particularly when it relates to data and privacy. Can we have our cake and eat it too though? Can we fully participate in this new world using that technology and not expect that the technology will be seeking to serve itself in some way?
- Laura Kalbag:
- That's the world that I want to live in. Absolutely. And I also don't believe in blaming the victim. I think that far too often, especially when it comes to privacy and protecting yourself online, people will blame the victim. They say, well, you use Facebook, so what do you expect? You've heard enough about Facebook in the news about what they do with your data, about how they sort of discriminate against people on racial basis with their advertising. There's so many news articles, you can read so much about how Facebook is not a force for good and is very harmful in many ways. But you can't blame people for using Facebook if that's the only way they have to communicate with their school of their child, with their older relatives who live on the other side of the world. If they don't know of an easier way to do that, if there is no easy alternative, we can't blame people for that. And that is why I give talks that and want to spread information about the way in which this technology is built. Because I think that a lot of the onus relies on us, the creators, to not build technology that exploits and harms people so that people can enjoy the experience that they're supposed to be enjoying and it not come with a catch.
- Brendan Jarvis:
- So what would be an example of a social platform that you believe, or it could be a real one or a hypothetical one that would enable similar things to which Facebook does yet wouldn't cross that line that you have of exploiting people. And I'm trying to get, I suppose what I'm trying to understand with this question is I'm trying to understand what is that line for you and what would that look like if it wasn't crossed in a social network?
- Laura Kalbag:
- I think the thing that we have to look at first is the funding model. Now I would say business model, but actually there are plenty of different models in the world that is actually more about funding because actually we can build things that are supposedly for social good but the way in which they're funded makes it exploitive. And so really the most fundamental thing is how is this organization making money out of its technology? And it's a phrase that people have used for decades at this point, is that if you are not paying, you are the product that's being sold. And a lot of the time that's used in reference to advertising. And when the advertising was in newspapers, we saw that as fairly harmless. Like ads could be annoying, ads could make you want to buy things that you didn't necessarily want to buy.
- It's kind of capitalism, but those ads didn't know anything about you, they just knew that you had access to this content. So they might be able to guess a bit of your political views based on the newspaper you read. They might be able to guess a stereotype of your gender based on a magazine that you read. But that's not the same as the technology we have today where it can have access to all of this information about us in order to better sell things to us and potentially worse as well. And so one of the problems with the security of the web and the way in which we build our systems is that any of this information that is collected about us isn't just used by the people that own that technology, but also governments where that company is based may have access to that.
- And this is why in the Edward Snowden revelations, which was one of the things that really triggered us in the work we were doing was so horrifying was that they didn't need, the US government didn't need to build a system to surveil as many people in the world as it could. The social media platforms had already done that for them and they just had to ask for access. And I didn't feel like I had the ability to have a means of changing what the N S A were doing. I didn't have the means of changing what a government was doing in the name of anti-terrorism, but I did have the means to go, well, I'm working in companies on technology that is similar to this, how can I make sure I'm not partaking in that?
- Brendan Jarvis:
- And you've mentioned the funding model being one of the ways that you can look at an organization as to where it's getting its money from and I suppose potentially also the model in which it's incorporated as to what its outcomes might be that it's seeking. But given that most of the organizations, at least in the west are built upon a mixed model of capitalism, what would need to change for us to see behavior and enterprise that isn't driven by returning value to shareholders?
- Laura Kalbag:
- I think one of the biggest things is finding new funding models. I don't necessarily have the answer to every option. One of the things that we discussed at Small Technology Foundation is the idea of funding the technology that is for the common good. So if you bear in mind that we use this web infrastructure, as you were mentioned earlier that I said about in our everyday lives we use it to access work, we use it to access utilities, we use it to access government services, and yet we seem to be paying for it with our data. When these could be funded by taxes, these could be funded by from the collective purse. Now there is trickiness in that because of course if it's a system that's owned and controlled by your government you have a whole nother set of problems a lot of the time.
- And so then we can look into other funding models. I mean partly we don't need to be having such a huge turnover of funds in the technology that we build in order to support the technology at all. It's just that the systems that we've built that are so much about growth and scale and building these huge systems to accommodate as many people in the world. So you can collect as much data as you can and make as much money from that as you can. A lot of the time. The technology that we need is something that I can use to send a photo to my family back in the uk and it doesn't need to be this massive system. It just seems to be this one small thing that does this small task. And so a lot of it is about decentralizing these systems, making these systems smaller, allowing us to communicate and building systems that wouldn't necessarily need as much funding and could then perhaps be affordable to individuals that they could then own and control all of their own information.
- Brendan Jarvis:
- Thinking about the individuals that may create those systems, I understand you are working on some technology yourself at the small technology foundation, what would need to be true for them? Would they have to push their own self-interest to the side, their own economic interests in order to actually create technology that is underpinned by the principles to which you've been talking about?
- Laura Kalbag:
- Well, I think the thing is we live in a capitalist societies. We can't help that fact. We need to be able to afford to pay rent and go to the supermarket and buy our groceries and feed the dog. We need money to exist. So I don't think that people should be expected to live in poverty in order to build things for social good. I think that we should be able to have funding models that allow people to live reasonably. I don't think you're doing much, right If you're becoming a billionaire that might be considered an extreme opinion to some people. But I think that not many people who are billionaires got their doing wonderful things for the world around them. I don't think we need that much money but I think that it should be well within our reach to have a comfortable existence and be able to build things for social good. And I do believe we can get there as well. I absolutely believe we can get
- Brendan Jarvis:
- There. Coming back to what you were speaking about earlier, which was the ability for the American government to effectively just ask for access to these platforms that already existed and therefore I suppose be able to reach pretty much anyone anywhere and learn a lot about them. You in the past have referenced Shoshana Z's book, the Age of Surveillance Capitalism, and in there I'm just going to quote something that she says, which is surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioral data. Although some of these data are applied to product or service improvement, the rest are fabricated into prediction products that anticipate what you will do now soon and later. You've also talked about, that's the end of the quote. You've also talked about this idea that if we are not paying for it, that we are the product. So what is wrong with this picture then? If we are getting free access to this technology that enables us to participate in the daily activities of modern life, connect with our friends, whatever it may be, what are you asserting is wrong with those organizations that own that platform or those platforms profiting from our usage through our data?
- Laura Kalbag:
- It is not just about how they profit from that information, but how they shape our behavior through their interfaces in order to profit from that. And so that's one of the things that Shoshana Zu was talking about in that quote was that they also can use what they've got in order to predict and encourage you to put more money into whatever they want you to put money into. So an example some people give is the concept of self-driving cars. If you're not funding a self-driving car yourself and it's being funded by another organization, that organization might decide to drive you to your house straight to your house, or they might decide to drive you to your house via McDonald's because McDonald's have given them some money in order to do that, in order to encourage you to go and eat McDonald's. And it's kind of seems like a ridiculous example, but if we look at how we use social media, and I think a lot of people can identify with this, you can feel your behavior being shaped and influenced by the things that you are reading on social media.
- I think Instagram is a particularly good example of this, that looking at images and of course the infinite scroll designed to keep you engaged will keep you scrolling through things. And you are looking at all of these images and these images are shaping not just your self worth but your attitude and ideas about the world around you. And you're keeping on going, you're keeping on watching. And in there, of course interspersed to ads that are based on what you've been looking at in order to encourage you to buy products that may be aspirational. And so I think we can feel how we are being shaped by these things. And if you have a detox or whatever they call it to get away from social media, again, victim blaming, I'm not necessarily into that, but if we do get away, we do start to notice the pull that these platforms have to us. And it's not just the social pull of, I'm not talking to my friends. It's this addictive quality. And so this is the problem with all of these companies having these funding models that are relied upon extracting our data, they would do all kinds of things to us in order to extract that data about us and in order to manipulate us to spend more money for
- Brendan Jarvis:
- Them, an awareness of the intentional design behind things like you mentioned, the infinite scroll to keep us engaged, the behavioral ads that we are targeted with. All of these things have kind of come to light in particular since Cambridge Analytica and the big scandal there with the 20, was it the 2020 election or the one before? I can't remember now. I try to blank. That period of time [laugh] weird from my memory and we've been in a pandemic for the last two years, so my timeline's a little hazy, but these things have come to light. There's been a number of movies that have been released, documentaries to address or try and raise awareness of this in our culture. And you, you've sort of said that you're not comfortable with the victim blaming, and I get that. I understand that. What responsibility do we as people that work in this space, people that are probably more educated than most, more privileged than most have a better understanding of this technology than most? What responsibility do we have ourselves in the ways in which we are engaging with the technology that we may be making, but also we may be consuming or consumed by depending on how you look at it.
- Laura Kalbag:
- And that's the tricky thing is that we as people who work in the industry, seem to have this fantastic ability to separate ourselves as the people working in the industry and the people who are the consumers. And it baffles me that we, it's almost like disassociating. Our personalities are so different, and I think part of it is that we allow ourselves to go along with it. In the industry itself, it is very easy. We frame how we are building things and what we are building them for differently from how it might be perceived as someone who's using it. So an example would be, I always see when I'm, so you mentioned earlier that I used to work on a tracker blocker. One of the things I used to do all the time is read privacy policies to get an understanding about what are these trackers doing.
- A lot of the time they're completely unclear and sometimes they're kind of useful and interesting. One of the examples of the things that they always tell us is we're tracking you in order to improve your experience of our website. And you're like, oh, you're doing something so generous for me, the person, I just want to have a great experience on the website. But really you are doing that in order to have make a good experience for the person who's using the website in order to make you money from that person's experience. And that is why it's being done. So we have ways of telling ourselves these things that we are somehow being benevolent to the user when we are not. But it's too difficult to take ourselves out of it and actually acknowledge the responsibility that we have for the things that we're building. And so I think that's the first step that we have to do is kind of take ourselves out of this group mindset of, well, what I'm doing is fine.
- Everyone else is doing it, and that makes it fine. And try to unpick what we are doing for the right reasons and what we're doing for the wrong reasons. Of course, one of the biggest problems of that is that most of us don't have any power whatsoever. We're not going to necessarily change these big tech companies that have funding models that are based around collecting and monetizing people's information because we have no way of changing the funding model. Nobody's like, we don't get to talk to two managers above us. How on earth are we going to write this ship? And so one of the things that we have to realize is that we don't necessarily have the power to make a change. If we are in that job, we are contributing to the harms and we are not going to be able to make a difference to it.
- Because what happens when you do, you look at Google, people trying to speak up about what Google has been doing in terms of the way that they want academic papers written on their behalf to be worded people trying to unionize all of these things that some workers at Google have been trying to do in order to try to push Google towards a more ethical or a more rights respecting space. And the people that are in those spaces end up getting frozen out. They end up getting fired. They end up getting called unprofessional and otherwise discriminated against. And so it is not an easy space, but I think that depending on where you are and what your role is and what you are able to have an effect on, there are many small ways that we can actually affect change. We just have to acknowledge our own responsibility and use the power that we actually have, recognize the power we have and use
- Brendan Jarvis:
- It. This is actually something I wanted to go into with you in more detail, which is this bravery, and again, I'm sort of imposing my own words here, but this sort of bravery of these people to take these actions within these big organizations with risk to their career, albeit they're coming from a position where they may enjoy more economic luxuries than workers in other industries. They're still risking a lot. They're putting themselves out there. Now, I wanted to ask you about this because as far as I could tell, you have always worked in your own organization. You have cut your own shape on the dance floor, so to speak. You haven't been, I suppose, shackled by the constraints that people who work in these other much larger organizations that make these products may be. So I wanted to ask you about that because you're obviously quite a strong voice for encouraging people to take these actions. But when it comes to your own risk profile as a result of the choices you've made to do the work that you do and the way that you do it, do you see that as a sort of strength or a weakness, or do you frame that in a different way that maybe I'm not touching on here, that the way in which you are contributing your voice to encourage this action? How do you look at it?
- Laura Kalbag:
- Well, your question comes at quite an interesting time for me because while I had been independent and always running my own organization since I started out in tech, and that was really the reason I started out that way, was nothing more than I couldn't find a job locally that appealed to me because I was very headstrong and thought I knew everything coming out of university. And I thought, well, I'd rather freelance than work at an agency whose values I don't believe in. But actually for the last seven months, I've been working [laugh] for a startup and as a contractor because the way that we want to fund and build alternative technology is not yet financially viable, particularly living in the environment we're living in today. Where're living costs are so high
- Brendan Jarvis:
- And getting higher, too. Inflation's going through the roof for most economies.
- Laura Kalbag:
- And I'm in my mid thirties, I don't have any dependents except for a dog, but I don't have a mortgage. I couldn't afford a mortgage. And so I don't really have that sense of security. I don't have any savings or any wealth like that. It was a decision to do things independently and not necessarily have some of the comforts and security that others had. That was a choice. But there came a point where I realized that we needed to keep funding our work somehow. And so I decided the way in which I would do that is find a company in technology where my skills would be of value, and that I wasn't working for something that was against my principles, and that's incredibly important to me. So making sure that I was working for a company where I was reasonably convinced that I could trust that they weren't building something to exploit people.
- So I ended up working for Stately, which is great startup where they're building tools for designers and developers to be able to collaborate and using things for the state machine state management. So it's not an exploitative system. There is nothing in their design to collect people's data and exploit them. So I'm perfectly happy making money that way, and at least that way I can help fund what we are building the rest of the time. And so I can see both sides of it very much that it is incredibly difficult, and this is one of the reasons why when I spoke to people at events and they say to me afterwards, well, I feel powerless to make a change, but I can't quit my job. I have kids. I have a mortgage. I need this money. And it's not easy for everyone to just go pick up another job. It's just not how the industry works. Some people, very few people could have whatever job they wanted, but those are very much, they're rare. And honestly, I was kind of lucky that stately would even want me given I have not necessarily been like, I'm critical of the tech industry. And I think that that actually shows a sign of people that care about what they're doing, that they're willing to take someone on, who will be critical of what they're doing if I believe that it deserves
- Brendan Jarvis:
- It. What was that conversation like? Can you take us back there? So you're sitting down, you're having this Conversationt, tell us about that. How did that go?
- Laura Kalbag:
- Oh, well, I was very lucky because I put the feelers out to see if there was anyone in a local community that was interested in someone with my skills. And David, the founder, approached me, so he was already well aware of the kind of person that I am and mm-hmm fortunately considered that for being an asset. And I'm lucky but that is not the same for everyone who speaks out. And I know that, and I've seen that in the industry that it is not easy because we are an industry where we will very quickly shut out the people who don't conform and who don't fit what makes us comfortable. I don't necessarily think that's unique to our industry. It's the only industry I've had experience of. And so sometimes we do have to find less obvious, and I don't wanna say underhand, but I don't think it's underhand if you are doing so for the right reasons, but maybe you are playing some different reasons, your head behind the decisions that you are making than you, you'll necessarily telling your boss. The same thing that I said about accessibility in the past is that just because your boss doesn't tell you that accessibility is a priority, doesn't stop you from doing things to make what you're building accessible. It's not like most of us have someone micromanaging every single step of what we design every day. And so we have the ability to make quite a few decisions for ourselves. So we have to try to take those opportunities.
- Brendan Jarvis:
- Yeah, I believe we do. I mentioned Sherry Bern Haber earlier in the conversation, and I also mentioned when we were speaking about her, about the business case that she sometimes gets asked to present for accessibility. Now she of obviously rs against having to do that, but she said to me in our conversation that she will do that if she has to, even though she disagrees with it alongside other things that she'll do, obviously to agitate for more accessible experiences, but because it's for the greater good as far as she defines it, she will do the things she's got to do in order to achieve the longer term objective that she's working for. I wanted to ask you about the personal costs. You sort of spoke about the economic luxuries you may have forsaken, having chosen this path and some of the decisions you've made recently with where you've gone to work in recent months, but what other costs has this come at? Have you experienced any pushback from some of these organizations that you've put in the spotlight? Have you lost friends, one friends? What other things have you experienced around taking such a strong stance on ethics and technology?
- Laura Kalbag:
- Well, sometimes it can be hard to tell because being critical of technology can also be indistinguishable from being a woman on the internet. So it's difficult to often tell how much of the experiences down to what I'm saying or about the person who's saying it. And so I've had employees from Google Heck or me during my talks, I had to tell this person to be quiet and save the response for the end of the talk and just let me get through it because nobody else was going to tell this person to sit down. I've had people who are working your Facebook after my talks come up and say, very patronized, and oh, well Laura, that was a wonderful grant you just gave there in order to try to diminish what I've said. And yeah, I'm sure there are friends that I don't have. I try to not always criticize the individual if I don't necessarily believe that their intentions are bad.
- I tend to focus on criticizing the companies, the people who are in positions of power. I think that a lot of people in tech have more privilege than they acknowledge, and they are able to make some degree of choice particularly people who are considered experts in the community. Though I think that those people could probably get jobs anywhere they wanted. And they do make a choice if they want to work at a place like Facebook or Google where it's very well known. It's very well documented, the harms that they cause. So I think those people do make choices, but I don't necessarily see a value in picking fights with them because they have to deal with that for themselves. And they're not going to be my friend because they already know my feelings about the work that they do. And so I don't necessarily feel that I've lost out so much in that sense. But I have made friends, I've learned about people doing fascinating work. A lot of people working in free and open communities are very interesting. There's a lot of very toxic behavior in those communities as well, cuz they are largely populated by the kinds of people who can afford to work on projects that offer themselves in their spare time, which not everyone can do, particularly people from marginalized groups, but I find meeting people who are working on these things despite everything, just because they care about it. Some of the most interesting people to meet,
- Brendan Jarvis:
- And some of those people probably work in these large organizations as well. And yet you've sort of touched on the economics and the social cost of saying things that may disagree with the narrative that the organization wants to tell. And I suppose that that kind of constraint is imposed in any employee employer relationship. There's an imbalance of power there, no matter, no matter how great the remuneration between capital and labor. Now, I'm not professing to be any big expert on either, but it is evident in my experience as an employer and in the past having been an employee. I wanted to ask you about big tech in particular though, in terms of the organizations. Are there any that you can think of or any examples of initiatives or actions that big Tech has taken in recent years that you feel are serving humanity more than they might be extracting from us in terms of our data?
- Laura Kalbag:
- I honestly can't,
- Brendan Jarvis:
- And that's the end of the interview. We're done. You hit it. No,
- Laura Kalbag:
- I, I can't think of any example where the good outweighs the bad. I can think of a lot of examples where the good is in order to distract from the bad. And I think that that's one of the problems with Big Tech is that if you think about the people running these companies, they don't have a problem. Everything is working fine for them. They just have an issue with how it appears to other people. And so they, I think that tech is an interesting example of this because we get given them so many tools for free in order to build things for these platforms. Like we get given these wonderful developer and designer tools outta the kindness of their hearts, except that it's not, it's in exchange for something. And that something is you give them loyalty and you think, oh, well they can't be that bad because they're doing these things for free. And so maybe it is because I have a cynical view of the world, which doesn't really tie in with the fact that I was talking about my naivety earlier. I don't think it's cynical. I think it's actually realistic that these companies don't see anything wrong with doing this and the projects they're doing that are supposedly for good, it's a net bad because what is the good
- Brendan Jarvis:
- For? You've certainly given me a few things to think about, which is always the sign of a good interview as far as I'm concerned. I don't have any answers as such on this particular issue. But I was curious to talk to you about something that I believe you and Aral have come up with through the Small Technology Foundation, which is the Ethical Design Manifesto. So we've spent some of our conversation now talking about some of the behaviors that are driven through the systems and the organizations that operate on those systems that might not be serving humanity to the greatest degree that they otherwise could if those systems were different. And you've come up with this manifesto, I believe, to try to dress or give some sort of framing for people to help them as they're shaping an organization if they have the luxury to do so in the power to do so. Now, you've used an analogy of an apple in the past to describe this manifesto, and maybe it's a little unfair of me to do is to ask you to do that unprepared today. But I was curious if you could tell us about that. Can you tell the story of the ethical apple? I think it's such a wonderful and powerful analogy.
- Laura Kalbag:
- So if you have two apples and they look exactly the same, and so this could be two types of technology that you are using, and this is how a lot of the technology we use today is, it's exactly what we were just talking about in the terms of the things that look good. So you might have an apple that looks delicious on the outside. This is all of the experience, the human experience design. And this is a lot of what we've been taught to focus on, particularly in UX over the last decade, is this idea of providing a wonderful experience. And so this is the outside of the apple, the lovely shiny apple, the looks beautiful, you want to bite into it. And then when you get into the flesh of the apple, and that's the human effort. So these are the things that actually allow people to get done the things that they want to do.
- And so it helps helping them complete a task, helping them access a service, helping them do something whether it is for their pub, for the public good, for their own good, or just for fun. That's what the human effort bit is. And in the center is the human rights. And so if in the center of your apple is rotten, it might look really beautiful on the outside, but if it doesn't respect your human rights, if it doesn't respect your privacy, if it doesn't care about whether you are disabled, if it doesn't respect any of your rights, if it exploits your rights in any way, you don't wanna bite into that apple. It might look absolutely beautiful on the outside, but the core is rotten. And so that is why initially the Ethical design manifesto was based on the idea of this pyramid, the kind of Mao's hierarchy of needs. The idea that the point is the human rights, and then you expend out into the human effort and then the human experience. But I like the idea of an apple because it's this idea that the experience and the thing that a lot of us end up working on can be used as the superficial way of covering up what's going on at the very core.
- Brendan Jarvis:
- It's not really something that I would normally do, but I couldn't help but think about the biblical story around the apple that was eaten and then some bad things happened. I won't go into a depth here. I'll expose how woefully inadequate my attention was when I was supposed to be paying attention to this in primary school. But I do recall there's quite a key figure in that story of the snake. And I'm not assuming that you actually know this story, so if I'm saying things that make no
- Laura Kalbag:
- Sense, oh, it's alright. I went to Christian secondary story.
- Brendan Jarvis:
- Yeah, yeah. I was Catholic. So who in this picture is the snake who's encouraging us to eat the bad apples?
- Laura Kalbag:
- Well, it's the people who benefit from us eating those bad apples. And so it's the people who are going to make money from, and that's what it all comes down to. A lot of these companies aren't starting out going, I'm going to build some technology that's going to exploit some human rights. They're saying, I wanna make money, I wanna make a lot of money. I wanna make a lot of money fast. And oh, oops, if we accidentally do a few little naughty things along the way that might be considered questionable. At the end of the day, I have a lovely house, I have a rocket that can take me into space. And so have to, it's not very many of us are actually paying attention to the CEOs. It's not like we are going, oh, well Jeff Bezos and Mark Zuckerberg tell me I should be using this and I think they are wonderful pillars of society and I'm going to do what they say.
- I don't think we are as naive as that, but we are convinced by the flesh of the apple, we're convinced by the bit about that says, oh, but you can do this with what we've built. You can access this, you can communicate with that. You can broadcast all of these different people and make friends and communities all with this wonderful apple and look how shiny and beautiful it is. Look what a wonderful experience we've presented for you. You're going to enjoy being on it all day every day. And so we think, oh, I'm have a bite of this apple. Then I don't necessarily know about the rotten core yet. I haven't necessarily experienced repercussions or fallouts from it, and will I ever experience it?
- Brendan Jarvis:
- Yeah. Well, if you find yourself in the evenings when you're supposed to be going to sleep endlessly scrolling through a newsfeed, you, you've probably found that you've found that core. It's one of the behaviors that I still have and I need to put down, actually, it's a terrible thing for your own health, you to just go
- Laura Kalbag:
- To bed's, terrible thing. But it's also not your fault because it has been designed, you are doing exactly what it was designed to do.
- Brendan Jarvis:
- And I know that. And that's the thing that annoys me the most.
- Laura Kalbag:
- Oh, it frustrates me too. I do exactly the same thing. I spend all my time trying to correct the habits. It's just because I tell people about these projects and I know how bad they are, it doesn't mean that I don't use things because what's the alternative and that's so important is that now these technologies have become so ingrained in our social existence, our work lives even being able to access things like government services and just the most basic of services, we often have to go through these route. Google Analytics is on pretty much every website, try escaping that. No one has the time to do that. And that's why it's so important that we find alternatives and that we don't blame people because what are you saying to someone? You have to use these things or you're not allowed to participate in the same existence as everybody else.
- And that's why it's so important that we actually work on building alternatives. That's one of the reasons why Aral and I care about trying to build things rather than just criticizing the problem. I think it is brilliant that there are people who are criticizing, and particularly in fields where they're able to do so and back it up and document it, and journalists and academics who can do those things. But we do also need people that are going to give us another option and that are going to build alternatives. And I'm not saying that what we are building will be the one true way, the run true alternative, the only alternative that's nonsense. I think there are other people working on similar initiatives, but I think that it is important that we are presented with options.
- Brendan Jarvis:
- So if the people listening aren't in the position of power yet, maybe or ever, where they're able to build something afresh so build something new, what actions can individual contributors or leaders of product and design take to help their organizations to make more ethical decisions when it comes to the experiences and products that they are already working on?
- Laura Kalbag:
- So I have so many different levels to this. Say I think about this a lot because I get this question a lot, and I kind of came up with the idea that it depends on your character and your position a lot of the time. So you can either be a person who advocates for better alternatives, somebody who educates themselves because they have the time to do so perhaps, and encourages people to make better of decisions. You could be someone who just asks questions, who just ask really good questions in order to create better solutions. You're a person who will ask an innocent question about, well, why are we using that tool? How is it funded? Why don't we use this other tool that might be funded in a different way that uses data in a different way? You might be a person who's just difficult, who says, no, actually I don't want this to happen on my watch.
- I maybe have the power to be able to stop someone who's perhaps lower down from me or is a peer of mine from doing this almost like a guardian who does that. Mm-hmm. To prevent other people from doing things. I think that one of the most basic things you can do if you are someone who feels like you have no power is be a supporter, is find the projects that you think might be doing something right and support them. And I don't even necessarily mean financially. Yeah, of course these projects often need financial support. A lot of have sort of patronages and things like that, but also vocal support, sharing their work, saying that you think they're doing a good job. It's often hard work working sort of uphill in these conditions. And to have people that just say, oh, hey, I love what you're doing. I think it's great. That actually means a lot. And so really you can, all the way down to that teeny tiny thing, you can have some kind of a difference.
- Brendan Jarvis:
- Laura, it's been such a provocative and substantial discussion today. I've really enjoyed getting into these things with you. Thank you for so generously sharing as well as so passionately sharing your experiences and your insights with me today.
- Laura Kalbag:
- Thank you so much for having me. I really enjoyed it and you've asked so many wonderful questions. I hope I made sense.
- Brendan Jarvis:
- Oh, absolutely. And you're very welcome. Laura. If people are interested to connect with you, to follow along this conversation a little further, to find your book, to find out what you are up to at the Small Technology Foundation, what's the best way for them to do
- Laura Kalbag:
- That? You can find me everywhere at Laura Kalbag. I'm the only one in the world, so I'm very easy to find. And our website is Small-Tech.org.
- Brendan Jarvis:
- Perfect. Thanks Laura. And to everyone that's tuned in, it's been great having you listen or watch, depending on how you're consuming today's conversation. Everything that we've covered will be in the show notes, including where you can find Laura, her book, accessibility for Everyone, the Small Technology Foundation, and anything else that we've covered today that I feel is important to put in there. If you've enjoyed the show and you wanna hear more conversations like this with world class leaders in design, UX and product management, don't forget to leave a review, subscribe to the podcast, and also if you feel like this conversation would be worth the heirs of someone in your circle, then pass it along to them as well if they would get something from it. If you wanna reach out to me, you can also find my profile on LinkedIn. At the bottom of the show notes you'll also find a link there to my website, which is thespaceinbetween.co.nz. That's thespaceinbetween.co.nz. And until next time, keep being brave.