Design for safety

A transcript of Episode 270 of UX Podcast. James Royal-Lawson and Per Axbom are joined by Eva PenzeyMoog to discuss designing for safety. Understanding how technological products will be used for harm, working to prevent that harm, and supporting the most vulnerable users.

This transcript has been machine generated and checked by Tristan Schaaf.

Transcript

 

Robot Voice
UX Podcast episode 270

[Music]

James Royal-Lawson
I’m James.

Per Axbom
And I’m Per.

James Royal-Lawson
And this is UX podcast, balancing Business Technology, people and society every other Friday for over a decade, and with listeners in 200 countries and territories, from Syria to Saint Kitts and Nevis

Per Axbom
Eva PenzeyMoog is a user experience and safety designer and founder of ‘The Inclusive Safety Project’. And before joining the tech field, she worked in the nonprofit space and volunteered as a domestic violence educator and rape crisis counsellor.

James Royal-Lawson
Earlier this month, August 2021. Eva, she released her book designed for safety through A Book Apart. And we’re pleased to have on the show to teach us more about this problem space, and how inclusive safety can be incorporated into our design work.

[Music]

Per Axbom
So I’m thinking we should just start out with what made you write the book?

Eva PenzeyMoog
Well, I wanted to reach more people than I could with my conference talk design against domestic violence, which, you know, I enjoyed doing and was doing it in lots of conferences. But I was thinking like, if I actually want people to know about this stuff, I need to find a way to get it to more people quickly. So a book was kind of the, the best way that I could think of to do that.

Per Axbom
And I think at this point, it’s also good to pin down like, the topic of the book was when you see the word abuse, that can mean a lot of things.

Eva PenzeyMoog
Yeah, yeah. So when I talk about safety in the title is designed for safety, I’m really talking about in inter personal safety, especially in the context of domestic violence. But there’s also some other contexts like child abuse, elder abuse, even things like roommates or employees, like sort of any situation where there’s an interpersonal relationship, and one person can kind of enact power over the other is a place where there’s a possibility for abuse and safety issues. And that’s specifically the part of it that I’m tackling in the book.

Per Axbom
You remember, like one of the first examples of tech abuse that you came across?

Eva PenzeyMoog
Yeah. So the very first time that I sort of, like started putting this together in my head was actually like my first client at my consultancy where I work ‘A Flight’. It was a client that was essentially working on an app to manage the relationship, and different tasks between people who live in like big apartment buildings, and the sort of people who operate the building. And one of the things was like managing your guest list your list of approved guests with the person who works at the front desk, so that you know, your friend comes, they’re on the list, they can just, you know, take the elevator up to your unit. And you don’t have to go down to get them

James Royal-Lawson
right, because it’s like a building with a reception service or Porter service or the

Eva PenzeyMoog
exactly

James Royal-Lawson
and then let into the building. Right. Okay.

Eva PenzeyMoog
Yeah, yeah. But I was thinking about a story that I remembered from a training I did when I was a rape crisis counsellor, which is also sort of what led to me doing domestic violence education, thinking about how… So abusers are just, there’s almost like creative, like dastardly people, you know, I’m not saying that in like a positive way. But it is like a thing. They’re very creative and very good at sort of finding ways in and finding new forms of abuse. And I remembered a story about an abuser who had disguised himself as like a food delivery person. So we had like the bag, you know, full of food, and was like, you know, this person ordered this food, but she didn’t give me her unit number. She was very explicit, she wants me to go right to her unit. And then the front door person let him like told him the unit number and let him go up. And this was one of those situations where it wasn’t the tech that was really performing the abuse, but it was enabling it in some ways.

And I was thinking why isn’t this situation accounted for in the software that the front door person is using? Like, they have the list of accepted guests, but what about a list of people who actually should not be let up and who should, you know, if you see them actually, alert the tenant and possibly the police, if that’s what they want. So I was thinking, it just felt in that moment, I was man, we’re really falling short. We could be doing so much more and the technology side of it could be really powerful because maybe the front door person doesn’t need to understand all this stuff. Not everyone needs to be a domestic violence expert. But if there was something in that software That alerted them to the fact that this is a thing that happens, so that they could be aware of it and then help keep that person safe in that moment. Wouldn’t that be amazing? And that’s kind of when I sort of the light bulb went off of, there’s like so much that could be done better in tech when it comes to interpersonal harm.

Per Axbom
And that’s kind of the key, isn’t it? Because I think what what do people a lot of people think is that well, anything can be misused and abused. But when it comes to the things that you’re talking about, there’s actually a lot of potential for doing better and designing safety into the very technologies that we use.

Eva PenzeyMoog
Yeah, exactly. It’s something that I think people just don’t know about. And that’s always the feedback that I got when I was doing my conference talk is people would be like, I just had no idea. Like, it’s it’s not necessarily that people know about this and are choosing to ignore it, at least at this stage. It’s more that people just literally don’t understand that this is a thing. And once they understand it, they’re more motivated to try to work against it. So yeah there’s a lot that we can be doing, if we just take the time to consider the abuse cases.

James Royal-Lawson
That makes me want, think, ask more about the harmful assumptions we make. I mean, you mentioned that in the book that we make harmful assumptions, but what are those kind of harmful assumptions that we, I suppose, fall into making?

Eva PenzeyMoog
Yeah, so I think the main one is just the assumption that our users are sort of good people who aren’t trying to use our product for any type of harm, that they’re not someone who is like, abusing their wife, or stalking their boyfriend, or, you know, invasively, surveilling their elderly mother, all these different things, we just kind of assume that our users aren’t good people. And that we just have to enable the actual task of you know, whatever it is setting up the device or using the software. And that that’s sort of the biggest problem that we have to overcome. And we just don’t think too much about their lives and their interpersonal relationships. Because we haven’t ever been taught to so I think the big assumption is that people aren’t trying to find ways to use our products for harm, which is an assumption. It’s an it’s definitely not true.

James Royal-Lawson
I guess it’s a safety mechanism for us as designers as well to, to not delve into that area. Because it probably doesn’t feel great at first, if you kind of jumping into the the harmful assumptions.

Eva PenzeyMoog
Yes, yeah. Exactly. James, that is really spot on. It’s, yeah, especially the domestic violence side is really hard. It’s really hard to think about and to talk about, and to learn about. So it makes sense that people don’t want to go to work, to their design job and thinking about this stuff. Because it is really dark, and really tough, especially, you know, statistically, it’s very likely that a good chunk of our users are going through this. But that also means it’s statistically likely that we have gone through it, or someone on our team has gone through it. So it’s also, that’s another sort of barrier is, if you have personal experience with this you might understand it better. But that also might actually just make it harder to talk about, because now you’re kind of going through this, you’re bringing up this really intense trauma in your workplace. And that’s always hard. So there’s a lot of reasons why I think we kind of just don’t talk about this in the workplace, but ultimately, that ends up just enabling the abusers and letting them do what they want.

James Royal-Lawson
Yeah. And that’s, that’s actually a multi layered aspect of that there that, I think isn’t a statistic, wasn’t it? 30% or something? I think you mentioned about people who suffered from some kind of domestic violence, which, if you’re a team of 10, you’re exactly right, there should be what, depending on the maybe the makeup of that team, then you’re going to be talking about a couple of people in your team who have suffered from that. And yeah, that’s great. You’ve got experience and awareness in the team. But opening that up and revealing that or discussing on your team isn’t going to be something that a lot of people are gonna jump at the chance to do.

Eva PenzeyMoog
Yeah, exactly. Right. And it’s, it’s something that. I don’t know, if either of you got to the part in the book about sort of the reliance on diversity to sort of fix these problems, which obviously, diversity in tech is a huge problem. And it is a huge part of the solution. But I feel there are some limitations. And the one that I sort of worry about is the assumption that you know, a black person on the team is going to be expected or pressure to sort of get into their history of trauma that they’ve experienced because of their identities and same things with a domestic violence survivor or rape survivor, whatever the sort of trauma is we can’t just expect that those people are going to get into the some of the worst moments of their lives, just you know, at the 11 o’clock design brainstorm meeting. And we do need, obviously diverse people for many reasons, but I think sometimes people sort of don’t get deeper into what that really means. And if the expectation is that those people are just there to like, sort of share their trauma in order to help you with your product, then that’s not great.

Per Axbom
Oh, these are such good points. So really, and you make good points about how we include people with regards to do they get reimbursed? How do we have empathy for people? All the things that are part of the research stages of understanding? What is the problem space that we’re really dealing with? Can you give us some advice on how do we get started doing the right thing, even in the research phase?

Eva PenzeyMoog
Yeah, so research phase. Again, it’s sort of, I think, dealing with that assumption that people aren’t going to try to weaponize our product for something nefarious. So looking for articles or scholarly content, I use Google Scholar a lot for that, to see if there’s just existing content out there about ways that similar products or similar features have been misused. So some of them are really easy. Like, if your product is going to have anything with like location data, there’s obviously a lot of stuff out there about how that gets weaponized for stalking. Sometimes it’s a little tougher to find things that are existing. And you know, a lot of times things are really emergent. And it’s not something that’s been reported on yet. Which is why there’s a different phase first sort of brainstorming those novel of use cases, for things that you haven’t found just in your existing research. Yeah.

Per Axbom
It was interesting what you said about people being creative, the abusers being creative. And it’s like alarming to think about the extent they go to. Dressing up, like you gave the example of, I just have to share with reading that story of, I think it’s early on in the book, where someone is stalked, I think it’s the story of Eric and Rob. And even the person being stalked is extremely tech savvy, and can’t figure it out. He keeps turning off all these GPS enabled things. But actually, in the end, it turns out to be I think it’s the car or something like that. And when I read, I was like, that could happen to me. And I’m tech savvy. And I always think that I’m just reading that I realised, well, what, and I’ve just started looking at my phone, what else could it be? I mean and it really, it really creeped me out.

James Royal-Lawson
I’ve read that and all the similar things we discussed. And I mean, you have talked about how many years later because we test so many things and try so many things. You kind of go, oh god, you know, why is that doing that? And you dig down and you realise that you’ve tested a service, like six years ago set up something, and you’ve forgotten about it. And then suddenly they got triggered? And it was doing some things? Oh, yeah. Very…

Eva PenzeyMoog
yeah. Yeah, it’s, I mean, well, I don’t want to say it’s like, Good that you feel that way. Because it is scary. But I think it’s good for even people who work in tech, and are very tech literate to realise like, Oh, actually, this is all so, there’s so many different ways that this could happen. And it can be so sneaky and it can be so like, the car thing is so new, which is I think why it’s especially scary, and people, a lot of people just don’t realise that that can happen. But there are documented cases of it happening. As well, as you know, aside from just stalking, there’s like, one case in I think Australia, of a woman whose ex husband was doing things like unlocking the car doors remotely when the kids were in the car, and then sort of using that as a reason that she was an unfit mother in their custody battle. So there’s all these other things with cars that are really creepy.

But I think it’s, I’m glad you bring it up. Because one of the big sort of arguments against my entire premise, which is that technologists need to be thinking about this and thinking about how to prevent and mitigate the harm. You know, people will say, well, it’s the user’s responsibility to understand the products they’re using. Of course, these can be used against you, you need to know that. And then it’s sort of your job to learn how to identify that it’s happening and regain control and whatever it is. Which I’m always like that assume so much tech literacy that most people just don’t have.

But I think at this point, it’s the case that even people who are very tech literate can still fall victim to these things, it doesn’t really matter because no one can take the time to actually understand every single thing of every single piece of tech that they’re using, especially when some of this stuff isn’t going to be documented for a few years. By, you know, either academics or you know, journalists who are reporting on it. So yeah, I think it’s good to hear that and hopefully It helps some of those people who think that we should just be educating users more realise that actually, it can happen to anyone, even the most tech savvy people, therefore, we need to just be preventing it at the source.

James Royal-Lawson
I think this for me, this ties in as well with our consent. We talked about asking for informed consent and regularly asking consent for these kinds of things. But at the same time, for me, I think one of the problems or consent is just related to what we’re talking about now, the complexity of what you’re agreeing to. And understanding everything and to expect all your users to be able to understand everything about every bit of consent you give is we can’t make that presumption at all, ever really.

Eva PenzeyMoog
Right? Yeah. No, that’s a good point. And I think that’s where it comes into play, where it’s really important to identify: Okay, so what are the parts of the product that we really need to prioritise the user understanding such as this car interface people can essentially have access to the location of the car at any tim. That’s a huge one, like stocking can be so dangerous, it can lead to all sorts of terrible things, you know, murder being the big one, which happens all the time. Here in the US, it’s three women every day are murdered by a current or former intimate partner. And that statistic is not true in other places in the world. It has a lot to do, I think, with our very easy access to guns here. But yeah, keeping people, you know, letting people keep their location secret is so important.

So that’s where I think it’s just taking the time to think: yeah, we can’t actually get our user to understand every piece of every feature and every, every little thing, but we can prioritise and we can think about what’s going to be the most dangerous and then do something to say: Hey, did you know that, you know this user is whatever the username is viewing your location? And they can be ‘yeah, of course, whatever, and dismiss that little notification’ or they can be like, ‘Oh, yeah, no, I didn’t know that. And now, I can make more informed decisions about my safety’. And that can be literally life saving.

Per Axbom
Yeah, exactly. That made me think of Google nest announcing that you will soon be able to use the Google Assistant on the nest device. before they even told anyone that it had a microphone. It wasn’t even a documentation. It was like, Oh, this thing has a microphone. Okay. So even if you’d read the manual, it wasn’t there. And of course, they weren’t using it, they say, but anyone can hacking microphone on a tech device.

Eva PenzeyMoog
Yeah, that’s, it’s really creepy.

James Royal-Lawson
So that’s really interesting that you got to kind of hardware tech in something, but because it’s not in use, it’s not feature. So you don’t need to talk about it. So you don’t need to give consent to use it. Because it’s not being Oh, god, that’s such a

Eva PenzeyMoog
Yeah, see, this is where it gets so… you know, obviously, I really believe that individuals and teams doing this work is a really powerful thing, and can make a huge difference. But this is an example where it’s like, man, our laws are so far behind like that shouldn’t be legal, that just shouldn’t even be a thing that can happen. And our politicians need to catch up.

James Royal-Lawson
But then we’ve also got all this talk over the years about how Apple for example, there are crippling devices. Now they reduce the the CPU level, you know, they make them slower. And they say it’s because of that your battery’s getting older, whatever the protecting the… they making sure your device keeps running. So you’ve got organisation of companies that are providing hardware are messing with your hardware, Google with a nest and it doesn’t have a microphone. Now it does have a microphone. So all these situations where we’ve got people playing with devices, which you have maybe learned to understand what they are you’ve got a physical object, you’ve got an understanding that object, suddenly it changes. So that then makes your understanding of what it’s capable of and what you know, what your risk of instantly changes as

Eva PenzeyMoog
well, right? It’s I mean, it feels like an impossible task to ask people to just be able to keep up with all that, including, like somehow foreseeing that there’s hardware that was literally never mentioned, and that you couldn’t see, like how on earth could someone Intuit that that was there? Like that’s impossible.

Per Axbom
And speaking of regulation, I mean, how will the legislators How will they become aware enough to understand what they have to do to legislate and create policies and laws around these companies?

Eva PenzeyMoog
Yeah, good question. Big question. I think. I mean, yeah, gosh, I don’t know much about like Swedish politics. I know here in the US. You know, there’s a big problem with some of the elderly People in Congress literally not understanding. They asked Mark Zuckerberg like, Well, how do you make money if you don’t if this product is free, and he was like, well, we, we sell ads, like, people don’t even have that sort of level of understanding. I am hopeful with people like Alexandria Ocasio Cortez in the mix now who are younger, who are tech savvy, who understand these things. And they did actually just introduce a really good law about algorithmic bias. And it’s sort of working its way through some committees right now. And it’s actually a really good law that activists are really happy with.

So I think there is hope, but I feel to actually answer your question, the thing that has to come first is public opinion has to turn against these companies. And that is happening, and it is, it’s very hopeful. I think usually, it’s like activists working on things. Academics working on things, people who work in the space in the actual industry starting to understand the harms, and then that sort of going out into the wider world. And the sort of average people starting to say: Oh, yeah, actually, I don’t, I don’t like this, this is harmful. And then that’s when politicians start to take notice. So I think that is starting to happen in tech, which is very, very nice to see.

James Royal-Lawson
Is how we deal with and assist the the victims or survivors, I think you call it in the book of this kind of abuse. They are part of how we can increase awareness and come help change the world and change the policies around it. Can we I don’t want to say leverage survivor. This sounds absolutely awful. But

Eva PenzeyMoog
no. Yeah, yeah, definitely. And I think, yeah, Per kind of mentioned this earlier, but the idea of like, paying people for their lived experiences, because they are the experts in the topic. They’ve literally lived through it and had the experience. And that is very valuable information that, you know, the average person who hasn’t gone through that just doesn’t have. And yeah, so centering survivors, as well as working with the sort of experts, people who work in shelters like lawyers who work with survivors on different things, those sorts of people in the support space, those two groups are so essential to work with as we craft this. Just as the survivors of any sort of abuse or harm should be at the centre of the work to actually fix it. It’s if we’re not fixing it for them. Who are we fixing it for? So they always need to be at the centre for sure.

Per Axbom
And you have a whole chapter on interviewing vulnerable users. And I think that there’s some really, really good tips there. One of my favourite being to actually see the value in having therapy. And getting that ahead of time.

Eva PenzeyMoog
Yeah, I could not pass up the chance to hype therapy for those who have access to it. I think it’s really important. I mean, just in general, as a human being on this planet, but especially if you’re going to be taking on some of these tougher issues. It is really tough to sit down and read about domestic violence or these different things for a few hours, you come across so many horrible things. And having a therapist who can help you sort of work through that is definitely key, especially as I think about this idea that changing tech for the better is a marathon, not a sprint, and it’s years, if not decades. And we need to be able to sort of persevere through that. I think prioritising your mental health, and figuring out how to make it a long term thing is really important, especially because, the Mark Zuckerberg of the world are counting on us to give up after a little bit and to not put a sustained effort in.

Per Axbom
That’s such a good point. And towards the end of the book, you use the seat belt as an example of how how you have to actually let it take time.

Eva PenzeyMoog
Yeah, I’m very obsessed with the history of the seat belt, it’s such a good way to look at a paradigm shift and how it happens and the sort of different phases it has to go through. And with the seatbelt, it was it was 32 years between seatbelts being introduced and meaningful laws and actual changes around the creation of a branch of our government that actually manages things like car safety and road safety. And then they, you know, passed a law that said, hey, car manufacturers, you have to just have seatbelts as a standard they can’t be an add on feature that you charge, like a few 1000 extra dollars for, you can’t put your profits over user safety, which is really cool.

But that was 32 years. And I think that was that started in the 50s and 60s. So I think we’re able to move a little faster now and people are able to get educated on information and understand the sort of harms of the internet and of tech much more quickly than they were able to understand the harms of what car manufacturers are doing. But it’s essentially the same thing like all of the same things that were happening with the car makers in the 60s are happening with tech right now. And it did change. And now they’re really regulated, you can’t have a car without a seatbelt.

James Royal-Lawson
And you’re not allowed to drive them without wearing the belt. I know when I was when I was a child, it was wasn’t a legal requirement to put a belt on. So I still have memories of sitting in the middle between the two front seats, so I can like see better and and towards my parents and everything, which is crazy when I when I think about that, how that was growing up, being in such a dangerous position in the car. But I remember that when the change of law, we all had to be belted up. But you’re right, it takes so much time. I mean, the mechanisms and processes we have in place in many countries are still not designed and equipped for the kind of technology that well, and the pace of technological change that we’re living with.

Eva PenzeyMoog
Yeah, right. Exactly. And this is kind of what I was saying earlier is that like laws are always like the cap. Like they’re always the last thing. You know, we don’t we have to work really hard. We have to fight really hard. Politicians don’t just give us like the safe, right things without them having like a tonne of pressure put on them. And maybe that’s changing with like, the Alexandria Ocasio Cortezes of the world coming into power, things like that are very hopeful. But for the most part, most politicians. Until all of their constituents are demanding something, they’re not just going to look at Tech and be like, Oh, this is causing a lot of harm, we should probably regulate it, especially when you know, there’s lobbying powers and companies can basically have the same rights as humans in the US when it comes to donating to political parties, and campaigns which is, you know, there’s so many different problems to tackle. But yeah, anyway, the seatbelt thing is very instructive. And I’m very obsessed with that story.

Per Axbom
And it’s a Swedish invention. I just had to get that in

Eva PenzeyMoog
it is Yeah, I was gonna bring that up.

James Royal-Lawson
I was waiting! Because I knew he always mentions the Swedish when we talk about safety belts, it always comes up. I was thinking he hasn’t mentioned it yet. When’s it coming?

Eva PenzeyMoog
Yes, we have this Swedes to thank for that.

Per Axbom
I have one, oh you go ahead James.

James Royal-Lawson
Oh i have final question as well. But we can both have our final questions if we’re quick enough. So I am in a team, and we’ve not really considered safety in our product earlier. How do I get started? What’s what’s my most my first step in the team together?

Eva PenzeyMoog
Can I ask, do you have a leadership position on the team? Are you in charge?

James Royal-Lawson
In this fictional team we’re thinking of, No, I’m not. I’m a member. Okay.

Eva PenzeyMoog
Well, I wanted to ask because I think like, I’m trying to be very intentional, that it’s the leaders in tech who are ultimately sort of responsible for this. And I’m trying not to say Facebook, I’m trying to say like the people who run Facebook, because those are the people who are accountable, ultimately. And I think any sort of team director, team leader should be leading out on this again, instead of waiting for your team to come to you and make the case. But recognising that most of us are not in that position of power, I think getting started. My advice is to first, sort of get an ally on the team. So let’s start sort of building that power. A little bit behind the scenes, maybe that sounds like creepy or unnecessary. But I think it’s really hard to go it alone and having at least one other person who’s like, yeah, this is really important.

And then sort of having a plan for when you bring it up, that they’ll immediately back you and be like, Yes, I agree with James, that is extremely important. And that’s going to be very powerful, because being isolated can be really, really tough. So starting with that, and then just, I have the process in the book, sort of five steps that people can use sort of overlay on their design process. And just like taking a look at that and being like, Yeah, let’s do some research. Let’s set aside two hours for a brainstorm. There are very specific discrete activities you can do. So just finding the time for that and making the case with your, your leader, your stakeholder, whoever it is who approves that, that this is something we really need to do and that it’s not going to take, it’s not going to be like that much time and money. It’s it’s ultimately pretty quick to do this work.

Per Axbom
That’s really good advice. I actually just wanted to end on. You’ve been doing this for a while now. What signs of hope are you seeing for positive change?

Eva PenzeyMoog
Yeah, I’m seeing a lot of signs of hope. Actually. The law that I mentioned earlier about bias algorithms that is in Congress right now is just so exciting to see because it does mark that public opinion is shifting. And that’s the other thing that gives me a lot of hope is that I think nearly everyone in tech sees at least some forms of harm, even if it’s not the one that I’m focused on. They understand the issues with misinformation or online harassment whatever it is. People are starting to realise, we’ve gone through this shift of like, Facebook is the best and Twitter is amazing. And what an amazing thing the internet is to wow there are so many problems, this is doing so much harm, and seeing the sort of average person who doesn’t work in tech start to be like, oh, wow, this is all really messed up. And we need to do something about it is something that gives me a lot of hope.

Per Axbom
Yeah. The awareness is key. And I think the answer to my previous question on how to get the legislators to take notice and understand is that you put their your book in their hands, obviously.

Eva PenzeyMoog
Yeah, that would also be good.

Per Axbom
Thank you so much for this. It’s been really exciting.

Eva PenzeyMoog
Yeah, thanks. This was a really fun conversation. Thanks for having me.

[Music]

Per Axbom
So when talking about default passwords, which is also something that Eva as an example of in her book, especially, I think, when it comes to ring cameras, and people hacking into ring cameras, because they didn’t change the default password that came with the device,

James Royal-Lawson
Ring cameras this is the doorbell. With a camera.

Per Axbom
This is really prevalent, I think, in the US seems a lot of people are talking about it on Twitter that people have these door cameras that are motion sensitive. So that even though when people ring the bell, or actually when just people pass by outside it actually films them. And this is a huge problem in itself in that people actually save those things they upload them to, to enjoy the Internet, and they have a ring and watch parties.

James Royal-Lawson
Full of funny slips people coming to the door. Yeah, happening. Yeah. Okay.

Per Axbom
So whatever we talk about, there’s something. Yeah, that is actually to Eva’s point, everything is being abused. But point here was the people who were also using default passwords to hack into other people’s devices downloading their moving material. And there was a law passed in California, I think it’s two years ago now, that actually prevents companies from doing this. And prohibits them from doing this, and especially when it comes to Wi-Fi routers, but I think it actually propagated to other devices as well. Because previously, people could just go into people’s Wi-Fi networks, because they just kept the default password is

James Royal-Lawson
admin admin, you know, that’s what it’s gonna be right?

Per Axbom
It’s same everywhere. Or you can just do a search online and find the default password for device. Yeah. The interesting thing about that law being passed in California, of course, is that. Why would they make one device somewhere that has an default password and one device for California because they just change all devices to not have a default password. So and that, of course, even more propagates throughout the world, as there are so many things that actually come out of California. So just that law being passed in one local place, it actually affected most of the world.

James Royal-Lawson
Yeah, now I think there might be different laws. But before is the case, I love social media and so on. There, you were limited to be used by 13 year olds to be older than 13. And that wasn’t because of any law in European country, it was because of a law in California, that said you had to be 13 to use the services, and they just didn’t really bother adjusting or internationalising the services for every single place in the world, every jurisdiction in the world. So it became it became kind of like urban myth that we have to be 13 to use a service. Yeah, you do. But it Oh, it.

Per Axbom
Yeah. But it has nothing to do with the local law.

James Royal-Lawson
It not the law is exactly. It’s maybe term decisions at the service based on the California law. And I think that’s that’s a really interesting and complex issue. Not that rest of it isn’t complex, but this whole thing about internet internationalisation. That, Eva, she mentioned a few American examples, and laws and so on. And there are all these laws and situations. What your context is in one country and what is a privacy nightmare in one place, is a cultural norm somewhere else? I think in Evas book, she has, I think name and address being an example. Revealing name and address can be a real privacy issue for Americans, because it allows you to do certain things, same in many countries, whereas here in Sweden, real name and address isn’t really a big issue. Right? Because I mean, you can find it out pretty easily. It’s not. It’s not that hidden. It’s still personal information. But the culture around your name and address isn’t quite the same as it is in other places.

Per Axbom
Exactly. Another example would be dating apps because dating apps, they start in the US often, and then they start being used in lots of countries around the world. And I know that in certain cases that these dating apps have been used by authorities to track down people who are homosexual, because being homosexual is outlawed in 70 countries around the world. And sometimes it leads to jail, and sometimes it just leads to harassment and sometimes to murder.

James Royal-Lawson
Yeah.

Exactly. And cultural, even cultural norms, norms in the in the Family or Household can vary, depending on about, I mean, maybe, again, example here is your children would walk to school here, by themselves, maybe or take themselves to school. Whereas in the UK nowadays, that wouldn’t happen, you’d be taken by your parents. So some technology might be accepted to help you. So your content, what might seem was like an alarming thing that you can track something or do something or respond to something. It might well be in a certain country, but it might not be the exactly the same situation somewhere else. So this is me not judging things as good and bad based on country. I mean, it’s not it’s not okay, that maybe domestic violence happens in a certain country, because that’s just how the culture is. That’s not what I’m saying. I’m just it’s more that I’m kind of trying to lift the international aspect that we’re always-

Per Axbom
We have to acknowledge how you’re talking about assumptions that we talked about early on in an interview with Eva was the assumptions that we have, we always assume that the ones we are designing for we know them, and we know how they are and what we know what families look like, and, but we really don’t, because we don’t know where and how and how its going to be used,

James Royal-Lawson
I think that’s probably what I’m getting up this whole thing about how we take that step. To get going with all this and maybe one of maybe a relatively easy exercise is to say, Okay, what if our product is using country x? To make you start thinking beyond your culture beyond your country, because we’ve we’ve seen time and time again how the Americanization of the internet that we’re using, has caused many, many problems. And in countries that are less American, or whatever that phrase means. You can potentially cause like your example now with with homosexuality. In some countries, you can weaponize these products, easier governments ca,n people can in different ways. So if you start thinking about beyond your culture, that maybe is a healthy starting point. Or a potential dying

Per Axbom
definitely. Yeah. We have a recommended listening episode that you put into our show notes, James, it’s oblivious design Episode 166.

James Royal-Lawson
Yeah, this is this is a chat from about four years ago. Which I at least I started off the episode by interviewing you. Okay, you launched What do you call it a web service?

Per Axbom
People call it app, even though it’s just a website. Okay.

James Royal-Lawson
But an app called dick pic locator, which is exactly what it sounds like. And but that was, that was a an ethical exercise on your part, awareness and ethical awareness initiative.

Per Axbom
It really was exactly I mean, those tools are dime a dozen really on the internet. My contribution, as you said, really was the name so people understood what it could be used for.

James Royal-Lawson
And so we had a discussion around what we coined or named as oblivious design, how, how you how there’s so many unintentional uses of things that we create. And I think that conversation and that episode, fits in really nicely with the topic of Eva’s book. Given that we, presuming that your product can make it can do harm is a similar idea to all the different uses that may come in the future that you were oblivious to, because the narrow set of scenarios you start off with.

Per Axbom
Exactly. So you are have been assimilating quite a number of volunteers, James, they have been a fantastic help. They had we have we have several teams now. What are they?

James Royal-Lawson
Yes, we have one team that helps us with checking through the transcripts, we have another team that help us with publishing those completed transcripts. And now we have a third team, which are listening to the episodes in advance, going through and picking out all the references they can find and finding useful links according to the what we’ve said. And that team, I really love to grow. Because it does help us the time we get from listeners is worth so much money to us effectively. It helps reduce the cost of doing this podcast and makes it keep on ticking.

Per Axbom
And I think I think a lot of people don’t take the time to visit the show notes. But I know that if you do and you find something especially interesting there’s there are lots of links to read further there. And sometimes videos to watch and sometimes some of my mind maps to look at and I mean, there’s so much content beyond what we’re talking about on the show, there’s something more to dive in on if you take the time to visit the show notes as well.

James Royal-Lawson
These shows are always a starting point for discussions, inspiration, learning, and we’d love your help to keep doing it. So email us or get in touch with us. Doesn’t matter how you do it. Although we recommend you don’t use smoke signals. We found that difficult to get across from Australia. But hey@uxpodcast.com / hej@uxpodcast.com , H E Y or H E J. .. H A Y I should say… no hold on… which one do we use Per? I’ve made a mess of that.

Per Axbom
Both of them.

James Royal-Lawson
H E J and H E Y.

Per Axbom
Yeah. Swedish or English spelling, as we say. Okay, remember to keep moving.

James Royal-Lawson
See you on the other side.

[Music]

Per Axbom
James

James Royal-Lawson
Per

Per Axbom
I don’t trust stairs.

James Royal-Lawson
You don’t trust stairs?

Per Axbom
Yeah, they’re always up to something.

James Royal-Lawson
Oh… Apart from the stairs to my cellar.

 

This is a transcript of a conversation between James Royal-LawsonPer Axbom and Eva PenzeyMoog recorded in August 2021 and published as episode 270 of UX Podcast.