Ascend UX

Deceptive Design

Episode Summary

Have you ever felt frustration attempting to unsubscribe from a product or service? Or perhaps confusion when managing data privacy settings? Those experiences were likely intentionally designed to mislead you by so called “dark patterns.” Our special guest Jennifer Li has researched this topic and discusses different types of deceptive design and their negative impact on the user experience. We also debate the UX Designer’s responsibility to themselves, their employer, and society when business and user goals collide.

Episode Notes

Helpful Links 

 

More Ascend UX Content 

 

Contact Us

 

Credits

 

Copyright 2020 Ascend UX. All rights reserved.

Episode Transcription

Episode 6: Deceptive Design

Introduction

Ayan: [00:00:00] I had one actually yesterday, that happened to me. We were trying to unsubscribe from a service that we no longer use, and we went and finally found the unsubscribe button. And rather than just making it very fluid, we were told, "Are you sure you want to unsubscribe from this service?"

Yes. We want to unsubscribe. 

And then they showed us how much money we saved by using that service, so there's this financial anxiety saying, " Oh wow, I'm going to spend more money if I don't maintain this service." 

So, we continued, and we said, yes, we still want to unsubscribe. And then they showed us all of the services that we'll no longer be able to access. So, we kind of had to like go through a tour of the platform before we're able to finally unsubscribe.

Evan: [00:00:40] Oh, wow... Did you unsubscribe?

Ayan: [00:00:43] I think so. Yes.

Evan: [00:01:03] Howdy friends! This is Ascend UX podcast, a show about the experience of user experience design. I'm Evan Sunwall.

Ayan: [00:01:10] And I'm Ayan Bihi.

 

Experience with "Dark Patterns"

Evan: [00:01:12] So Ayan, we spoke about this topic with a mutual acquaintance—dark patterns, manipulative design, designers creating experiences that seem to mislead—and this person had some very strong opinions about a recent experience they had, didn't they? (laughs) 

Ayan: [00:01:27] Yeah, for sure, especially when I said the word "dark patterns", he was like, "Wait, I have something for you. Let me show you the screens."

Evan: [00:01:33] Yeah, they went on to describe a situation where they thought they had unsubscribed from a service, and some time elapsed and then they realized they hadn't unsubscribed at all because of the copy that was in the experience, and they were—rightfully—a little livid about it.

They went on to contact the service and describe and explain the situation, and they admitted no fault for what they had created. They said, "We don't agree. We don't agree this is misleading."... that's pretty frustrating!

Ayan: [00:01:59] So, so frustrating. Just the time it takes to have to go back and redo something that you feel like you've been tricked into.

How about you, Evan? Have you had any experiences with dark patterns recently or just in general?

Evan: [00:02:12] One of the things I experience with buying stuff online—You put in your email and then you get signed up for their newsletter. And I'm like, "No! Get that stuff out of there. I don't want that stuff!" And then, every time I gotta engage with the unsubscribe process I have to go into vigilant mode. I have to find it first, because sometimes they'll style it subtly—maybe the contrast of the text will be a little off, and it'll look washed out, or it'll be phrased strangely. Then read the webpage—how are they framing the unsubscribe process? "Click on this to unsubscribe" or " Click on this to maintain contact". I have to strain to make sure that I am understanding what I am doing correctly because I so frequently get it wrong... and I hate it.

Ayan: [00:02:54] Me too, actually. I really don't like newsletters if I can be honest, so I always exercise my right to unsubscribe. But the worst is when you unsubscribe and then that newsletter comes back and you're like, "Wait, did I unsubscribe, or was that just a figment of my imagination?"

Evan: [00:03:09] Yeah! Yeah. And then you like blame yourself. Like," I didn't do it, right! Oh no!" And then you go through it again. And yeah, it's a bad feeling. 

We've talked about design as manipulation in a way that we want to help people achieve goals and we want people to use a product successfully, but there's this underhanded approach that crosses an ethical threshold. Manipulative design that is not exactly keeping the user interest at heart—and creating a negative experience, negative emotions with your user base.

 

Jennifer’s Intro

We have a special guest, Jennifer Li, that's going to help us talk about this topic. 

Jennifer Li recently graduated from the University of Washington, Seattle with a Bachelor of Science in Information Technology, and a concentration in Human-Computer Interaction. Jennifer developed her product design skills interning in a variety of environments, such as startups, design consulting agencies, and larger companies such as Indeed.She landed a job post-graduation and will soon be a Technology Advisory Consultant at Ernst & Young, where she'll use her passion for problem solving and advocating for customers.

Jennifer, welcome to the show!

Jennifer: [00:04:12] Yeah, thank you so much, Ayan and Evan! I'm very excited to start this conversation with you guys.

Evan: [00:04:17] For the audience, let's just talk about what is a design pattern? 

Jennifer: [00:04:21] So design patterns are repeatable solutions used for reoccurring design problems. They act as common languages among designers, and guide users to achieve their goals while interacting with different platforms. Design patterns are very valuable and efficient, as they're created to provide seamless and enjoyable interactions.

Evan: [00:04:42] These are kind of shortcuts. So when a user encounters a novel new situation, they can use existing, prior knowledge around that experience to use it more successfully. Is that right?

Jennifer: [00:04:54] Exactly.

Evan: [00:04:54] So what are dark patterns?

Jennifer: [00:04:57] Dark patterns, are created to manipulate user interactions and mislead users into doing something they don't want to do. The terminology "dark patterns" was actually created by a UX practitioner named Harry Brignull in 2010. And Brignull defined dark patterns as instances where designers use their knowledge of human behavior, such as psychology or cognitive science and the desires of the end users to implement deceptive functionality that is not in users' best interests. 

Ayan: [00:05:28] Why do you guys think that dark patterns are being consistently used within our practice as designers? 

Jennifer: [00:05:33] So dark patterns are usually created in the interest of business goals and short-term gains like the number of subscribers or the amount of revenue. Our tech industry is very fast paced. There is definitely a pressure on the business and product teams to meet or exceed their business goals in a very short amount of time. 

Evan: [00:05:53] Let's talk about some of the different types, what are the most common dark patterns that are out there these days?

Jennifer: [00:05:59] Going back to Brignull, he actually defined 11 types of dark patterns, but all the dark patterns fundamentally use five strategies. They are nagging, obstruction, sneaking, interface interference, and forced action. 

 

Pattern 1: Nagging

Evan: [00:06:15] What about nagging? Let's describe nagging.

Jennifer: [00:06:17] Nagging is defined as the redirection of expected functionality that persists beyond one or more interactions. In a more common sense, it's basically saying that the apps or the websites are pushing or nagging users to do something until they actually do it. 

So for example, two years ago Instagram had this function where they asked users if they want to turn on their notifications and instead of giving a "No" option, there are only two options. There's "Not Now" and "Yes." And when they click "Not Now", Instagram just keeps sending you that popup. Users can't really say, "No, I don't want to receive notification". 

Ayan: [00:06:53] I had one actually, in regards of nagging yesterday, that happened to me. We were trying to unsubscribe from a service that we no longer use, and we went and finally found the unsubscribe button. And rather than just making it very fluid, we were told, "Are you sure you want to unsubscribe from this service?"

Yes. We want to unsubscribe. 

And then they showed us how much money we saved by using that service, so there's this financial anxiety saying, " Oh wow, I'm going to spend more money if I don't maintain this service." 

So we continued and we said, yes, we still want to unsubscribe. And then they showed us all of the services that we'll no longer be able to access. So, we kind of had to like go through a tour of the platform before we're able to finally unsubscribe.

Evan: [00:07:34] Oh, wow... Did you unsubscribe?

Ayan: [00:07:37] I think so. Yes. We'll see shortly I'll check my bank account in a couple of days.

Evan: [00:07:44] Exactly what we're talking about right now. Like, "I think I did. I think?"

 

Pattern 2: Obstruction

So, what about the next one, Jennifer?

Jennifer: [00:07:50] The next one is obstruction. Obstruction is when a process is made more difficult than it needs to be with the intent of dissuading certain actions. While I was doing research on dark patterns, I was trying to delete my Amazon account. The process was a perfect example for obstruction. 

First, I spent about 10 minutes going to the account settings and searching on the page about how do I delete my account. And I couldn't find a way to delete it! So, I to Google myself, "How do I delete my Amazon account?"

Finally, I went to their help center—which you have to scroll all the way down on the page in a very tiny font—and you click " Help". And even under help there's "Do you need more help?" And then in there there's a form that finally asks you, "So what is your problem?"

So, I'm like, "I really want to delete my account and clear my data". After taking all those routes to get to that point, instead of giving you the option to delete your account, you are set up to email a customer representative.

So there is no direct option to say, "delete your account". You have to talk to someone and wait for them to persuade you all the reasons why you shouldn't delete your account. And Brignull actually categorizes the specific dark pattern as Roach Motel, when it is very easy for users to get into something, but it's difficult to get out. 

Ayan: [00:09:08] So Jennifer, how long did it take you in total from the moment that you decided to do it 'til you got that email to contact the customer service representative?

Jennifer: [00:09:16] Too long. It was probably 25 to 30 minutes.

Ayan: [00:09:19] I'm wondering how many people just say, "Forget it I don't care anymore". 'Cause 25 to 30 minutes is a lot of time so for people to persist, to actually fulfill that goal of deleting their account, I'm sure it's like 1%. 

That's not really creating a positive user experience where you feel that you want to continue using it. Rather, you feel imprisoned by the structures that are created to make you stay.

Evan: [00:09:41] There is a hidden cost here that many people discount, or they don't think about, or like, "Ah, whatever". But word of mouth general perception of your company that "I don't have a good feeling". lot of negative emotion, a lot of feeling about that service. Not necessarily going to get a referral.

Ayan: [00:09:54] But do businesses or companies have the best intentions in mind? For example, could they be saying, "We have all this information of the user and we offer a great service. We don't want it to be easy for them to unsubscribe. We want it to be difficult because there's a lot for them to lose."

Evan: [00:10:10] I think there's a role in, "Hey, if you're going to unsubscribe or delete the service, there's a couple of things you should know." Either "This is what we're gonna do with your data, this thing's going to be discontinued. There's some benefit to this, and we want to make it clear to you that there are some consequences to doing this and we don't want you to do it accidentally."

However, how much friction—how much physical load, cognitive load—are you baking into that experience, where it becomes that obstructionist dark pattern that we talked about before, where it's fairly obvious that I've gone through four hoops now, and you're still not letting me unsubscribe versus maybe one hoop communicating potential consequences or making sure—just confirming—you really want me to enact a permanent change or something that could be destructive.

 

Pattern 3: Sneaking

So Jennifer, how about the next one?

Jennifer: [00:10:56] The third strategy is called sneaking. Sneaking is when apps tend to hide, disguise, or delay information that is relevant or valuable to users.

Evan: [00:11:07] We had an experience where we're using a new tool and some of the terms and services were not clear. They were on a separate page that I had not clicked on. And it kind of would have changed my thoughts about how we would have signed up for it and how we would have used it.

It was there. I could have clicked on it, but it was so deemphasized, it was stylized in such a manner that I didn't see it. And that was important information to make a final decision about signing up for it.

Jennifer: [00:11:33] Another example is user agreements. I don't know if you guys read user agreements, but I really don't because they're like 50 pages long and it's very hard to read every single sentence.

Ayan: [00:11:45] I try to, but in the end, you just feel overwhelmed and I just give up. 

Evan: [00:11:50] It's been a pretty rare case—where maybe I tried to look for how my data would be used and then I couldn't find it. It seems like it's written to my lawyer...

Jennifer: [00:12:00] As users it's hard for us to read everything and we just want to filter through to the thing that can actually reach our goal. Companies can take advantage of that to sneak information that is relevant to us, and that we don't see.

 

Pattern 4: Interface Interference

Evan: [00:12:14] How about the next one, Jennifer?

Jennifer: [00:12:15] The next strategy is called interface interference. This is referring to the manipulation of the user interface that privileges certain actions over others. Going back to our subscription examples, there sometimes are comparisons of, "Hey annually, you saved this much money versus monthly—you're not saving much money."

Evan: [00:12:36] Another thing that I've seen around that is sometimes in games if there's a micro-transaction where you can purchase something, they'll mark something as "This is the best deal! This is the one you should get!" And it's treated differently. It's styled differently. It looks enticing. It's actually not, it's not the best deal. It's just trying to tap into a shortcut of like, "Oh, I want to get the best deal. I want something that is financially good for me."

 

Pattern 5: Forced Action

Ayan: [00:12:57] So those are the five Jennifer, or is there one...? 

Jennifer: [00:12:59] There's one more! The one more that is most frustrating, in my opinion, out of the five is called forced action. It requires the user to perform a certain action to access certain functionality. For example, I came across this game, I was super excited to play this game and, was using my email and creating an account password—your regular creating-account design patterns—and at the very, very bottom of the popup, it says "Subscribe to our newsletter weekly" and it was already preselected. I am an inbox-zero person, so I don't want to receive their newsletters, so I unchecked it and when I tried to create this account, they didn't let me, and they actually highlighted the subscription newsletter box red to be like, "You have to subscribe to our newsletter in order to create this account." And that was very frustrating.

Evan: [00:13:48] That's another technique of like cognitive lock-in you've invested, you have some sunk costs in the experience. "Oh yeah. One last thing near the very end. We're going to either bundle a forced action of signing up or doing something else." And you're like, "I've come this far. I've got everything and I've configured this thing." or "I put it into my shopping cart. I don't want to have to redo this again. (sighs) Okay. Alright. I'll... let's do it." 

Ayan: [00:14:11] Exactly because you say, "Well, I came this far, like I did all of that. This is nothing in comparison, so, okay." 

Evan: [00:14:16] it's yeah. It's not a coincidence! It's not a coincidence! Someone thought about that. That is intentional in many cases and unfortunate.

 

Should These Patterns be Used?

Should UX designers use manipulative design patterns, dark patterns?

Jennifer: [00:14:30] Wow. That's the hard question. 

Evan: [00:14:33] Let's take an example: you have a prominent button on your webpage, and you want people to click it because it's basically the signup process, or maybe it's the purchasing button. Maybe you'll make it big, you'll give it a really saturated color and make it bright green or blue or something. You'll position it in a certain way that people come to expect a sign up—probably in the top, right of the page, using a design pattern. 

That's manipulation. That's subtly using human perception to get more signups and get more purchases and all that. Manipulation as a base term is not bad—that is part and parcel with being an effective designer.

Jennifer: [00:15:07] If you understand the consequences, you understand what you're getting the users into—it could be a valid reason to use manipulative design. For example, a lot of the fitness apps or habit- forming apps use manipulative design to hook users onto the product so that they can form a healthier habit. I think that is a case where these design patterns are used for good.

Evan: [00:15:30] But what about the situation where you throw up several dialogs or the way that you've written the microcopy in a particular part of the process is not speaking accurately to what is happening? And what actually happens has not been clearly communicated before they clicked on it? Now you've gone somewhere else.

Ayan: [00:15:50] I guess it's based on how far are we being manipulated? 

Jennifer: [00:15:53] I think a UX designer, with different ethical values might have different answer for this question, but from my personal perspective, I think a UX designer should try to avoid using dark patterns. And my reasoning is with the power of design we carry a great amount of social responsibility. It is our responsibility to advocate for users and to try to create a safe and healthy environment for our users instead of abusing this power. 

Ayan: [00:16:24] I also agree that it really depends on the designer. It depends on what their values are, if they're fully conscious of the effects of what they're designing and how can that work. 

I think we really should just reflect more than just the next sprint or just releasing this product and think about how will this affect the users—and most importantly—the hands of who we're putting these tools into. And if using manipulative design is actually the best method of getting what they need to be done.

 

Designer Ethics

Evan: [00:16:51] If ethics were easy, we'd all be saints, it would be commonplace, but it's not. You're an influencer, you're a persuader, but you rarely ever actually control the direction of the ship. And you do have to accept that kind of loss of control, but it doesn't mean you're not completely out of control. You have other choices you can make too, and it may be leaving, maybe getting off the ship. It would be a terrible situation to be confronted with an obstinate business owner or someone who wants to embark on this path and start using these tools to retain users or to maintain revenue. 

Jennifer: [00:17:24] That's a great point, and in addition to thinking about that sometimes designers are just the implementers and they don't really have the power to control the strategy or the business needs, what can we do as designers when we are asked to create manipulative design patterns?

Evan: [00:17:42] One thing that I encourage all designers, is you need to do some introspection yourself about your own values. You need to have things that I'm willing to do. I feel good about as a professional and things that you're not. It sounds really basic, but you'd be surprised at the number of people who don't really reflect and think about where they want to put their professional energy into. And how do they feel about it? The social impact, on their community, on themselves. What are your values? What do you want to put out into the world? 

It's not the entirety of this problem, but that may definitely kind of set you on a course to either pursue different types of opportunities or industries or even moments where you are confronted by a boss about: "I want you to make this harder."

Another thing that you may want to consider is the role of qualitative research and testing to be like, " Hey look, can we get some feedback? I'll design it. We'll create a concept... can we get some feedback from our user community about this experience? Let's have someone delete the service." What will come out of that?

If you designed a misleading approach that is disrespectful to the user's needs and goals, it's probably will come up pretty quickly through even a couple of sessions, about people's feedback on that and the erosion of trust. You could use something like that to maybe persuade your colleagues, "Look, this seems to be a very negative experience and people are very soured by it. Should we continue to do this?"

Ayan: [00:19:09] It also can provide a company with an opportunity to better understand what's not working in their product to improve it. So if I'm there to unsubscribe, that means that I'm not satisfied with the service or product that I'm paying for. So the company, rather than forcing the person to stay, that could be a really great opportunity to be like, "Hey, we're sorry to see you leave. Do you mind that we ask you a few questions to help us understand why you're leaving?" And perhaps rather than pushing that user into the corner where they feel imprisoned, they might be like, "Oh wait, they actually care about my opinion." They feel that they can participate and provide their time and their opinions to improve that service.

Evan: [00:19:45] That's a brilliant point. This is an opportunity to learn how to improve your product and service to serve your customers. It may not be a pleasant conversation, it may be a hard one to negotiate or make happen, but that's certainly probably going to give you richer insights about improving your offerings and being a more successful company than trapping them and misleading them.

Jennifer: [00:20:05] I think that's a great point. We're talking about the short-term gains versus long-term effect. So, if companies can take this as an opportunity to reflect on themselves and improve their product, I think long-term it has huge benefits for their brand reputation or their user base. They might be able to explore other user-based options too.

Evan: [00:20:27] Ultimately, it's feedback. It's feedback that helps everyone grow, not just companies, not just businesses and apps, but people too. And by using these techniques, you are creating a filter. You're creating a blocker to honest feedback that could really help inform your product strategy, the next direction, things to change about how your company operates. And that's a real unfortunate thing.

 

Legality

Ayan: [00:20:51] When I'm going through all these loopholes and all of these moments of frustration and time being wasted the one thing that comes to my mind is, "Is this legal, what I'm being forced to do? Are there laws out there to protect me or others who are going through this?"

So, I know in May, 2018 in Europe, where I'm a resident, there was the general data protection regulation that was placed stating that any company providing it services or products in Europe had to apply to the different restrictions that have been placed. It's been over two years now. And it still seems that there's issues where we as citizens in Europe are not fully protected. Like, there is that regulation there, but I'm just curious of how strong it is and how easy it is for us in Europe, or even you guys in America to be protected against these dark patterns. 

Jennifer: [00:21:41] I think from my research, there currently aren't really laws that prohibits the usage of dark patterns, but there are definitely a couple laws that are created to protect user privacy and restrict companies from using the dark patterns. So one of them is what you mentioned earlier, GDPR. It requires users informed consent to process their personal information to each usage of their personal information. And that tackles a couple of the strategies that we talked about earlier, like misdirection or sneaking.

Another law that UK proposed in April 2019 talks about that restrict operation of social networking services. When they're used by minors, it prohibits using nudges to draw users into options that have low privacy settings. There's one more law that I've found that is related to this topic There is the DETOUR act, which stands for Deceptive Experiences To Online User Reduction that would make it illegal for companies with more than a hundred million monthly active users to use dark patterns when seeking consent to user personal information. So, I definitely think that we're taking steps to understand how data privacy works on the internet and to protect our users when they share their personal information. 

Ayan: [00:22:59] My question is then who is going to be regulating and applying these? Because—I don't know if you guys remember when Mark Zuckerberg testified at Senate—the senators were not fully conscious or aware of technology, so they were asking, very simple questions, like "What is the internet?" And those are the people who are supposed to be protecting us at the same time. So, whose role will be to actually make sure that these laws are put in place and being used to protect us?

Evan: [00:23:27] I think I see on the horizon the creation of regulatory bodies around some of the most egregious forms of dark pattern implementation, particularly around user data or mental health. 

I do see that there is a legal obligation that may become the responsibility of the organization, but going to be definitely in the hands of the designer that you will need to comply with, because if you don't, your company could be fined or it could be put out of business as more countries, start to understand the insidious nature of some of these patterns.

That's a distinctly different world than we find today, which is like "Design whatever you want. Do whatever you like, there's no consequence" In some decades time, that will not be the case and you'll need to learn and understand some of the rules around these laws.

Ayan: [00:24:17] Fun fact, actually, in France, there is the National Commission of Informatics and Liberty, which has been created in 1978 to protect citizens and their privacy. So people who have any issues can contact the—in French, it's called the Comité—and they can state any issues that they found of their privacy being abused. 

So I agree with you, Evan. I think that will become more and more common currently, actually, similar with design and also tech is that it's like the wild, wild West. We just had a free for all. And now we're saying, “Oh wait, this is actually quite powerful. Maybe we should put some laws and regulations to manage this.” So yeah, let's see what the future brings.

Evan: [00:24:57] I already see the students getting excited about their law class in design school. Whoa boy, they're going to be so excited. (laughs)

Ayan: [00:25:06] That's actually a really great idea, but I think, as you're saying, Evan, we're in this space of where things are just being defined. And I think it's the same thing for the General Data Protection Regulation. And I know in California, they're also considering applying similar laws there as well. So, I think it's just going to take time, but...

Jennifer: [00:25:24] Well, good news is we had to take a course specifically on ethics. That was a part of our required education, so we definitely have conversations like that. In our mind, at least like we are trained to think about, "Why do you do this?" or "What are some consequences" before you actually jump into designs. So—good news—people are getting more educated about it and are becoming more aware of it. 

Ayan: [00:25:47] We spoke earlier about accountability on the designer's behalf, but I think there also should be accountability on the humans who are interacting with these tools as well—to question the tools and to also keep in mind that the tools that we have integrated in our everyday life might not always have our best interests in mind.

Jennifer: [00:26:06] It's funny. I was talking to my roommate and I asked her, "So do you know what dark pattern is? Or have you heard of filter bubble?" which is another concept that is related to dark patterns. And she was like, "What are you talking about?" 

And that's surprising. You think about how if people are outside of design industry or tech industry, how little they know about internet or product manipulation, design pattern, and stuff like that.

So, it is definitely also users' responsibility to be educated to be informed, to read about our patterns. And when they see dark patterns, speak up and let the companies know the consequences and that users are not that easy to trick, to mislead. 

 

Jennifer's Exit

Evan: [00:26:48] Well, Jennifer, this was a very educational discussion on a lot of responsibility for a tough topic. Thank you so much for coming out today and talking about this with us.

Ayan: [00:26:57] Thank you, Jennifer. It was great to talk to you and learn more about these patterns that exist and are very prevalent in our field as designers.

Jennifer: [00:27:05] Thank you guys so much for this amazing conversation today. I'd love to continue this conversation with anyone who is a designer and expand my professional network. You can reach me at my LinkedIn profile—just search Jennifer Li—and we'll get connected.

 

The Term "Dark" Patterns

Evan: [00:27:36] So I have a bit of an issue. We've been talking a lot about dark patterns through this really interesting discussion. And I think in this date and time right now, and this increasing awareness in the tech industry we've heard things about this terminology of black and white of dark and light. It bothers me a little bit. I feel a little bit of a tinge of something and I can't put my finger on it. And I'm not ascribing Harry Brignull had any ill intent about calling these unethical behaviors, "dark patterns", but it just bothers me a little bit. What about you? 

Ayan: [00:28:07] I'm with you a bit on that one, Evan. I feel it kind of irks me as well, because as I was thinking about that, I said, "Okay, so what is the opposite of dark patterns? What is the antonym of dark patterns?" It's just patterns. We're not saying, "Oh, we should be putting light patterns or good patterns into this." We're clearly going to the other spectrum and saying "dark" and then going to that connotation where dark as a negative thing.

For me, there's also the question that is we are trying to describe this to designers as well, but there's also the public we want to become more conscious of these things that are happening in their daily interactions with technology. And I find that the word "dark" is too obscure. It doesn't give enough of a description for them to fully understand.

I like what we used—"manipulative patterns"—because it's more emotional, saying, "Oh wait, I don't want to be manipulated." I think it makes it easier to understand what those actions are because "dark" it's... it's not too precise, "dark". I think we can go deeper and use a different word that better describes what is happening. 

Evan: [00:29:08] We're talking about a negative behavior, unethical behavior, something detrimental. We could come up with a different term that basically communicates the same idea. We're not going to change the world. We're not going to heal it by using a different term for an esoteric design concept. But tiny little steps like this, or even having dialogue and conversations with your colleagues and your peers. Hey, that can help a little bit. Maybe in time there'll be dawning awareness of how different professional terms that we use how other people feel about them and maybe find a better way. 

Ayan: [00:29:43] We can make a survey, if you want. As fellow designers...

 

Closing

Evan: [00:29:49] That's going to do it for this episode of the Ascend UX Podcast. We'd love your questions and feedback. Please send them to ascendux@pros.com. Also, rate, follow subscribe, or leave comments on Apple Podcasts, Spotify, or whichever service you're using to listen to us. We definitely appreciate it. 

Ayan: [00:30:08] Thank you! Until next time, guys...