WEBVTT

00:00.000 --> 00:16.620
I'm going to introduce the next speaker, some of you that have been here since we've

00:16.620 --> 00:20.560
been going to give this a technical overview of where do you have to prepare them?

00:20.560 --> 00:23.460
This is Paralyze and there is Paralyze and there is Paralyze and there's a potential fact

00:23.460 --> 00:29.680
designing for trust and safety in the age of military technology, Paralyze and Mayze, and

00:29.680 --> 00:34.140
Mayze working with Paralyze, and here to space around the climate research is a fantastic

00:34.140 --> 00:39.680
architect, a little bit of a variety of sincerity at my arm, and I'm going to do a joint

00:39.680 --> 00:51.840
tool to work with it, take it away, let's try this one, yeah, I'm excited to be here, and

00:51.840 --> 00:55.160
I'm going to try another one, so they are going to talk about the signing for trust and safety,

00:55.160 --> 01:01.200
but specifically how designing for advocacy and security is a chain on my arm, and

01:01.200 --> 01:13.320
I'm going to say, I'm going to put out a communication and research and design, we are a

01:13.320 --> 01:19.260
think that we are facing the interconnects, we are ready to take the network, you know,

01:19.260 --> 01:37.120
we also do a lot of research, as well, into security design, a lot of the stuff we do,

01:37.120 --> 01:45.120
if you do a collaboration, yeah, always open for that, and so what I'm going to talk about,

01:45.120 --> 01:51.360
and I also need a resource design for it, is this important layer of design, so how product

01:51.360 --> 01:56.640
is design deeply matters for a year, can we do it to understand what I'm about to do,

01:56.640 --> 02:02.840
so we think of design and communication as a transition layer, can take what the policy of, let's

02:02.840 --> 02:07.840
say, if we can do that, the software, product, the technology, how it's built, and I really

02:08.080 --> 02:12.840
should translate it for facility, data experience, it's something that we just need to understand,

02:12.840 --> 02:17.800
the development of model around how that product functions, but design can be an unrivaled

02:17.800 --> 02:23.200
area, it can suppress content, it can let choose other content that's not about it, it's

02:23.200 --> 02:29.040
designed much like technology, it might be you, it can mislead you, and that's one of the

02:29.040 --> 02:35.840
things I also want to get into, and how that's like how design, the harmful design pattern

02:36.080 --> 02:44.080
it's designed to design patterns which are closely tailored and do it in fact,

02:45.960 --> 02:54.400
So I can even understand whether it's a bubler that has studied their technology so as to

02:54.400 --> 02:57.360
be good for modeling and understanding, the)]

03:01.060 --> 03:05.320
I'm going to get to trust as a user's cabinet, come into thisames round of which

03:05.320 --> 03:09.320
And it's also how much they can believe it's not a different one.

03:09.320 --> 03:11.320
This is very expensive.

03:11.320 --> 03:15.320
So you can be about reducing exposure escalation impact tomorrow.

03:15.320 --> 03:19.320
And so design where this later comes to the articulated.

03:19.320 --> 03:22.320
And again, this is our articulated two of these, right?

03:22.320 --> 03:27.320
Trust me, for the design or within the design player.

03:27.320 --> 03:29.320
I'm going to mislead you to see the design.

03:29.320 --> 03:33.320
And it's for filling on, like on this top.

03:33.320 --> 03:37.320
This design can also create your education.

03:37.320 --> 03:41.320
It's not particularly a professional student.

03:41.320 --> 03:43.320
It's not like I'm particularly designed.

03:43.320 --> 03:47.320
But I can't even do this with the importance of design.

03:47.320 --> 03:52.320
So then, if I wanted to talk about how that basically happened.

03:52.320 --> 03:55.320
So, for whole design patterns, often called the end of our pattern,

03:55.320 --> 03:57.320
except the design pattern.

03:57.320 --> 03:59.320
Our design patterns, that we understand,

03:59.320 --> 04:02.320
are potentially mislead to the community's knowledge,

04:02.320 --> 04:05.320
or how consumer-making decision, they normally look good.

04:05.320 --> 04:07.320
How many of you have tried that?

04:07.320 --> 04:09.320
I'm saying, scribe, we've got our product.

04:09.320 --> 04:11.320
We've got ourselves a scribe based later.

04:11.320 --> 04:13.320
We've got the idea of it.

04:13.320 --> 04:14.320
Yes.

04:14.320 --> 04:15.320
How do you get rid of it?

04:15.320 --> 04:18.320
I think you better to make the key to the arrow down.

04:18.320 --> 04:21.320
So, what's it to the trend of finding option at the moment?

04:21.320 --> 04:23.320
Very difficult.

04:23.320 --> 04:25.320
Either example to the point.

04:25.320 --> 04:29.320
But if you also include things like endless scroll,

04:29.320 --> 04:34.320
try to find a security tool or setting that's easy to get rid.

04:34.320 --> 04:35.320
Or according to the tools,

04:35.320 --> 04:37.320
if you will have to do a keychain,

04:37.320 --> 04:40.320
so if you're kind of a keychain on a platform,

04:40.320 --> 04:42.320
it's like someone's trying to contact you,

04:42.320 --> 04:43.320
but not yet.

04:43.320 --> 04:45.320
If you're also doing what you do,

04:45.320 --> 04:47.320
if you have to do the ecosystem,

04:47.320 --> 04:49.320
if you have a bit on it,

04:49.320 --> 04:51.320
then you can do it.

04:51.320 --> 04:54.320
So, these are all three different patterns.

04:55.320 --> 04:57.320
I've four of them, like hundreds of them,

04:57.320 --> 05:00.320
are placing the best intentions of the company.

05:00.320 --> 05:02.320
But I got two different things.

05:02.320 --> 05:05.320
So, here's the final one.

05:05.320 --> 05:07.320
I'm going to make quite a few.

05:07.320 --> 05:09.320
Is the training that you can see

05:09.320 --> 05:10.320
or is the training that you can see?

05:10.320 --> 05:12.320
Or is the training that you can see?

05:12.320 --> 05:15.320
Or is the training that you can see?

05:15.320 --> 05:17.320
No, I think this happens,

05:17.320 --> 05:19.320
but the ones you can see,

05:19.320 --> 05:21.320
they're not going to be like that.

05:22.320 --> 05:24.320
We're really going to have to do the training.

05:24.320 --> 05:26.320
And this is the framework of that.

05:26.320 --> 05:31.320
This is a thing that I would say

05:31.320 --> 05:33.320
in the presentation, which is a keychain,

05:33.320 --> 05:35.320
which is a keychain that you can see here.

05:35.320 --> 05:37.320
This is the keychain that you can see here.

05:37.320 --> 05:39.320
It's a free action to practice this into a 4K,

05:39.320 --> 05:41.320
which is a Dutch core exercise.

05:41.320 --> 05:42.320
It's a free-up.

05:42.320 --> 05:45.320
I think that is 12, which is a keychain.

05:45.320 --> 05:47.320
It's a one-cut-of-out.

05:47.320 --> 05:49.320
Well, we started looking at

05:50.320 --> 05:51.320
the recommended systems.

05:51.320 --> 05:53.320
Under the digital service tab,

05:53.320 --> 05:55.320
algorithmic, recommend your systems.

05:55.320 --> 05:58.320
Like, say, the algorithmic keychain

05:58.320 --> 05:59.320
is based on your Instagram.

05:59.320 --> 06:02.320
How to have a non-confident option.

06:02.320 --> 06:04.320
Most importantly, you should make that

06:04.320 --> 06:07.320
for a free period of chronological time.

06:07.320 --> 06:09.320
But it's hard to find

06:09.320 --> 06:11.320
but chronological time.

06:11.320 --> 06:13.320
Actually, I'm basically interested

06:13.320 --> 06:14.320
in this webinar.

06:14.320 --> 06:17.320
Our complaint was outlined in here.

06:17.320 --> 06:19.320
And so, it's algorithmic.

06:19.320 --> 06:21.320
It's algorithmic.

06:21.320 --> 06:23.320
It's very hard to find a line

06:23.320 --> 06:25.320
to go over the problem of the one.

06:25.320 --> 06:27.320
And so, we found that, notice that

06:27.320 --> 06:29.320
I highlighted a couple of articles with the DSA,

06:29.320 --> 06:31.320
because I also learned a whole bunch of patterns.

06:31.320 --> 06:34.320
So, the theory of the problem of the timeline

06:34.320 --> 06:36.320
is dark pattern.

06:36.320 --> 06:39.320
And then the ability to actually set

06:39.320 --> 06:40.320
to work on.

06:40.320 --> 06:42.320
Add it to the problem of the timeline.

06:42.320 --> 06:45.320
Also, by the way, let's do this.

06:46.320 --> 06:48.320
Everyone is a dark pattern.

06:48.320 --> 06:50.320
It has those D for some periods.

06:50.320 --> 06:52.320
Right, it's chronological D.

06:52.320 --> 06:54.320
It's a critical time.

06:54.320 --> 06:55.320
It's not set.

06:55.320 --> 06:58.320
It's a part of our research.

06:58.320 --> 07:02.320
These are showing me how to teach you

07:02.320 --> 07:04.320
that that's the algorithm of the timeline.

07:04.320 --> 07:07.320
So, that's the case with that.

07:07.320 --> 07:09.320
So, I just hope that this is fine.

07:09.320 --> 07:11.320
It's not easy to follow.

07:11.320 --> 07:13.320
It's easy to follow.

07:14.320 --> 07:17.320
I'm like, you see all the multiple steps.

07:17.320 --> 07:18.320
Right?

07:18.320 --> 07:19.320
This steps, this version,

07:19.320 --> 07:21.320
combined with the code's ball,

07:21.320 --> 07:23.320
which you know where to find it.

07:23.320 --> 07:24.320
Under this very setting,

07:24.320 --> 07:27.320
you know that you can set the dark pattern.

07:27.320 --> 07:30.320
And again, by the way, it's the TSA.

07:30.320 --> 07:32.320
Dark patterns can be traced

07:32.320 --> 07:34.320
with a more UX, at least issues.

07:34.320 --> 07:37.320
But you want to think of them as safety issues.

07:37.320 --> 07:40.320
Like, having a conflict that can play flow,

07:40.320 --> 07:42.320
vary,

07:42.320 --> 07:44.320
you know, tools that exist in a virtual reach,

07:44.320 --> 07:45.320
in a space.

07:45.320 --> 07:46.320
By the way,

07:46.320 --> 07:48.320
how many users should write

07:48.320 --> 07:50.320
so you can see what kind of artwork you also can see.

07:50.320 --> 07:51.320
Right?

07:51.320 --> 07:53.320
These are not just the method of the glitches,

07:53.320 --> 07:56.320
but the safety ones as well.

07:56.320 --> 07:57.320
So,

07:57.320 --> 07:59.320
this is where, like,

07:59.320 --> 08:00.320
these are difficult to get,

08:00.320 --> 08:01.320
and it is the design.

08:01.320 --> 08:03.320
Which is part of the design flow.

08:03.320 --> 08:04.320
Right?

08:04.320 --> 08:05.320
This is where it's flying.

08:05.320 --> 08:08.320
This is creating the space of the mark.

08:09.320 --> 08:10.320
So,

08:10.320 --> 08:12.320
I want to show that as an example,

08:12.320 --> 08:13.320
just to sort of, say,

08:13.320 --> 08:14.320
design,

08:14.320 --> 08:16.320
can have a sort of weaponized space.

08:16.320 --> 08:17.320
That's actually it.

08:17.320 --> 08:19.320
And in the research we do,

08:19.320 --> 08:22.320
we can be think of that this work would

08:22.320 --> 08:23.320
turn pattern in the circles.

08:23.320 --> 08:24.320
I said,

08:24.320 --> 08:25.320
when I said,

08:25.320 --> 08:26.320
when I said,

08:26.320 --> 08:27.320
when I said,

08:27.320 --> 08:28.320
when I said,

08:28.320 --> 08:29.320
when I said,

08:29.320 --> 08:30.320
when I said,

08:30.320 --> 08:31.320
when I said,

08:31.320 --> 08:32.320
when I said,

08:32.320 --> 08:33.320
when I said,

08:33.320 --> 08:34.320
when I said,

08:34.320 --> 08:35.320
when I said,

08:35.320 --> 08:36.320
when I said,

08:37.320 --> 08:39.320
when I said,

08:39.320 --> 08:40.320
I'm just saying,

08:40.320 --> 08:45.320
when I said.

08:45.320 --> 08:46.320
When I said,

08:46.320 --> 08:48.320
I'm just saying,

08:48.320 --> 08:49.320
you're not trusted,

08:49.320 --> 08:51.320
but I'm just saying,

08:51.320 --> 08:53.320
though it's true

08:53.320 --> 08:56.320
and it might be true probably,

08:56.320 --> 08:57.320
but it might be true it better's,

08:57.320 --> 08:58.320
if it is true and it may be true,

08:58.320 --> 09:00.320
and it may be true.

09:00.320 --> 09:02.320
and it may be true,

09:02.320 --> 09:05.320
and it may be because we are trained to do this thing,

09:05.320 --> 09:09.320
It's very, very smart, and that behavior is really,

09:09.320 --> 09:13.320
the ability to define those cool settings,

09:13.320 --> 09:16.320
but also the ability to do it, too.

09:16.320 --> 09:18.320
And the trusted takeover we've been doing,

09:18.320 --> 09:21.320
when you see privacy and security design

09:21.320 --> 09:23.320
well safety and proofs.

09:23.320 --> 09:25.320
And in order for folks to be targeted,

09:25.320 --> 09:27.320
if you have much more of a lot of settings

09:27.320 --> 09:29.320
for ground training than if you do it.

09:29.320 --> 09:32.320
You can make it easier for folks to report

09:32.320 --> 09:35.320
and escalate up use to content moderation,

09:35.320 --> 09:37.320
system to press and safety teams.

09:37.320 --> 09:39.320
We have more transparent systems,

09:39.320 --> 09:42.320
conspiracy, understand what the calorie is using.

09:42.320 --> 09:46.320
All of this really relates to the area of safety,

09:46.320 --> 09:47.320
per consumer.

09:47.320 --> 09:49.320
So we're able to see security come together

09:49.320 --> 09:53.320
to create an area where we can move the trust and safety.

09:53.320 --> 09:56.320
So spread it to the work that we're currently doing

09:56.320 --> 09:57.320
and just watching it.

09:57.320 --> 09:59.320
You might have been working with a German,

09:59.320 --> 10:01.320
and you don't want to keep moving.

10:01.320 --> 10:03.320
You can help each other to build out the definition

10:03.320 --> 10:06.320
of what we're going to see by design.

10:06.320 --> 10:08.320
You can approach them with their allergy.

10:08.320 --> 10:09.320
That's interesting.

10:09.320 --> 10:11.320
We're going to do some of those things.

10:11.320 --> 10:12.320
We're going to do some of those things.

10:12.320 --> 10:15.320
We're going to verify security by design.

10:15.320 --> 10:16.320
We're going to do some security by design.

10:16.320 --> 10:18.320
We're going to do some security in the beginning.

10:18.320 --> 10:20.320
And you're also thinking about it.

10:20.320 --> 10:21.320
All throughout the product.

10:21.320 --> 10:23.320
I'm going to send a link to our slides.

10:23.320 --> 10:25.320
Then I'm going to delete your centric.

10:25.320 --> 10:27.320
We're going to talk to them if you should provide it.

10:27.320 --> 10:29.320
You can see if you have to be the same thing.

10:29.320 --> 10:30.320
Thank you.

10:30.320 --> 10:31.320
Also, we're going to do all that.

10:31.320 --> 10:32.320
Very important.

10:32.320 --> 10:33.320
A lot of works.

10:33.320 --> 10:34.320
Been ahead of you.

10:34.320 --> 10:35.320
If you're centric.

10:35.320 --> 10:36.320
Safety.

10:36.320 --> 10:37.320
From all the way to the beginning.

10:37.320 --> 10:39.320
After saying as much.

10:39.320 --> 10:40.320
We're responding to it.

10:40.320 --> 10:41.320
You are.

10:41.320 --> 10:43.320
We'll see if we can do that.

10:47.320 --> 10:48.320
Yeah.

10:48.320 --> 10:49.320
So.

10:49.320 --> 10:50.320
Some of the things we've highlighted.

10:50.320 --> 10:51.320
I want to be right there.

10:51.320 --> 10:53.320
I'm not thinking of design.

10:53.320 --> 10:54.320
I'm not going to open it here.

10:54.320 --> 10:57.320
I'm going to come in to this part.

10:57.320 --> 10:58.320
So design.

10:58.320 --> 11:00.320
I'm going to find the technology.

11:00.320 --> 11:01.320
I'll see.

11:01.320 --> 11:02.320
We'll create.

11:02.320 --> 11:03.320
Safety.

11:03.320 --> 11:04.320
I'm not going to.

11:04.320 --> 11:05.320
Safety.

11:05.320 --> 11:06.320
So.

11:06.320 --> 11:07.320
So.

11:07.320 --> 11:08.320
We're going to.

11:08.320 --> 11:09.320
We're going to.

11:09.320 --> 11:10.320
So.

11:10.320 --> 11:11.320
So.

11:11.320 --> 11:12.320
So.

11:12.320 --> 11:13.320
So.

11:13.320 --> 11:14.320
So.

11:14.320 --> 11:15.320
So.

11:15.320 --> 11:16.320
So.

11:16.320 --> 11:17.320
So.

11:17.320 --> 11:18.320
So.

11:18.320 --> 11:19.320
So.

11:19.320 --> 11:20.320
So.

11:20.320 --> 11:21.320
So.

11:21.320 --> 11:22.320
So.

11:22.320 --> 11:32.320
So.

11:32.320 --> 11:37.320
So.

11:37.320 --> 11:38.320
So.

11:38.320 --> 11:39.320
So.

11:39.320 --> 11:41.320
So.

11:41.320 --> 11:42.320
Size.

11:42.320 --> 11:43.320
So.

11:43.320 --> 11:44.320
So.

11:44.320 --> 11:45.320
So.

11:45.320 --> 11:49.320
So.

11:49.320 --> 11:52.380
For the user events, he will have, as we work at it,

11:52.380 --> 11:54.560
just after we're going to pilot something.

11:54.560 --> 11:58.560
Here are some of the things that we researched yesterday.

11:58.560 --> 12:01.600
Or I'm going to be the easier to decide kind of like

12:01.600 --> 12:04.600
a Facebook list for example, if you can see what content

12:04.600 --> 12:07.040
you're going to be doing here, more of a Facebook

12:07.040 --> 12:08.720
or Skype, there's a lot of people who want some

12:08.720 --> 12:11.120
moderation in my opinion, but then what

12:11.120 --> 12:13.880
is all different types of social products, how

12:13.880 --> 12:18.920
that, for example, we have.

12:18.920 --> 12:22.800
So part of what we started to pull out

12:22.800 --> 12:24.680
was a project, what it's had.

12:24.680 --> 12:27.640
It's a group of elements, say, even more

12:27.640 --> 12:29.160
than I should do, right?

12:29.160 --> 12:32.040
Eight years, articles, and methodology that

12:32.040 --> 12:34.600
had existed, right now, I've been published over

12:34.600 --> 12:36.560
the past 20 years.

12:36.560 --> 12:39.520
So one of the useful things I'm going to be doing

12:39.520 --> 12:40.880
and how to be shown on a project, I

12:40.880 --> 12:43.120
call trust-and-triggering, between we

12:43.120 --> 12:46.200
were looking at, asking if everybody else were

12:46.200 --> 12:47.200
to be single.

12:47.200 --> 12:48.560
The answer is to realize that you're going to be

12:48.560 --> 12:52.400
able to be able to produce about 30 different countries.

12:52.400 --> 12:54.600
One thing that can out with this idea of social

12:54.600 --> 12:58.000
product, is that those two linear plates have been set.

12:58.000 --> 13:00.920
So, right now, if you finalize your report,

13:00.920 --> 13:03.000
you file a review for response.

13:03.000 --> 13:04.600
Why is it going to be more of a circular system

13:04.600 --> 13:05.960
that you have been shooting?

13:05.960 --> 13:08.320
We are going to draft a report or go get close.

13:08.320 --> 13:10.000
Characters of the plates, and bring that in,

13:10.000 --> 13:12.920
and then to another portion of the potting

13:12.920 --> 13:14.640
I do something at the end.

13:14.640 --> 13:17.040
Of all of the time, I don't know if someone

13:17.040 --> 13:19.040
is using a task to progress in the report,

13:19.040 --> 13:22.800
but I don't know if it's going to be able to do that.

13:22.800 --> 13:25.880
You're writing more than a solution for an operation.

13:25.880 --> 13:28.640
So you need to break out of the server, or hold it.

13:28.640 --> 13:32.040
Okay, do you find a specific person?

13:32.040 --> 13:35.240
Pretty much as well, I'm asking the issue of tools.

13:35.240 --> 13:36.640
And if they think of the strategy,

13:36.640 --> 13:39.080
like this piece of consent, and more consent,

13:39.080 --> 13:42.960
the process of reporting, so like, ensuring that,

13:42.960 --> 13:45.720
no, it's one rather than what we've said.

13:45.720 --> 13:47.280
We've got it, but if you're a similar one,

13:47.280 --> 13:48.880
it's not rejected.

13:48.880 --> 13:51.760
We'll talk about it in a minute.

13:51.760 --> 13:53.320
Oh, is it not working?

13:53.320 --> 13:56.800
I don't know, I don't know, I don't know.

13:56.800 --> 13:58.400
I don't know.

13:58.400 --> 13:59.120
I know, you don't know.

13:59.120 --> 14:02.560
And that, oh, it hasn't up in recording this whole time.

14:05.760 --> 14:08.160
I guess a whipping fee a technology conference

14:08.160 --> 14:11.760
without a technology issue.

14:11.760 --> 14:15.320
Dang, that's okay, it's fine.

14:15.320 --> 14:18.080
Anyway, I'll just keep going.

14:18.080 --> 14:22.000
Anywho, you all look a little low energy

14:22.000 --> 14:24.200
because it's such an uplifting topic

14:24.200 --> 14:28.640
that I'm talking about, or is it because it's hot in here?

14:28.640 --> 14:30.600
Maybe it's both.

14:30.600 --> 14:34.040
Anywho, I guess, jumping back into this.

14:34.040 --> 14:35.640
For designers and for companies,

14:35.640 --> 14:37.240
one of the also big recommendations

14:37.240 --> 14:39.760
is can you take trauma-informed design principles?

14:39.760 --> 14:40.880
If you're wondering what those are,

14:40.880 --> 14:44.400
we are going to discuss them on the next slide,

14:44.400 --> 14:47.160
conducting threat modeling during development phases.

14:47.160 --> 14:48.600
Is this something from our research?

14:48.600 --> 14:50.720
We also saw that a lot of trust and safety teams

14:50.720 --> 14:52.680
either didn't know what that was,

14:52.680 --> 14:54.160
or not actively conducting.

14:54.160 --> 14:57.480
They were sort of designing for a handful of personas

14:57.480 --> 15:01.280
versus sort of taking a much more expansive view.

15:01.280 --> 15:03.440
And a big one, prioritizing user safety

15:03.440 --> 15:05.560
over engagement metrics.

15:05.560 --> 15:08.160
So again, really thinking about that some folks might be

15:08.160 --> 15:10.280
posting content that they don't want a lot of people

15:10.280 --> 15:12.400
to look at, and that's OK, and that's their choice,

15:12.400 --> 15:14.840
and that's also a safety choice.

15:14.840 --> 15:17.280
And including marginalized communities

15:17.280 --> 15:21.880
in the design process, one of the other elements

15:21.880 --> 15:25.160
of this research is chains, trauma-informed design principles

15:25.160 --> 15:28.520
where a big fan of them, they're a UK-based NGO.

15:28.520 --> 15:32.440
And so their trauma-informed design principles

15:32.440 --> 15:35.040
are foundational to everything they do in build.

15:35.040 --> 15:37.040
They also build software.

15:37.040 --> 15:41.800
And so for them, this is where they're experiencing,

15:41.800 --> 15:43.320
they're building around the experiences

15:43.320 --> 15:45.320
of survivors of gender-based violence.

15:45.320 --> 15:48.040
That also includes folks facing intimate partner

15:48.040 --> 15:50.720
violence, for example.

15:50.720 --> 15:54.600
For them, and this is all read from their principles.

15:54.600 --> 15:56.040
So this is what they've written.

15:56.040 --> 15:58.160
This is not my writing.

15:58.160 --> 15:59.880
So they write quote.

15:59.880 --> 16:01.880
Survivors are not just users.

16:01.880 --> 16:03.720
They are people navigating trauma.

16:03.720 --> 16:05.480
They're people in survival mode.

16:05.520 --> 16:07.800
So keep us in mind, you're designing a,

16:07.800 --> 16:09.800
you're when you're designing that service.

16:09.800 --> 16:12.440
If not designing with survivors in mind,

16:12.440 --> 16:14.360
it could be triggering trauma-in-packs

16:14.360 --> 16:16.320
all areas of grain.

16:16.320 --> 16:17.840
So what does this mean in practice?

16:17.840 --> 16:20.600
So safety, quick exit buttons,

16:20.600 --> 16:23.040
avoid overloading survivors,

16:23.040 --> 16:26.440
simple designs to ensure emotional safety

16:26.440 --> 16:27.880
and lessen fatigue.

16:27.880 --> 16:31.640
Fewer clicks, meaning like do not bury info.

16:31.640 --> 16:33.600
Privacy is non-negotiable.

16:33.600 --> 16:37.480
So data minimization, the ability to delete

16:37.480 --> 16:39.200
an account and data at any time,

16:39.200 --> 16:40.920
and that has to be prioritized

16:40.920 --> 16:43.120
from the outside of building, right?

16:43.120 --> 16:45.920
So again, easy deletion, accountability.

16:45.920 --> 16:48.800
So explaining our decisions, transparency,

16:48.800 --> 16:51.840
open and publicly, trust is built this way.

16:51.840 --> 16:55.640
Constantly measuring the impact when you release something,

16:55.640 --> 16:57.240
so how you know it's going to work

16:57.240 --> 16:58.720
and how many mechanisms.

16:58.720 --> 17:01.080
And I think within that, they also go further

17:01.080 --> 17:03.440
and describe impact for, again,

17:03.440 --> 17:06.200
very specific impacted groups, not just every user,

17:06.200 --> 17:08.440
is at working also with the most vulnerable

17:08.440 --> 17:10.920
or the most marginalized.

17:10.920 --> 17:13.680
And I also want to sort of share some work

17:13.680 --> 17:16.240
from this other group called PIN America.

17:16.240 --> 17:17.880
Again, they work a lot with journalists.

17:17.880 --> 17:19.920
And so there's three reports we looked at.

17:19.920 --> 17:22.160
One was called the OXQS for Abuse.

17:22.160 --> 17:24.320
And within OXQS for Abuse,

17:24.320 --> 17:27.120
there's some of the specific things they found

17:27.120 --> 17:31.680
where, so they're trying to encourage users rather

17:31.680 --> 17:34.200
than just punish abusers.

17:34.200 --> 17:35.960
Some of the specific design suggestions

17:35.960 --> 17:39.040
they found from interviewing human rights experts,

17:39.040 --> 17:40.480
designers, technologists, and journalists,

17:40.480 --> 17:43.880
are things like a shield and dashboard system.

17:43.880 --> 17:46.440
So a way to quarantine a piece of content

17:46.440 --> 17:48.560
for safety review, safety mode,

17:48.560 --> 17:50.800
so like a one-click thing where you can change

17:50.800 --> 17:52.560
all of your settings quickly if you're facing,

17:52.560 --> 17:54.800
let's say like dog pilot, a lot of people

17:54.800 --> 17:57.720
are jumping on your account.

17:57.720 --> 17:59.880
Rapid response teams have trusted allies

17:59.880 --> 18:02.480
to help that person facing harassment or harm,

18:02.480 --> 18:05.400
manage that abuse, and automatic documentation tools

18:05.400 --> 18:06.840
for legal evidence.

18:06.840 --> 18:09.560
So if someone has sent you like a series of threats,

18:09.560 --> 18:11.360
being able to quickly document that

18:11.360 --> 18:15.160
from the platform itself, transparency, penalty systems

18:15.160 --> 18:17.120
with escalating consequences.

18:17.120 --> 18:18.400
One of the things that their research

18:18.400 --> 18:21.200
and our research has found is that on the other side,

18:21.200 --> 18:24.760
if you are an in person, engaging in abuse,

18:24.760 --> 18:27.120
but more generally, no user actually knows

18:27.120 --> 18:30.160
like where they sit, how many strikes are against them

18:30.160 --> 18:33.200
in a platform, there are strikes against you.

18:33.200 --> 18:35.760
And so you can be like walked out of your account,

18:35.760 --> 18:38.920
let's say, for a day or a week,

18:38.920 --> 18:41.920
and you might not know why, and that could be from like,

18:41.920 --> 18:43.800
historical strikes that have been against you.

18:43.800 --> 18:46.920
So this does, this is something that's a big issue

18:46.920 --> 18:50.440
on all sides of people engaging online,

18:50.440 --> 18:52.640
but yeah, this transparent penalty system.

18:52.640 --> 18:54.400
And again, a trauma informed reporting

18:54.400 --> 18:56.320
for coordinated attacks,

18:56.320 --> 18:58.960
maybe an easy, easy to report.

19:01.200 --> 19:02.200
But yeah.

19:02.200 --> 19:06.840
And then one of the other things we also looked at

19:06.840 --> 19:11.840
was this worker by the National Democratic Institute.

19:12.560 --> 19:16.000
So this is a tracker, which is a structured data set

19:16.000 --> 19:18.480
of interventions, studies, and recommendations related

19:18.480 --> 19:19.840
to online gender based violence

19:19.840 --> 19:23.880
of abuse, disinformation, and other forms of online harm.

19:23.880 --> 19:26.800
When we were developing the safety by design framework,

19:26.800 --> 19:29.160
in which we're launching in this merch,

19:29.160 --> 19:32.520
the NDI landscape tracker really helped us identify

19:32.520 --> 19:34.160
reoccurring themes and interventions

19:34.160 --> 19:36.520
by systematically reviewing the interventions

19:36.520 --> 19:37.240
in the tracker.

19:37.240 --> 19:39.840
So one of the things we're publishing is a large data set,

19:39.840 --> 19:42.440
which will also be able to access, which is,

19:42.440 --> 19:45.240
pretty much every provocation and proposal

19:45.240 --> 19:47.440
of a safety intervention and policy

19:47.440 --> 19:51.560
we could find published through papers online.

19:51.560 --> 19:53.120
So again, these are the things we're talking about

19:53.120 --> 19:55.800
for folks like what a folks historically requested.

19:55.800 --> 19:58.440
And we're trying to tag as best we can,

19:58.440 --> 20:00.360
what's been implemented.

20:00.360 --> 20:02.120
Is it perfect?

20:02.120 --> 20:06.400
What we're doing now, is it incomplete, probably.

20:06.400 --> 20:08.120
But it's built off some of this hard work

20:08.120 --> 20:10.640
that NDI has done in the past.

20:12.280 --> 20:13.520
One of the things we're able to see

20:13.520 --> 20:16.360
is what we're calling common categories of action.

20:16.360 --> 20:19.800
And those are examples of platform transparency,

20:19.800 --> 20:21.000
reporting mechanisms.

20:21.000 --> 20:22.560
So the data set we're putting together

20:22.560 --> 20:25.560
and that NDI is put together goes into much more depth

20:25.560 --> 20:28.800
around an example of a reporting mechanism.

20:29.920 --> 20:32.640
Legislative and policy recommendations,

20:32.640 --> 20:36.000
training and capacity building examples,

20:36.000 --> 20:38.000
research and visibility of harm patterns,

20:38.000 --> 20:42.040
including hyperspecific patterns of harm

20:42.040 --> 20:45.200
and harassment, and localized or context-sensitive

20:45.200 --> 20:46.560
protective measures.

20:47.840 --> 20:50.720
We in our data set,

20:50.800 --> 20:53.440
view these reoccurring themes that can be translated

20:53.440 --> 20:55.760
into design and interventions to reference

20:55.760 --> 20:59.480
and integrate into findings and recommendations.

20:59.480 --> 21:03.800
So this tracker in particular aggregates actual documented

21:03.800 --> 21:05.400
interventions and research findings

21:05.400 --> 21:08.080
from multiple organizations and context.

21:08.080 --> 21:09.640
This isn't an incredibly large spreadsheet

21:09.640 --> 21:10.960
that we're all so linking to.

21:10.960 --> 21:13.040
So if you'd like access to this,

21:13.040 --> 21:15.160
let me know if it is publicly accessible,

21:15.160 --> 21:17.080
I can share it out later.

21:17.080 --> 21:22.080
Okay, best practices, I'm almost done.

21:22.440 --> 21:25.600
So in this last section,

21:25.600 --> 21:29.040
I want to shift our focus to also what can actually be done.

21:29.040 --> 21:30.600
Arriving, we've discussed so far

21:30.600 --> 21:33.600
from harmful design patterns to safety by design.

21:33.600 --> 21:36.280
And the frameworks and examples serve all point.

21:36.280 --> 21:40.520
We think to this simple and heavy quotes conclusion.

21:40.520 --> 21:43.560
So safety cannot be under tackled by one team

21:43.560 --> 21:46.680
or one discipline as requires multi-stake holder

21:46.680 --> 21:49.360
responsibility and action across a product.

21:49.360 --> 21:52.600
So it can't just be isolated to one trust and safety team,

21:52.600 --> 21:54.600
multiple teams are the entire product

21:54.600 --> 21:57.600
have to also be dedicated to centering safety.

21:58.800 --> 22:00.520
And I think this is true if you're trying

22:00.520 --> 22:02.440
to build a security culture as well.

22:02.440 --> 22:05.840
You have all the teams have to be dedicated to this.

22:08.520 --> 22:09.840
There you go.

22:09.840 --> 22:14.840
So what does this serve multi-stake holder responsibility mean?

22:14.840 --> 22:17.600
Design must be usable, accessible, and legibles.

22:17.600 --> 22:18.920
It has to be easy to find.

22:18.920 --> 22:22.320
Again, building on some of the points that Elio and Ania

22:22.320 --> 22:24.080
outlined in their talk.

22:24.080 --> 22:26.360
Design and technology have to be trauma informed.

22:26.360 --> 22:28.280
That does mean thinking about how much information

22:28.280 --> 22:29.800
or you're giving people, is it something

22:29.800 --> 22:31.760
that in a moment of panic?

22:31.760 --> 22:33.480
Can people understand?

22:33.480 --> 22:34.320
Can they access?

22:34.320 --> 22:35.320
Can they sift through?

22:35.320 --> 22:36.960
Is it something that will be triggering to them?

22:36.960 --> 22:38.520
Is it something that, again, they can, like,

22:38.520 --> 22:42.480
they have the cognitive bandwidth to engage with?

22:42.480 --> 22:45.840
If it's multiple lines of small text, the answer is no.

22:45.840 --> 22:48.800
That's very hard to read for people there in moments

22:48.800 --> 22:51.320
of high panic or high stress, right?

22:53.520 --> 22:55.720
A big thing, which I'm sure all of you do in this room,

22:55.720 --> 22:57.960
but the big tech does not.

22:57.960 --> 23:00.080
Centering opt-out over-opt-it.

23:00.080 --> 23:04.320
That means making sure all the privacy settings are turned on.

23:04.320 --> 23:06.640
Versus having to go and find them

23:06.640 --> 23:10.800
and make your account more and more private, right?

23:10.800 --> 23:15.000
This ensures security and safety are already

23:15.000 --> 23:17.040
set for the most safe user experience

23:17.040 --> 23:19.120
and folks can go toggle as they feel comfortable

23:19.120 --> 23:22.040
for how public they would like to be or how exposed,

23:22.040 --> 23:24.760
they'd like their data to be.

23:24.760 --> 23:28.560
Design should take into the emotional state of the user.

23:28.560 --> 23:31.240
So a big one, as we've been saying earlier,

23:31.240 --> 23:34.240
design was repeatedly center the most marginalized

23:34.240 --> 23:37.360
or as some folks say the decentered user.

23:37.360 --> 23:39.120
Design should actively threat model

23:39.120 --> 23:43.000
and understand and prepare for all these nuanced types of harm.

23:43.000 --> 23:47.840
Design is not neutral, so every default, every flow,

23:47.840 --> 23:50.320
every hitting, setting, encodes assumptions

23:50.320 --> 23:52.880
about power, risk, and responsibility.

23:52.880 --> 23:55.480
These are things we have to actively tackle

23:55.480 --> 23:59.720
in a design practice and also within product design.

23:59.720 --> 24:03.840
So I just want to end on this quote that we love.

24:03.840 --> 24:06.240
We use a lot, this is from off-sonnerry got,

24:06.240 --> 24:09.480
the creator of the decentered and from this paper

24:09.480 --> 24:12.120
she wrote called design from the margins.

24:12.120 --> 24:15.400
So she says, we expect architects to design buildings

24:15.400 --> 24:19.040
and British to withstand jail, force wins, and heavy loads.

24:19.040 --> 24:22.560
So two should we expect companies whose products are the vehicle

24:22.560 --> 24:25.640
for free expression and access to information

24:25.640 --> 24:27.920
to design sensitive, resilient technologies

24:27.920 --> 24:30.520
for the benefit of all, whether they face government

24:30.520 --> 24:33.760
or pressure and social stigma or not.

24:34.400 --> 24:35.560
Thank you.

24:35.560 --> 24:47.040
If you have any question or access to any of the materials

24:47.040 --> 24:48.440
I've shared today, please email me.

24:48.440 --> 24:51.880
I would happily share them out with all of you.

24:51.880 --> 24:53.840
So thank you.

24:53.840 --> 24:55.320
Let's keep that up for a second.

24:55.320 --> 24:57.480
I think we have two more minutes, but...

24:57.480 --> 24:59.000
Oh, shit, I'm sorry.

24:59.000 --> 25:04.000
But I think that's an off-sonner for a simple,

25:04.000 --> 25:07.800
one of the best ones, yeah.

25:07.800 --> 25:09.560
Yeah, I can hear you.

25:09.560 --> 25:10.560
Yeah.

25:10.560 --> 25:28.560
So that sort of relates to like, when you're at the same boat.

25:29.560 --> 25:32.280
Sorry, that's the first and asking a question.

25:32.280 --> 25:35.680
My friends was asking, how can design take into the emotional

25:35.680 --> 25:37.440
state of the user how would you know?

25:37.440 --> 25:39.760
So when you're thinking about some of these broader principles

25:39.760 --> 25:44.160
you're talking about like legibility or accessibility, right?

25:44.160 --> 25:45.960
I think those go hand in hand with this.

25:45.960 --> 25:49.120
Like is the text small, or is it easier to read?

25:49.120 --> 25:54.120
Is it using like jargon, or is it using plain language?

25:54.120 --> 25:58.360
Is it, is it a lot of text someone has to stitch through?

25:58.360 --> 26:02.560
So making something that has a larger font is using more plain language

26:02.560 --> 26:05.400
is easier to find easier to surface.

26:05.400 --> 26:06.760
Like those are already things that are

26:06.760 --> 26:09.440
to take into the emotional state of a user.

26:09.440 --> 26:11.360
Generally, if you're designing, let's say,

26:11.360 --> 26:14.720
for accessibility and you're designing for a wide audience,

26:14.720 --> 26:17.640
you're probably already centering.

26:17.640 --> 26:20.320
The emotional state is you're making it easier

26:20.320 --> 26:22.640
to find and surface that information

26:22.640 --> 26:26.720
and you're really taking into account how people

26:26.720 --> 26:30.280
are understanding or could misunderstand what it is you're outlining.

26:30.280 --> 26:34.640
But a big thing is making sure you're not giving them a lot of small text.

26:34.640 --> 26:38.600
Even when they get to a page, let's say of resources, right?

26:38.600 --> 26:43.480
So even if they can find your resource, making sure that it's not like a wall

26:43.480 --> 26:46.240
of multiple paragraphs of stuff they need to do.

26:46.240 --> 26:49.320
So if you are building, let's say, like an anti-doxing tool

26:49.320 --> 26:51.760
and you're trying to help them remove information,

26:51.760 --> 26:54.640
it may be telling them, hey, it's going to take this many minutes

26:54.640 --> 26:58.360
to go through this and you're really trying to space out some of the information.

26:58.360 --> 27:05.640
You're also making sure it's this balance right of your

27:05.640 --> 27:07.120
ideally, hopefully hiring a copywriter.

27:07.120 --> 27:11.640
But you're really trying to make sure the information is bite size

27:11.640 --> 27:13.200
as well as relevant.

27:13.200 --> 27:15.800
So it's like kind of cutting out extraneous information.

27:15.800 --> 27:30.800
Oh wait, sorry.

27:30.800 --> 27:31.800
Oh yeah.

27:31.800 --> 27:32.800
Oh, we did get certain.

27:32.800 --> 27:33.800
It's okay.

27:33.800 --> 27:34.800
Yeah.

27:34.800 --> 27:44.800
So sometimes not though.

27:44.800 --> 27:45.800
Yeah.

27:45.800 --> 27:48.800
How do you see design?

27:48.800 --> 27:50.800
Safety by design.

27:50.800 --> 27:51.800
Yeah.

27:51.800 --> 27:52.800
That's a great question.

27:52.800 --> 27:56.160
So I mean, one of the reasons I will never say, oh, yeah.

27:56.160 --> 27:59.800
So this person has a question of, you know, that harmful design patterns

27:59.800 --> 28:03.800
can are intentional by platforms or companies.

28:03.800 --> 28:05.800
And how do we see safety by designing being adopted?

28:05.800 --> 28:09.800
I will never say in a recorded session that they're intentional

28:10.800 --> 28:14.800
because I have to work on cases where it's impossible to prove intentionality.

28:14.800 --> 28:19.800
As an FYI, but we can have a solid conversation about that.

28:19.800 --> 28:23.800
But one of the things we believe at convocation is that when you design

28:23.800 --> 28:28.800
for a world that creates incentives, safety, security, and privacy,

28:28.800 --> 28:33.800
you would actually mitigate a lot of dark patterns or harmful design patterns

28:33.800 --> 28:37.800
because you're creating a world that centers the user, their agency

28:37.800 --> 28:38.800
and effectively their choice.

28:38.800 --> 28:43.800
So you're making it easier for them to really go through and make decisions.

28:43.800 --> 28:48.800
If you are in a world where you're trying to center privacy and use your agency,

28:48.800 --> 28:52.800
you are thinking about how to, I think, allegedly share that content

28:52.800 --> 28:55.800
and not judge a user towards something more harmful.

28:55.800 --> 29:00.800
Generally harmful design patterns are prioritizing a company and its interests

29:00.800 --> 29:03.800
over consumers and generally their safety.

29:03.800 --> 29:07.800
I like making our data as public as possible.

29:07.800 --> 29:10.800
So I do think that these go hand in hand.

29:10.800 --> 29:14.800
It does mean a lot of things would be like for radically redesigned.

29:14.800 --> 29:17.800
So if someone wants to unsubscribe, that'd be a very easy flow.

29:17.800 --> 29:20.800
There's probably a lot of reasons they want to unsubscribe.

29:20.800 --> 29:24.800
But it might be like a longer ripple effect of safety by design.

29:24.800 --> 29:28.800
We don't mention that explicitly in our paper, the unsubscribe example.

29:28.800 --> 29:33.800
But we do mention how easy some of these safety features have to be to find.

29:33.800 --> 29:37.800
And so that would be like pro user or user-centered settings,

29:37.800 --> 29:39.800
as perhaps a way to think about it.

29:39.800 --> 29:42.800
So they do go, I think, hand or at least they do,

29:42.800 --> 29:47.800
with how we think about approaching design and technology policy.

29:47.800 --> 29:49.800
Do you think you're in a question?

29:49.800 --> 29:51.800
Do you have a normal?

29:51.800 --> 29:52.800
Oh, yeah.

29:52.800 --> 29:57.800
So a lot from making the companies the course that we've been doing behind us

29:57.800 --> 30:00.800
and the designers, you're not to buy organizations.

30:00.800 --> 30:03.800
Most of the things that you would be up to,

30:03.800 --> 30:05.800
you wouldn't have an opportunity to decide

30:05.800 --> 30:07.800
if it works some reason or not for another.

30:07.800 --> 30:08.800
Yeah, thank you for that.

30:08.800 --> 30:12.800
So the question was, I guess, outside of trying to force companies with law

30:12.800 --> 30:14.800
or legislation to implement some of these principles

30:14.800 --> 30:18.800
are their other ways to encourage them or have them implement them.

30:18.800 --> 30:21.800
I mean, I would say so this might sound naive,

30:21.800 --> 30:25.800
but I, again, have worked at Wikimedia on the Trust and Safety team

30:25.800 --> 30:29.800
and that I think is a wonderful project where we center

30:29.800 --> 30:31.800
or try to center as best we could.

30:31.800 --> 30:34.800
They're like the requests and wishes of users.

30:34.800 --> 30:38.800
I do think when you treat users as real people

30:38.800 --> 30:42.800
who make your product what it is and you're trying to center

30:42.800 --> 30:46.800
and you're working with them and trying to center what they want

30:46.800 --> 30:49.800
and how they want to sort of exist and navigate in a project

30:49.800 --> 30:52.800
including your safety, you are taking them seriously

30:52.800 --> 30:54.800
and what you do is you end up building trust with them.

30:54.800 --> 30:58.800
And I think I think trust is an impossible thing to purchase.

30:58.800 --> 31:02.800
It's a very difficult thing to build and it's a very easy thing to lose,

31:02.800 --> 31:06.800
but it is what keeps consumers and users returning.

31:06.800 --> 31:09.800
It's what builds a community into a real community.

31:09.800 --> 31:11.800
So I think I recently said as naive,

31:11.800 --> 31:15.800
I think that that's a reason to design and to build that into product design

31:15.800 --> 31:19.800
because I think that that's an important important thing for user retention.

31:20.800 --> 31:25.800
But I don't work at a big tech company so I'm sure they have responses

31:25.800 --> 31:27.800
as to how that's naive.

31:27.800 --> 31:29.800
But I don't think that it is.

31:29.800 --> 31:30.800
Yeah.

31:30.800 --> 31:31.800
You can tell.

