Katie is Chief of Staff at Spectrum Labs. Spectrum Labs is a technology company, platform, and community whose mission is to make the Internet a safer and more valuable place for all. She works closely with the engineering team to build products that enable customers to reach their trust & safety goals.
00:00:01:28 - 00:00:25:00
Narrator
At a crossroads of uncertainty and opportunity. How do you navigate forward? This podcast focuses on making smart choices in a rapidly changing world. We investigate the challenges of being at a crossroads and finding the opportunities that arise out of disruption. Listen in on future forward conversations with the brightest luminaries, movers and shakers. Let's navigate forward together and create what's next.
00:00:25:02 - 00:00:52:08
LIsa Thee
Hello, everybody, and welcome to the Navigating for Her podcast. My name is Lisa Thi and I'll be your host today. Co-hosting with me is Dylan Kinsella. Dylan is also one of the brethren of Orange Consulting and we're so excited for him to bring his perspective to the conversation. One of my favorite hobbies in life is collecting experts, people that know all the things about all the things I don't know much about, and one of my favorite people in this space is Katie's the Goldman.
00:00:52:10 - 00:01:09:23
LIsa Thee
Katie is the chief of staff at Spectrum Labs, where she partners with online platforms to help them build safe, inclusive and engaging communities. She really brings the positivity about cleaning up the Internet to bear. And we're thrilled to have you here today, Katie. Thank you for joining us.
00:01:09:26 - 00:01:13:04
Katie Zigelman
Awesome. Thank you so much for having me.
00:01:13:06 - 00:01:14:24
LIsa Thee
Dylan, do you want to give our. Yeah.
00:01:14:26 - 00:01:15:16
Katie Zigelman
Yeah.
00:01:15:18 - 00:01:36:25
Dylan Kinsella
Hey, first. Yes, my first thought was fantastic. Thank you for having me. Lisa, you said such a great job with this podcast that I'm I'm excited to be a part of this one, especially with Katie here. I've looked at the spectrum. I'm excited to see what they're doing and it's great to join a podcast or any initiative when you can join, when somebody is so professional and doing something so good for so many people.
00:01:36:25 - 00:01:43:07
Dylan Kinsella
So I'm excited to be a fly on the wall and hopefully help drive some of this conversation forward. But just, just fantastically thrilled to be here.
00:01:43:07 - 00:01:58:05
LIsa Thee
And yeah, we're stoked you're here too. And my favorite podcast out there right now is armchair expert with DAX Shepard and DAX and Monica come on together and work together and she fact checks them. So I would like to encourage you to fact check me because I've.
00:01:58:05 - 00:02:03:17
Dylan Kinsella
Got I've got a tab open on my other monitor. And I will be I will be fact checking. Okay.
00:02:03:20 - 00:02:14:14
LIsa Thee
Fantastic. So, Katie, can you tell us a little bit about your background and where you're from and if you think that inspires any of the work you're doing today?
00:02:14:17 - 00:02:38:16
Katie Zigelman
Sure. So I was actually born and raised in California, in San Diego, but then I, I bots all over the world. I ended up going to school in Boston and then I lived in New York for a while. And New York was always the dream for me. And it may seem silly, but when I was growing up and thinking about what do I want to be when I grew up, I loved Lois Lane from Superman.
00:02:38:16 - 00:03:07:19
Katie Zigelman
She was just I loved what she did. I loved what she stood for. I love that she got to wear a pencil skirt to work. Very silly things, but I was convinced that I wanted to grow up, move to New York, wear a pencil skirt and do something in the journalism field. I was just so in love with newspapers, and a huge piece of that is I think that access to information is really it's it's so incredibly important to democracy and especially having accurate information available to people.
00:03:07:21 - 00:03:31:12
Katie Zigelman
And so that was a lot of where my career started. I actually worked first for The Boston Globe and then for The New York Times and then continued in tech, working actually more on the vendor side of things, which is ultimately what brought me to Spectrum. But there's been a lot of bopping around. But that that concept of technology and information being accessible for all has been actually a fairly common theme.
00:03:31:12 - 00:03:53:14
LIsa Thee
Yeah, you were really ahead of your time, I must admit, back in 2017 when I started listening to a lot of folks and the trust and safety domain talk about misinformation campaigns and how challenging those are going to be as we move forward in our technology journey as a society. I don't think I really gave it the gravity that it deserved.
00:03:53:16 - 00:04:09:17
LIsa Thee
So you were definitely seeing far down the field much earlier than I was on why this is so important and how it's going to be an issue that we have to contend with over time. How about you, Dylan? Are you spreading any misinformation campaigns right now?
00:04:09:19 - 00:04:28:23
Dylan Kinsella
I hope not. But just by the vast number of misinformation campaigns and information on the Internet, I'm sure it's spreading something false once or twice unknowingly. So I apologize to those that are my diehard followers that I may be led astray. But but I try my best to to validate fact check and look for resources.
00:04:28:23 - 00:04:40:13
LIsa Thee
And I definitely have gotten a nudge from a friend or two every once in a while, like, hey, did you check the source on that that you just sent? I think we all are doing our best, but it is a bit of a challenge at times.
00:04:40:15 - 00:04:54:15
Katie Zigelman
Absolutely. And actually, we recently heard, you know, big difference between misinformation and disinformation. Misinformation very unknowingly. It happens to all of us accidentally. But this information is really where things get back.
00:04:54:22 - 00:04:56:29
Dylan Kinsella
And is that what's the intent is.
00:04:57:01 - 00:04:59:04
Katie Zigelman
Yeah, that's the intent behind it.
00:04:59:06 - 00:05:02:19
LIsa Thee
So there's the coercion element to it.
00:05:02:19 - 00:05:03:27
Katie Zigelman
The exactly.
00:05:04:04 - 00:05:19:26
LIsa Thee
Yeah. So that makes a lot of sense. So Katie, for those people that are interested in learning more about your area of expertise and what Spectrum Lab does interest in safety, can you give us an overview of what you do and maybe in a way that your grandmother might understand?
00:05:19:28 - 00:05:51:12
Katie Zigelman
Absolutely. So at Spectrum Labs, we work with online platforms, especially in the gaming space, dating, social media and marketplaces, and we work with them to help them recognize and remove toxic behavior within their user generated content. So that could be chat messages, comments, posts, usernames, voice chats, livestreams, you name it. And really, at the end of the day, we're their partner where they create their own community guidelines.
00:05:51:12 - 00:06:13:04
Katie Zigelman
They say what is and is not going to be allowed within their platform. And then we show up and we help them find where people might not be following those guidelines correctly so that they can then take that information then and make sure that they're keeping their community very safe, very engaged and very inclusive. So really, we help them grow their userbase.
00:06:13:04 - 00:06:20:22
Katie Zigelman
We help them keep those users around and in creating being positive members to the overall community.
00:06:20:25 - 00:06:28:02
LIsa Thee
Very cool. How do you guys use data and technology to accomplish that ambitious goal of keeping communities healthy online?
00:06:28:04 - 00:06:54:23
Katie Zigelman
Yeah. So how we actually do the identification of these different behaviors is through what we talk about is contextual AI. So we're actually using data to train a AI models that can detect 40 plus different behaviors across multiple languages. So think about those like hate speech, child grooming, sexual harassment, especially some of those more complex behaviors like radical evasion.
00:06:54:25 - 00:07:07:09
Katie Zigelman
Those are things that require really strong data that has positive examples and negative examples. So you can really build that that super sophisticated AI to be able to find those.
00:07:07:11 - 00:07:12:01
LIsa Thee
And does the AI actually remove the content or are humans involved in the process?
00:07:12:03 - 00:07:31:09
Katie Zigelman
Yeah. So the AI surfaces that and then we have workflow tools that allow people to choose to automate actions so they can say, when we see this type of content, we want to educate the user that that's not allowed. When we see this type of content, we might actually want to ban this user if it's a zero tolerance type behavior.
00:07:31:11 - 00:07:56:21
Katie Zigelman
So the air does the surfacing and then we have the webhook tools through those automations for people to actually take action. And that can either happen automatically or it can be surfaced to a human moderator to review. And what that means is that we can we can help people automate away kind of the worst of the worst or the black and white cases that aren't that don't need as much review.
00:07:56:21 - 00:08:03:16
Katie Zigelman
And we can leave what needs to be reviewed for humans to do what humans do best, which is really think, sure.
00:08:03:16 - 00:08:33:15
LIsa Thee
So toxic behavior isn't all one thing, right? So some things are blatantly illegal and need to have action taken immediately in order to reduce the risk of penalties to the company with legal consequences. Things like child sexual abuse material or livestream terror events where other things are not favorable and will likely get you user churn because people don't want to be on a platform that's hostile but aren't as time sensitive.
00:08:33:15 - 00:08:41:25
LIsa Thee
And that way there can be more engagement from their trust and safety team to make sure that they're making thoughtful decisions right?
00:08:41:27 - 00:09:02:25
Katie Zigelman
Absolutely. And I also like to think about it as communication and expectations. So much in life comes down to those two things. And if you think about it from a platform perspective, they're created these community guidelines. And those community guidelines is them communicating to their users what's going to be allowed and what's not. And at the same time, the users are agreeing to those.
00:09:02:27 - 00:09:26:10
Katie Zigelman
So they're saying, yeah, this is the behavior that I'm going to follow. And so when a user then starts to see other community members not following those guidelines, there's there's there's a gap. There's a gap between what they're being told is expected and what they're actually seeing. And that's where there tends to be friction, especially as you think about brand reputation and brand image for some of these platforms.
00:09:26:13 - 00:09:40:19
Katie Zigelman
It's really when they've set the guidelines and they aren't enforcing them consistently, that people start to start to ask some questions and start to feel less safe and and like they don't know what to expect.
00:09:40:21 - 00:09:52:14
LIsa Thee
Yeah, I think we can all relate and the analog world to the expectations of being keeping your voice quiet and the library versus maybe the expectations of how loud it's going to be at a rock concert. Right.
00:09:52:17 - 00:10:16:25
Katie Zigelman
Exactly. And that's something, you know, in the physical space, we have these cues that let us know what's socially acceptable. And in the digital space, it's been a little bit harder to create them. And you do, you do see them. And we're definitely at spectrum. We're not the arbiters of truth, so we're not here to say every platform should be have the exact same guidelines of what's acceptable for their situation.
00:10:16:28 - 00:10:45:11
Katie Zigelman
But we are here to say of what has been said, let's help you enforce those consistently. And one of the examples I give is if you go to a nightclub on a Friday night, you're going to have very different socially acceptable behaviors. And if you go to an art museum on a Sunday afternoon and the same can be true on the Internet, how you act on LinkedIn, where it's a much more professional environment, is very different than how you act on, say, Tumblr when you're looking at different images or whatnot.
00:10:45:13 - 00:11:06:01
LIsa Thee
So how does one go from being inspired by Lois Lane pencil skirts and journalism into a fruitful career? As a product manager of a fast growing startup and ultimately elevating your career to chief of staff of the startup over the course of this time? How did how did.
00:11:06:01 - 00:11:31:00
Katie Zigelman
This all happen? Yeah, you know, a huge piece of it has been I have created an incredible network of humans. I love how you say, you know, you like to collect experts, I like to collect mentors and this incredible teachers. And I have been very lucky. You know, I went to a school that offered a co-op program as part of college, and that really kickstarted my career.
00:11:31:00 - 00:11:56:12
Katie Zigelman
Even before I graduated, I was able to get real life experience working as a full time employee, getting paid, learning how to even just handle some of the professional situations, like don't you come in meetings, all those little things that you don't know until you actually experience them. And so I started my career, like I mentioned on the publisher side, and then I moved to a startup that was working with publishers.
00:11:56:12 - 00:12:20:22
Katie Zigelman
So it was actually it was a data management platform called Crux K, our UX, and we worked with some of the largest publishers, some of the largest marketers to help them collect data and make sense of it. And then that company got acquired by Salesforce. I had a couple of job changes within that as well. And then the co-founders of Spectrum were also employees, that crux.
00:12:20:24 - 00:12:53:02
Katie Zigelman
So a lot of my career has been following some very influential people in my life who have taught me a lot about what I've known. And then it's been a lot of of just trusting that I'll figure it out. I've been asked to do a lot of roles that I've never done before. You know, if you look at some of my past job titles, even when I first went to the vendor side, I was in fact client training and education and then went from that into more operations and then from that into more products and then from that into program management.
00:12:53:04 - 00:13:14:26
Katie Zigelman
I still I used to say I thought that the the strangest part of my career was going to be going from New York Times to Cosmo, but I somehow then ended up as a tech technical program manager for an infrastructure engineering team at Salesforce. And I learned a lot of big words and a lot about infrastructure, a lot about Agile process.
00:13:14:26 - 00:13:42:03
Katie Zigelman
And in all of the things. And then from there, you know, going, it's also been really interesting going between really big companies like Hearst Digital Media and Salesforce to also very small companies at Spectrum. I believe I was full time employee number six. And when you start something that early, you wear a lot of different hats. You you support the customers, you help build the product.
00:13:42:03 - 00:14:04:08
Katie Zigelman
And I I've always loved within the spectrum, I've kind of sat between the product or the engineering team and the go to market team. And so I've really been able to learn from our customers and from the market what do they need, How do we actually help them build the tools that they need and then work very closely with the engineering team to deliver on that.
00:14:04:11 - 00:14:07:11
Katie Zigelman
And so I'm sorry, go ahead.
00:14:07:14 - 00:14:28:26
Dylan Kinsella
I'm sure that's that's fascinating. I yeah, I think it's really cool having gone from such a large company like Salesforce, where it ends up being a little bit of a machine, right, You know, fantastic company. We use Salesforce at launch and I'm really happy with the tools. But then going to a smaller company like Spectrum, like your day, who you are, your personality ends up being a part of the product and a platform that you're building out.
00:14:28:26 - 00:14:51:05
Dylan Kinsella
And I think that's really interesting and quite a treat to be able to see both ends of the spectrum go from such a large company where you're you're not just a cog, right? You're you're obviously adding value, you're contributing to this massive platform, but taking what you've learned there and your history then going to help start a company like Spectrum, it must be a fascinating experience, a very, very great and enviable.
00:14:51:08 - 00:14:58:10
Katie Zigelman
Oh, thank you. And I have to say, was that pun intended? There was definitely a spectrum mentioned in there.
00:14:58:13 - 00:15:04:10
Dylan Kinsella
That if it was, I think I just speak in time half the time. So whether it's intended or not, you know.
00:15:04:13 - 00:15:29:03
LIsa Thee
It's okay. It's part of your charm. Go on. And I really love how you spoke to the opportunities to co-op early in your career. I was realizing as you were talking through that, that I had a similar experience. I started coping in the automotive industry in high school and through college. And by the time I graduated, I had six or seven different experiences at different companies.
00:15:29:05 - 00:15:52:25
LIsa Thee
And that really allowed me to learn as much about myself of what I liked as what I didn't like. And I think that's such a good life hack for people that early in their career that you get almost as much benefit from learning what you don't prefer to do in an internship like scenario where you only have to do it for a few months versus having it be your full time job for a few years before you can move on to the next thing, right?
00:15:52:27 - 00:16:15:08
Katie Zigelman
Absolutely. And I think it's also, you know, we there are so many careers out there that you're not necessarily exposed to as a child or even as someone in college. You know, unless you're you're going to get a degree in accounting because you're going to be an accountant. There are so many other careers out there that don't have a direct line to, you know, I study this in school.
00:16:15:08 - 00:16:55:21
Katie Zigelman
I know I then do this, especially as you think about the trust and safety industry. There's no college degree. There is. There really historically haven't been a lot of industry organizations that are supporting this industry. And now, you know, we have things like the Spy Trust and Safety Professional Association, and I know at Spectrum are a sister company and Spectrum is the Oasis Consortium, where we're trying to create some industry standards and bring people together because this is an industry that is a little under-resourced and it's a little newer historically, kind of going back to the Lois Lane journalism in May.
00:16:55:23 - 00:17:17:20
Katie Zigelman
But historically, if you wanted to publish information in some way, you had to go through an editor, you had to get it published in some very reputable news source and whatnot. And now fast forward to the Internet, where almost every platform has user generated content, you can all of a sudden just start spewing your opinions and your reach does a lot is a lot wider.
00:17:17:20 - 00:17:40:19
Katie Zigelman
Now. And so it's it's a newer industry for people to actually have to worry about things like community guidelines and moderation and enforcing. And so one of the things I'm very passionate about is just how do we support those individuals who are in this industry and give them what they need to create their own business cases and how best practices and learn from each other?
00:17:40:19 - 00:18:01:17
LIsa Thee
Yeah, that's what I've noticed in the industry as well as I've worked through trust and safety over time, is that I typically see a deep bench on technologists that are willing to focus in this area and bring their talents to bear. I typically see a deep bench of policy people and the legal teams that are really passionate about doing something to benefit society.
00:18:01:20 - 00:18:27:07
LIsa Thee
What we don't often see is an understanding from the executive team and the executive sponsors of the gravity of the impact that safety can have, that if it's not done properly, it almost feels like the early days of the cybersecurity movements or the privacy movements when everybody is like, Yeah, yeah, we all know we should, but that doesn't mean it gets above the line right yet.
00:18:27:09 - 00:18:46:13
Katie Zigelman
And yeah, there are so many parallels between cybersecurity and trust and safety. And I think one of the the most obvious ones is that in both industries there's a clear definition of bad, right? If you have a data breach from a security perspective, that's terrible. If you're on the front page news of The New York Times because of bad breath, I know I keep coming back to New York Times.
00:18:46:13 - 00:19:10:29
Katie Zigelman
I just really love them. But, you know, if you have some big press break because you have been found to have toxic content on your platform, that's clearly bad. But what's good and even what's not good enough, and that's a lot of where I think there's a waste of standards that are coming out are really, really powerful because it gives people almost a rubric to score themselves against.
00:19:11:01 - 00:19:22:02
Katie Zigelman
It gives people a Northstar to work towards, and it gives people a tool that to align and to just to be able to sleep at night without the weight of the world on their shoulders.
00:19:22:05 - 00:19:36:10
LIsa Thee
Absolutely. And I, I was honored to be asked to contribute in a small way to the white paper that the Oasis Consortium just put out. Do you mind sharing with people where they can find that if they're interested in learning more about what that Northstar could look like?
00:19:36:13 - 00:20:01:25
Katie Zigelman
Yeah, absolutely. So that one can be found on the Spectrum website, which is Spectrum Labs, ICOM, and this is a white paper that's all about collaborating and how every role within a company can actually contribute to the initiative of user safety. And that's something my team makes fun of me all of the time because I over you use the phrase teamwork makes the dream work, but I really do think it's true.
00:20:01:25 - 00:20:21:27
Katie Zigelman
And one of the things that I've found career wise, where I'm happiest is when I'm able to help bring teams together when it is. Engineers aren't working in tandem with the trust and safety leaders or whoever it is to really produce something amazing. And that's there's no single person, there's no single team that's going to be able to solve that.
00:20:21:27 - 00:20:24:14
Katie Zigelman
It really is about that collaboration.
00:20:24:17 - 00:20:54:23
LIsa Thee
And that's how the startup all came from. Got acquired by Salesforce, right? So it benefits business as well. So Katy, you have had a meandering career starting in journalism, going into tech and now going into a social impact startup. Can you help us understand your why? What what has motivated you to put your best foot forward in this space, which frankly sometimes can be a little bit depressing?
00:20:54:26 - 00:21:18:09
Katie Zigelman
Yeah, I think a huge piece of it for me today is my team and our mission and really what we're trying to do, even if there's there's a rough day, it's a startup life. It's a roller coaster, right? You have really high highs and really low lows and feel a bump and dips in between. But right now it really is.
00:21:18:09 - 00:21:35:15
Katie Zigelman
It's our mission, our our mission of trying to make the Internet a safer and more valuable place for all. I think it's so incredibly important. We all use the Internet, we're all connected through it. There are so many powerful things that the Internet can bring to us and there are so many ways that people can abuse that, but they are the minority.
00:21:35:17 - 00:21:56:25
Katie Zigelman
And that's something, you know, just remembering what we're doing and why we're doing it. And then I you, my my network, my team, the people, they make it all worth it. You know, you spend more time with your coworkers and do with anyone else. And I'm very blessed with the ones that I have now and have gotten to work with for almost six years now.
00:21:56:27 - 00:22:07:07
LIsa Thee
I love that. What emerging trends are you seeing in your field that you think we should all be paying more attention to?
00:22:07:09 - 00:22:37:17
Katie Zigelman
Yeah, top of mind for us. Yeah. Top of mind right now is voice. You know, we're seeing, especially with clubhouse coming out, more and more people are communicating via voice as user generated content in addition to text. And it's a very different platform. It's very different content type. It has a whole bunch of challenges behind it, but it's it's a really important one for people to crack and to crack it now because they're another huge topic.
00:22:37:17 - 00:23:06:25
Katie Zigelman
We talk about it, you know, safety by design. It's a lot easier to design something upfront with the appropriate guardrails or I know, Lisa, an example I've seen you use in the past is put the seatbelt in the car right away. You know, we're trying to put the seatbelt on the Internet, but easier to just build it with it first than it is to go back and recall all the cars and then put it on a and so that that's one that it's a it's a really tough challenge, but it's it's something that the industry has to tackle and has to tackle very quickly.
00:23:06:26 - 00:23:28:28
Dylan Kinsella
Is the speech tech is interesting because, you know, we have the power right to do speech text. But the sentiment behind what is being said, I think is the key. And that's that's you know, it's starting to be solved. There's some you know, some of the big tech companies are working out solutions, but being able to understand the sentiment and the intent behind what somebody is saying is really powerful and important.
00:23:29:00 - 00:23:33:06
Dylan Kinsella
And I imagine, especially in your industry of safety, it's.
00:23:33:08 - 00:23:53:21
Katie Zigelman
Yeah, we found I think there's a stat that I've seen it's about 80% of all content gets lost in that transcription. And that's why a spectrum has actually taken a very different approach in that we're not using the transcription piece at all. We're doing the analysis directly on the audio file, which is which allows us just to keep that integrity of the data.
00:23:53:23 - 00:24:14:19
Katie Zigelman
So it's not like so think about transcription sometimes as a game of telephone and especially, you know, as you're dealing with AI and you're dealing with models, garbage in means garbage out. You need to you need to maintain that integrity of the data throughout the process. And that way you can really capture the context and deliver a much more accurate result.
00:24:14:21 - 00:24:36:00
LIsa Thee
That's fascinating. How does that apply to live streaming? Because I know that's been one of the trickier domains for companies to get a handle on is user generated content that's live streamed and real time and how do you balance the economics of making sure that that video is is aligned with community standards?
00:24:36:02 - 00:25:09:02
Katie Zigelman
Yeah, not a bad one. Yeah. And that's something where, you know, as you're thinking about safety for voice and why it is such a complex issue and similar to text, it's so much more than just moderation and it's so much more than just catching the bad actors. But a part of voice is knowing when to record data and doing that in a way that's still compliant with privacy policies and is going to be compliant with or is going to make your user still feel very safe.
00:25:09:02 - 00:25:29:28
Katie Zigelman
And so in the live stream example, a huge piece of it is do you record every single piece of a live stream? Do you like in broadcast world there's typically a delay. Do you start doing that? And that's where I think as an industry it is still very early and people are trying to answer a lot of those questions.
00:25:30:01 - 00:25:55:02
Katie Zigelman
And I think up until this point the technology hasn't really been there to support it as especially in that real time setting. You know, if you have to go from a real time or from voice to transcribe text to then through models to then a determination that takes time. And that's another huge part of why we are not going down that route and instead actually doing that analysis without that second step of the transcription.
00:25:55:04 - 00:26:19:19
Katie Zigelman
But it is, it's a challenge to understand when do we start recording and how do we how do we want to also respond when certain things are coming up. So one way that we've seen platforms handle it is they'll, they'll actually have moderators on staff who know the signal that they're looking for is when they need to join the live stream so that they can then potentially actually interact with users.
00:26:19:21 - 00:26:40:04
Katie Zigelman
Because I think another thing that can be a misconception in trust and safety is that it's all about just kicking out the bad actors. Well, the reality is that actors are users, too. Users are how platforms make money. No one wants to actually get rid of users. They want to get rid of the behaviors. And so what are things that can be done to educate?
00:26:40:06 - 00:27:03:15
Dylan Kinsella
If I have a quick question, earlier you were talking about identifying amongst people getting radicalized. Right. And seeing the uprise of certain trends and bad actors may be coming to be this do it spectrum. Do you monitor the content or are you monitoring? Are you able to like, monitor in an individual right? So is it each content piece that you're analyzing as a one off like, Oh, that was a bad comment flag it?
00:27:03:20 - 00:27:29:11
Dylan Kinsella
Or are you starting to have the ability to like, Oh, let's identify this person or we're seeing this person is now starting to go down this path of radicalization or is there any way to intervene in that person, say, hey, listen, like, you know, we're seeing these trends and I ask the question, listen, they were talking about the target case study earlier from 2012, which was, you know, being able to identify, buying trends and identify someone or, you know, woman is pregnant right through just what somebody was buying.
00:27:29:11 - 00:27:36:15
Dylan Kinsella
And I'm curious on that on the social media or the tactical level, are you seeing that same kind of pattern emerge?
00:27:36:18 - 00:28:01:27
Katie Zigelman
Yeah. So that's another thing. When we first started the company, we wanted to apply our experience with big data and AI and make sure that we were using that in the most useful way. And so as we do our analysis, we're not just looking at a single piece of data, we're actually looking at all of the metadata associated with it, which does allow us to track conversations over time and to track a user stream over time.
00:28:02:00 - 00:28:27:26
Katie Zigelman
So you think about a behavior like sexual harassment, How someone responds to a sexual advance is literally how sexual harassment is defined in many ways. And so the difference between a perfectly consensual adult conversation on a dating app and sexual harassment that happens, on being able to see that conversation and see that response. Same with something like child grooming.
00:28:27:28 - 00:28:52:06
Katie Zigelman
If all that you're doing is looking at individual messages, by the time that an individual message would indicate something and that kind of behavior is happening, it's too late. And so you have to be able you have to be tracking it over time and seeing seeing the different phases and seeing the relationship that's getting built. And so that's that's actually how our models are designed as we take that metadata into consideration.
00:28:52:06 - 00:28:59:03
Katie Zigelman
And then we can see the probability as the conversation is continuing or as a user stream is continuing.
00:28:59:05 - 00:29:22:16
LIsa Thee
It's actually one of the biggest keys to being able to intervene early before the trauma has started. Right. It context is everything. I mean, there's a huge difference between somebody saying, I just tripped in front of this girl I like, I want to kill myself and my life is worthless. I want to kill myself. They both have the same keywords, right.
00:29:22:16 - 00:29:24:09
LIsa Thee
But very different.
00:29:24:11 - 00:29:25:16
Dylan Kinsella
Context. Yes.
00:29:25:22 - 00:29:31:28
LIsa Thee
So but they're implying how do you guys deal with context outside of your native language?
00:29:32:01 - 00:29:53:14
Katie Zigelman
Yeah. So we we actually have a patent pending approach from a technology perspective of how we handle international languages. And it has to do with how the text data gets converted into a numerical representation as part of the analysis process. And I won't bore you with the details, but there is there's more info on the website for anyone who's interested.
00:29:53:16 - 00:30:14:16
Katie Zigelman
But we start with that. And what we found is that that gets us to a place of good. And then we work with native speakers and we work with our customers to tune over time and make it even more precise. And that's a huge piece. You know, going back to we're not the arbiters of truth, We're not here to to put a stick in the sand and say this is exactly what hate speech looks like.
00:30:14:16 - 00:30:30:26
Katie Zigelman
And this is always what hate speech looks like and how it might look on some platforms is going to manifest itself differently than others. And so we partner very closely with our customers to make sure that we're tuning to meet their guidelines and their communities and their needs.
00:30:30:28 - 00:30:59:21
LIsa Thee
That's super interesting because I think one of the challenges that I've seen most in this space is machine learning is all dependent on labeled data and model language drifts over time. You get different colloquialisms depending on your geographic location. What trafficking human trafficking looks like in Cambodia is different than what human trafficking looks like online. Often in Canton, Ohio.
00:30:59:24 - 00:31:20:03
LIsa Thee
Right. And so having that context is super critical to being successful. And all the companies that are looking to play in this domain that are looking at tackling their communities are almost exclusively multinational companies that have to deal with all the languages, all right. So how do you scale that?
00:31:20:06 - 00:31:42:19
Katie Zigelman
Yeah, and that's really one of the benefits of working with a third party partner like Spectrum. We are cross-industry. We're gathering data both labeled and unlabeled across all different industries. We're also working with a lot of experts. And, you know, we were talking about the radicalization model. We actually work with the Center for Terrorism, Extremism and Counterterrorism outside of Middlebury Institute.
00:31:42:22 - 00:32:08:07
Katie Zigelman
And so as we've improved that model, we've done it with their guidance of how that does look. Now, how that manifests itself online, because they're researching it every day. And that's something, you know, as a single platform form, especially a medium sized company, you're not going to have the resources to be able to understand exactly to find enough positive examples of what all of these different behaviors look like.
00:32:08:09 - 00:32:27:19
Katie Zigelman
Whereas when you work with a partner like us, we have the benefit of the cross industry knowledge, the expert knowledge, and that's all contributing to the training data, which our customers then also contribute to. And everyone gets better because of it. And so it's one of those, you know, rising tide raises all ships.
00:32:27:22 - 00:32:34:27
LIsa Thee
Can you talk a little bit about some of your customers and some of the accomplishments you've been most proud of in working with them?
00:32:34:29 - 00:33:10:26
Katie Zigelman
Sure. So one of the customers we're working with is the Meat Group, which is actually a live streaming platform. So very relevant for this conversation. And they they're working with us and they, after implementing our solution, actually saw a 15% reduction in manual moderation efforts. And why that matters so much is, you know, there's always going to be a human in the loop as we're talking about trust and safety, but reducing the noise of what the moderators have to look at makes insane efficiencies that then also allows the moderators to focus on the things that really matter.
00:33:10:28 - 00:33:33:08
Katie Zigelman
And so that's one that we're really proud of. We're also working with a couple of other companies that some of our under NDA, so I can't talk about as much, but Pinterest, riot games, Wildlife Studios, these are all, all customers that we're working with and seeing some some really, really great results and helping them improve their communities.
00:33:33:11 - 00:33:51:01
Dylan Kinsella
So we're talking about, you know, entrepreneur in this podcast, Lisa, She's she's a serial entrepreneur. You know, if you read her website in the background, she started, you know, many companies and essentially taking initiatives from Intel and other startups and grown them into into products and platforms. And I have a background in that as well as do you.
00:33:51:03 - 00:34:11:16
Dylan Kinsella
And I'm wondering, so in the startup space, when you're taking the product to market or you're launching a new thing, your concern is will people use it like, can I build it and will someone use it and maybe raise some money? And you've got half million dollars, friends and family or bridge round, and your goal is to like, build the platform so that someone can log in and use it, right?
00:34:11:18 - 00:34:36:21
Dylan Kinsella
And I would say in that context, you're you're not focused maybe on will people abuse that you're really concerned about, Will people use it? So the spectrum kind of helped fill that gap or that void where a startup might be able to leverage your technology? Do you have a have a support system to moderate content when what's on their platform, when they're still at the startup phase or they're trying to build out their platform and get users?
00:34:36:23 - 00:34:38:01
Dylan Kinsella
What's your take there?
00:34:38:03 - 00:35:09:06
Katie Zigelman
Yeah, absolutely. And I love that question because it it speaks to something that we talk about a lot, which is safety by design. You know, if, if, if the only thing that you're thinking about when it comes to trust and safety is moderation, it's too late. The analogy I use is if if you have a leaky roof and there's water coming into your house and you just put a bucket under it, that's like hiring a team of human moderators to go clean up a problem without actually addressing, Hey, why isn't leaking what's actually at the root of this?
00:35:09:08 - 00:35:34:23
Katie Zigelman
And so, so much of trust and safety is also about what is the user onboarding experience look like? Who are the people that each user can interact with and under what different criteria? What are the social norms that you're setting, starting with your community guidelines and then with the different cues that you actually build into your product? And so there's so much more that goes into it than just, oh, someone said a bad word, let's kick them out.
00:35:34:25 - 00:35:53:27
Katie Zigelman
Right? And that is a huge part of where where spectrum comes into play. And that's also why we care so much about this community and and providing trust and safety best practices in addition to the technology itself that I truly believe nobody wants to build a platform that a child is going to be groomed on.
00:35:54:00 - 00:35:55:07
Dylan Kinsella
No, not at all.
00:35:55:09 - 00:36:18:16
Katie Zigelman
But they don't always know what they need to do, what guardrails need to be in place to prevent something like that. And that's why I think the role of a trust and safety individual within a company is so incredibly important because they're the expert and I think their number one job should really be education. How do we bring other people on board and make it a collaborative effort to keep user safety as a top priority?
00:36:18:16 - 00:36:41:00
LIsa Thee
Yeah, Katie, I love what you're saying there. I think even to build on that more is that's the value of having a third party partner that can be surgically focused on the best technology and product experience to make it as lightweight as possible for companies to accomplish their goals of having safety by design without having to pour so many resources that infrastructure.
00:36:41:00 - 00:36:52:17
LIsa Thee
Because if you you're your primary care physicians not going to do surgery on you. We know when things get complicated, you need a specialist and. That's a place where a company like Spectrum plays, right?
00:36:52:20 - 00:36:56:01
Katie Zigelman
Absolutely. I love that analogy.
00:36:56:03 - 00:36:58:28
LIsa Thee
Well, thank you. I think we I think we got that. That's great.
00:36:59:00 - 00:36:59:26
Dylan Kinsella
Yes.
00:36:59:28 - 00:37:09:07
LIsa Thee
So thank you so much for your time today, Katie. We've learned so much from talking with you. Do you mind sharing with us where people can find you and keep tabs on what you're working on?
00:37:09:10 - 00:37:22:06
Katie Zigelman
Absolutely. Yeah. Spectrum Labs icon is the best place to see what's going on on a spectrum. And then we're we're on Twitter and LinkedIn. I'm also on LinkedIn, of course Katie Zuckerman and you can find me there.
00:37:22:08 - 00:37:24:27
LIsa Thee
Wonderful. Thank you so much for your time today.
00:37:24:29 - 00:37:25:14
Katie Zigelman
Thank you.
00:37:25:17 - 00:37:37:29
Dylan Kinsella
Thank you so much. I learned that time and I just had such a such a pleasure joining you and Lisa on this on this call everyone's podcast. So thank you so much. We appreciate this awesome Facebook.
00:37:38:01 - 00:37:49:02
Narrator
Hey, everyone. Thanks for listening to the Navigating Forward podcast. We'd love to hear from you. At a crossroads of uncertainty and opportunity, how do you navigate forward? We'll see you next time.