Dr. Hany Farid is a Professor of Computer Science at the University of California, Berkeley. He has a long tenure of being a forward-facing, thought leader in the areas of video manipulation. He plays an important role of advising global governments and technology companies on how to stay in front of the trends such as deep fakes, other, other, and other. In a world of misleading (or fake news) news, Dr. Farid is on the front lines of helping us know what is real and what is fake.
00;00;00;00 - 00;00;24;26
Narrator
At a crossroads of uncertainty and opportunity. How do you navigate forward? This podcast focuses on making smart choices in a rapidly changing world. We investigate the challenges of being at a crossroads and finding the opportunities that arise out of disruption. Listen in on future forward conversations with the brightest luminaries, movers and shakers. Let's navigate forward together and create what's next?
00;00;24;28 - 00;01;09;02
Lisa Thee
Hello everyone! Welcome to the Navigating Forward podcast. My name is Lisa Thee and I'll be your host today. One of my favorite hobbies in life is collecting experts. And I'm so excited to share one of my favorite ones of those with you today. Today we have the honor of speaking with Doctor Hany Fareed. Doctor Fareed is a professor at UC Berkeley for computer science, but has a long tenure and being a forward facing thought leader in the areas of video manipulation, and advises global governments and technology companies on how to stay ahead of the current trends of things, including deepfakes and how to identify when someone has manipulated technology for our viewing.
00;01;09;02 - 00;01;29;17
Lisa Thee
So in a world of fake news, Doctor Fareed is in the front lines of helping us know what's real and what's fake. So we're so excited to welcome you here today. Thank you for joining us. Thank you so much, Lisa. It's great to be here and great to see you as well. So for some of our listeners that don't know as much about your work, do you mind telling us a little bit about your background? Where are you from and how do you think that's shaped what you do today?
00;01;29;17 - 00;01;47;23
Dr. Hany Farid
Yeah, it's a great question. So I'm an immigrant. My family, was born and raised in Egypt and outside in Cairo and outside of Cairo. I was born in Germany, where my father was in graduate school. My brother and I were born there.
00;01;47;23 - 00;02;06;08
Dr. Hany Farid
We were very young, and we immigrated to the country. And when I was 3 or 4 years old, so pretty young. So I'm an immigrant. I consider myself very much an American. Despite the fact that I'm, wasn't born here. I grew up in upstate New York. My father was a research chemist. In fact, at Eastman Kodak.
00;02;06;08 - 00;02;35;06
Dr. Hany Farid
So thinking a lot about the traditional photography as opposed to the digital photography that we'll talk about, my mother was a teacher. I guess in some ways I was destined or predetermined to become a professor. The intersection of being a scientist and a teacher, and I spent most of my life, in fact, up until two years ago on the East coast, so between upstate New York and Philadelphia and Boston and then 20 years at Dartmouth College, up at the border of New Vermont, New Hampshire.
00;02;35;08 - 00;02;58;27
Dr. Hany Farid
And now I'm here in the Bay area, Northern California, you know, enjoying the, not 20 below temperatures of New Hampshire that comes in December. I can very much relate to that. I grew up in the metro Detroit area and moved to California in the early 2000s, and I, really enjoyed visiting my snow versus living amongst just the right way to say it.
00;02;58;29 - 00;03;21;08
Lisa Thee
So, for somebody who doesn't understand the area of digital forensics, can you give us an overview of what that entails that maybe a grandparent might understand? Good. So the field is really, you know, what we think about day to day. The problem is if I hand you an image or a video or an audio, how do I know if it's been manipulated from the time of recording?
00;03;21;10 - 00;03;54;09
Dr. Hany Farid
That's it. Right. And there's a lot of reasons why we would want to care about that. The use of fake imagery and misinformation to to disrupt, democratic election, to sow civil unrest, to create nonconsensual pornography, to lie, cheat and steal, to defraud people. And what we have been seeing really unfold over the last 20 years as we've entered into the digital revolution and the internet revolution is more and more weaponization of visual imagery, because more and more of us have devices that record and record digital imagery.
00;03;54;09 - 00;04;32;15
Dr. Hany Farid
More and more of us have the tools that can now manipulated. And of course, we all have the ability to distribute that information on the internet and create all kinds of havoc. So we focus primarily on authentication. How do you determine if visual visual imagery is authentic and somewhat unrelated, but in the same sphere, if you will? As you know, we saw I spent a lot of time thinking about how we protect children online from child sexual abuse material, and how do you take those images and video at the scale of the internet and find the ones that are, you know, particularly harmful and illegal and dangerous and remove them from the internet? And that's another branch of what we do in the space of digital forensics.
00;04;32;15 - 00;04;49;04
Lisa Thee
Yeah. And that's the area that we got a chance to get to know each other, about. So do you mind talking a little bit about what your work looked like ten years ago and what it looks like today? Because you've been thinking about this for a long time.
00;04;49;04 - 00;05;12;06
Dr. Hany Farid
Well, before it was in the public sphere of awareness. Yeah, it's an interesting question. I've been thinking about this field of digital forensics for 20 years, and honestly, way before I should have been when I started thinking about this film, analog photography still dominated the landscape. Eastman Kodak was relevant. But you could see the trends. You could see the internet coming.
00;05;12;06 - 00;05;33;13
Dr. Hany Farid
You can see the digital trends. You could see that the tools are getting more sophisticated. And so ten years ago, it was a niche field. There was a handful of us in the world thinking about this, worrying about these problems, worrying about lying and misinformation and fraud and evidence. And it was quiet. It was I sort of longed for those days, to be honest with you.
00;05;33;13 - 00;05;54;15
Dr. Hany Farid
It was it was, you know, a couple times a year something interesting would happen. But for the most part, we were quiet academics doing our thing. And something really dramatic did happen over the last ten years, or we've seen an explosion of social media and explosion of digital technology, an explosion of manipulated content and misinformation and lies and conspiracies, and suddenly it's not so quiet anymore.
00;05;54;15 - 00;06;14;09
Dr. Hany Farid
But also the stakes are very different. Because it used to be that when we were dealing with the national security case or a court case, we would have days, weeks to sort these things out. Now we have seconds, minutes to sort these things out before they go viral and wreak havoc on the world. And so the stakes have gotten higher.
00;06;14;11 - 00;06;53;11
Dr. Hany Farid
It's gotten much, much busier. And it's gotten, it's sort of gotten more difficult to do this because there are so many more people attacking, our institutions of and we have less and less of an ability to trust what we see and hear and read online. Yeah. So with that, about ten years back, maybe a little longer these days, you were part of the consortium that helped to define, a tool that Microsoft open source to the world, really small, medium and large sized businesses for dealing with, images of the abuse of material of children online called photo DNA.
00;06;53;14 - 00;07;14;28
Lisa Thee
Yeah. And that's actually coming back up in the news cycle lately, due to some exists, some legislation that Europe is considering around, the balance of privacy and safety. Do you mind sharing with us a little bit about how you participate in that in photo DNA originally what its original intention was, where it is today, and what your points of view are on that legislation.
00;07;15;00 - 00;07;36;12
Dr. Hany Farid
Good. So, in the very early days, so the internet that we know today is really about 20 years old, right? So about the turn of the millennium. So in the very early days of the internet, one of the first things that we saw was a growth, a disturbing growth in the creation and distribution of so-called child sexual abuse material.
00;07;36;12 - 00;07;56;14
Dr. Hany Farid
Some people called child pornography. Those in the industry call it csam, because we don't want to somehow minimize the idea of what it is. So see crime scene photos. It's a crime scene photo. That's exactly right. For those who you don't know, by the way, today, average age of a child involved in this material is eight years old, down to a few months old.
00;07;56;14 - 00;08;21;02
Dr. Hany Farid
We are not talking about teenagers playing with their sexuality. We are talking about extremely young kids in most cases pre-pubescent down to infants and pre-verbal who are being physically sexually abused and that material being shared. So in the very early days of the internet, we saw this problem in the very early days of the internet. The US government asked the then giants of the technology sector to do something about it, and they didn't.
00;08;21;04 - 00;08;40;02
Dr. Hany Farid
They just didn't do it. Various reasons for that. We can talk about that later if you want. But in 2008, out of just sheer frustration, the folks at Microsoft reached out to me because of my expertise in digital forensics and asked me to come talk to them. And the other technology sector is to try to understand what, if anything, can be done in this space.
00;08;40;04 - 00;09;05;28
Dr. Hany Farid
And after, you know, days of meetings in Washington, DC, we, I came to the conclusion that this idea that you can actually analyze images and videos at the scale of the internet, even then, 12 years ago was not possible. But what I also learned at the time was that the National Center for Missing and Exploited Children, Knickknack, was then home to already millions of known csam material.
00;09;05;28 - 00;09;26;26
Dr. Hany Farid
Today, it's in the tens of millions. And what I also learned is that that same material, day in and day out, week in and week out, year in and year out, keeps circulating. And so I thought, well, why don't we just catch that stuff? We know it's out there, it's been identified and maybe I can't catch everything, but let's at least catch the stuff that we can.
00;09;26;26 - 00;09;48;15
Dr. Hany Farid
We can solve. And so I and some folks at Microsoft, both in the engineering team and their policy team and the legal team, worked for about a year and develop a technology called photo DNA. And it's a very basic, simple technology. It reaches into any piece of content. It extracts a distinct digital signature that allows us to identify that piece of content and only that piece of content.
00;09;48;15 - 00;10;12;22
Dr. Hany Farid
So once something has been flagged as child sexual abuse material, we extract that signature. And then on upload, every image can be scanned and compared against that database of known material. This is, by the way, what we do with viruses and malware and other pieces of harmful content that are being distributed online. It's the same basic idea, so it allows you to preserve the privacy of the user while catching things that can be toxic.
00;10;12;23 - 00;10;35;21
Dr. Hany Farid
That's exactly right. And so that the reason why I mentioned that this is what we do with malware and viruses is that that infrastructure that allows us to scan attachments to your email, the web pages that you are navigating to, images that are being uploaded exists to secure our computers. And surely we can agree that if we're going to deploy technology to secure a computer, we can deploy the same technology to protect our children around the world.
00;10;35;28 - 00;11;01;26
Dr. Hany Farid
Okay, so in 2000, between 2000 and 9 and, well, today, really that technology has now rolled out through hundreds of companies. It is by far as I understand it, the single most successful technology for reducing the spread of Csam. It leads to 99.9% of the reports to the National Center every year, which are, by the way, in the tens of millions of reports a year that they receive.
00;11;01;28 - 00;11;21;26
Dr. Hany Farid
This is not a small corner of the internet. This is an epidemic, on our, society. And this isn't the dark web. This is the general internet. We all most of it is Facebook. Most of it is being uploaded to Facebook. So this is not people aren't hiding. And this tells you their sense of immunity, the sense that there's no risk.
00;11;21;29 - 00;11;46;22
Dr. Hany Farid
And that is really troubling, by the way. And so I have, over the last decade, continue to work on, refine this technology, develop new technologies, develop technologies that can help. For example, gaming platforms deal with, grooming behavior where predators are trying to, extort or sex talk or entice young children on the platform. And this is a space that is difficult to talk about.
00;11;46;28 - 00;12;12;18
Dr. Hany Farid
When you talk to the young kids who are victims of the crimes, it is gut wrenching, heartbreaking stories of physical and sexual abuse that goes on for decades because that material keeps distributing online. And it's something that I think we as a society have not done enough. And here's something that you should all think about that in this country, we pass laws to protect the copyright interests of the movie.
00;12;12;26 - 00;12;31;06
Dr. Hany Farid
And music industry before we pass laws to protect children online. What does that tell you about the priorities of a society that does that? What is wrong with us that we value the movie and music industry more than we do the well-being of our children? But that is the reality of the technology sector that we're living in today.
00;12;31;09 - 00;12;55;27
Lisa Thee
Yeah, I saw a staggering statistic once, of a single example of somebody that was a victim of this crime who was in their late 20s, and what she had shared was that she had been contacted by the Department of Justice to notify her that her images were being used to prosecute a case over 35,000 times.
00;12;56;04 - 00;13;19;27
Dr. Hany Farid
Yep. So it's just the victimization that I. And that's what we. Yeah. And it's you know, I will tell you, I've talked to the young girls, primarily girls who've been victims of this. And what they will tell you, many of them, is that the assault was awful, one of the worst days of their lives. But you know what's worse? Knowing that every hour of every day people are distributing that material and finding pleasure in there and the crime that was committed against them.
00;13;19;27 - 00;13;43;15
Dr. Hany Farid
It is gut wrenching to them. And many of them spent years and decades trying to get this material off the internet and in many cases, to no avail. And that's why it was so critical that we all came together as industry, academic and government leaders to help inform, the decision makers in Europe about the implications of potentially blocking this technology.
00;13;43;15 - 00;13;59;05
Lisa Thee
Can you share with me, if you were involved in some of that and how that all unfolded for everyone?
00;13;59;05 - 00;14;21;14
Dr. Hany Farid
So what you should know about this is that, in the EU and other parts of the world, too, that there's a little bit of a tension between security and privacy. We all want as much privacy as we can get. But there are limits, obviously, because, you know, we can't have absolute privacy. We certainly don't have it in the offline world. There are limits. When I go to the airport, I don't have absolute privacy. They can look through my bag. That's why do we do that? Because we need to secure, air travel. Yeah. So there is tension in the EU where there are some, on the privacy side who say you should not look at the images that I upload.
00;14;21;17 - 00;14;38;02
Dr. Hany Farid
You should not look at the attachments in my email, because that's an invasion of privacy. There are others like me who are saying, yes, I understand that. However, there are kids who are as young as eight years old and four years old and two weeks old, two months old that are being sexually abused and their material are making through the network.
00;14;38;02 - 00;15;01;21
Dr. Hany Farid
And how do we balance these two? And so the EU is considering legislation that would effectively outlaw programs like photo DNA, which have been used for over a decade by these companies to do one thing and one thing only, which is remove child sexual abuse material. Right. And by the way, the same legislators who are trying to dismantle that still want them to enforce copyright infringement.
00;15;01;24 - 00;15;21;06
Dr. Hany Farid
Where are our priorities here? So I have spoken with a number of people in the EU on both sides of the decision making here. I think it is completely wrong. Headed to pit privacy against security for a number of reasons, the least of which is that protecting children from this is a privacy issue. We are talking about their privacy.
00;15;21;09 - 00;15;40;16
Dr. Hany Farid
You're worried about your privacy, but what about that eight year old and 12 year old's privacy? Why don't you worry about their privacy? And by the way, for those of you who don't want photo DNA, would you please do me a favor and turn off your spam filter and your malware and your virus and your ransomware filters that protect your computer on a daily basis?
00;15;40;16 - 00;16;01;00
Dr. Hany Farid
It's the same technology it is. When people look at the world through this privacy lens, they become blind to the realities of having to find a balance between an open and free internet and a safe internet. And I think they have simply fallen on the wrong side of this issue.
00;16;01;00 - 00;16;20;04
Lisa Thee
Thank you so much for explaining that in a way that makes it really accessible to people. I think our primary audience are going to be business leaders across various sectors. So for folks that maybe weren't as aware that this was as large of an issue as it was before, where can they go to have access to this kind of technology?
00;16;20;04 - 00;16;45;01
Dr. Hany Farid
Yeah. So, the way when we develop the technology, Microsoft and I developed the technology. We gave it away for free. There was no business here. We, essentially license it to, to the National Center at no cost. And the National center licenses it to, companies at no cost. And then they share their database of known csam material. So the folks at the National Center for Missing Exploited Children here in the US and Canada, there is an equivalent agency called the Canadian Center for Child Protection.
00;16;45;04 - 00;17;04;19
Dr. Hany Farid
And most countries have their own sort of version of this. But Nick, Nick is the place to start here in the U.S. to learn more about photo DNA and similar technologies. And this is something that people can put in. It's a very lightweight, like putting on a security filter, right? This isn't going to take up engineering resources or time.
00;17;04;26 - 00;17;31;04
Dr. Hany Farid
It's really, it's super lightweight. It's probably a couple hundred lines of code. We wrote, it scans an image in less than a millisecond. I mean, this is really a lightweight technology, and it is honestly the bare minimum you should be doing. It is the bare minimum.
00;17;31;06 - 00;17;50;04
Lisa Thee
Okay. Thank you, Doctor Farid. So I'm going to pivot us a little bit to your your day job of what you do, which is you are a day in day out professor at UC Berkeley. I would love to hear a little bit more about how the trends of Covid are impacting you as a professor or impacting your students. Would love to hear a lot more about that.
00;17;50;04 - 00;18;06;24
Dr. Hany Farid
Yeah. Wow. So like everybody in the world, we've all been impacted, the students much more than the faculty. We here at Berkeley are teaching entirely online. The students are not back. There are a few universities that have students back and are doing some hybrid, but there's a very big campus with 40 some odd thousand students in an urban setting. And there is a sense that we could not bring the students back safely. And so we are doing everything online. Teaching our mind is hard.
00;18;06;26 - 00;18;31;07
Dr. Hany Farid
It's really, really hard. It's hard for the students. It's hard for the instructors. I spent the better part of the summer getting ready to teach. I'm teaching two intro courses this semester in computer science. One in the School of Information, one in the computer science department. And I recorded all of my lectures, so that they were delivered asynchronously so students can do it at their own schedule, at their own time.
00;18;31;10 - 00;19;04;15
Dr. Hany Farid
We have students spread all over the world at different time zones. I would say in terms of overall efficacy, maybe 50 to 70% of where we would like to be. I don't really understand at least Lisa, but there is some there is just a fundamentally different experience of being in a classroom, being in a lecture hall, being able to see your students, being able to talk to them on the way out and into class, being able to get a cup of coffee with them, having them come to your office and talk to you versus, this zoom interaction, it is not the same.
00;19;04;17 - 00;19;31;03
Dr. Hany Farid
And by the way, I think that's the reason why MOOCs, the massive open online courses, never really fully took off and replaced the university setting as a place to get a degree, because that one on one interaction, particularly when you're teaching, is really special. But, you know, we're we're muscling through it's not great. We have another semester of it ahead of us in the spring, but hopefully come fall of 2021 will be out from under this and we'll be back to normal.
00;19;31;06 - 00;19;51;16
Dr. Hany Farid
You know, I really enjoy Brene Brown's, research in the, in these areas. And one of the things she talks about is how we're hardwired to connect. And my guess is that's where things start to break down. It's so hard to really connect with somebody when you can't have all the signals and the in-person interactions to say, hey, is this I my teaching at a level where you're comfortable?
00;19;51;16 - 00;20;05;11
Dr. Hany Farid
Is it stretching you, but not stretching it too much and being able to adjust and adapt? And I can tell you, I've been teaching for 20 years now, and even when I teach a big class with hundreds of students in the class, I can tell when I have them. I can tell them, yeah, I can tell from the vibe in the room.
00;20;05;11 - 00;20;23;12
Dr. Hany Farid
I can tell from the look. I can tell from the fidgeting. I know if I've lost them. I know if I've gone too deep, I know to pull back a little bit. Like there's this instant feedback as you're lecturing, which you just don't get over electronic communication. So, you know, I, I worry about the students well-being. I worry about their education.
00;20;23;12 - 00;20;43;14
Dr. Hany Farid
I worry about the impact for them. Understand? You know, in a four year degree, one year is 25% of your educational experience. It's a lot. It's a lot of your time and a lot of them, you know, we disrupted them last spring in the fall, this spring, you know, in the summer, it's you know, it's not great. It really is is rough on these students. And I think they're, they're unfortunately, getting cheated, out of, out of the best of their education.
00;20;43;14 - 00;21;03;27
Lisa Thee
We're obviously all trying to make the best of it, but it's by no means ideal. So for yourself, knowing that we're in this environment, where did you go to keep learning and growing and be able to adopt some of this digital transformation?
00;21;03;29 - 00;21;29;14
Dr. Hany Farid
You know, one of the great things, and honestly, one of the things I love most about university life is I don't have to go anywhere to keep learning. I am just surrounded by incredibly smart students, faculty, and just the academic environment is so rich. And it's a place of inquiry and discovery and curious people and smart people, and it just it's wonderful.
00;21;29;14 - 00;21;52;09
Dr. Hany Farid
I just love talking to my students and my fellow colleagues. Just yesterday I was on a panel with a Nobel, Nobel laureate in physics talking about the role of technology in society. And it's fantastic. It's it's it's just great. And I really I will tell you, having taught for many years now, the single thing that I hear the most from students after graduation is how much they miss the learning environment.
00;21;52;11 - 00;22;13;24
Lisa Thee
It is really special, and you don't always appreciate it when you're in there because you're worrying about your tests and your homework and your GPAs. But when you leave it, you will understand that that learning environment is so rich and so rewarding. Well, this might you might have answered it already for me, but what does your why? What inspires you to bring your self on those days that you're not feeling the most motivated?
00;22;13;24 - 00;22;44;19
Dr. Hany Farid
Yeah. And these days you know, there's a lot of days like that when you're not feeling particularly optimistic or motivated. A couple of things. I, I really like the research and the science that I do and the outreach that I do, and I think it has been, in certain places, impactful. But if I'm honest, I think the biggest impact I have is on my students as a teacher and as an educator, is exciting and motivating and inspiring young minds to explore.
00;22;44;21 - 00;23;02;04
Dr. Hany Farid
And that for me over 20 years is not gotten old. And look, say what you know about young people and there's a lot you can make fun of. Sure. Millennials, you know, we could tease them all day long. But I got to tell you, this generation of young people are remarkable. They are smart. They are driven. They are motivated.
00;23;02;04 - 00;23;28;06
Dr. Hany Farid
They have a sense of duty and responsibility. And really many of them want to make the world a better place. They see the injustices and, and, you know, being at the ground there and shaping these wonderful minds is really a wonderful, privilege. And it never gets old. You know, the line is the faculty, you get older and older, and the students stay exactly the same age.
00;23;28;09 - 00;23;48;08
Dr. Hany Farid
And it's it's inspiring. It's inspiring to see this new generation coming through with their new set of ideas. I can relate to that. I see a lot of optimism for the advancement of shared value in society with this generation, and how we can look at public private partnerships to solve problems that have probably been around since the beginning of humanity.
00;23;48;10 - 00;24;06;10
Lisa Thee
But now we're starting to get to the places where technology has been able to create ways for computational awareness that we can actually start to drive change. So what are some of the emerging trends that you see that you're most excited about in your field? Yeah. What I would know if I would say I'm excited about it.
00;24;06;10 - 00;24;34;04
Dr. Hany Farid
I'm most concerned about the well, how about this, that we should all be paying more attention. Good. Okay. So good. So here's what I think we should all be paying more attention to, which is the weaponization of technology. In the early days of the internet, we were all, you and I, we saw it. We were so hopeful for what it meant to democratize access to knowledge, to bring the world closer together, to make all of the information available and accessible.
00;24;34;04 - 00;25;03;29
Dr. Hany Farid
But that's not really what happened, is it? And, you know, I worry that we have we have gone sideways a little bit and how technology is being used against us as individuals, as societies and as democracies. And you've been seeing this unfold really now for about a decade. And here's the thing is, I'm a technologist, and I believe in the power of technology, but I'm also not a techno utopian.
00;25;04;01 - 00;25;26;04
Dr. Hany Farid
I don't just fundamentally believe is just building it. And it will all be fine. We've been trying that for 20 years and it's not working. And so we need to really rethink some very fundamental things from corporate responsibility to what technology should and can we develop to what is the regulatory regime look like to rein in a relatively unregulated industry?
00;25;26;06 - 00;25;50;04
Dr. Hany Farid
I think we need to have technology working more for us and less against us. And I think if we don't start thinking more critically about, all aspects of technology, privacy issues, safety issues, addictive issues and the nature of these devices, it's we're going to be end up being controlled by the technology as opposed to having the technology work for us.
00;25;50;04 - 00;26;09;01
Dr. Hany Farid
And so I think we have to just start really starting to grow up a little bit and stop just running full speed into the brick wall. Well, here's the latest, greatest gadget. Let's see what happens next. And so I think we and I think we're getting there, by the way, I think we've, we all woke up 20 years later with a pretty bad hangover.
00;26;09;03 - 00;26;36;03
Lisa Thee
And now we're all starting to try to think through, like, what's next? What's the what's the next iteration of this technology going to, to to look like. Yeah. And I appreciate you bringing some of the, invisible things that are happening to being visible for some of our listeners. Because if you're not in these, corners and these communities that are trying to do nefarious things that are scaling as a result of technology, it might be really easy to think, oh, this is a little fringe thing.
00;26;36;05 - 00;27;02;27
Lisa Thee
But last year, as you mentioned, there were, tens of millions of reports and 65 million videos and photos shared. Certainly not a little minor problem anymore. Yeah. I think, the misinformation landscape around Covid, around the elections, around climate change, around basic facts. Right. Look, we can have different opinions. We can disagree on lots of things.
00;27;02;27 - 00;27;25;00
Lisa Thee
That's fine. But we have to have a shared factual system. And what we are seeing technology do is create essentially a complete division in society where people fundamentally live within these echo chambers. And we are not we're not talking about the same thing anymore. And that's very dangerous for our society and our democracy. You've done some research in that area recently. Do you mind sharing some of the more surprising findings that you've had?
00;27;25;00 - 00;27;47;02
Dr. Hany Farid
Yeah. So yeah, so we have been studying both misinformation as it pertains to images and videos, as you and I have talked about, but also just false information, particularly around Covid. Here's what really shocked us. We we did a global survey here in the US, in Central and South America and, North Africa, in the Middle East and in Western Europe.
00;27;47;02 - 00;28;16;05
Dr. Hany Farid
And the belief in false information around Covid and things like the 5G causes, Covid, things like Bill gates created the virus in order to create a vaccine, and the vaccine is going to contain a microchip, and that microchip is going to be implanted into you, and we're going to be able to track you. There is worldwide, between 15 and 20% of people who believe that type of information, that is deeply, deeply disturbing.
00;28;16;08 - 00;28;37;21
Dr. Hany Farid
Right. And what has happened in my, in my belief is the following is that the conspiracies used to be like the earth is flat. We didn't land on the moon. JFK was killed by government conspiracy. We're hiding evidence of aliens. And, look, it's easy to make fun of people who believe in what seems like slightly outrageous and unlikely conspiracies.
00;28;37;21 - 00;29;04;07
Dr. Hany Farid
But here's the problem with conspiracies. It's not the conspiracy itself that is dangerous, it is that it leads to distrust of institutions. It leads to distrust of governments, of scientists, of experts and of the media. And once you enter into conspiracy land, well, then the landscape of Covid misinformation makes sense because you don't trust the government, you don't trust the media, you don't trust the scientists.
00;29;04;09 - 00;29;46;01
Lisa Thee
And now it's not so funny anymore. It's not like you believe the earth is flat. It's that you believe that Covid is being caused by Bill gates and therefore you're not going to get a vaccine. And now we have threats to our society. And so what's happened is this, this slow but sure, chipping away of distrust and there's a number of reasons for that part of which, of course, is social media, because it's very easy in social media to live in your own little echo chamber and to be algorithmically fed content and news and information that conforms to your worldview, to find groups of like minded people that normalizes criminal behavior, abusing children, hateful
00;29;46;01 - 00;30;07;18
Lisa Thee
behavior, white supremacy, or just conspiratorial, information and not have a reality check outside of that bubble. And that is very, very dangerous. And that's what we have been seeing unfold over the last few years. So to wrap this up, I'd love to hear a little bit more about what is your most proudest accomplishment today?
00;30;07;18 - 00;30;26;22
Dr. Hany Farid
That's easy. I think the work we did on photo DNA is by far the most impactful. And what I really what I love about that project is a couple of things. One is I've heard from victims, you know, how grateful they are to know that at least there's a technology out there that has the chance of eliminating some of this content.
00;30;26;22 - 00;30;50;01
Dr. Hany Farid
And that's really rewarding. But two is that, you know, when when you face these global problems, these massive problems at a scale that is unimaginable, like tens, hundreds of millions of these uploads, multi-billion dollar a year industry, the kids getting younger, the crimes getting more violent. You sit back and you're just like, look, I'm just one person.
00;30;50;01 - 00;31;12;11
Dr. Hany Farid
What can I do? Right? But here it was 2008. Me, two or 3 or 4 other people from Microsoft decided enough. We've had enough with this and we did something about it. And, you know, don't. It really taught me not to underestimate the power of a small number of people to change the world. It's frustrating. It's hard. It's up and down.
00;31;12;11 - 00;31;36;11
Dr. Hany Farid
More often than not. You won't succeed. But it can work. You can affect global change from your office. And that really was. It was pretty remarkable and inspiring, to, to work on that project, especially leveraging your own unique skill set. Right. You couldn't just serve four people in that room. You had to throw the four right people in that room.
00;31;36;11 - 00;32;00;23
Dr. Hany Farid
And I think a lot of our listeners are going to be those people that have these really special skills that their day job incentivizes them to apply to certain issues. That's right, that's right. Yeah. And look, honestly, we were at the right place at the right time. There was incredible leadership from Microsoft all the way up to the very top to really put muscle behind this, not just for six months, not just for six years, but for over a decade.
00;32;00;23 - 00;32;17;12
Dr. Hany Farid
They have been, you know, fighting the good fight. And so everything had to come together. But there was like four of us in a room, you know, plugging away at this thing. And, you know, a decade later we affected change. And that, you know, that is really that that kind of global impact is, is wonderful to see. Yeah.
00;32;17;12 - 00;32;37;11
Lisa Thee
In my experience, when you can align your special skills with a mission of what your why is, amazing world changing things can happen. I think that's right. And I think you're a great example of that. And I hope you inspire our listeners to figure out what their wonderful, special skill and that burning problem that gets them out of bed at night.
00;32;37;14 - 00;32;53;24
Dr. Hany Farid
They just can't stop thinking about is and and figuring out ways to channel that. That's right. And I'll just add two things is don't also forget that failure will be part of it. Like we in that process, there are lots of ups and downs and also that change comes in excruciatingly small steps. It's just the reality of the world.
00;32;53;24 - 00;33;19;14
Lisa Thee
And it's okay, right? That's so right. It's just the way it seems to work. But those steps still matter. They do. And you look back and you go, oh wow, look where we got in those really incrementally small steps for a long period of time. So for our listeners that have found so much interest in this topic and want to learn more about you and the work that you're doing, where can they find you and keep tabs on, what you're up to?
00;33;19;16 - 00;33;41;15
Dr. Hany Farid
That's a good question. I'm actually not on social media. I'm not on Facebook. I'm not on YouTube, and I'm not on Twitter. The easiest place to keep up with me is LinkedIn. And my Berkeley web page, which is just faried.berkeley.edu. Lots of writings, lectures that I've recorded. Everything is on that is on that website.
00;33;41;17 - 00;34;02;02
Lisa Thee
Okay. Wonderful. And how can people support your work if they have skills in this area and they want to do more to contribute to society? Are there ways that you can engage people like that? That's a really interesting question. I, you know, we run an academic lab, so we don't have that scale, but there are many organizations that are doing really good work in this space.
00;34;02;02 - 00;34;32;20
Dr. Hany Farid
Let me just name a few. National Center for Missing and Exploited Children, the Canadian Center for Child Protection. There's an NGO called Thorne that was started by Ashton Kutcher and Demi Moore that is doing incredibly good work in this space. There's a wonderful team at Microsoft working on child safety issues for years now. So there's a lot of groups out there that are working very hard at the NGO level, at the private level, and at the government level on these and on these problems.
00;34;32;22 - 00;35;05;19
Dr. Hany Farid
And I would say start there. Wonderful. I think, for international audiences as well, you can look into the We Protect Global Alliance. They have, a broader international lens on this. And just remember, well, yes, yes. Thank you. And just remember, for those of you, as you're thinking about your end of year giving, the National Center for Missing Exploited Children and Thorne or 501 c threes there where I give my personal dollars to because I know, from working deep in the data that no dollars squandered and it drives systemic change in the world. So and I'll add to that list the Canadian Center for Child Protection, which is a small but fierce group of really dedicated people up in Winnipeg, Canada, doing really, really good work on behalf of children. And also, that's where some of my dollars go every year as well.
00;35;05;19 - 00;35;24;16
Lisa Thee
That's wonderful. Thank you so much for your time today and really enjoyed this conversation.
00;35;24;21 - 00;35;39;17
Dr. Hany Farid
Great talking to you, Lisa. Thank you. Hey everyone. Thanks for listening to the Navigating Foreign podcast. We'd love to hear from you. At a crossroads of uncertainty and opportunity, how do you navigate forward? We'll see you next time.