george grombacher 0:02
Well, hello, this is George G. And the time is right. welcome today’s Yes, strong and powerful. Rebecca Ericone. Rebecca, are you ready to do this?
Rebekah Arrigoni 0:08
Absolutely.
george grombacher 0:10
All right, let’s go. Rebecca is the CEO of Lodi. They’re a company dedicated to helping people regain their online privacy by utilizing facial recognition technology, and identifying whether or not there are any intimate images or videos on the internet of you, Rebecca, tell us a little bit of personal lives more about your work why you do what you do?
Rebekah Arrigoni 0:32
Yeah, I’m married with three kids. I live in Seattle. I have a giant puppy. That’s about 90 pounds. And that takes up a lot of my time, as long with you know, starting a brand new company, and but it’s been really exciting. We’ve been having a lot of fun with it.
george grombacher 0:50
We kids and a 90 pound dog is the dog, the largest of the children.
Rebekah Arrigoni 0:54
He’s a bernese mountain dog and he is the largest child
george grombacher 0:59
was the dog there before the kids are?
Rebekah Arrigoni 1:03
Here we got him just just a year ago. So he’s been a very welcome addition, but he has definitely a lot of work. He’s a big dog. So I’m, that’s why I do weightlifting though.
george grombacher 1:15
Okay, now now it all makes sense. Right? So Lodi, how did that how did that come to be?
Rebekah Arrigoni 1:22
Yeah, so we had run operations for another AI company. And we were working on a facial recognition project for another company. And we started thinking up the idea for Lodi, and you know, with this client’s permission, we kind of used a similar algorithms and data set to kind of, you know, create this product. And, you know, it’s still it’s a, we still use facial recognition, and basically can find out if people have any non consensual, intimate images, or videos on the internet, also known as revenge porn. And then Lodhi can help users take those images or videos down via the federal law called the Digital Millennium Copyright Act, or the DMCA.
george grombacher 2:07
So you became familiar with, with, with with the technology that could and does fix this problem. But what was the motivator? Did you just you came to realize, Wow, this is a huge problem.
Rebekah Arrigoni 2:25
Yeah. So I mean, I am not a victim of revenge porn. But there was a point in my life that I was sexually assaulted. And you know, you do that does stick with you, you think you know, what if there are photos or images of me during this really intimate moment, moment, and for millions of people, in fact, one in three women are sexually assaulted. They they don’t know if these images could potentially be out there. 10 million people in the United States are affected by intimate image abuse, also known as revenge porn. These terms are often interchangeable, but image abuse is the more preferred term because it isn’t actually about revenge. It can be someone’s voyeuristic pleasure. And more importantly, the word revenge also implies that the victim did something that was deserving of this act. The word porn conflates nude photos of oneself and materials that are meant for public consumption. And so I do use the word revenge porn because colloquially known, that’s that’s what it’s known for. But off and often what people are searching also, but when they do want information, more information about it, we do use the term intimate image abuse.
george grombacher 3:40
That makes a lot of sense. So it’s way better term to 10 million people. And that’s probably an estimate. I wouldn’t be surprised if it was if it was a lot higher than that.
Rebekah Arrigoni 3:51
Absolutely. It’s just what’s reported. And that’s, of course, many people don’t it does go on reported.
george grombacher 3:59
Right. Yeah. All right. And so you mentioned that there is an existing law. What was that again?
Rebekah Arrigoni 4:07
Yeah, there is a, so there is no federal regulations around specifically non consensual, intimate images. But there is kind of a way around it. And it’s called the DMCA or the Digital Millennium Copyright Act. And essentially, people are able to use this act by saying that I did not give consent for my image or my video to be uploaded to the internet with its it was done without my consent. And so they’re able to enforce this law. So it doesn’t affect the hosting sites that are hosting the images or the videos. But it does allow the person who’s in the media to be able to get that image taken down. There are state laws that do this. Well, there are the majority of state laws that do have laws prohibiting This ranges from a misdemeanor. to a felony. But it’s been a difficult process to enforce and process. And it can also be pretty traumatizing for the victims.
george grombacher 5:08
I’m sure. All right. So how does how does Lodi actually work? Like? How does it actually how does it know? What’s, what is or what’s not on the internet?
Rebekah Arrigoni 5:20
Yeah. So what we do is we have a whole team of data scientists and engineers. And we’re able to essentially collect all public images and videos from user generated adult websites. So we’re not going after anything that’s professional, it’s what’s user generated, and it’s not behind a paywall. And so what people can do is if they are wanting to see if they have any media out there, but it’s been uploaded without their consent, they’re able to go to the site and upload a profile picture of themselves, we’re able to essentially use data points that can match their face, and then we match those to the datasets that we have already. And then we’ll serve them those results.
george grombacher 6:07
Okay. So if I’m curious, you can use use an image of my face and use the resources you have and the pops up and says, Oh, okay, actually, George, there are some images of you, or some videos of you on this, one or two, whatever it might be this this, this this website. Right. And due to the Digital Millennium Copyright Act, what, what can I do? What what happens from there?
Rebekah Arrigoni 6:39
Yeah, so our our data sets, we do have about a 98% accuracy. So we do have thresholds in place. So even if we’re not totally sure about an image, we do still show it to you just in case there isn’t a match. And then, if there is a match, the users are able to essentially save this to like an inbox, and then we will assist them through that DMCA takedown process. We do. It isn’t, it isn’t like secret information, you don’t need to go through a service, we actually do highlight it on our website, how people can remove it themselves. Sometimes it can be really pretty lengthy. And oftentimes, we found that the hosting sites won’t actually remove it unless there’s like a lawyer involved, or at least looking like the request is more professional. And so when we come in, we actually have a formal letter that we send to these companies requesting that it gets taken down. And that’s been about two days or turnaround time. But we we haven’t seen it more than two weeks.
george grombacher 7:43
Wow. Okay. And worst case scenario, we, we find my image on XYZ site, we send them we follow the steps, we submitted a letter to them, and they don’t respond, or they tell me to get lost. What is is there? Is there a legal intervention available to me,
Rebekah Arrigoni 8:12
there is not through our company, but we do offer resources for people to be able to reach out. There are a bunch of established nonprofits and other companies in this space already. And we do we are developing partnerships with some of the nonprofits as well, so that we can even get access to Lodi to the people who really do need it and may not be able to afford it. We do try to make it an affordable option for individuals as well as content creators. Because individuals aren’t just victims of you know, nonconsensual, intimate images and media being uploaded without consent. There are content creators in the same space that are trying to make a living and people are stealing their content. And so we’re able to find that as well. So if we aren’t able to get it taken down, which so far, we’ve had a pretty high success rate, I can think of just a couple where we’ve had to have like a human intervention with it. But we do offer resources for people to find lawyers are other options.
george grombacher 9:15
Interesting that that certainly if I am a content creator, and it is it is the way that I earn a living is to is to is to sell images of myself and then all of a sudden somebody’s stealing it and putting it out there where where people can access it without pain, then that is a huge problem and thought about that.
Rebekah Arrigoni 9:38
Absolutely.
george grombacher 9:42
So are there a finite number of you think you mentioned it’s user generated sites? How many are there? Is it an infinite number? Is it hundreds 1000s
Rebekah Arrigoni 9:56
really is it’s millions. We have we’ve Going through millions of images and videos so far. And we’re constantly adding new information with our, with our system. So that way people can even just set up automatic alerts. So say somebody, you know, signs on and wants to see if there’s anything up to them. And there isn’t Thankfully, there’s always new user generated content being added all the time. And so we will keep going back to those sites and then keep checking for people.
george grombacher 10:34
Do you think that the Are we becoming more aware of the risk of this? As as we live in a more digital age of putting our own images online? Are we doing that less? I guess the question is, do you see trends?
Rebekah Arrigoni 10:56
Yeah, absolutely. We’ve noticed that the, it looks like it’s affecting people between the ages of 18 and 34. Which makes sense, because those are the generations that have had more access to technology and phones and sharing of media. And there, there have been so many reports that are so many instances of this that have just gone unreported. But also, because of the fact that we do have more technology, it’s just more easy. It’s just easier for people to be able to find this information now. And then, you know, take it down or, you know, that has to feel like that wasn’t really even an option. Because people just didn’t even know that it was even out there.
george grombacher 11:44
Yeah, interesting. Is there. Is there an appetite? from a, from a legislative perspective, I think that you mentioned that there are certain states or municipalities that have laws in place. But do you see more more laws on the horizon?
Rebekah Arrigoni 12:02
I do, actually, we actually have a lawmaker in Washington that is starting to take more initiatives with revenge porn. And we’re actually really hoping to be able to kind of get on the ground floor with a lot of the legislation around it, especially with using AI and facial recognition. I think there’s a lot of misinformation around the use of AI and facial recognition and a lot of ethics that go involved with it. You know, it’s a really new technology, and it’s changing so quickly. We try to be as transparent as possible with you know, what media, we’re, we’re looking at what we keep from users. In fact, we have, like a, it’s, it’s by default, there is no data that’s saved from users, aside from the one profile picture that they upload themselves. But we don’t do any training on any of the information that’s uploaded, we do really believe that, that people’s image should they should have autonomy with that. I mean, it really does go against the whole ethos of our of our company, we’re trying to get, you know, these images and videos taken down for people. So we’re not going to be a part of, you know, hanging on to this biometric data of other people. But we do want to have a say, you know, we do want to have some sort of an influence in the legislative sphere. I think that there’s a lot of misinformation around it, but also a lot of room for growth and just to have these tools available to people.
george grombacher 13:37
Well, I couldn’t agree more. You’re familiar with the name image and likeness. Deal with the NCAA athletes? Do you see parallels of any kind?
Rebekah Arrigoni 13:52
Um, yeah, so we we actually have some people in our sphere that are also working with, like deep fakes. And, you know, trying to find, trying to trying to basically being able to invoke the DMC act with that, as well. Right now Lodi is specifically just focusing on on adult content and user generated specifically at that. But I do think that this technology will eventually be able to cross over into other sectors, and even help with maybe just like overall reputation management. We’re definitely not there yet. We definitely just want to keep our our focus a little bit smaller and specifically help these people. It’s largely, it’s disproportionately affecting women and LGBTQ plus communities. And so this is kind of where we’re wanting to keep our focus right now.
george grombacher 14:52
What is the breakdown, just roughly?
Rebekah Arrigoni 14:55
Oh, I don’t have that stat specifically. and I can definitely get that back to you. But it is. It is mostly women and then LGBTQI plus communities. Men are of course, affected. I think that it’s probably underreported just because of the stigma around men’s mental health and asking for support and just sexual assault in general. But from the stuff that we have, it’s disproportionately affecting these other groups.
george grombacher 15:30
Got it? What do you wish that more people knew about this?
Rebekah Arrigoni 15:38
Um, I do hope that it does open up more, more, more conversations just around just around nonconsensual, intimate images and media in general. And it’s and you know, more conversations around AI and facial recognition and ethics. I do think that Lodi is going to be able to provide a really amazing service for people and also content creators to being able to keep track of just their media and maintaining their income as well.
george grombacher 16:13
That makes a lot of sense. Yes, certainly. So how can you walk us through the process for working with Lodi, and just how it all works?
Rebekah Arrigoni 16:26
Absolutely. So people can head over to go lodi.com We have multiple blog posts highlighting how to do a DMCA takedown on your own romantic and platonic relationships and dating advice that has been researched and written by our on staff psychologist. And then in depth explanations about how Lodi works. We do offer resources for victims and are actually working on more partnerships with multiple nonprofits just to be able to bring Lodi to people for free as well. There are multiple plans to help individuals and of course creators search for content and be able to get it removed to floaty found anything. And then we have our whole onboarding flow, people can do an initial search just by uploading a simple profile image. We’ll walk them through that first search and then serve them their results. There is no initial commitment payment, just a free search that people get to do and have peace of mind.
george grombacher 17:26
If you just as much as I did show, Rebecca your appreciation and share today’s show with a friend who also appreciates good ideas go to go lodi.com It’s G oloti.com and learn about the DMCA process, the DMCA takedown process, check out the dating advice that they’ve put together and take advantage of the free search as well. Thanks for Abraca Yeah,
Rebekah Arrigoni 17:51
thank you so much.
george grombacher 17:53
And until next time, remember, do your part by doing your best
Transcribed by https://otter.ai