The secret to amassing engagement online, especially on social media, is driving divisiveness. Larry Magid, journalist, technology columnist, and President and CEO of ConnectSafely, shares how strengthening online safety is more critical than ever.
Tune into this episode as Eric Chemi and Larry discuss the importance of ConnectSafely’s organizational mission.
Click to Read Transcript
0:07 Politely Pushy. Welcome to Politely Pushy. I’m your host, as always, Eric Chemi. Today, we’re joined by Larry
0:12 Magid. He’s the founder, the CEO of connectsafely.org. Today, we’re talking about how nonprofits can engage the
0:18 media while continuing to raise awareness about safety and responsibility. Of course, big issue
0:24 here in the age of AI, right? This is the one of the big talking points. It’s got national and global ramifications
0:31 for sure. So Larry, thanks so much for making time with me today. A great pleasure. Looking forward to our conversation,
0:36 you know, on a quick summary, when you say to someone, hey, I I run connectsafely.org. What does that really mean?
0:43 What do they want to know about what what do you really do and what does the organization do? Well, I kind of map the train and other
0:50 people drive it. Although I I do drive I do do a lot of driving myself. uh you know I I basically help set the
0:56 strategy, hire the staff, raise the money. Um I do get very involved in in
1:03 communications whether it’s as a writer or uh podcaster or doing videos but also
1:09 working with our PR term which happens team which happened to be bospar uh full disclosure uh and trying to just
1:16 basically set the tone for the organization and for the most part not exclusively be a spokesperson although
1:22 we have other people in the group now who are really good as spokespeople as well so I’m trying to make it more of an
1:29 organization at one point it was me and my co-director who’s left the organization about 10 years ago and it
1:34 really was all about us. But I’m trying to make it more about the the message and the organization and the mission of
1:40 trying to make the internet a safer and more civil place to be uh working with
1:46 companies and legislators and kids and parents and anybody who will listen uh
1:51 to try to get a more civil discourse and in a safer place for everyone especially kids. So just real quickly, what’s the
1:59 size of the organization, the the budget, how many how many people, what give me some quantitative metrics before
2:04 we get into some of the qualitative? Yeah, I guess there’s no secret since as a nonprofit, we have to file a 990. So
2:09 it’s all public information. Um, you know, depending on our fundraising success, our we could be anywhere from
2:15 500,000 to just under a million in a in a given year. Um, we’re small. We have two full-time people, but we have a
2:22 number of contributors. I mean, people who are almost full-time. Uh, we have our media director who also does some
2:28 special projects for us. We have our education director who works for us part-time and also is assistant
2:33 principal of a school for her daytime gig. We have an online a virtual assistant. She’s great. She Rebecca does
2:41 all of our admin work. I don’t even know where she lives. She lives somewhere in the Midwest, I think. Uh, what do we
2:46 have? We have a uh a policy director who works remotely part-time. Um I’m missing
2:53 a few people but you know a handful of folks around the the country and even around the world we actually have
2:58 somebody over in the UK that does some work for us. So I would say on about eight people are affiliated with us as a
3:05 in some kind of paid capacity. And you said the goal is to make the internet a safer place. And you know a
3:11 lot of people might laugh and they say I think it’s getting more and more dangerous. There’s no way it’s safer now than it was before. It feels like you
3:17 and your band of eight are are fighting an uphill battle. Well, yes and no. I mean, in some ways it’s safer. In in
3:24 many ways it’s not. I mean, it also depends on how you define safety. I mean, if you define safety simply in
3:29 terms of physical harm or, you know, horrible things happening, it’s pretty steady. Things have been an issue since
3:36 the very beginning. We actually opened our doors in late 2004 during the period
3:41 we call predator panic. when uh to catch a predator was on the air and attorneys general were trying to ban MySpace and
3:48 there were these horrible concerns about these horrible people who are sexually exploiting children. Those people are
3:55 still out there. But for the the vast majority of people online, there are
4:00 annoyances. There are perhaps stress stresses. There could be mental health
4:05 issues, but it’s not catastrophic for the vast majority of people. Well, there are cases of course where people have or
4:12 horrible things have happened to people, but my by and large it’s it’s more like an annoyance. I’ll give you an example.
4:19 Uh, a day doesn’t go by that I don’t get some kind of a scam attempt. You know, somebody trying to scam me out of money.
4:25 The favorite ones that I get are all these really attractive looking women who want to start up a conversation with
4:31 me. And as flattering as though that may be, I have a feeling they’re not interested in me as much as whatever
4:37 kind of money they might be able to extract from me. But these things happen daily and and they’re just annoying
4:43 because most of us learn to just blow them off and not respond. But once in a while, somebody does respond and gets
4:49 gets scammed, gets taken, gets gets gets a victim of fraud. Do you respond.
4:54 Oh, I never respond. Well, I say never. I have a couple of times. not to those hacks, but if they’re particularly
5:01 amusing, I might I mean, I I’m not sure I’m proud of this, but one time I got a call from somebody reportedly uh he
5:08 claimed to be from I think Norton or Microsoft uh to tell me that my computer was infected and to help me. And I
5:14 played along with him for a while and eventually said, “Look, I know with absolute certainty this is a scam. I
5:19 could tell because I followed down the things you asked me to do and they don’t make any sense and I’m a techie guy and
5:25 by the way, why are you doing this?” And we got into a really interesting conversation. He lived in the Philippines. He was broke. He needed the
5:31 money. And at the end of the conversation, he asked me for a job. I didn’t give him one, but you know, we had a nice talk and he admitted that he
5:38 was scamming because he had no other in his mind no other economic alternative. So, I’m aware of that.
5:46 What is the main goal? Right. So, in terms of connect, we want to make it a safer place. How do you accomplish that? Right. You talked about, hey, you’re
5:52 working with the PR agency in Bosear to help do that, for example. What are the goals that that you guys are trying to
5:58 do? How does PR help with that? How do you decide what goal you’re trying to achieve?
6:04 Like, okay, by the end of the year, you know, we we’ve engaged with this firm to try to accomplish XYZ goal. How do you
6:10 measure all of these success metrics? It’s very difficult to measure success because it’s not as if you can test the
6:17 waters and say, well, gee, we did this, this, and this, and the internet is x% safer. It’s just much too vast to even
6:23 begin to measure anything like that for any one organization or for that matter the entire subtotal of all the
6:29 organizations and governments and individuals and corporations that are trying to involved with trust and
6:35 safety. And there are tens of thousands of people who on one level or another involved in trust and safety. In terms
6:41 of our role, we we see and I’m going to kind of making this up as I go along, but I think that we have like four major
6:46 constituencies. Parents are an important one. Trying to get parents to understand
6:52 what it is their kids are doing online and how they can be helpful. And that’s a that could be a 4-hour podcast all by
6:58 itself. The other is young people trying to get educate them directly on how to
7:03 use the internet and social media and the various apps in as safe a manner as possible. And by safety, I also include
7:10 privacy and security. And I also include civility these days and and things like protecting yourself against falling for
7:16 and sharing misinformation. Another important constituent are industry trying to get our our partners like
7:24 Meta. I’m not sure I’m going to let them be able to rattle them all off, but Discourse, Google, Tik Tok, Open AI, uh
7:33 Roblox, Amazon Kids, I’m probably left a couple out. Uh Character AI. The get Oh,
7:39 Uber is our latest partner. I don’t know if we’ve announced that yet, but maybe I’m breaking news here. Um, trying to
7:45 get them aware of what they can do to make their platforms as safe as possible. So,
7:50 what do you mean by a partnership? I I want to hear the rest of your answer, but what do you mean like okay, we’re a partner with them? So, so who’s paying
7:55 who? What what’s what is their partner? We are a nonprofit and we depend on the generosity of our supporters and so
8:01 these companies will make donations to connect safely. We do not disclose any one donation, how much any one company
8:08 gives us, but it it varies. Uh, and together that’s how we’re able to make up the number I just gave you, the
8:14 roughly 500,000 on a some years up to almost a million on other years by all
8:20 of these companies chipping in. Um, and so they’re partners in two ways. One is financial. That’s very important. Two is
8:26 we work with them to advise them on safety. So, for example, I sit on the safety advisory boards, several at Meta.
8:34 I sit on the reality labs board, the youth advisory council, the safety advisory council. I sit on the uh safety advisory
8:41 council at Roblox. We used to Oh, Snapchat is another one of our partners. We used to do Snapchat. They’ve shifted their
8:48 advisory council going a different direction. Um, so we advise them. We
8:53 work very closely to to vet some of their things. like one of our partners just today submitted some content that
8:59 it wants to send out to educators. We are looking at it and we are giving them feedback. So there’s that level of
9:05 interaction. And then finally, we publish guides and we publish videos and
9:10 we publish animations around their products. Uh a parents guide to TikTok,
9:15 a parents guide to Instagram, of videos, uh animations, all sorts of resources.
9:21 We do podcasts that talk about some of their products. We also, by the way, cover companies that we don’t partner
9:26 with because if a company is having an important impact on young people, we want to be there and we want to give
9:32 people advice even if they’re not one of our partners,
9:37 you know? So, it makes me wonder if I ask questions like, oh, well, what do you think about so and so company?
9:43 Is it is it hard to give a fair answer, right? Like we we hear about Facebook, Instagram, Meta, all the dangers that
9:49 we’ve heard about with teenage girls or what’s on Instagram or predators on there and all this kind of stuff. We
9:54 know Roblox has a lot of predatory behavior. We know there’s a lot of dangerous things out there,
10:01 but does it does that cloud your ability to give an honest assessment? It’s a really fair and appropriate
10:06 question. And ironically, it’s a conversation I had with your boss, Curtis, who’s the I think one of the heads of of Bospar just today because we
10:14 were talking about how when when he pitches me to media, should he disclose? And and my answer is yes, he should. I
10:20 do. Anytime you see an article about from me that that talks about substantively talks about one of our
10:27 partners, there is a disclosure which says Larry Magid is CEO of ConnectSafely, which uh receives financial support from the name of the company. I
10:33 always disclose. So the answer to your question is I would be naive and lying to you if I said it had no impact. That
10:40 would be ridiculous. Obviously it has an impact. However, having spent 40 years as a journalist and we haven’t talked
10:46 about my other career, my other background, I really have a very strong
10:52 ethic that I’m not going to let it sway me too much. Just today, I’ll give you an example. If you go to
10:58 larrysworld.com, which is my personal website, and we haven’t posted this on ConnectSafely yet, or
11:03 if you go to my LinkedIn profile or my Facebook profile, you will see an article. It’s about the Charlie Kirk
11:10 discourse, the you know, the conversations we’re having in the wake of the Charlie Kirk murder and the and
11:16 the enmity and the the the divisiveness and all of the anger online. And I cite
11:22 some of our partners and in fact for the first time ever I come out against uh section 230. I come in favor of
11:28 modifying section 230 of the communications decency act. None of our partners are going to like that. But I
11:35 say it as I see it. I mean if they’re there’s no question that the algorithms
11:40 on these on these platforms are causing some of this divisiveness because anger
11:45 sells. If you can get people riled up and angry and yelling at each other, that’s going to drive engagement and
11:51 that drives profit. They are profiting from this and because they amplify. So if you look at section 230, it was
11:57 written in 1996 when we didn’t even I think we barely had AOL. We certainly didn’t have social
12:03 media. And what it basically was Prodigy back then. Remember that Prodigy?
12:08 I was the tech columnist for Prodigy. I also wrote tech columns for AOL and Compuser back then. And back then, um,
12:16 these were kind of tantamount to common carriers. You would argue, well, you don’t blame the phone company if
12:22 somebody makes a prank phone call, right? There was no algorithm at the time. No algorithm. It was they were forums.
12:28 Yeah. And the and also the other reason for 230 is that it immunized them if
12:33 they did try to moderate to to moderate and something went wrong. Kind of like the good Samaritan law, right? You’re a
12:39 doctor, you stop by the side of the road, you aid a person who’s been injured in a car accident, they can’t
12:44 sue you because you’re acting as a good Samaritan. That’s exactly what 230 did. It had it played a very important role.
12:50 Algorithms changed. And the other part about it that is that the the Supreme Court ruled or or the Congress ruled, I
12:56 don’t know, somehow it was decided that these are carriers. They’re not publishers. Their job is to carry is to
13:02 is to is to basically allow you to say what you want. So if you say something that that somebody wants to sue over
13:09 that it’s between them and you it’s not between them and Compuser or prodigy. Well that today with algorithms I would argue
13:17 that these these sites are publishers. So if I put put a post up there and they just let it run and my salers see it
13:23 fine. That’s just me. If I did say something incendiary or illegal sue me don’t sue them. If they are then putting
13:30 it publishing it I’m sorry amplifying it. It’s like this. Let’s say you’re in
13:35 a small town and they have a they have a town square and some crackpot goes in the town square and starts screaming at
13:41 the top of his lungs some rant. I don’t believe he’s committing a crime and I
13:46 certainly don’t blame the mayor of that town for the fact that some nutcase in his in his town is screaming, let’s say,
13:52 racist, homophobic, sexist things. But what if the mayor were to go up to him and say, “Hey, here’s a bigger megaphone
13:57 and let me gra let me gra let me get a crowd together so they can all hear you. like I’m going to go on city hall uh you
14:03 know let me let me promote let me promote your stuff as the mayor of the town. Let me give you more space within the city
14:09 to do it. Yeah. And at that point I think the mayor would be opening him or herself up to some litigation because they were part
14:15 of the problem. And so that’s what’s happening currently in social media as I understand how the by the way to the
14:21 extent that I understand how these algorithms work because that’s the other problem with algorithms is they’re very difficult for anyone even the people who
14:27 work for these companies to know exactly how they work. But we do know they have an impact. But we know they work. Yeah.
14:32 Right. And and I I know people who have said recently, and maybe this is about, you know, Meta’s Facebook’s AI
14:39 improvement on Instagram. I know people who said, “Oh, I use Instagram now and the AI is so good that I’ll watch one
14:45 video and boom, I’ll get exactly that kind of stuff coming in
14:50 immediately now.” And it wasn’t true before. So they say it’s much more like Tik Tok now. And I’m glad you mentioned
14:56 the fact that some people and I’m one of them actually like parts of what that do does because again when I actually
15:02 turned off the there is a way on Facebook that you could turn it off. You could basically get a chronological thing of all your friends. Bored the
15:08 heck out of me. I mean I I thousands of friends. I mean because I used to accept friends before there was a follower
15:13 model that they now have and it was boring. And I find Facebook a lot more interesting because of the algorithms.
15:19 I’m hearing from people that I have interacted with that I want to interact. And by the way, in my case, they don’t
15:24 all agree with me. I actually have a pretty good spectrum politically of people from the left and the right. We
15:30 have these conversations which are usually civil. Not always. Sometimes I have to delete comments. But my point is
15:35 that the algorithms do put us into into bubbles. And um and it’s that’s the
15:42 problem. But it’s the reason why Netflix can recommend shows that I actually want to watch, why Amazon can recommend books
15:48 and products that I actually want to buy. These algorithms have a positive purpose, but when it comes to civil
15:54 discourse and politics and other issues, they can kind of skew the skew the
16:00 system in dangerous ways. I’m glad that you mentioned 230 because I didn’t think of it I knew it was from
16:05 a while ago. I knew it was from the 90s, but I didn’t think of it as yeah, there were no algorithms at the time. It was just like a AOL message board. There was
16:12 nothing that they were not amplifying anything. And so I’m glad you clarified that because I think people hearing that
16:18 will realize, oh yeah, it is different now. those those companies were not they were not ranking and algos and promoting
16:27 and you know shadow banning they weren’t doing any of that stuff. I remember you post is what you post.
16:32 I remember when Compuser the one of the executives of Compuser I think was arrested in Bavaria because the Bavarian
16:38 government was very upset that Compuser was allowing somebody dispute Nazi propaganda and in Germany unlike America
16:45 Nazi propaganda is not protected speech. against the law and they went after Compuser and I remember being really
16:50 upset about that. Of course, I I find Nazi propaganda horrendous, but I didn’t think that Compuser should have been
16:56 held responsible for the fact that some Nazi in Germany chose to use their platform. That’s now if Compuser had
17:03 made that on the front page and publicized it as sometimes happens today, I think I would have had a very
17:08 different attitude. Yeah. Yeah. So, so how are you advising
17:14 companies when they look to you for advice? What are you telling them? Especially now in the world of AI,
17:19 right, where hey, you know, this this law might change, you know, big governments are coming after tech
17:25 companies for for what they are allowing or what they’re promoting, what their algo is allowing. What is the advice
17:30 that they are looking for to get from you? Well, I think it’s a matter of first of all anticipating that these regulations
17:35 are coming and they’re coming very slowly at the federal level of the United States. They’re coming at the state level and they’re coming
17:41 internationally. The European Union for example, European Commission and the and the British have passed a number of of
17:47 fairly strong laws. Uh so they need to anticipate these laws and try to not
17:54 counteract them but just anticipate them and maybe comply in advance so that the laws uh either aren’t necessary or if
18:01 they are imposed they’re not going to have a disruptive impact on their business. uh they need to figure out
18:08 this very very fuzzy and difficult line between free expression and moderation
18:15 and that’s I don’t have a a clear answer for that one but they need to find ways to allow for free expression but at the
18:22 same time prevent really horrible horrible dangerous content and not amplify dangerous content. They need to
18:29 do this in a politically charged environment where we’re polarized. So what I think is dangerous you might dis
18:35 you might think is great content or vice versa. So you know and that’s not easy but we we talk with them about ways that
18:42 they can do that. How can they for example um well I’ll give you an example again. It’s on my mind the the Charlie
18:48 Kirk horrendous murder. One of the things that happened in the immediate aftermath is that people were posting
18:54 videos of this I’m told I haven’t seen it but I’m told a gruesome image of him
19:00 actually being murdered. you know, the the bullet hitting him and what, etc. You can’t unsee something like that. And
19:06 I’m grateful that I was I did was lucky enough not to stumble on it. That content, I’m not saying it should have
19:12 been banned, but it should have been, first of all, banned for minors. And there should have been an interstitial
19:18 that says, you know, the following the disturbing content. Are you sure you want to look at it? How could they have
19:24 done that im immediately? I’m not sure. But those are the kinds of things we talk and we’ve been talking to YouTube about that for years ago around
19:30 beheading videos. You know, maybe a beheading video was put up by a
19:35 terrorist to glorify their horrible crimes or maybe it’s put up by a human rights organization to show how horrible
19:42 these terrorists are. I mean, you need to understand the context and why it’s being put up. But the point is there you
19:48 can’t just force that on the general public and certainly not on children without ample warning to the public and
19:55 banning it. frankly from young children. Yeah, certainly as we’re recording this,
20:02 it’s uh September 15th, right? This this uh Charlie Kirk story certainly just is less than a week old here. It’s been
20:09 everything that you’re talking about, right? Safe content, discourse, amplification, you know, the use of AI,
20:17 right? Like everything that you’re talking about as far as connecting safely and what the role of these big companies have been like. All of these
20:23 issues we we’re seeing, you know, face to face in the past few days. Oh, it’s absolutely. I mean, you know, hate speech, misinformation, the amount
20:29 of misinformation that came out in the immediate aftermath. I mean, I don’t think he had necessarily even died yet.
20:35 I mean, because I believe he died at the hospital, but whatever. I mean, there were all this fingerpointing and blaming
20:40 this group and that group and, you know, was it a false flag coming from the right? Was it a, you know, was the lunatic left? I mean, yada yada yada
20:47 yada yada. None of that was helpful. And even now when we know a little more, it still isn’t helpful to point fingers
20:53 because we know I mean I actually the governor of Utah who I might not agree with on everything has made some really
20:59 good points about it’s the fault of the shooter but we all need to tone down a little bit and he was apparently
21:06 radicalized online. So that says something about all of us who may have contributed to that radicalization and
21:13 that doesn’t mean you know and I the other thing I’m thinking about I’ve given a lot of thought about this. Oh,
21:19 you know, some people blame politicians, right? And where is the line, whether you’re a politician or just somebody
21:25 posting on social media, where is the line between persuasion and propaganda?
21:30 And where is the line between rightfully pointing out the weaknesses of your opponent? I mean, if you’re a politician
21:35 running for office, you’re running against somebody, it’s certainly your right to point out why you’re better
21:40 than him or her, why they are deficient relative to you and whatever, why their
21:46 ideology is abhorent to your base or whatever. That’s that’s politics. That’s
21:51 that’s persuasion. That always it’s been around forever. But that doesn’t mean you call them vermin. That doesn’t
21:57 mean you call them enemies of the people. That doesn’t mean you try to dehumanize them. that you try to imply
22:02 that they’re that that they’re extremist of the worst possible kind. I mean, when somebody is a rank and file conservative
22:09 or a traditional liberal and you call them a fascist or a communist, that’s just incendiary. There’s no need for
22:15 that. You could you can, you know, you can run against them and point out their weaknesses and your strengths without
22:22 name calling, but but there is a line somewhere, and I’m trying to figure out where that line is. I I honestly I mean I know the extremes but I don’t know
22:29 where the margins are in that line and I think we need to figure that out. I think it was pretty bad even back in
22:35 the 1700s the very beginning of the country. You read stories about here’s how politicians would would talk about
22:41 their opponents. I mean it was vicious. It might have been worse than it is now. Yeah. And if you ever watch uh
22:46 videos of parliament in the UK, they they are constantly calling each other things. They’ve been doing forever forever. But there’s something about now
22:53 and I think it’s the amplification on social media. So it’s one thing in the 1700s or even in the age of television
22:58 in the UK. Okay. So it’s out there, right? It doesn’t spread as quickly like it can be said nasty to somebody but
23:04 okay then it gets put into a newspaper that gets printed eventually or gets put into a book. It’s not it’s not
23:09 immediately disseminated to hundreds of millions of people. And and you know the game of telephone, right? I tell you something and by the
23:15 time it comes around the circle it’s totally different. That’s what’s happening with the misinformation. So even people who are well-meaning, they
23:21 hear about something and then they they they just subconsciously elaborate it and suddenly what might have been an
23:28 opinion, a factoid now becomes misinformation. A perfect example was again back to
23:34 Charlie Kirk issue, the bullet casing had TRN on it and right away on the
23:40 internet it said that stands for trans. The guy’s obviously transsexual, a trans, he’s a trans guy. TRN is the
23:46 initials of a uh Turkish ammunition manufacturer. Every bullet that comes out of that
23:52 company has TRN in it. Whether it’s from a trans person or not, you know, whoever whoever buys that bullet, their bullets
23:58 going to have TRN in the casing. So, that was an example of just a crazy rumor going wild that had no basis in
24:04 fact, but it got spread. What would changes to that section 230? What would
24:11 that do though? Let’s say if that had if those rules had already been different in the past week, then all of a sudden
24:16 are these online companies liable for such misinformation? That’s a really interesting question.
24:22 You know, as as with anything, the devil’s in the details, and I don’t know how any given court would rule, but in theory at least, if you felt you were
24:29 harmed by what somebody posted on Facebook or Instagram or Tik Tok or
24:34 wherever, you could not only sue the poster, the person who posted it, you could sue the company, the platform in
24:41 in in state court, civil court, and and or I guess in federal court. I’m not sure exactly. Again, I’m not a lawyer.
24:48 Um, so 230 would allow for that lawsuit. Whether those suits would have any merit
24:53 is another question. And by the way, there there’s there is a negative consequence of that. Anytime you open up
24:58 the the spigot for lawsuits, you’re going to get uh what do they call spurious lawsuits. You’re going to get
25:04 lawyers that are out there to make a buck that don’t have a real case. And that’s going to have a negative effect. That’s that’s the downside of it. And
25:10 and I don’t know I don’t know the solution in our in our society of how you avoid these kinds of spirious
25:16 lawsuits, but they are going to happen. or those class actions that I see so many now where the class action lawyer
25:22 you if they get their 30% they make a huge amount of money and then a hundred
25:27 million people you’ll get this email like oh you’ve got you know $2 right right it’ll be like a $200 million
25:34 settlement the lawyer gets 60 million bucks and I get $2 if you bother filling out the form which
25:40 you’re not going to fill out for $2 right you know but here’s the benefit of those one benefit there’s a thing called site
25:46 pre which means Some of these class action suits can be if there’s money left over after the
25:52 class takes whatever they they take the money can sometimes go to nonprofits. So
25:57 connect safely once got a grant as a result of that. We just a long time ago but there was a case where I think it
26:03 was $10 and not enough people turned in their application for their $10. There
26:09 was plenty left over. We got a a chunk of that. So that’s the bright This morning I got an email said class
26:16 action lawsuit. I’m not kidding. And they said, um, you know what? I’m going to pull it up. I’m going to read it to
26:21 you. I’m going to read it for you exactly because maybe this will be a future uh donor for you guys. It was,
26:28 um, here I didn’t realize this was going to be that important today, but I’m pulling it up. It was the databreach
26:33 settlement. I’m not going to give the company name. Discount code. Dear customer, we are reaching out regarding a recent class action settlement
26:39 involving us. You’re getting the email. You’re a member of the class. No court ever found us liable for anything.
26:45 Rather, this settlement was reached to allow the company to continue to focus on providing excellent service to its customers. You are eligible for up to a
26:54 $1 credit to be applied to our service fees. This code provides a 25 cent
27:01 discount on service fees and may be used up to four times for a total of $1.
27:08 Lucky you. And in California, the code will not expire. Otherwise, it expires in October
27:13 of next year. So that was that was a parking app I had. So So for $1, no, we
27:18 can’t use you can’t get more than 25 cents per use every time you pay for parking. And of course, if it involves more than 30 seconds of your time, not not to
27:25 overstress your value of your time, but yeah. Um is it even worth it? Exactly. So but but it’s interesting
27:31 nonprofits can actually make make that leftover money um in that. So then you
27:36 know it goes back to like you guys have a limited budget, right? You don’t have a you don’t have the the budget of open AI over here, right? This is a This is
27:43 That’s what I told your boss today when we were talking about fees. Yes, we have a limited budget. Limited budget. So, let’s say you bring
27:48 on a PR agency, for example, that that’s budget that you have to very carefully spend. So, what’s the goal for, hey,
27:55 we’re going to make this spend on PR. What are you trying to accomplish in that campaign? Well, I mean, I can again I I don’t
28:02 think there’s anything proprietary about the conversations we had this morning. We are trying to balance between social
28:07 media and regular media, whatever you call television, radio
28:12 in terms of how you’re putting out your own information or or the kind of companies you want to reach
28:17 in terms of how what kind of coverage we get. So so how much should I spend
28:23 trying to increase my engagement on LinkedIn or Facebook or Instagram? By my I mean my organizations versus trying to
28:30 get a story in the New York Times or an interview on MSNBC or CNN. And and these
28:37 are tough questions because um and and and as you know in the I assume you have
28:42 some PR. I don’t know too much about your background in PR, but um anytime you try to get a story in the New York
28:48 Times or or on television is a crapshoot. The most brilliant PR person can never guarantee success. uh whereas
28:55 with social media you still can’t guarantee success but you can more easily create metrics and and try to try
29:01 to meet them. So we are trying to figure out right now and we are having conversations with your your your team
29:06 about how to balance our budget between the social media aspects of it where you have a great group of people and the
29:13 more I don’t know what you call it traditional media but I also include podcasts I include blog posts online
29:20 media but but media that other people write about us I guess you’d call it earned media I think is is it the term
29:25 that PR people use and the funny thing is in my case as somebody who spent 25
29:30 years working for CBS News, who’s worked for CNN, who wrote a column for 20 years at the LA Times, still writes a column
29:37 for the Mercury News. I’m usually on the receiving end of these PR pitches. So, I
29:42 know it mostly from being the journalist and I’m afraid to tell you blowing off
29:48 most of the most of the pitches that I get. Of course, I know you don’t have to be afraid. I think we all know and well
29:54 when I was a journalist at at CNBC in Bloomberry 100 200 emails a day like
29:59 this and it was all assume the answer is no unless I write back and I might write back to one or two a week.
30:06 Absolutely. And and by the way many of them are just kind of silly and well most of them it’s just spam garbage that you got your email got put on a
30:12 list somehow. My standard results used to be Google me and and if you still think I’m relevant let me know. Yeah,
30:18 but it’s not that hard to figure out what I write about. Uh but yeah, no, but the point is that, you know, they’re all
30:24 crapshoots, but but earn media is more of a crapshoot, but if you get a hit in
30:29 the New York Times or whatever, it can be very valuable. Uh not just because it’s helped you raise your profile, but
30:36 it’s something that funders like to look at. On the other hand, funders also like to look at at at reach. They want to
30:41 know, some of them, not all of them, some of them want to know what our mo what our social media following is. They
30:47 want to know what our web traffic is. So, these are all things that we put in our annual report that are important to
30:53 us. And obviously, the higher the numbers, the better in terms of some companies, not all. Some companies don’t
30:59 seem to care, but others really almost look at it like they’re investing in advertising dollars when they fund. I
31:05 don’t like that, but it’s it’s the way it is. They want to know that we have an impact so that, you know, the money they
31:12 give us will have some impact on on society. So, they measure these things. Oh, I see. Yes. So, the funders, the
31:19 people that are like the companies that are going to give you guys money to keep the foundation going, to keep the nonprofit going, they’re always looking
31:26 at, I’m sure they’ve got many nonprofits that they’re donating to. They’re probably trying to figure out how can
31:32 they be efficient with that nonprofit spending and and which are the nonprofits they want to continue to fund
31:37 year in year out. Yeah, I’m sure they do. And and again, I only know of a couple of cases where they’ve asked. Usually, they don’t ask,
31:42 but they may have other ways of figuring it out. I mean, I’m sure they can they can see your follower count or whatever.
31:48 Um, so yeah, that’s all relevant and um I wish it weren’t important, but it is it well and in some ways I don’t wish it
31:55 weren’t important because obviously I don’t want to shout into a void. I mean, I could sit in my uh backyard and scream if all I wanted to do is reach myself
32:01 and and the dog next door. I do want us to have impact on on hopefully millions
32:08 of people because that’s how we can make people more educated to be safer. And it’s also maybe a way we can affect
32:14 change if if we have a big a big audience. How has the PR engagement changed your strategy in that sense? How have you
32:20 adjusted budget in terms of social or traditional or media? Well, we’re actually going through that conversation right now, but the point is
32:26 we put budget into social, which we’ve never done in the past. And as a result of that, we have a completely different
32:33 strategy. For example, I’ve shifted much virtually all of my professional
32:38 posting over to LinkedIn. I still post on Facebook, but it’s more personal, political, you know, things like that.
32:45 But, but when it comes to things around internet safety, it’s it’s mostly on LinkedIn. I would have used X, but I
32:51 really don’t want to be around X right now. I think it’s kind of turning into a cesspool, but but certainly LinkedIn is
32:56 a very strong one. And learning, for example, the value of video. Um, one of the things I have never done before is I
33:03 take selfie videos when I’m at an interesting place that is relevant. Like for example, a few months ago, I was in
33:09 right outside of Oslo at the internet governance forum and at the recommendation of the Bospar um social
33:15 media team, I was doing videos from the floor and posting those on LinkedIn and I think we posted on Instagram as well.
33:22 So things like that that I didn’t realize how important they were. understanding the difference between engagement and um what is it when people
33:30 and reach you know I used to think if I didn’t get a lot of comments and a lot of likes and I was failing but it turns
33:37 out that I needed to look at actually the number of people who actually saw the views those are important maybe more
33:43 important I just learning about those things it’s it’s been a you know for somebody who has written books about the
33:48 internet going back to 1984 I was surprised how little I knew about how
33:53 some of these platforms terms actually helped and it really helps to have professional but I’m not I’m not here by the way to
33:59 to pander your organization but just the idea that you know you think you know and of course I don’t I’ve never been
34:05 much of a LinkedIn user I frankly I’ve never been looking for a job I I don’t use LinkedIn to hire so I kind of didn’t
34:11 really take it that seriously I mean I have thousands of followers there but I never really did much on it but now I’m
34:17 focusing more on it because it’s reaching an audience that our advisers correctly inform us that we really need
34:23 to reach It’s fascinating. It’s It’s so humbling, right, when you say, “Oh, I’ve been working on a particular topic for 40
34:30 years, and it turns out there’s still a lot of it I don’t know.” Right. Like, you write about the internet for 40 years, and here you go.
34:36 Yeah. And and also, by the way, in case you can’t tell from my gray hair, I’m not the youngest kid on the block. And
34:41 it’s really important to be open to ideas from other generations. And I I
34:47 mean that with with great sincerity that there are some of the younger people that work for your firm and work for me
34:53 have experiences that I don’t have. I mean they have a perspective that I don’t have and to ignore that would be
35:00 at my peril because I want to reach everyone. I want to reach old people like me but I want to reach teenagers and I want to reach 20-somethings and 30-somethings
35:06 and young parents and I need to learn from people who are living that part of their lives as I live a different part
35:13 of my life as an older person. Yeah. Yeah, it makes sense. What would you say then as you looked in the next
35:20 12 months? What would be your goals as far as either for ConnectSafely the
35:25 organization or for you know the big picture the internet in general like what are the you know the micro and
35:30 macro goals of what you want to see accomplished going forward? Well as a child of the 60s I want peace, love and good vibes all around.
35:38 And to accomplish that and to that end um I think ConnectSafely has to double
35:43 down on uh focusing against hate speech on focusing uh on misinformation on
35:50 promoting civility while at the same time continuing the work we do around cyber bullying around predation fighting
35:56 around fighting sextortion which is a major problem among a number of young people. Um but but really thinking
36:03 through how we can define safety as more than just the absence of danger, the
36:09 presence of positivity and uh and the promotion of civil discourse. And so
36:15 that to me is a big goal. How we go about doing that? Uh I don’t have any magic bullets. I hate to use the word
36:20 bullets. I don’t have any magic tools. Um but I think again social media is
36:26 very important. finding ways to reach out to demographics that we don’t necessarily reach today or at least not
36:32 in significant enough numbers. Uh reaching the media that young people listening to uh reaching out to groups
36:39 who maybe have have a different ideals. Now, by the way, ConnectSafely. We have people on our staff. We have Republicans. We have Democrats. Not
36:46 everybody is in lock step internally around politics, but I want to reach out broadly. I’m very, you know, I was a
36:53 little bothered by the fact that the only two prominent politicians we ever
36:58 had, so we do an event every year called Safer Internet Day. We we had a big five-year hiatus during the pandemic,
37:04 but every other year going back to 2000, yeah, 2014 all the way up through 2020,
37:10 we would have events, large events, and our prominent speakers have included Kamila Harris and Chuck Schumer.
37:17 They’re great. I’m no problem with that. But I really am working to try to balance that out. And I was really happy that last year, I can’t remember his
37:24 name right now, we had a very prominent Republican California legislator who spoke at our event, spoke very, very,
37:29 very well. As a matter of fact, I’m thinking seriously of asking um Spencer
37:34 Cox, the governor of Utah, Republican governor of Utah, if he’ll speak at our next event. Maybe we’ll even have it in
37:39 Salt Lake. I don’t know. Um, but I love the fact that, you know, there are people on both sides of the aisle who
37:45 are calling for civility and calm and rational conversations. We need to encourage that.
37:52 Certainly the right time for that. Larry, I appreciate everything that you’re doing, everything you talked about today. And um I I I wish you the
37:59 best of luck because this is one of the great challenges, especially as a young parent myself. I think about me trying
38:04 to keep them off of screens as long as possible. They’re still very little, but but uh they’re they’re so smart to
38:11 figure out how to get to get to YouTube and all the things that I’m trying trying to keep away from them. So, um I
38:17 I think about all these issues every day, you know, and so I appreciate all the work that you guys are doing to try to keep the internet a safe place. If if
38:24 it’s possible, if it’s possible hopefully you can do it
38:30 please visit us at ConnectSafely.org and look for us wherever you get your propaganda, whether
38:36 it’s Instagram, Tik Tok, uh Facebook, uh what else? LinkedIn, wherever. We’re
38:43 everywhere. And and we have a podcast like you do. Ours is called Doing Tech Right. And you can find that wherever
38:49 you get your podcasts. Awesome. Larry, I appreciate the time today. Thanks so much.
38:55 Thank you to my guest and thanks for listening. Subscribe to get the latest episodes each week. And we’ll see you
39:00 next time.