Online Safety and Security for Today’s Kids: When Government Gets Involved…

Jordan Shapiro
13 min readOct 3, 2023

--

In our new podcast episode, Laura Higgins and I discuss online safety with Julie Inman Grant, Australia’s esteemed eSafety Commissioner. With her wealth of experience from tech giants like Microsoft, Twitter, and Adobe, Julie brings a unique perspective on internet safety measures and regulations. Learn about her journey leading the eSafety Commission since 2017 and the transformative strides they’ve made under the newly created global online safety regulators network. Together, they discuss the proactive strategies being implemented by the commission to safeguard Australians in the digital realm.

Julie Inman Grant: I’m Julie Inman Grant, the Australian eSafety Commissioner. I’m sure you’re very confused because I have an American accent. You are not hearing wrong. I actually started my career in Washington, DC in the early 1990s. I had big ideals, and of course, even bigger hair at the time. I really landed in tech policy ground zero.

I worked in Congress for my hometown Congressman, and I was working on a range of social justice issues, and he popped his head over the cubicle one day and said “Hey, Julie, you know you’re working on these hard issues. We’ve got this small little computer company in our electorate called Microsoft. Could you work on tech tissues as well?” So this was before there was even an internet. I ended up working in the education and nonprofit sector, going to graduate school at night, studying the intersection between technology and society. And then I was drafted by Microsoft to be one of their first lobbyists back in 1995. So right before the US DOJ antitrust.

Jordan Shapiro: That is back when Bill Gates still used to lie on the desk with the sweater on.

Julie: Well, I’ll have to tell you about my second day of work. I was so excited to meet my CEO. And he landed. He really didn’t like coming to Washington, DC at the time, you know, he just wanted to get on with it, and, you know, create jobs and stimulate innovation. So he arrived in chinos and polo shirt, and he was going to the White House and going to the Press Club, and so we had to ask our outside council to change out of his suit in the hotel lobby so that Bill could have a suit. And then I took him over to the White House and he forgot his wallet and the security guard wouldn’t let him in. I said. “But don’t you know? This is Bill Gates.” He said, “I don’t care who you say you are. You’re not getting into the White Gouse without an ID.” So you know, it was very interesting. And then the next day the headline in the Washington Post was “Oh, they’re nice. It’s unfortunate that we have to have a DC office. They’re nice people, but they’re just overhead.”

And I think that really describes the state of the broader tech sector today, where the starting point is, “We just want to be left alone to innovate and create and sorry if what we exfiltrate into the wild might be harming society, you know. You can’t stand in the way of innovation.” So I was part of that cabal. I spent time at Twitter, you know, was so excited to join after the Arab spring, and speaking truth to power, but very quickly I saw how social media really surfaced the ills of society, of human nature, you know, through misogyny and racism and targeted online harassment which was actually designed and effective in silencing voices, and then I spent some time at Adobe. And then I got recruited in as a poacher turned gamekeeper, and I now regulate the tech sector

Jordan: You have kids right?

Julie: I’ve got three. Right in the sweet spot.

Jordan: How do you manage being a big time government official and also having kids? I’m just an author, and I struggle with my kids.

Julie: Well, I am taking a three week vacation. It’s the first time in 30 years that I’m not bringing my laptop. This was on doctors’ orders to just disconnect, which I think we all need to do. I’ve got a 17-year-old and I remember when she was 3 years old, she was more fascinated with my phone and the lights and the Wii than she was with the doll. And I remember thinking, “Wow, these kids” — this is 14 years ago — ”are going to grow up very differently.” So I also have 11-year-old twins, who are in the upper end of primary school. Now they claim that they’re the only two kids in sixth grade that don’t have a phone, and I’m actually inclined to believe them. They do have access in the house. They use my account when they’re trying to connect with friends, so I can see everything that’s being said, but you know it’s really hard not to bring what I see and what my investigators see every day with child sexual abuse, material, including self-generated content. Seeing that the age of cyber bullying has gone down from 14 to 8. We have 8 and 9 year olds reporting serious cyber bullying to us. This is a sort of interesting hangover of the pandemic where I think we were all struggling as parents to remotely work and school, and so we were a little bit more permissive with technology. but once you hand over a smartphone or put your kid on TikTok at 8 or 9 years old, you can’t wretch that back.

Jordan: So what do you do? As the Commissioner. How do you think about that? From a policy standpoint — what can governments do about that?

Julie: Well, listen. I think, you know, the approach we take is, I think, a unique one that we’ve developed over time because there were no other online safety regulators, and we had to write the playbook as we went along. But before we even use our regulatory powers, and we have complaint schemes, and take-down powers and high levels of success — and now we have some systemic regulatory powers. We have to start with prevention through education and fundamental research, building the evidence base. So everything I heard the Surgeon General say last month, you know, are issues that we’ve been gathering evidence about and advising parents and educators, and even young people themselves. through a co-design process about the healthy use of technology. You know, there are so many mixed outcomes in the research. And I do worry about direct causal links being drawn to, you know, “social media is bad. It leads to mental health issues.” We had a terrible situation in 2018, where there was a tragic suicide of a 14-year-old girl and it was all attributed to cyberbullying, when, of course, you know that there are much more complex things going in the background. And what I worry about when adults in the media draw that direct causal link is that children won’t engage in help-seeking, and they will think that taking their own life or doing something more drastic is the way out.

I think we’re the only government in the world that has a youth-based cyberbullying service where we serve as that safety net. When a child, their parent, or an educator reports anything seriously threatening, intimidating, humiliating, or harassing, and it doesn’t come down, we’re that safety net, and we advocate on behalf of that child and have a 90% success rate in terms of getting that content taken down, which goes a long way to relieving the mental distress that young people experience from not just acute cyber, bullying like death threats and rape threats which we do see from teenagers, but just the garden variety being mean — creating drama, starting rumors — which can have a very corrosive impact on children’s lives.

Laura Higgins: Julie, I’ve had the pleasure of working with you for many years in my previous work, on helplines, and so on. And you know, Australia was the first country to have this regulatory responsibility. So what sort of power specifically do you have? I know you just touched on a couple of things, a couple of questions on that. How does the industry respond to it? And also, how is it received by general citizens and people in the country?

Julie: Well, thank you for those questions. So we, you know, as countries around the world, are considering legislation. For instance, the Online Safety Bill is actively being debated in the UK Parliament. The House of Lords, I believe, right now, Ireland just set up an Online Safety Commissioner. Of course we’ll see a proliferation of regulators through the Digital Services Act. We’ve already reformed our legislation. So you know the formation of the eSafety Commissioner. We actually started as the Children’s eSafety Commissioner, but it came about because a well-known female presenter from Australia’s next top model, so the Tyra Banks of Australia, was very open about mental health issues. She had a nervous breakdown. She was getting terrible trolling on Twitter, and she came out of her recovery and got right back on Twitter. I was interviewing for Twitter at the time, so I remember seeing what was playing out. She ultimately ended up taking her life, and it was referred to as the Twitter suicide.

Now this spawned a petition that went to the government, that said, you know, the government needs to step in. This is 2014. But what the government of the day decided to do was start small and slow with the Children’s eSafety Commissioner because they were concerned about things like freedom of expression, and the specter of censorship. So they started with the hotline function that we already have in terms of taking reports of child sexual abuse material. And we’ve had that function like the National Center for Missing and Exploited children for more than 20 years, and then they created this new, youth-based, cyberbullying scheme. So again, not to be proactive monitors of the internet, or to do the content moderation job for the companies, but to serve as that safety net. That’s how we started.

We have a very good relationship with the platforms. They get huge volumes of reports. Often the content moderation is outsourced, and a content moderator from another country who doesn’t understand the culture in the context may have 30 seconds to a minute to decide whether or not that violates the Company’s terms of service, so they get it wrong. I mean, if you look at the latest Meta oversight boards reports, something like 1.3 million requests for reviewing decisions were made. They only got to 50, and more than half of them were found to be wrong decisions: So we do that day in and day out, on a daily basis as an objective third-party. But we always prefer to use informal means, because that gets to tech, that gets the content down more expeditiously. We know the more quickly we get that content down, the more relief we provide to the children.

We also work with schools and parents, because cyberbullying tends to be peer-to-peer. It’s an extension of conflict which is happening within the schoolyards and it’s, as you know, very visible to young people and their peers, but often hidden to teachers and parents, particularly as things start moving to Snap and to DM, and that sort of thing. So we started with that and we moved to what the Government asked me to set up, a revenge porn portal, and I know you did some pioneering work at the revenge porn hotline, but my first inclination was no, I’m not going to call it a revenge porn portal. Revenge for what? That can lead to victim blaming. Let’s call it what it is: image-based abuse. And we set that up in 2017 and got much more potent powers in 2018. We’ve helped tens of thousands of Australians get intimate images taken down from all over the internet. We’re starting to get reports of deep fake pornography. So that’s starting to be democratized. But one huge change that we’ve seen that’s of huge concern, I think, to hotlines all over the globe, is that we’ve seen a tripling of reports of sexual extortion. And it’s totally changed. The demographics of those we’re helping is on its head. You know, relationship retribution, in terms of the sharing of intimate images and videos. That garden variety, revenge porn, or a non-consensual sharing of intimate imagery tends to impact women and girls. But 90% of the reports we’re getting around sexual extortion, which is backed by organized crime, are young men and boys, mostly between the ages of 18 and 24.

Jordan: Yeah. So you’ve talked about so many things that are really, really scary. So for our listeners, the parents, the educators, the sort of everyday users of the internet, what would be some like top advice that you’d get? That you give us in terms of thinking about it? I don’t want everyone listening to go, “Oh, no, there’s such horrors”.

Julie: It is tough. So we’ve done a lot of work around the pedagogy of online safety education and what works and what doesn’t, you know? Scaring people or judging parents, tends to lead to amygdala hijack. And you know, people shut down. We also know that doing the same with kids and just doing one-off presentations is not going to help them, and I happen to believe that just banning devices rather than teaching what I call the four Rs of the digital age — Respect, Responsibility, building digital Resilience, and critical Reasoning skills — if you’re not teaching kids these skills and self-regulation of technology use, then we’re not setting them up for success in the future. I’d say the same thing about AI.

So I just say to parents that you are the front lines of defense. We are the ones that tend to hand over the digital device. In Australia, 42% of two-year-olds have access to a digital device, and by the time they’re four years old it’s 94%. So on our website, esafety.gov.au, we’ve got a parents guide for under fives, and the key message to be delivering to kids is be safe, be kind, ask for help and make good choices, and then when they get into the primary years. It really is about, as I said, those 4 Rs of the digital age. Sitting down, signing family tech agreements. We know that kids are much more likely to adhere to family technology use rules when they are part of the decision-making process. We’ve got a number of those for young families. All of this is free.

The key advice that I give to parents is to really speak early and often to your kids. Keep the lines of communication open. Let kids know they can come to you if anything goes wrong online. A lot of kids are worried about device denial, so they won’t confide in their parents, or they don’t think that they can help them, but if you start by asking those questions at the dinner table. We ask our kids about school. We ask them about sport. Ask them what they’re experiencing online, co-play and co-view. I could never keep up with my kids on Roblox. I’m like, how the heck are they, you know, six years old. And how do they know how to do that? But it’s something that, you know, I want to know who they are playing with when we set the parental controls. I make sure my kids use technology in open areas of the house, so I can see how they’re reacting to it. So often, when you know kids are being cyber bullied you’ll see a visceral reaction, they will have to reading the paper, and that also includes you know, when kids are playing fortnite or gaming, you might have them wearing headphones, but if you really want to hear what they’re experiencing, make sure that’s in an open area to the house. And you can hear what they’re experiencing as well.

Laura: Yeah, I love that advice, Julie. As you know, both of us are parents, and Jordan and I talk about these topics all the time, and many other guests here. And I think that’s really good advice. It sounds so simple, but actually, things that do become quite challenging for people actually to be just as you say, present to co-play, to co-participate, to role model as well. I think that’s really really great advice, to just be there with your kids on this journey exactly as you said as you would talking about everything else that they do in life, and trying to make that as normal a conversation as opposed to it being a let’s sit and have the talk.

Jordan: Is there anything that we should have asked about that we didn’t?

Julie: No, I’m sorry if I went all over the place, and I was getting excited about this as well again. I guess I just also mentioned that one of the things we did, we announced in Washington, DC, in November last year was the creation of a global online safety regulators network, and so as a lone regulator who kind of felt like they were, you know, climbing up a humongous hill at the front of the peloton, with no one drafting behind us. We do have organizations around the globe joining us, and we believe that we need to work together and we can learn from each other that we can share intelligence and information and make sure that there isn’t a splintered net of regulation, and that what we’re all doing is harnessing the benefits and the good. AI is a perfect example of a use case that could have tremendous power in helping humans with content moderation on the internet, and you know, search out and sweep out illegal or seriously harmful content. So we want to see those positive use cases. But we also want to minimize the risks. And we need to do that as governments together.

Laura: Huge congratulations on that initiative, Julie. I think it’s going to be really interesting to see where things go in the next couple of years. But I think we’re all really grateful for the work that you and your team do for leading the charge. Well done! Thank you.

--

--

Jordan Shapiro

I wrote some books - Father Figure: How to Be a Feminist Dad & The New Childhood: Raising Kids to Thrive in a Connected World. I teach at Temple University.