Keeping the Metaverse Safe for Kids with Julie Inman Grant
Interview with Julie Inman Grant
Commissioner Inman-Grant was the worlds-first e-Safety Commissioner and has many successes in her over 5 years in the role. We talk to Julie about the challenges of keeping the Metaverse safe for our children and for everyone. Plus Julie gives more general advice on some of the issues parents need to think about when it comes to raising kids in a digital age.
Transcript
Nick Abrahams:
Ladies and gentlemen, welcome back, and I am delighted to be interviewing today someone who I’ve got an enormous amount of admiration for. Its Australia’s eSafety Commissioner, Julie Inman Grant. Julie, welcome to the show.
Julie Inman Grant:
Thank you so much for having me. Not a lot of people say that. I either have too much power or not enough.
Nick Abrahams:
Well, I feel like we’ve known each other over many years, and I know that eSafety has been a very important topic in your life and has driven you to the role of being at eSafety Commissioner, which you’ve had now for over five years, I think. Can you give us a bit of a sense, what is the role of the eSafety Commissioner?
Julie Inman Grant:
Sure. That’s a great question, and I think you’re right. After 22 years in the industry, I like to say that I started out at tech policy ground zero back in Washington DC with Microsoft in the mid 1990s when the Communications Decency Act was being considered and the first White House summit on online safety was being held. Things have changed a lot, but things have also stayed the same. It’s been quite an honor that the world’s first country to set up a National Online Safety Regulator like eSafety Commissioner, and of course we started as the Children’s eSafety Commissioner.
Nick Abrahams:
That’s right.
Julie Inman Grant:
Working to prevent online harms and then protecting Australians from a range of online harms that really a set of schemes and programs that have been layered on over time. And I guess the signature that I really brought to the regulatory sphere is how do we not just playing that game of whack-a-mole or whack-a-troll, if you prefer, and really start thinking about, we’ve got prevention, we’ve got protection. It’s the proactive change thing that I’m also really interested in is how do we minimize the threat surface for the future, and how do we shift more responsibility back onto the platforms themselves?
Just as legislators all around the world did more than 50 years ago when it came to cars and embedding seat belts. How do we get them to assess risk upfront and build and embed those protections in at the get-go rather than retrofitting after the damage has been done? Which of course we call safety by design and understanding tech trends and challenges, how we might harness these new technologies for good, but how do we minimize the risks to the public now and down the track and communicate with them what is coming and in a way that’s accessible for them?
Nick Abrahams:
It’s an extraordinary mandate. It seems so broad and all encompassing, and particularly as technology expands our universes. Maybe just with the commission itself, can you give us a bit of a sense how big is it? How many people? What divisions, business units do you have within the commission?
Julie Inman Grant:
Sure, sure. Yeah. When I walked in the door five and a half years ago, there were 35 people. I’d come for 22 years in the tech sector and from flat hierarchies. I had to do a crash course in the Australian Public Service. And I knew that we had to create more, much more of a hybrid culture, that technology’s always going to outpace policy and regulation. We had to figure out a way to be able to bring in an innovative mindset where we could pivot and we could anticipate risk rather than just responding to it as we happened. We’re now about 200 strong. We’ve already had our regulation totally reformed. The Online Safety Act of 2021 came into play on January 24th of this year. Beyond having that remit around prevention, protection of proactive change, and I could talk for five to 10 minutes about each of those areas, but essentially prevention is about establishing an evidence base that hasn’t exists before.
Fundamental research, but also harnessing the insights and intelligence that we get from our threat trends, from our complaint schemes, and making sure that we’re developing content that’s really audience focused. We started with building materials and prevention materials for kids. Then we had to move very quickly to materials that serve all Australians, and now we’re actually focused really on the most really vulnerable or at risk Australians. We’ve got a whole new team around diversity and inclusion and vulnerable communities. That’s work in and of itself, and actually getting people, Australians to look at the content, to engage with it and really apply it into their daily technology use and to really enable that behavioral change over time is the biggest challenge. We’ve actually had some success there now that we’ve had … We have about seven years of runway, and we found that when I came in, about 50% of kids would talk to their parent about something that had gone wrong or would report to a social media site or use conversation controls. We now know that kids, more than 90% of kids will use blocking, muting and reporting tools, and about 70%-
Nick Abrahams:
90% of kids will? That’s amazing. Okay.
Julie Inman Grant:
Yep. And about 70% are now talking to their parents when something goes wrong, something like cyber bullying. Interestingly, we interviewed their parents at the same time and only 50% of the parents recalled having that conversation when a child confided in them. And I think that’s part of this generational divide that we’re experiencing. I think I told you earlier, about 95% of Australian parents have told us that they see online safety as the most challenging parental problem. Only 10% will actually go and seek out information until something goes wrong. I think there’s also this generational divide or this digital disconnect that exists because people of our generation, we kind compartmentalize what happens to us online, whereas young people today, their online lives are their lives. They’re totally interconnected. We know with youth based cyber bullying, for instance, almost all cyber bullying is an extension of conflict that’s already happening within the school gates.
It’s peer to peer, so it does become more invasive and more pervasive. And so you can’t just compartmentalize harm or bullying that’s happening online versus your real life. There are a lot of cultural and societal factors at play here. That’s prevention. Protection, we’ve talked a little bit about proactive change. The latest branch that I stood up is around tech trends and challenges and futures and developing industry engagement and enablement programs like Safety by Design, doing it with industry rather than to them, surfacing up innovative best practice so that some of these more wicked or intractable technology issues, we can show other companies how innovation can be used to tackle issues like recidivism or detection of various forms of illegal content.
A lot of innovation that can be happening there. We’ve got about 43 investigators, which is great from where we were, but we’ve got 43 investigators for about five different schemes around youth cyber bullying, image based abuse, which is the non-consensual sharing of intimate images and videos, abhorrent violent material, illegal content, including child sexual abuse material, and anything instructing in terrorism or crime. And then a new adult cyber abuse scheme. Layered onto that are two new systemic reforms that really are meant to tackle, lift safety standards at scale. And one of those is a co-regulatory scheme around industry codes. Now, the industry has just finished doing a public consultation around those codes. The first tranche of codes deals with what companies across eight different sectors of the technology-
Julie Inman Grant:
…Companies across eight different sectors of the technology industry will be doing to proactively detect, remove, and basically prevent access to illegal content, what we call class one content, so mostly child sexual abuse material and pro terrorist content. The industry will present to me the codes sometime in the mid-November timeframe and then I’ll have to make an assessment as to whether or not I think they meet appropriate community standards, lift safety standards and are beyond status quo. If, hoping for the best, and that we’re able to register those codes and move onto the second trench, which will be about protecting under 18s from access to pornography. But if not, there’s a provision that will enable to me to make standards. And then there’s something…
Nick Abrahams:
There’s a lot. And I notice in what you’ve been talking about, it sort of feels, it’s very much about the web2 world, isn’t it? You’ve had tremendous success, whether it’s with getting the big social media platforms to behave more responsibly and so forth. And you talk about platform operators. I guess, now we’re headed into this web3 world, which brings with a concept of decentralization and so forth, and, obviously, the metaverse. I was just wondering, do you have your own thoughts around what constitutes the metaverse? What actually do people mean when they talk about the metaverse?
Julie Inman Grant:
I think if anybody tells you that they know exactly what the metaverse will look like, they shouldn’t be believed. If the major platforms do achieve a degree of interoperability, then the metaverse could become one extended 3D world that we want to escape into. Or it could look more like a multiverse with a range of walled garden offerings. It would be great if your kids go to Legoland and then 3D Disneyland, but Sexyland might be right next door, who knows?
Nick Abrahams:
Right. Of course, yeah.
Julie Inman Grant:
So what I think is really interesting about both Web3 and. Web3 first, is you’re talking about crypto and NFT and blockchain. Essentially I think of it as a philosophy as much as a technology paradigm shift. It’s around this intermediating internet gatekeepers, whether they’re the big technology players or banking and financial institutions or regulators like us or law enforcement. But what nobody who’s architecting this new world or advocating for this new world can tell me is how harm will be remediated. I’ve asked a bunch of blockchain specialists, if child sexual abuse and online harassment can be held in a public ledger and it’s amenable, does that mean it exists forever? What does this mean for notice and take down? Probably nothing. Another question I’ve asked is, well, how do you root out bad actors in this world if everything is decentralized and nobody’s really accountable or responsible. And I get, “Well, the community will root out the bad actors.” And I’m like, “Oh, yeah, that’s working really well today, isn’t it?”
Nick Abrahams:
There’s like a techno utopian dream and that’s very much a philosophy. And I know on social media, I get caught a bit by people who don’t like when I mention… I might mention meta or something in a post about what meta is doing and the true decentralization zellots are like, “Meta has no business being in our world.” And so it’s this very strong view, but yet you’re quite right. That immutability of the blockchain, that means that we’ve got a real problem with, as you said, content and other things existing there. And the very idea that the community through Dows and so forth can effectively enforce such things. As we’ve seen Dows have been, most organizations that describe themselves as Dows aren’t really Dows in the sense of everyone votes on everything. It’s almost gone into almost a corporatized structure with governance like boards and so forth. So it feels like the true decentralized proposition, will struggle over time. We’ll see more centralized control, which maybe that means that there will be people, or we can find organizations or individuals who can be responsible for things. But for a regulator, very difficult.
Julie Inman Grant:
Yeah, and I’m under no illusions we’ll be regulating that part of the world. But of course, we are seeing crypto and Bitcoin is a huge feature in a lot of the sexual extortion cases that we’re seeing. We’re seeing crypto, we’ve been seeing this for years, being used to hide a child’s sexual abuse hosting mechanisms or to really allow the transactions between pedophiles of this kind of content. But what really scares me, I’m glad you mentioned utopian colored glasses because I remember having those in the mid 1990s.
Nick Abrahams:
I think we all did.
Julie Inman Grant:
And what I’m scared about is that they’re going to be coming to you in an Oculus headset or a Snap or Apple AR glasses. And this is what’s really freaky, I think, about the metaverse. And in our tech trends and challenges for you. I think we put this forward about 18 months ago. It wasn’t called the Metaverse now. We refer to it as immersive technologies. But we actually predicted hyper realistic online sexual assaults. And lo and behold, Nina Jane Patel had that terrible experience. But right now you can do things like you can do webcasting, I suppose.
But your kids are going to be having these glasses on their face. Not only will they not be able to get dates but you won’t be able to see what they’re seeing and they won’t be able to unsee it. So it’s not as much about content as it is about conduct. And of course the whole idea of the metaverse is to blur what’s real and what’s not. When you’re talking about hyper realism and you’re probably talking about whole new forms of sex tech and teledildonics and haptics and wearables and all of these things that could actually have so many incredible uses. I think about, I would love to send my kids to ancient Rome in the universe so they could see the sites, the smells and the sounds of the gladiators in the Colosseum.
But I do worry about my kids wandering into online strip clubs where, if you look at the Center for Countering Digital Hate, did a little experiment in Horizon worlds and found that once every six minutes an avatar posing as a kid was propositioned by an avatar that was ostensibly an adult. So particularly in private spaces, how are we going to protect people from those kinds of harms? And I don’t think the answer is regulating now because what are we going to be regulating? And we apply regulation after the damage has been done. But this is where I think safety by design is critical. We need the architects of web3 and the Metaverse to be thinking about the risks now, to be building in the safety protections, preventing the misuse and weaponization of these technologies. But we still tend to hurdle towards the exciting and the sexy rather than stopping to think about what do we want these new online worlds to actually be look like and function.
Nick Abrahams:
Yeah. And is your office getting of complaints and issues around these more immersive technologies now? Is it coming from, I guess, the gaming world, whether its the Fortnite and the Roblox? Is it actually happening now or is it sort of still-
Nick Abrahams:
… is it actually happening now, or is it sort of still just simmering under the surface?
Julie Inman Grant:
Well, I think it is probably the gaming companies that are going to actually have the edge in the metaverse. On one hand, the online gaming industry really understands, they have the experience of Gamergate, and they know that there’s actually more choices when you think about it in the online gaming space. So people can vote with their feet if an environment is too toxic or too hostile, whether it’s misogynist or racist abuse or otherwise. But the whole idea of a comprehensive metaverse is also going to be dependent on identity that travels, I suppose, with your avatar.
The gaming industry has already been thinking about that kind of interoperability. You’re seeing some of the companies really deploying some innovative tools to try and make sure that their platforms are safer. But I mean, you hear stuff; we still see stuff and hear stuff all the time about kids walking into sex clubs in MeepCity. Well, of course, there was the Knox Grammar situation with Discord, and we’re actually seeing a lot more online abuse and the sharing of violent extremism content, and child sexual abuse happening over the chat functions in online gaming worlds.
One of the things we tell parents, particularly when they have their kids playing games like Fortnite, where you’re paired with 99 other people who could be adult strangers with your teenager or other foul mouth teenagers, is you should have them playing these games in open areas of the home and probably not wearing headphones, if you really want to hear what they’re experiencing and see what they’re experiencing. Of course, you could take it a step further on co-play and co-view, but some of these environments really aren’t fit for purpose for young people. Frankly, what we saw over the pandemic were parents being a lot more permissive with technology youth. So now we have all these kids who are 8, 9, and 10 years old coming back to school with their own smartphone on TikTok, on Snap, on BeReal, on all these platforms that they’re really not cognitively and developmentally ready for.
Nick Abrahams:
Right. Very interesting. I mean, I guess for your organization, how are you planning for the future? I guess, now you’ve mentioned that you’ve got a technology trends group, which sounds very forward thinking. How do you see the eSafety Commissioner’s role and the organization’s role? Because in many respects, if we move into an immersive world, it’s almost like it’s Julie Inman Grant becomes our major sort of police presence because you’re in a fully immersive world. Who do we complain to, if we get attacked in one of those worlds, how does that all roll out?
Julie Inman Grant:
Well, listen, we’re trying to take one step at a time, and also depends on what the government of the day wants to help see us become and enable us to become. One of the things we’re doing right now is we’re developing a whole new regulatory operations model so that we can of rethink everything that we do to make sure that our codes and our basic online safety expectations, which is our key transparency tool. You may have seen that I issued seven legal notices to Microsoft, Skype, Snap, Meta, WhatsApp, Omegle, and a few others. That’s just the first trench to really get transparent information about what tools they were using to detect grooming and child sexual abuse images and videos, where we’re working through that process now so that we can see more radical transparency.
I’m going to be using those tools in very different ways, but I want to make sure that we’ve got our robust systemic regulatory tools in the codes and bows are sitting together and co-located with our investigative team so that all those insights are working together and all the insights that we’re gathering from all of this is informing our research and our data and ultimately our education and prevention issues.
I think the challenge has been, and I guess, I took this from my 22 years in tech, you have to keep moving forward. You have to keep innovating; you have to keep anticipating and make sure that we’re really nimble and that we can pivot to new threats and trends. That’s how I’ve tried to set us up to the best of my ability. That isn’t always easy in a government construct, but I’m doing the best that I can so that we can tackle the challenges that we have now and really harness the insights and learnings and put it towards keeping people safer for the future.
Nick Abrahams:
I mean, you’ve made an extraordinary impact in your time in the role. I guess you’ve mentioned Safety by Design a few times, which until you and your organization started talking about it, I had not been aware of, obviously heard of Privacy by Design. But can you just give us a little bit of a sense of what does that mean for organizations that might be looking at to deploy, whether it’s in the metaverse or Indeed or in Web2? What does it mean to say Safety by Design?
Julie Inman Grant:
Well, thank you for asking. I actually brought the concept of Safety by Design to Microsoft over 10 years ago when I was their head of global privacy and safety policy and outreach. I was sitting in product design assessments where we were doing risk assessments around accessibility, around security vulnerabilities, and around potential privacy breaches, and I kept saying, “Well, what about personal harms?” I kind of got the little eye roll like, “Oh, we’re going to become an enterprise company, Julie. We’re not going to be a social media side.” But I was like, “Hey, Skype is being used as a vector for live stream child sexual abuse.” Xbox at the time was pretty toxic. They’ve done a really good job at moving that forward.
But I guess I figure that I couldn’t be a safety antagonist inside, and I guess becoming a poacher turned gamekeeper really did give me the imprimatur to sort of say, “Hey, really, how do we shape the future? If we’re not building fundamentally safer products, platforms, and services, there’s no way we’re going to regulate our way out of online harms? We have to put that responsibility back onto the platform.” But I also knew we had to do this with industry rather than two industry. So four years ago we sat down with major players in the industry and we worked through what a principal space framework would look like with actions underneath of them.
So the three key pillars are transparency and accountability; user empowerment and autonomy, and service provider responsibility. We had a lot of the major players sign off on that, but I very much felt that principles are only useful if they’re implemented. I think you would probably agree that there are so many principles based frameworks out there that we all feel a little bit dizzy. So we spend another 18 months and we consulted with 180 different organizations to build some risk assessment tools.
So we’ve got Safety by Design risk assessment tools built for startups, and one for mid-tier and enterprises, and they’re free to use. They’ve been downloaded by companies in 46 countries. But you’ll see that they’re peppered with really great insights and inputs from a range of companies, from Google to UBO, Snap to Nextdoor, in terms of how they dealt with issues and really surfacing up best practice. This isn’t all about being punitive; this is about surfacing up innovations. We used to talk a lot in the industry about co-opetition, and of course all the companies compete on these things, but the area that we should be…
Julie Inman Grant:
All the companies can compete on these things, but the area that we should be cooperating in is absolutely across safety and sharing how we’re making our platforms safer. So we’re taking this to the universities. We’re taking this to the VCs and investment. Often, the VCs are the adults in the room. So we’ve developed some due diligence clauses. We’ve developed checklists that they can take to the Series A funding. “Hey, have you thought about this? We don’t need to have any more cautionary tales or tech recs.” We haven’t really seen the VCs, investment companies take this on at scale, but maybe you can help us there, Nick.
Nick Abrahams:
Well, hope so. If there are any VCs listening, it’s definitely worth getting across that. In fact, I was working on a project the other day and the question came up, which was, what should we do around designing this? And I was happily able to point that organization to the safety by design materials that you have. Because particularly around things like AI and so forth, we are getting into some space which throws up a lot of ethical questions, almost more than legal questions. And so to have a framework like that and detailed tool is very helpful. So yeah, certainly, you got me actually out of a very difficult spot with that. It worked out very well. I
Julie Inman Grant:
Well, thank you. What’s really exciting is that, if you look at the new regulators and the regulation that’s popping up around the world in the U.K. with the Online Safety Bill in Europe, with the Digital Services Act in Canada, in New Zealand, safety by design is now a feature. So for a long time I felt like I was yelling, “Safety by design!” like an open tavern.
Nick Abrahams:
Right.
Julie Inman Grant:
I like to think of it as my greatest hits, but it is starting to catch on, I think, because it makes sense. It just hasn’t been prioritized vis-a-vis security and privacy up until now.
Nick Abrahams:
Yeah. And we will start to see that more and more, as we have frankly with privacy. That was a slow burn for many, many years, and then that’s obviously become a critical issue. And then cyber similarly with recent hacks, et cetera, so now that’s front of mind. So yeah, I think particularly as we see the more immersive style platforms come out, I think safety by design is going to be critical. Just one final question, and this really gets to not about Web 3.0 per se, but just you mentioned that staggering statistic that 95% of parents say that dealing with their kids’ online behavior is their hardest issue, but yet 10% have actually sought out information to help them with that. Maybe we could help with the disparity in those metrics. What advice do you have or what tools and resources do you recommend for parents who are struggling with this issue?
Julie Inman Grant:
Well, listen, I don’t think we have to become technological wizards to be able to talk to our kids about staying safer online. We have the judgment, the experience, and maturity, and we really need to prepare them for the online world. So I don’t think device denial, which is a place that a lot of parents go, is the right place to go because you can’t build digital resilience if you’re not allowing your kids to have some of these experiences. It’s not a matter of when, but if. But we do have to provide that protective scaffolding to make sure that they’re not falling prey to some of the more serious crimes that we see. So some really basic stuff, and I started doing this over the pandemic so I’m trying to eat my own dog food, as they say, having three kids. But we ask our kids, what’s happening at school? What’s happening in sport? What’s happening with their friendship groups? We should ask them what’s happening online as well. Have them walk us through, “Are you on Snap? Are you on BeReal?”
This is really for teens. We know that 94% of four year olds have access to a digital device, so we need to start by setting parental controls and privacy settings on by default at the highest levels. We’ve got guidance for parents of under fives, be safe, be kind, ask for help, make good choices. So we have to start early. When we move into the primary years, it’s about the four Rs of the digital age, respect, responsibility, building digital resilience, and critical reasoning skills. I often tell my kids to question everything. Is this a fake account? Is this true? Is this misinformation? Is this fake news? So question everything, honing those judgment and digital literacy skills. So setting boundaries up front about how much time they can spend online. We’ve got a couple of technology agreements that are up there.
Nick Abrahams:
Okay.
Julie Inman Grant:
And we know if kids actually sit down with their parents and help design the rules that they’re more likely to stick to it. You can do yourself a favor as a parent if you set those parameters early and often. Then you’re not going to have to deal with the techno tantrums that are likely to ensue.
Nick Abrahams:
I feel like these are better than the dog agreements, that if we get a dog, I’ll look after it. I have heard of people having much better luck with the technology agreements where you can actually rely on that if you do discuss that early on. And we’ve done that with our kids. We didn’t reduce it to writing necessarily, but that really clear understanding as to, if you get this particular device, here’s the expectations around it.
Julie Inman Grant:
Yeah. And really basic things. I mean, I’ve got a teenager. I try and have the kids using technology in open areas of the home so I can see what they’re doing and that nothing happens behind closed doors. And sorry to scare parents, but what our investigators are seeing more and more of is self produced child sexual abuse material that is often coerced in the privacy of bedrooms and bathrooms. So you need to know that. So doing that. Co-playing and co-viewing, you want to know what your kid’s experiencing on Fortnite. Maybe have them take off the headphones and play a few matches with them. Be engaged in their online lives the way you are their everyday lives, and leave conversations open so that kids know that they can come to you and that you’ll help them through it.
Go to esafety.gov.au to report any form of abuse, but also to get information. You can spend some time walking your kids through some of the materials or even educating yourself. Be one of the 10%, and maybe we can get that number higher, that’s seeking out that information before something goes wrong online.
Nick Abrahams:
Fantastic. Julie, I know you are incredibly passionate about this topic and it really comes through in the groundbreaking work that you and your team have done. And so really on behalf of, I guess, all of Australia, I thank you and your team because the sort of things that you and your team have to deal with on a daily basis are the very worst of human behavior. And so you and the team have done a phenomenal job with trying to get a pathway through the darkness. So thank you very much for that and we wish you all the very best into the future, whatever that Metaverse may hold. So thank you very much Julie Inman Grant.
Julie Inman Grant:
Thank you so much, Nick.