Start the Conversation Episode 5 Transcript

Toby [00:00:04] It goes to show that much of cyberbullying behaviour is driven by what's happening in the school environment and that's particularly acute in regional and rural areas. The best equipped families are those that are hardened against the threats doubt everything, unless you have had some objective verification of this person's identity, assume that they are not who they say they are. That ability to exercise scepticism and critical reasoning, you know, I mean, that's that's a great way to not become a victim.


Simon intro [00:00:37] The violent extremism landscape is fluid and complex and it can be difficult to navigate. This podcast series has been developed as a means of providing listeners with some thought provoking topics within this context. Personal insights and journeys, as well as helpful information that could assist someone who is vulnerable to being involved in violent extremism. The Engagement and Support Unit services focus on early intervention, awareness and resilience against violent extremism. They consult with and support the local community with this information to help mitigate the drivers of violent extremism and raise awareness of the complex factors and vulnerabilities that contribute to these ideologies. Before we begin, we would like to acknowledge the traditional custodians of the lands and airways on which we are meeting and broadcasting today.  As we share our learning we also pay respects to elders past and present. It is their knowledge and experiences that hold the key to the success of our future generations and promote our connection to country and community. Please note that views expressed are not necessarily representative of the New South Wales Government. Episodes may contain depictions of violence or sensitive topics that some people may find distressing. For further information, please view our episode notes.


Heather [00:01:56] Hello, I'm Heather Jackson, Director of the New South Wales Countering Violent Extremism Engagement Support Unit.


Rebecca [00:02:03] And I'm Rebecca Shaw, Communications and Community Engagement Manager. And this is Start the Conversation. The Internet has arguably become one of the greatest advancements of modern times. People around the world use it for work, research, communication, and for sharing images and videos. But within the millions of images, videos, apps, games and chat rooms being circulated and housed on computers across the globe, there are vulnerable children and teenagers being exposed to channels of information, often quite graphic, unfiltered. And some can lead to exploitation. Or in the case of the young people we work with at the Engagement and Support Program, helping them down the pathway to a violent extremist setting. The responsibility for protecting kids is community wide. It starts with parents, carers and teachers. But everyone has a role to play in keeping children safe from harm online. The internet and online social presence is the norm. Both adults and children are glued to their phones. It is the main pathway for communicating with family and friends, and the space changes all the time. We do know that a lack of understanding about online safety and monitoring can result in children being exposed to content we would otherwise not allow. There's a fine balance there for parents to get that right, and we have to be tuned into it. Today we are chatting with Toby Dagg from eSafety. Their office keeps tabs on suspicious activity in the online space and helps protect people online. It's Australia's first and only regulatory body of its kind. Toby is General Manager Regulatory Operations Group at the eSafety Commissioner. In this role, he is responsible for industry regulation efforts, civil investigations into online harms and education and prevention programs. Prior to taking on his current position, Toby served as the organisation's Chief Operating Officer and was the founding Executive Manager of eSafety's Investigations Branch. Between 2019 and 2022 Toby served as vice president of the International Association of Internet Hotlines, providing strategic leadership to global hotlines involved in the fight against online child sexual abuse material. Through various committees, he advises organisations within the banking, law enforcement and not for profit sectors about online harms regulation. Prior to joining esafety, Toby served as a detective with the New South Wales Police Force. In this episode, reference is made to illegal content relating to child exploitation. Some listeners may find this distressing. So welcome, Toby. Thank you so much for joining us today.


Heather [00:04:39] Welcome, Toby.


Toby [00:04:39] It's great to be here. Thank you.


Rebecca [00:04:41] Toby, can you tell us a little bit about your background and also how eSafety works and with what other agencies you collaborate and share information?


Toby [00:04:48] Sure. So I joined the New South Wales Police in 2003 and went into criminal investigation as quickly as I could, mainly in the armed robbery area in a specialist unit and working very closely with State Crime Command. So I developed this real appreciation for the concept of targeted hardening, and that's an approach I think I've taken to my work. I think probably anticipating a bit of where we're going to go Bec, the best equipped families are those that are hardened against the threats. And that hardening comes about as a result of education and preparation. So after I left the police force, after about 7 or 8 years, I actually went into juvenile justice and worked in one of the programs associated with the Keep Them Safe reforms. And so I developed a great affinity for and appreciation for the work that you all do. So so thank you. I was then fortunate enough to snag a job with the Australian Communications and Media Authority as an online investigator, and I've always been a huge tech nerd. Was playing computer games before I started school Pong and then Tanks and then on and on from there. So I've always had a real love for technology. And the intersection of technology and crime was something that I found very compelling and I focussed on child abuse material and have done so more or less ever since. The way that we work with other agencies is pretty extensive. We've got memorandums of understanding with all of the Australian police forces, including the Federal Police, and we have particular touchpoints with the Australian Senator Counter Child Exploitation, which is up in Brisbane and do tremendous work focusing on the threat of grooming threat of livestream offences. I receive all of the reports from the US National Centre for Missing and Exploited Children. They do terrific work, so we refer to a lot of matters that concern under 18, especially in relation to sexual extortion. I went to the Federal Police for investigation, but we also work with the Joint Policing Cybercrime Centre as well, the JPC three in Sydney. And so I have a touch points into the broader cybercrime portfolio as well. We do a lot of work with other government agencies, particularly federal government agencies, including in the tvec space, and we work very closely with civil society as well, which is a terrific way of amplifying and scaling some of our prevention and education messaging. So just an example, a quick case study of that cooperation concerns a recent matter where we had a notification from a state police force that in investigating alleged terrorism offences, a concern to young people, they discovered that this young person had posted a manifesto to several locations online outlining attack, methodology, justification, rationale, taking great inspiration from the Great Replacement Manifesto, which was authored, of course by the Christchurch attacker. And we were able to identify additional locations where that manifesto was hosted and we took removal action to have it taken down and we were able to do that within about eight hours, which was really tricky.


Heather [00:07:34] And so with all of this broad experience and all of the outreach that you have, what do you think what qualities make for a good online investigator? I'm just sort of thinking it's a minefield to sort of really be able to identify and touch upon it and find those leads. So how do you adopt that critical thinking and targeting process that you're talking about?


Toby [00:07:55] I think, Heather, the best investigators are those who are driven by absolutely insatiable sense of curiosity and a very high degree of attention to detail and appreciation for the importance of the minutiae that is then needing to be twinned with an understanding of technology. It's hard to teach the former. We can teach the latter. So if we find people who are already inclined towards pursuing questions until they reach the bedrock answer, then we can teach them about how the internet works. I don't think you can create a good online investigator if you don't have that just insatiable curiosity to ask and answer the question like what is going on? So if we're able to find that we're building out a pretty extensive investigations capability at eSafety. We've matured enormously since Covid and we're now about 45 or 50 strong in the investigations branch. The other thing, too, is that we try to diversify as much as possible so we're not all drawn from law enforcement or from the intelligence services. We take people from the banking sector, from the not for profit sector, as long as they show those essential characteristics, you know, they've got a good chance of getting in.


Rebecca [00:09:09] This isn't a question so much as a statement my minds going back to your case study about how it took eight hours to remove the manifesto, I was just thinking about John Horgan's chat with us about manifestos. It takes, I think, an hour and 40 minute from when the manifesto is posted before the act takes place. I think we need to get that time down. Yeah, no, I know it is. I just that's where my mind with that stat but to set the scene a little for our listeners and especially parents, can you give us some up to date stats on removal requests or report image based abuse? So Like sharing of illicit media or bullying related content?


Toby [00:09:46] Sure. So we operate four schemes at eSafety. They're all set out under our legislation, which is called the Online Safety Act, and those schemes deal with illegal and restricted content like child abuse material and terrorist and violent extremist content. So if I say tvec, that's what I'm referring to image based abuse, which is the non-consensual sharing of intimate images that includes sextortion, sexual extortion, which I can talk more about, child cyber bullying. And then the fourth scheme is adult cyber abuse. And I think it's fair to say that all of our schemes see pretty dramatic increases quarter by quarter and financial year on financial year. So for example, in relation to the illegal and restricted content team last full quarter, so that's the first quarter of this financial year. We had complaints about around 12,500 URLs. That's a 41% increase on the same time last year. So that brings us to a total for this year of if we're going along those projected trend lines, about 50,000 URLs reported to us just through that scheme and 85% of that content is child abuse material. We've also seen for the last financial year in terms of tvec a 230% increase. So terrorist and violent extremist content still constitutes a very small percentage overall of the reports that we receive. But that is increasing substantially. I mean, a 230% increase by any measure is huge.


Rebecca [00:11:08] Has it increased in light of some recent sort of events that particular stat?


Toby [00:11:13] I think if we look back at this financial year, we'll see that that is the case here. So we're obviously very heavily involved in the response to the Wakely livestream stabbing and the degree to which the eSafety Commissioner became part of the discourse around the significance of that event was really quite substantial. And that definitely increased public awareness of our role, which was terrific. So I think the short answer your question is yes, we'll see an inflection point when we go back and look at the data from sort of April, May, June, as that matter worked its way through the federal court. But we've seen big increases in other areas as well. We're now dealing with about a third more cyberbullying complaints than we have done over the last financial year. We're dealing with really complex matters through the cyber bullying scheme as well, cross-platform matters that involve whole school communities, which in regional and rural areas can be really challenging because of the way that school or the schools have an influence on the social fabric of the community. We're seeing an increase in the number of complaints too through the cyber bullying scheme that deal with suicide and self-harm. So this is a cyber bullying perpetrator encouraging a person to self-harm or suicide, even though they still remain, fortunately, a fairly small percent. We've seen about a 84% increase in the number of those kinds of complaints, which in light of some really tragic circumstances over the last sort of 4 to 6 weeks of young people taking their lives, not necessarily because of cyberbullying, but with cyberbullying as part of the background context, that that really worries us.


Heather [00:12:50] And do you think with those massive increases in that reporting, do you think it's because there's greater awareness? Or do you think that there is proportionately a growth in that space and therefore people reporting? Or what do you think is the contributor? Or one of.


Toby [00:13:08] That is the perennial question that we ask ourselves. And I think the answer is probably a bit of both. Certainly there's a greater awareness of eSafety's role in relation to the harms that befall children that are targeted towards children. But I think there's also an increase in incidents. But we need to be really careful. And I will just caveat that by saying that's a bit of a guess because the scheme actually requires in relation to cyberbullying, at least for a person to have first complained to the social media service before they come to us, we're a safety net in relation to cyber bullying and adult cyber abuse. It's not the case with the image based abuse. People come to us immediately because there's a general prohibition in the act against sharing material that is intimate without consent. The ability for us to hazard a guess as to incidents is based in part on our understanding of how this is affecting the community through a media reporting too, rather than just information that we're getting through the complaints scheme. But if severity is any indication of an increase in incidents, then yes, we're certainly seeing a growth in that area.


Heather [00:14:07] And you touched on Wakely there, which I agree really put the eSafety Commissioner and the role that you do under the light, which can be a good thing coming from a challenging incident. And we've seen some horrific events being livestreamed both domestically and internationally. And it's a natural question of the community or parents to say to the eSafety, why can't you just take it down? Why are you why are you letting our kids see this?


Toby [00:14:33] Well, that was very much what drove us to respond to Wakely in the way that we did in the immediate aftermath of the live streamed attack, which at that point hadn't been declared by the New South Wales Police Commissioner as a suspected terror attack. We were already working with the social media platforms from about 830 that night to highlight the fact that we were seeing this material be propagated across platforms, particularly X, and to ensure that they were alerted to what we were seeing, where we were seeing it and the kinds of features that accompanied that distribution. So the following morning was when we learned that that declaration had been made by Commissioner Webb. And that then led us to a position of saying, well, objectively, that increases the seriousness of the incident that we're dealing with. And from our assessment of the way that the Online Safety Act was concerned, we were satisfied, and the eSafety Commissioner was satisfied that issuing notices to platforms when a material was being distributed was the only prudent course of action available to us for two purposes, mainly. One, because we were really concerned about the extent to which young people using those services were likely to encounter the material. Anyone who was using X at that time would have seen the videos of the live stream were immediately available. As soon as you open them in everybody's feed. And I heard from lots and lots of parents in the ensuing days about the fact that they had encountered this material without any expectation or preparation. The other was more connected to social cohesion as an outcome. I mean, we're really alive to the fact that the kinds of social tensions that led to the riot outside the church, a riot in which New South Wales police officers were injured, some of them seriously. Those tensions were still very much present within the community and we saw it as incumbent on the platforms to take steps to suppress the availability of the material so that those tensions could start to be eased. So we were working in sort of two axes, one in relation to young people, one in relation to those broader social cohesion objectives. So to answer that question, why can't you just take it down really goes to the way that these platforms are designed and operate. It is so easy to propagate the material without any effort. And one of the reasons we were concerned about X at the time is because the proliferation of the material, it seemed to be growing at a faster rate than on other platforms, which is why we prioritised that platform. It was certainly more available on that platform than others based on our assessment on the Tuesday. And so part of the answer to involves us overseeing regulatory mechanisms that are intended not to just achieve takedown, but to place the onus back on service providers to deal with the availability of that material online, to take proactive steps, for example, in relation to child abuse material. There are codes in operation now that are enforceable and where social media services that are T1 social media services are largest social media services at a higher risk based on the fact that that's where Australians are. And we need to take proactive steps to remove known child abuse material. So we're really well and truly across the threshold now of an era in Australian online content regulation where we're working at two ends of the problem. One point application of power to achieve takedown in acute circumstances. And two the creation of this structural response which places the onus back on the platforms to do the right thing.


Rebecca [00:18:01] Safety by design.


Toby [00:18:02] Safety by design, transparency, enforceable codes and standards. I mean, that's the way that we can shift the nature of the platforms, their design, their objectives, their incentives and create durable change that lifts standards rather than has us always working at the top of the mountain trying to chip our way down.


Rebecca [00:18:21] Who's enforcing that? The safety by design mechanisms on the platforms.


Toby [00:18:25] So safety by design itself isn't enforceable. But I think it's true to say that that's the ethos that underpins a lot of what eSafety is trying to do. It's been very strongly championed by Julianne Edmond Grant, the Commissioner, and I was at a conference a couple of years ago overseas where she was named the queen of Safety by Design, which I think is very well deserved to be very much her passion for safety by design, driven that discourse and driven the change. And we're seeing companies now overtly adopt safety by design as a principle of action, which is which is great. But the codes and standards are up to us to enforce. So we assess compliance. There's a staged approach to those being brought online at the end of this year we'll have codes and standards in place and enforceable that deal with that upper edge of material, child abuse material content. And then over time, we'll see the phase two codes come online, which deal with children's access to say pornographic material that's got a bit of a longer tail to it. And we'll see where that heads. But we're having very productive conversations with the industry.


Rebecca [00:19:29] We've been talking quite generally, I guess, about the content, but for any listeners wondering or specifically what it might look like what is illegal or restricted content? What does that include and what are the motives of the perpetrators using this content?


Toby [00:19:43] Something I've given a lot of thought about.


Rebecca [00:19:45] It's not an easy question.


Toby [00:19:47] And I think for any of us who've been working in this area for any length of time, like you always remember, the first moment that you see child abuse material and it's it just becomes indelibly burned into your brain. So we're really talking about that kind of content Bec. So images and videos of children being sexually abused, being physically abused. We're talking about material that advocates the doing of a terrorist act. So manifestos are a good example of that. We're talking about material that depicts attempted murder, violent terrorism. They're all contained within that illegal component of restricted and illegal content. When it comes to restricted content, we're talking about material that is inappropriate for children to see typically. So that's pornography, real violence, certain material that might be fictional in nature, but is detailed enough for it to be totally unsuitable for children, inappropriate for children. In terms of what motivates people to produce and consume that upper edge of material. There are very large and well-developed offender communities online that are responsible for the creation of what is referred to often as first generation material, so that's new child abuse material. And the reason I differentiate new versus known child abuse material is the known stuff is well vetted by organisations like the US National Centre for Missing and Exploited Children, which reduces that material to a specific fingerprint. And that fingerprint is then shared with companies like Meta, for example, and others working in the online space so that they can proactively detect the presence of that content on the networks. It's very effective way of limiting the distribution of known child abuse material. But if any communities are abusing children in their care and typically the transmission chain is the material is is produced through the abuse of a child, uploaded to a file hosting service with that link then shared on a network like the Tor Network, which is an encrypted, hidden network. And then it makes its way onto the internet through secondary distribution channels. What often motivates those offenders is not money, although there are some who sell the material, but notoriety and reputation. And it's interesting. There's a very strong sense in the community that contributes to the kind of cognitive distortions that justify the abuse of children to say this is not your problem, it's society's problem. Society's got the issue, not you like that. Society doesn't recognise, you know, sexual relations between adults and children as being valid expressions of love is their problem, not your problem. And so being able to contribute to that through the production of material is a lot of what drives these offenders, it would seem. And then there are offenders who are working in the very upper reaches of sadistic material as well, which says a lot about their motivations in terms of their own psychology. So psychology sits and socialisation sit at the heart of a lot of this. From my perspective, at least, others might have a different view, but that's certainly the way I've seen it when it comes to tvec, I mean, the Internet is just a perfect distribution mechanism for material that is capable of radicalising and achieving propaganda aims, as you see in your work every day. And the interesting intersection between some of those more extreme elements that move from theory to practice and those that enable through the shifting of the Overton Window and contribution to discourse around outsiders, for example, is one of the areas that I find really, really fascinating.


Heather [00:23:14] Something that we've been finding and not on mass so it's not a trend. But quite interesting to us is that some of our clients or people that come across our sort of interest have that mix of very strong sort of ideological grievances, but also a mix of holding files and there's sometimes being charges of child pornography, extreme gore other sort of aspects that's intermingling between the violent extremism and that sort of real hardcore child pornography. Are you seeing much relationship between the two or the interface between the two, or is it just sort of it's a cesspit out there?


Toby [00:23:55] Look, I'll start from it's a cesspit out there that is factually accurate. We don't see a lot of what is produced as a result of, say, forensic analysis of phones and devices. It's very much where the police sit. But we know from speaking to police colleagues and listening to experts talk about this, that when they seize devices that are related to a child abuse investigation, it's very common for them to find gore material and beastiality material. And often there is a sort of potpourri of paraphilias. So, you know, sort of sexual.


Rebecca [00:24:26] Potpourri like salad bar.


Toby [00:24:28] Salad bar, sexual perversions that can be bundled up into one person. And sometimes what these offenders are seeking out is the novelty of sensation. It's really important to understand, too, that not every person who commits a sexual offence against a child is a paedophile. I mean, sometimes we use the terms interchangeably sex offender and paedophile, but often those offenders are opportunistic or situational offenders and they may be driven by other things like power as opposed to a sexual interest in children. So it's sort of no surprise to to hear that the basket of content that is extracted from phones contains.


Heather [00:25:04] All levels of extremes


Toby [00:25:04] Extremes. Yeah. Yeah.


Heather [00:25:07] And just moving on to, dare I say, lighter topic. Is there such a thing today. So we're talking about there's a lot of discussion about bullying online. You know, when we grew up it was bullying in the playground and obviously that still exists. But this absolute keyboard warriors and this online bullying is happening and it's happening at a really young age as well. And who are you finding in terms of your reporting stat, whether it be gender or age or whatever who are you finding that are the subjects of bullying and what sort of platforms are they mainly reporting or identifying that that's where it's occurring? And have you seen any significant trends of late? You know, we know that Facebook is for old people now and everyone uses Snapchat or are you seeing those different platforms move?


Rebecca [00:25:54] Yeah, I guess are the subjects of the bullying, the ones that are doing the reporting or another cohort?


Toby [00:26:00] Yeah, thanks. It's maybe I can work backwards from that because one of the other things that I've noticed.


Toby [00:26:07] Let me let me take question sub part 2.


Toby [00:26:10] What we've noticed is that very often those who are the subject of bullying are also the perpetrators of bullying. And it is a very complex social phenomenon. Often, I think sometimes we look for goodies and baddies in our work and that can be really satisfying to kind of feel like, you know, you're always on the on the side of good. But when it comes to children and young people who are the perpetrators, and I use that term advisedly because we're not talking about criminal offences here, but who are responsible for bullying, They are very often victims in other ways in their lives, too. And that is because there is sometimes a very strong flavour of neurodiversity that is represented in their own lives and that can sometimes make it difficult for them to identify where the appropriate boundaries are in relation to behaviour online. But it also makes them a lot more susceptible to cyber bullying themselves and a lot more likely to be the victim of bullying in the playground. So these are important just to note that it's not always clear to us where the responsibility lies. And so we take a very educative approach when it comes to cyber bullying. We work a lot with schools. Schools are the most effective way for us to achieve change. We often work at the deputy principal level, talk to the school about what we're seeing and then broker relationships and conversations about not applying sanctions, but applying corrective discussions. So the tools available to us under the cyber bullying scheme are intended to be educative as opposed to punitive. So they're not backed by civil penalties, for example, they're backed by other interventions from the court, such as injunctions. And it's really important, I think, just to pause on that, because we don't want to punish children for this kind of conduct, but want to ensure that we're creating the basis for their understanding of citizenship to emerge and their understanding of the importance of civil discourse and a productive way of resolving conflict to emerge. That's not our responsibility. It's other's responsibility, including the parents. But to the extent we can contribute to that, we see it as a really good thing. And sometimes children and young people just don't understand the impact of what they're doing, particularly if they're unthinkingly using terms like KYS or kill yourself to the wrong child who's already susceptible to suicidal ideation and who might have a background of attempts of self-harm or attempts of suicide that can be really harmful to have contained in a message. But of course, the broader social family individual context is almost certainly not going to be known by someone who is responsible for cyberbullying material. So back to the first part of the question, which is around what we're seeing as far as statistics are concerned. I think we're talking about a 31% increase in the number of cyberbullying complaints that we're dealing with. The average age of a child who is targeted by cyberbullying is 13 and it overwhelmingly affects girls more than it does boys, at least it's our complaint stats are represented. So it's about a 66% split between boys and girls. So two thirds of girls as opposed to the remainder being boys and it's very often centred on school. So we see this really interesting graph where and it was disrupted by Covid, but we're seeing it stabilise again. If you chart all of our complaints, you see this sort of sawtooth pattern that roughly corresponds to terms and holidays nationally. And it goes to show that much of cyberbullying behaviour is driven by what's happening in the school environment and that's particularly acute in regional and rural areas. The school is the focus of so much social and community activity and for some areas, that's why cyberbullying becomes such a profoundly distressing destabilising issue, particularly when you have a young person take their own life as a consequence of some of the behaviours that are taking place within the school.


Heather [00:29:57] Isn't it eally interesting that you're seeing term patterns in terms so everyone takes a break from bullying in the holidays?


Rebecca [00:30:04] I was going to say when does it spike - in term or out of term?


Toby [00:30:06] It spikes in term, yeah. Yeah. So peak is is somewhere in the middle of the term usually and then it subsides very noticeably while school holidays are underway.


Rebecca [00:30:15] You could say that they would be spending more time online in the holidays, you know, so that's an interesting stat. Looking at stats, a stat that we draw on from a eSafety quite frequently in communications is that a third of young people have reported being exposed to online content promoting terrorism. I know you spoke of it being a small aspect of your requests, but do you get many requests for violent extremist content to be removed? And I guess it's an interesting question for us because we find materials that circulate can begin with very sort of simple comic satire, like the characters like Pepe the Frog, who's a Nazi symbol. And often kids would not even know that it's potentially harmful. And so how do you know when something needs to be removed in that sort of.


Toby [00:30:58] Yeah. So that's a great question and often relies on context. So to answer the question about stats, yes, we've seen a very substantial increase in the number of reports to us about terrorist and violent extremist content. So while in absolute terms, they represent still a fraction of overall reports with the focus absolutely being child abuse material, last financial year, we still saw it 230 odd per cent increase in the number of terrorist and violent extremist content reports provided to eSafety. I think that's symptomatic of the changing nature of the online environment and the fact that since Christchurch we've had a number of very high profile live streamed attacks or attacks that have produced materials like manifestos. So I'm thinking about Haley, I'm thinking about Buffalo, the year before last, the extent to which the perpetrators of this violence can quickly scale their content and the way it's picked up by those who are sympathetic to the content is really quite staggering. And so the challenge for us is always about containment as opposed to elimination. You just you can't sterilise the Internet. The Christchurch video, for example, is still available online. It's not hard to find. You're not going to find it from a Google search, but you will find it if you know how to follow links. And so the intersection between some of those extreme right wing memes like Pepe the Frog and the, you know, the MPC meme and all that classic sort of 4 Chan style satirical and I use that term advisedly. Lobby of far right ideas is a really important entry point into normalising the perpetration and the depiction of violence. One of the things that we found really disturbing was in the hours after the Christchurch attack, when that video landed on HN and started to make its way very rapidly across the Internet, was that there were multiple versions of the video that were quickly being produced by the 4 Chan community and related communities that were overlaid with a sort of Call of Duty style graphical layout. And that's exactly what the perpetrator was intending to achieve. It looked like because of the use of a helmet cam sorry a chest cam, it looked like Call of Duty. And that was very deliberate on his part. It instantly became mimetic in nature. And so you started to see these communities talk about his high score and when the Haley attacker was preparing his own attack, which fortunately didn't lead to anything like the casualties in the Christchurch attacks, you know, he was attempting to beat the Christchurch attackers high score and and so it goes. Peyton Gendron was responsible for the Buffalo shooting exactly the same set up as the Christchurch attackers. And so we see these sort of durable ideas start to cement themselves in the way not just the methodologies constructed, but the aesthetic of violence of far right violence. And you see it as well in the way the National Socialist Network portrays itself on the streets. They're all in balaclavas, they're all wearing identical black outfits. There is this strong kind of like extreme right wing cool. That is part of the attraction, I think, of belonging to some of these communities because you're not you don't just clothe yourself in ideas. You clothe yourself literally in the romance of the extreme right.


Heather [00:34:23] I just I think we were just touch on before we started the podcast about how much Tarrant has influenced young, particularly young men and even internationally. I think when we were talking to John Horgan on our other podcast he was saying that people, kids in rural middle America have shrines to Tarrant. And I just think this is just the absolute scourge of the Internet to be able to deify him in those horrific acts. Now talking about young men and particularly vulnerability online, because that's what we see a lot in the Engagement & Support Program is young, what I call them children, really 13, 14, 12, really sort of vulnerable online when it comes to being exposed to misogynistic content or online grooming. And we're starting to have more of that conversation, both in terms of the rise in domestic family violence. We're seeing a rise in terms of familial assaults, and then we're seeing it in terms of the more extreme end in violent extremism, of that increase in misogyny. And the whole issue of role of women has gone too far the rights of women have gone too far and that it's a man's world. What do you think about this? And you've seen that flow and are you seeing an increase that we're seeing? And do you have any comments on figures by the likes of Andrew Tate that are taking quite a leadership in this?


Toby [00:35:47] So we relatively recently published some research to the safety website that anyone who's listening to this is very welcome to check out. I think you'll find it fascinating, which was produced through collaboration with Michael Flood and with Josh Bruce and others, as well as our terrific research team at a eSafety. A Study concerned both focus group discussions and then individual interviews with 117 young men between the ages of 16 and 21 about their attitudes towards everything from online pornography to the role of influences in their life. And there's actually a pretty extensive case study that we include in that report about Andrew Tate and attitudes towards Andrew Tate. The story is mixed based on what we were able to learn from the young men who participated in the survey. On the one hand, there are young men who explicitly credit Andrew Tate with changing their attitude towards their life, towards health and fitness, towards their ability to make an income. Some are very supportive of Andrew Tate's, and I use this term advisedly efforts to counteract the feminist narrative. They say that he is doing important work to recenter masculinity as the dominant narrative within society, and that's to your point Heather the rights of women have gone too far, and that Andrew Tate should be considered as a champion of men in that sense. And that is obviously having an impact on the way some of these young men view women and view their role as men in society. But then at the same time, there are plenty of young men in that survey who said, you know, Andrew Tate is just full of B.S.. I mean, it's obvious it's obvious how artificial his persona is and that he's just trying to sell himself and generate an income as a result of positioning himself on the margins of how we regard notions of masculinity in society. People also referred to the fact that he's been charged by Romanian police with trafficking offences, and so they're able to see through his confected persona, I think, in pretty clear ways, which is really encouraging. The Man Cave, we refer to this research in the report. The Man Cave did a survey of about 1300 young Australian men. I think 25% of those young men said that they really look up to Andrew Tate as a role model, which I think is deeply disturbing. You asked for my opinion. I mean, I think he's a complete and utter fraud, You know, someone who presents himself as being master of his domain, yet unable to leave a country because of bail or probation conditions doesn't strike me as a particularly effective voice for. Bit of a contradiction. Yeah, that's right. You know, he's, by all accounts, allegedly a violent criminal who, you know, beyond the trafficking allegations has demonstrated an absolute embrace of violence against women. And I think to the extent that he acts as any kind of role model, he's an appalling one. So where he, again, acts as an entree into some of those more extreme beliefs, and I think we were talking about Elliot Rodger, too, before the podcast started. You know, when we think about the extremist masculinities that probably converge within the overall construct of extreme right wing ideas and then incel ideology as well. And John Coyne and the aspi team have done a great job pointing to the dangers that Incel ideology pose for national security. It's not just about individual threats towards women and girls, but this potentially has the capability of turning into a full blown national security threat. The collision from both neo-Nazi entry points and Andrew Tate entry points is incredibly concerning, and I think over time we'll understand just how active that intersection has been to contribute to acts of violent terror or violent extremism perpetrated by individuals, as opposed to the model that hitherto characterised violent extremism, which is you connected yourself to a group and acted in concert with that that group as opposed to that notion of stochastic or individual unpredictable terrorism. That seems to be the dominant paradigm for today.


Heather [00:39:57] So what essentially you're saying is besides Andrew Tate as a foul man, is that when we are looking at these young, vulnerable men who are looking at him from a fitness or a self-esteem point of view, and then sort of going into more of his doctrines, you can see that slippery slope into that more extremism sort of ideology, that right wing sort of issues. And so coming in as an entry point.


Toby [00:40:22] Absolutely. In the same way that we saw during Covid, the creation of or the fostering of nurturing of some really extreme ideas around public health measures was stimulated by entry points in the wellness community, totally innocuous ideas around healthy lifestyles and pure eating over time. And this is in part fuelled by algorithmic recommendation engines as well, which have driven people into the more extreme reaches of content led to people moving from, you know, juice cleanses to seeing Covid measures as being indicative of some sort of draconian effort to control and enslave them. You know, there are lots of well-documented pathways that individuals have followed between that entry point to that sort of more extreme starting point.


Rebecca [00:41:13] I guess, for people listening with reservations in terms of reporting. And young people especially, are often unwilling to report being a victim, whether through embarrassment or shame or fear of not being taken seriously or whatever the case may be. But would you encourage everyone to report, no matter how trivial they believe you know, the content or issue to be? You know, what are some of the implications for these young people in terms of after effects if they do have these sort of fears or reservations?


Toby [00:41:37] Yeah, sure. So I think it depends on what we mean by report. I mean, if it is to take some sort of action, Yes. The young people, the parents should feel very confident in taking an action. And I might just give the example of sexual extortion, which is a perennial concern of ours, an acute concern, and one that has been growing in leaps and bounds since Covid. I'm pleased to say that so far we're tracking for a reduction in the number of sexual extortion matters that were reported to us financial sexual extortion. But a product of organised criminal gangs often operating from overseas, that's great, but it's still a very serious source of concern. And we know through speaking to coroners around the country that coroners investigating matters that have led to the suicide of young people as a consequence have been the target of sexual extortion, which is just beyond devastating. So the message that we articulate when it comes to sexual extortion is, number one, tell someone, tell someone, don't take any action in response to their demands so don't pay and block but the telling someone bit is the most important part of it. And that doesn't always in cases where young people don't necessarily have access to a parent or carer that they can trust, that doesn't always have to be a parent of carer, but it should be another trusted adult if that person is in their life. There is such a moment of revelation, I think, for young people when they bring someone into the private crisis and understand that it is actually soluble, that actions can be taken right now to fix the problem. The relief. I was speaking to a law enforcement colleague a couple of years ago and he said something so, it just stuck with me ever since he was a former Internet crimes against children commander in the United States. And he said that in his advocacy work, he's encouraging parents to have the following conversation with their children to say, no matter what happens, we're just down the hall. You know, if it's 2:00 in the morning, you're not alone. All you need to do is come wake us up and say, Mum, dad, auntie, uncle, grandma, grandpa, whoever is responsible for caring for the child, I need help. Now, that was just such a powerful message for me. I just find it really emotional. And I'm the parent of two young girls myself, and the idea that they would ever suffer in their bedrooms alone feeling like they can't disclose is just so devastating.


Rebecca [00:43:46] Remarkable it eludes so many families, though. Yeah, that just that disconnect.


Toby [00:43:51] Yeah, It can happen so quickly as well. The coroner's report about a young fellow in Victoria who took his own life. There was a period of about 90 minutes between receiving the message that was the trap closing shut, the demand for money on pain of having the intimate material distributed to everybody and him taking his life is such a short moment.


Rebecca [00:44:11] Just reach out to anyone.


Toby [00:44:12] He was at home. So there is no blame to him at all. I mean.


Heather [00:44:16] You can only imagine he felt that bad and how bad the parents felt. Yeah. And just a bit of advice on we do touch on sextortion a little bit in terms of some of our clients being vulnerable to that because they're online so much and we know that young people are sharing more of their lives online and we have parents come up to us and say, you know, my daughter is sharing an image of herself to her boyfriend or, you know, they're being requested for material, etc.. What would you say to parents who are facing that time at 14, 15, 16 years old that. They find images or they come across or, you know, it's just a bit of fun, Mum. Like, what sort of advice would you give to them besides blowing their stack? Because we know that doesn't work.


Toby [00:45:01]  Yeah. Yeah, that's right. And nor does taking devices away, it's it tends to aggravate the tension and the conflict between parent and child. And so I should just say I'm using parent here to refer to all of those adults who are responsible for caring responsibilities. The importance of early, open, honest communication just cannot be overstated. I'm having this conversation with with my own family members who have children who are right in that in that danger zone of age. And those conversations are going really, really well. And in the case of one of my young relatives, he just he hadn't thought about the potential for that material to compromise him and had just assumed that anyone contacting him on any of the social media services that he uses is well intentioned and who they say they are. I think it really is good to remind kids that because of putting so much of themselves online, building a profile of them for an offender, a motivated offender is child's play. You know, we've got good evidence. A guy called Paul Raffile from the Internet Contagion Research Institute is published really extensively about a group called the Yahoo! Boys, West African based Organised Crime Group, now being declared dangerous organisation by Meta, which is great. So they're no longer available to use Facebook and Instagram to overtly advertise themselves, but they've published manifestos. I'm talking like hundreds of pages of scripts, methodologies, tips and tricks for hooking what they call clients and then developing the relationship in such a way as to yield, you know, like 500 bucks, a thousand bucks at a time in the process, destroying a child's life along the way. And so the way that these groups are organised and the kind of power they have to manipulate children and young people just can't be underestimated. I mean, they are very motivated offenders in the same way that child abuse offenders are motivated, they are online and they are constantly searching for for victims. And so back to my earlier comments about target hardening. It's a great example of how you can harden your kids by saying doubt everything unless you have had some objective verification of this person's identity, assume that they are not who they say they are, that ability to exercise scepticism and critical reasoning, you know, I mean, that's that's a great way to not become a victim. But one of the other things that I've seen with just so brave, but it's such a great way to neutralise the power that an offender has over a target of sexual extortion involved a 17 year old bloke who came to us and said, look, I don't think there's anything that you can do now because this is what's happened. He was trapped in a sexual extortion exchange. So it was a classic methodology, right? Because there is a methodology. It's tends to go Instagram, initial contact, and then on to Snapchat, which is where the exchange of material happens, and that's where the threats are made. So that's exactly what happened to him. He masturbated, shared a video over Snapchat. They revealed themselves as being offenders. And the first thing that he did was send a message to his entire social group, including his family, on every social media service he used to say, this is really embarrassing, but if you get a video of my penis, I've been the target of sexual extortion. And I'm really sorry. I'm I'm I can't tell you how embarrassed I am. And I know that you'll be embarrassed to feel free to delete it, but it instantly neutralised the power that the offenders had. And he let them know that. And he's like, I've told everyone so f f off. Yeah.


Rebecca [00:48:28] Yeah. Well, I think you've actually answered our next question, which is about, you know, what simple tools and tactics can young people adopt if they're suspicious of an account or image? And I think you've said, you know, just don't trust anything, you know. Yeah.


Toby [00:48:38] And I would say to Bec like, don't pay, don't pay, whatever the view you might have in that moment, if you just give them something, they'll go away. They never will because you become the bank. It's a fishhook and you are just reeled in constantly. I mean, we've had cases where victims of sexual extortion have paid not just hundreds of dollars or thousands of dollars, but tens of thousands of dollars. And think about the motivations behind these offenders. As soon as they know that you can pay, then you're their income.


Heather [00:49:06] And should not only people report it to you, but should they report it to police?


Toby [00:49:10] They should. They should. It's a criminal offence it's a form of blackmail, obviously. What state police have told us is that very often they don't have any obvious means of investigating if the conduct is taking place offshore. The AFP has done absolutely fantastic job disrupting sexual extortion, financial extortion, sexual extortion, targeting under 18s. This is all public they had to do an operation called Operation Huntsmen, which is intended to achieve both onshore and offshore disruption. And as a result, their numbers have reduced substantially. So that's excellent. The AFP has the ability to sort of work offshore like that. New South Wales Police, for example, is a lot less able to affect those international outcomes. But reporting it, I think even if it's just to create an event number so that you can track your engagement with law enforcement is very wise. But esafety.gov.dot.au is where you will find an absolute mountain of information that shows how to respond, what to report, what actions can be immediately taken, particularly when it comes to in-app reporting and blocking features, which are a great way to create some barrier between you and the offender.


Heather [00:50:20] We talk a lot about young people and posting online, but I think sometimes what we forget to talk about is parents posting online. And I think because we've come from fairly grimy backgrounds in terms of what we've seen and what we've looked at, we see a picture of a child in a bath that seems innocuous, but it really is sort of fodder for people to exploit. So I think not only young people, but also parents and caregivers and grandparents should be really aware of what they're putting online.


Toby [00:50:49] I completely agree. And in fact, in 2015, I will say the site that was the subject of this porting is still live and active. But in 2015 we showed that a particular site in the clear web. So this is just the normal part of the web that you can access using a browser or a conventional browser was responsible for distributing about 2.7 million images, innocent images of children harvested from social media, and they was exhaustively organised and targeted towards those with a sexual interest in children. So we might assume that because we know who is in our social media network that they can be trusted. I don't think that is always the case. We can't vet our entire social media network and I think it's unfortunately a stark reality that those who are abusing children are often within the familial unit. And so exercising a high degree of caution there. I think one of the things that I've seen a lot of my friends do is their children hit school, is posting school photographs. I mean, their uniforms. You can clearly see the emblem of the school on the uniform. And I had a good friend do that about a year and a half ago. And while I was waiting for a flight one afternoon, I decided to build up as much of an intelligence profile as I could on them from the perspective of someone who knew nothing about their account because their account was completely open and I showed them the fruits of that and said, I know who your children are based on nothing more than this photo. We need to be a lot smarter about that. And we need to assume that there are bad actors who are looking for an opportunity to come in and destroy a family.


Rebecca [00:52:28] Parents just don't realise how seemingly innocent pics can be used maliciously online. Now you're a big tech guy, so I have to ask about AI here actually, I'd like to know, has AI had any part to play in how people are engaging with content online or developing content? And has it changed your approach to investigations in any way?


Toby [00:52:50] Yeah, thanks. So I thought you're going to ask me for some advice about restarting your computer. And I'm just going to say, Look.


Rebecca [00:52:54] I've got that one.


Toby [00:52:57] So AI is already disrupting a lot of the harms landscape in ways that are not advantageous to us. We're hearing a lot from our law enforcement colleagues that the ability to produce photo realistic depictions of children makes their work to identify victims impossible, impossibly difficult. And we've got in Australia, we've got some of the best victim ID experts in the world. In fact, they've been brought here because they are the best in the world. And those people who I have enormous respect for are saying that they are driven to despair often because they just cannot distinguish between what is real and what is not real. So they're looking for things like background detail to identify where this child is. But the background details generated by artificial intelligence. For us, we're not seeing a lot of that material filtering to our complaints yet, but we will some point. We're seeing only about 1% of all image based abuse material that is generated deepfake material 1%, it's still a lot, but we expect it to increase substantially as the technology becomes more and more democratised and as these consumer facing apps that you can just access via a web browser that provide, for example, like, you know, nudify and I'm using air bunny quotes here nudify services where you can take an image, provide it to the consumer facing model, and then the model then infers what a person without clothing looks like and provides a photorealistic depiction of that that is incredibly widely available. And very often those services are advertised on major social media services as well. That material like we are really just at the dawn of this age and trying to understand what threat landscape looks like in 5 to 10 years just blows my mind. I mean, it really does. I'm a very heavy user of generative AI. I think it's an absolutely fascinating technology and I'm constantly amazed at the results that are produced. And as a result of my own, you know, constructive use of it, like my life has improved in all sorts of different ways I can get training advice. I'm a karateka, so I'm preparing for a grading in December. It all is terrific motivation and specific advice about my carter for example. That's great. But to know that offenders are getting the same value out of a lot of these models as a result of pushing them towards outputs that are intended to support and enable their own criminal activity is incredibly disturbing. And one of the things we're especially worried about is the way that generative AI has supercharged, scripting around grooming, for example. You can imagine a situation where you've got a standalone model running on your machine. It might not necessarily come with the guardrails that a chat GPT does and where you know it's because you're reasonably adept at scripting or you've been able to pull a script from the internet. You've got hundreds of instances of generative AI models engaging with targets in a way that is not at all apparent to that target, that they're not actually conversing with a human being. Because I've had plenty of times, and I'm sure we all have, where the conversation and again conversation this is with a real human being, but it feels so realistic being able to discriminate between a person on the other end with a keyboard and a generative artificial intelligence model is more or less impossible now. And in fact, I saw an amazing survey the other day that showed that a majority of Americans and I think we probably see the same thing here, but a majority of Americans believe that large language models are conscious. So then use that understanding of, you know, the fact that we anthropomorphise technology in ways that are really, really unhelpful. We assume that these technologies are human. If you aren't even thinking that you might be contacted by a large language model, but you've got offenders running these farms, these multiple instances to scale, not being questioned and to scale and automate their work. I mean, and this is, you know, we're a year and a half down the track of what large language models and multimodal foundational models can produce and will be.


Heather [00:56:49] And I think from a violent extremist perspective, we're looking and talking about a lot with academics and experts like yourself in terms of myths and disinformation. And what I really found was with October 7th in the Middle East and subsequent attacks between Israel and Palestine, etc., how much misinformation and disinformation was put out on both sides of this and graphic videos of attacks or children being harmed in very graphic ways that were later be found out and put on media, mainstream media as fact, which is really sort of amplifying people with grievances or even just people that are have family and friends that are suffering in that area. And it's really amplifying their beliefs and that and there sort of conflict in terms of this that what it ends up is being just generated to do exactly that. And I just I feel there's so much element of powerlessness, but also that need to critically think to test. Yeah, you know, am I actually talking to a person? Am I following the script?


Toby [00:57:55] And is this high impact photograph of a deceased child real? The way that media is consumed now on the likes of Tik Tok, where you just get a second of an impression of a video before we scroll to the next one. You know, the aggregate of all of those impressions, if the algorithm is driving you to content that is inclined towards supporting your particular worldview around Hamas, around Israel and Hezbollah or whatever, you know, the ability to gradually shape attitudes and then lead to a tipping point where you say, enough is enough, I need to do something. I need to take an action, and your action is then justified on the basis of entirely synthetic content. That is our reality now.


Rebecca [00:58:38] That's exactly it.


Heather [00:58:39] So it's grim.


Rebecca [00:58:43] There's no specific  answer.


Heather [00:58:45] It is grim, but we're all experiencing it. Young people are growing up in it. Parents are trying to, as you say, caregivers are trying to parent through it. What do you think would be the best supports for young kids and parents like, you know, so many parents feel powerless against the Internet. They feel that the child knows more about the computer than they do at home. What would your words of advice as a parent and person as the deputy eSafety Commissioner, what would your words of advice be to parents?


Toby [00:59:13] I think the answer to that question is, first, don't assume that you can't know as much as your child does. I hear this a lot. Parents like it's just like, you know, that they've got this magical connection to the computer and technology and this innate understanding of technology, that's only because they've been exposed to technology from the very earliest stages of cognition developing and maturity developing. It doesn't say anything about their greater capacity to understand the underlying technology than ours. So I think it's important for parents to understand what the technology is, how it works what the apps are that are installed on the phone, how they work, how friendships are connected, how social networks develop over time. Understanding who your children are engaging with and that can be through co play as well. Sitting down with your child and saying show me how you use this app so I can understand it and educating yourself about the tools that are embedded in social media services and devices themselves that allow you to very effectively control your children's access to content and for example, Instagram. It's a very welcome development, has created a whole new suite of teen controls to ensure that parents can make messages unavailable overnight. For example. You know, sleeplessness is a major source of concern. It produces anxiety and all sorts of deleterious impacts for children. The way that the controls now make contact lists private by default, for example, is a great way of preventing sexual extortion. Knowing those tools are available to you and how to enable them on your device and apps I think is a great way to start. And there is plenty of information available on the eSafety website. Shameless plug esafety.gov.au


Rebecca [01:00:48] There really is though.


Toby [01:00:49] But empowerment comes through knowledge and understanding. I've found being able to have those conversations with my own family members to say, have you thought about trying this or that? If you use this app, what are you doing in terms of using your controls to prevent the situation you just told me about is actually really welcome, and that leads to some really productive conversations with the children and young people in those family members lives.


Rebecca [01:01:12] I mean, I guess just to add to that question, we are last question, I promise. But outside of the family, you know, what can the greater community do to help? I mean, we might be looking at young people who need support outside the home because they don't have an environment within the home that offers that safety net. Where can they go? Where can they find resources, support?


Toby [01:01:32] Well, I think you're probably better placed to answer that question in some way. I suppose I'd say that we need to understand that children don't come to these orientations and viewpoints without any kind of influence. The home environment is obviously an incredibly powerful cruciable for the forming of those viewpoints and values and having sympathy and really an understanding from a perspective of compassion, I think is a great way as a community for us to ensure that those young people are given the right support and then provided a pathway out of holding some of those views. I mean, I think often about some of those terrible images we saw at the height of ISIS where we had Australian parents taking the children into a conflict zone and then exposing them to the most appalling, unthinkable violence. What hope do those children have to have anything remotely approaching a normal childhood that leads to development along a spectrum that we generally associate with healthy, healthy childhoods. Those adverse childhood experiences are so profound in creating adult viewpoints that understanding that and doing it from a perspective of compassion I think is really important. I mean, I was a youth liaison officer when I was in the police. From that work, I was probably taken to a point where I could no longer see the young people that we were dealing with as offenders, as products of an environment that, you know, really, I mean, breaks your heart to see how some of these kids are raised. So I guess that would be my message for the community. Let's approach this from a perspective of understanding compassion and humanity rather than from one of punitive sanction.


Heather [01:03:04] I saw a great quote the other day where it was a child, a young adult was saying, don't have a conversation around me. Have it with me. And I think that was just and I think exactly right is where we're coming from, is understanding compassion and bringing them into that conversation. So so the online space is obviously this big challenge that we all face. And it's difficult for parents to keep up with the increasing number of apps and games and the way that kids now interact on the different platforms that you're supposed to understand and engage with. We really need to take responsibility. We think it's about conversations and having open conversations with your children around online safety and making it a safe place for them to speak up if something doesn't seem right or they've had a strange approach or they're having an interaction, that doesn't make sense. Despite the recent push to ban social media for under 16s, technology isn't going anywhere. Kids are not going to stop using devices to communicate with their friends. So it's about getting wiser learning and adapting. So Toby Dagg, Deputy eSafety Commissioner, thank you so much for joining us. In three words, what would you say about technology.


Toby [01:04:08] eSafety can help.


Rebecca [01:04:10] Absolutely. Thank you so much for joining us today Toby.


Heather [01:04:13] Thanks Toby.


Simon intro [01:04:16] You have been listening to Stop the Conversation, a podcast series produced by the New South Wales Countering Violent Extremism, Engagement and Support Unit. For more information please see the episode notes or visit wwww.steptogether.nsw.gov.au.


Last updated:

17 Dec 2024

Was this content useful?
We will use your rating to help improve the site.
Please don't include personal or financial information here
Please don't include personal or financial information here

We acknowledge Aboriginal people as the First Nations Peoples of NSW and pay our respects to Elders past, present, and future. 

Informed by lessons of the past, Department of Communities and Justice is improving how we work with Aboriginal people and communities. We listen and learn from the knowledge, strength and resilience of Stolen Generations Survivors, Aboriginal Elders and Aboriginal communities.

You can access our apology to the Stolen Generations.

What's this? To leave this site quickly, click the 'Quick Exit' button. You will be taken to www.google.com.au

Top Return to top of page Top