Start The Conversation Episode 8 Transcript

Ben [00:00:03] There are people out there who want to mislead you and want to spread falsehoods and want to manipulate you and it's about putting measures in place, almost like a process in place to stop that happening. We've almost lost this skill of critical thinking.


Simon intro [00:00:40] The violent extremism landscape is fluid and complex and it can be difficult to navigate. This podcast series has been developed as a means of providing listeners with some thought -provoking topics within this context. Personal insights and journeys, as well as helpful information that could assist someone who is vulnerable to being involved in violent extremism. The Engagement and Support Unit services focus on early intervention, awareness and resilience against violent extremism. They consult with and support the local community with this information to help mitigate the drivers of violent extremism and raise awareness of the complex factors and vulnerabilities that contribute to these ideologies. Before we begin, we would like to acknowledge the traditional custodians of the lands and airways on which we are meeting and broadcasting today. As we share our learning, we also pay respects to elders past and present. It is their knowledge and experiences that hold the key to the success of our future generations and promote our connection to country and community. Please note that views expressed are not necessarily representative of the NSW Government. Episodes may contain depictions of violence or sensitive topics that some people may find distressing. For further information, please view our episode notes.


Rebecca [00:01:39] Hi, I'm Rebecca Shaw, Communications and Community Engagement Manager for the NSW Countering Violent Extremism, Engagement and Support Unit. This is Start the Conversation. Today we are chatting with Ben James. Ben is a journalist, editor and author with more than a decade of experience working at news organisations in Australia and the UK. He has been editor of AAP FactCheck since February 2022, having previously worked across a variety of News Corp titles. AAP or Australian Associated Press, FactCheck focuses on investigating claims of political significance and counteracting misinformation shared on social media. They have no political affiliations nor agenda, simply a focus on revealing the facts. FactCheck also actively monitors traditional media sources and various social media platforms for material that is suitable for fact checking. The final decision on what material is accepted rests with Ben. Also joining us today is Johanna Hough. Johanna works with us as a Senior Project Officer in the NSW Engagement and Support Unit. A qualified teacher, she has worked in youth engagement projects and in tech safeguarding and education. So she has a particular interest in today's topic. Misinformation is pervasive and can be difficult to spot, but building the skills to recognise reliable information doesn't have to be complicated. Fake videos, news and misinformation are everywhere at the moment. And what exactly is the link between the spread of fake news and radicalisation or violent extremism? Bad actors are constantly seeking faster communication channels and broader distribution opportunities to twist online narratives using misinformation and disinformation for their own political purposes. Terrorism and law enforcement experts Kristy Campion, Jamie Farrell and Kristy Milligan wrote a paper on misinformation used by fringe groups during the pandemic in Australia. It found evidence of situations where government information and media reportage was perceived as unreliable. The authors wrote that this situation created yet another space for extreme ideological narratives to emerge and exploit. So how exactly do we identify and respond to hateful conspiracy theories and fake news? This might be the perfect time to hand over to our expert. Ben, thank you so much for joining us today.


Ben [00:04:09] Thanks for having me.


Rebecca [00:04:10] Can you tell us and our listeners what exactly is the difference between misinformation and disinformation?


Ben [00:04:16] Yes, it's a good question, a good place to start. I mean, put simply, misinformation is the spread of false information by those who genuinely believe it to be true. So we saw a lot during the kind of Covid pandemic, a lot of people who genuinely believed claims about the vaccines, you know, spreading what is false information. But yeah, there was, you know, a genuine belief that those claims were true. Disinformation is also false information, but it's spread by those who know it's false. So it's really a question of intent. You know, so say, for example, something like the Bondi Junction stabbings that we saw earlier this year. Somebody initially put out a false name and that was disinformation that was to cause damage to a particular group, a particular religious group. And so that was that was clear disinformation that many people then shared that and spread that thinking that was true. That is misinformation. So as I say, yeah, it really is a question of intent. But when it comes to the damage and harm that it causes, there really is no difference.


Johanna [00:05:21] Great. Thanks, Ben. And your answer to this may differ depending on misinformation and disinformation. Now we know the difference. But who are the bad actors in Australia at the moment in terms of creating this fake news?


Ben [00:05:35] Yeah, look, it's again, it's a really good question. And I think the problem that we face is that everyone has the potential to spread mis and disinformation. And it's something we're kind of constantly looking at to try and get a handle of who is behind these falsehoods. You know, and that's not just something that's that's isolated to Australia. There's a lot of kind of research and investigation going on around the world. But we've recently been looking into the spread of kind of fake news off the back of major events. I mentioned the Bondi Junction attack, but also the horrific incident we saw in the UK recently, which stabbings in Southport. And for those who don't know, three three girls were killed in an attack. Taylor Swift themed dance class. A teenager was was arrested and is currently before the courts alleged to have carried out that attack. And as a result of that attack, there was a lot of kind of misinformation, which ultimately in part led to the kind of widespread riot that we witnessed in the UK. So, yeah, we have been looking into a lot of the kind of major players behind the false claims that emerged from these kind of major events. And we roughly kind of grouped them into three three groups, three categories. And this is from speaking to kind of experts around the world that, you know, this is seems to be something a bit of a pattern that's emerging. So the first kind of category you have a foreign bad actors. These are often kind of state backed disinformation spreaders. And they're not so much concerned with the event itself. They don't have a particular goal in relation to whatever this this event has taken place. They have just got a goal to sow discord within communities and countries. So, yeah, anything like that, anything like a major event, a major, you know, as I say, the Bondi attack or Southport stabbings, even things like elections and referendums we saw with the voice referendum. It was an incredibly polarising, democratic event and a perfect opportunity to sow discord. So that's that's the kind of first the first kind of group. The second group are those that do have a kind of particular goal, a particular kind of ideological stance who want to capitalise on events. So say with Southport in the UK, you have the far right groups and anti -immigration groups who immediately kind of pounced on that event very much to kind of drive their agenda. In this case, a kind of Islamophobic anti -immigration. And then you've got this third group who are perhaps not ideologically linked to anything as such. But it's more just a drive to, you know, monetize these events or just grow power and influence. And you look to people like, you know, perhaps Andrew Tate, who may know don't really know how you describe the social media influencer is probably giving him a bit too much credit. But yeah, whether it certainly doesn't seem like he has much of a strong ideological link to anything in particular. But, you know, he's monetizing this and he's gaining new followers. And I just go, I guess, creating kind of more power and more influence for himself. Yeah, I mean, it's not an Andrew Tate problem alone. And you mentioned Australia. It's something that we've seen over the last few years in Australia. A very good example is this. I mean, during the pandemic, a lot of new influencers, alternative voices kind of popped up across social media, initially railing against, you know, no vaccine mandates and the vaccines and lockdowns. And there was a lot of kind of mis and disinformation being pushed. I'm sure a lot of them did have genuine concerns, but I think there's also a cohort that were in it just to create that platform and create that audience, you know, effectively to monetize it. And, you know, when things did open up again and the borders reopened, the kind of the focus shifted away from COVID. And it was interesting to see how a lot of those those characters and those groups quickly pivoted to new areas, new topics. And last year it was The Voice was, you know, it was incredible to see so many of these people that were so invested in vaccines and mandates all of a sudden drop that and move on to The Voice. And there were various kind of narratives around, you know, secret plots and, you know, UN agendas. And then, you know, as soon as The Voice kind of disappeared in October last year, they moved on again. And, you know, it was maybe the Middle East and things are happening there. And now it tends to be kind of global bodies like the Un and the World Health Organisation. And, yeah, so it's as I say, I'm sure there are some there that are spreading misinformation. They do have these genuine concerns, but there's also an element of capitalising on whatever event is to spread misinformation and build that platform and build that profile. And, yeah, I mean, some people are just narcissists, some people just like the attention. And but, you know, I think some people are also, you know, trying to create a living and make money from it.


Rebecca [00:10:37] Cost of living.


Ben [00:10:38] Yes.


Rebecca [00:10:40] Look, you did touch on this earlier and sort of speaking to that intent. Can you explain how people with extreme ideological narratives might exploit misinformation and disinformation for their own political purposes? So how would this present specifically?


Ben [00:10:54] Just because it's fresh in my head and we looked at it quite a lot. I'll go back to what happened in the UK and Southport again. And just as a reminder for those who don't know, you know, horrific event, I think it was a Monday morning. There were a number of young girls on school holiday and they were at this Taylor Swift themed dance class in this fairly insignificant town called Southport. They a number of people were stabbed in an attack. Three girls died. There were a number of injuries and there was an arrest soon afterwards. Turned out to be a 17 year old and he is currently before the courts. And yes, he is he's alleged to have murdered these three girls. So the background to that happening is that, you know, in the UK at the moment, real kind of genuine concerns about uncontrolled migration, particular small boats crossing the English Channel from France and mainland Europe. And the UK authorities not having any control over this and not knowing who's entering the country. And that was a huge part of the recent general election. It was a kind of narrative that kept coming up. And, you know, there are clearly genuine concerns from a lot of people around, you know, migration in the UK. But yes, that is against the backdrop of this horrific attack. And, you know, it was horrific. It's, you know, if you want to create a narrative of innocence cruelly being taken away, three school girls at Taylor Swift themed dance class, you know, that is, you know, it doesn't get much more shocking than that. And you had this situation where people were terrified. They were shocked and, you know, there was awful event that happened. And, you know, I feel it myself when something like this happens. You're trying to find answers. You're trying to how can this happen? Like who would do this?


Rebecca [00:12:36] Natural response. Yeah.


Ben [00:12:37] At the same time, you've got this really difficult situation where the police are engaged in this incredibly kind of complex investigation. And, you know, it's not like the Bondi attacks where the Joe Cauchy, the attacker was killed. You had this 17 year old who survived. What to do with that, yeah. Automatically there's restrictions in place. He was underage. So there's, you know, he's given anonymity until a judge says otherwise. So you've got the police and the media kind of restricted in terms of what they can say. So you've got this massive kind of vacuum of information. All these traumatised people trying to make sense of what's happened. And then you have groups, you know, far right groups and Islamophobic individuals and people with an anti -immigration agenda pounced on it. They saw this vacuum of information. You've got people desperately searching for answers and it doesn't take much. And we saw it. We saw the kind of posts and issues started coming through. You know, I heard he was a Muslim. I heard he arrived by boat last year. You know, apparently he's got this Arabic name. That's all you need to kind of start things off. And then you have people perhaps more from the mainstream who, you know, would perhaps be unfair to call far right. And, you know, they just say things like, oh, you know, I don't know these rumours are true. Or maybe the police are hiding something and that builds and builds and builds. And, you know, as I say, you've got the police and you've got the mainstream media playing by the rules. And, you know, they're limited in what they can say. And, you know, by the time that any rebuttal comes or, in this case, by the time a judge, I don't know, it's probably about a week later, the judge enabled this teenager to be named. It's too late. The damage is done.


Rebecca [00:14:18] It's a perfect storm situation, isn't it?


Ben [00:14:20] Yeah, yeah. And it's really difficult. And as I say, within a few days you've got riots in the streets of the UK. You've got people throwing bricks at mosques and all sorts. I mean, it turns out this, you know, there's no suggestion that the man who was arrested was Muslim and, you know, he was born in Wales. So, yeah, it's so easy for those with an ideological stance to take advantage of these situations.


Rebecca [00:14:47] And that really is a perfect example, isn't it?


Ben [00:14:50] Yes, yeah.


Johanna [00:14:51] Ben, you mentioned before about, I guess, the mainstream media playing by the rules and then other people that are spreading disinformation deliberately for their own purposes. And you just gave the perfect example of that. Thinking now about tools, how does AI impact the spread of mis and disinformation?


Ben [00:15:09] Yeah, look, AI is a huge concern and, you know, it is top of every fact checker's agenda, concern about AI. And I think there are two main concerns about AI. There's the ability to produce mis and disinformation at scale. And there is also the use of AI generated images and audio to trick and deceive people. And look, it's pretty scary stuff. And we are in unchartered territory, really. And certainly as a fact checker organisation, they're trying to get to grips with it. Something we're battling every day. I mean, just talking to the problem of scale, something we've been looking into recently is an operation that's run by just, from what we can tell, a couple of people. Who are using AI to perfectly kind of generate polarising content which they then dress up very loosely as satire. This content is created by AI. As I say, perfectly crafted to make the most of social media algorithms and ultimately drive ad revenue to their website. But it's done just by two people and they can achieve that scale so quickly and easily without any, you know, output from them really in terms of time or cost. And that kind of thing simply wasn't possible ten years ago. You'd need an army of, you know, skilled propagandists, I guess. And that is a real concern. And I think it's up to kind of fact checkers and other media organisations to work out how we can use AI to our advantage to kind of combat that. And it's difficult because we don't want to, you know, we don't want to sacrifice any accuracy or anything from our side of things. But you kind of feel like you've got to meet fire with fire. So yeah, that's something we're looking into. And obviously the other thing, as I say, is AI generated images and audio. And look, a year ago it was a bit of a joke really. We used to, you know, see some of the images and things that were coming through. And, you know, people with seven fingers and people with their lips barely matching up to what they were saying. But it's a different ball game now.


Johanna [00:17:13] Much more sophisticated.


Ben [00:17:14] You look at some of the stuff now and it is incredible. And look, I think another six months time it's going to be somewhere else as well. So it is a real concern and it's probably a bit above my pay grade in terms of regulation and what governments and social media companies need to do there. But yeah, it's something we need to be alive to for sure.


Rebecca [00:17:35] Definitely. I mean, you talk about algorithms and AI. I mean, it's quite clear that technology isn't going anywhere, that something else is going to pop up. And we just have to learn to adapt and live with it. Better the devil you know type thing. So what steps can everyone at home take to protect themselves against misinformation and disinformation?


Ben [00:17:55] Yeah, look It is the question. Look, it's difficult. I tend to think of it as being a mindset and it's a habit. And look, I don't think it's a healthy thing to go through life not trusting anyone or anything. But it is being aware of what's out there. You know, being alive to the fact that there are people out there who want to mislead you and want to spread falsehoods and want to manipulate you. And it's about putting measures in place, almost like a process in place to stop that happening. And I think one of the main problems is the speed at which falsehoods and mis and disinformation spreads across the internet and social media. So whenever anyone kind of asks for their kind of top tip, what they can do, it always comes back to the same kind of process for me. And it's kind of day to day what we do is fact checking whether we kind of think about it consciously or not. But yeah, the same thing we always come back to. And initially that is to pause. You know, we are incredibly busy and, you know, we're always scrolling through whatever social media and, you know, buying a coffee and talking to someone at the same time. And we kind of just need to take ourselves off that autopilot and engage in some critical thinking. And for us, we talk about a kind of three question filter that we run all information through. A kind of a three step process and nine times out of ten we find that gets you out of trouble. So yeah, whenever we see a claim, whenever we see something, first thing we do is pause. You know, we tell people to pause. The next thing is to ask yourself the question, who is it that's making this claim? You know, who is presenting us with this information? You know, why might they be saying it? Have they got an agenda? Have they got a grudge? Are they qualified to be saying, you know, making such statements? You know, what's the kind of background? What are they? Have they said anything like this before? So that's the first thing we always kind of ask in this three question process. The next one is what's the evidence? The person that makes a claim, are they providing any evidence? You know, can we find any evidence? Can we test it against anything? And then the third question is what do trusted sources say? Is this claim supported? Is it contested? And this is all about kind of wider reading, kind of breaking out of whatever the algorithm usually serves up to us and reading across the, you know, political spectrum or whatever it is to find if, you know, trusted sources, you know, back up what's being said. And it's a really simple process and, you know, often does just take a couple of seconds. It's not like every time you come across a bit of information, you're like, right, let's sit down, let's look at these three questions and tick them off. You know, you kind of almost, once you get into the habit, you're almost doing it kind of unconsciously without thinking about it. You know, it can be as simple as, you know, thinking about who it is that said it, like, looking at their Twitter profile or whatever and seeing what their affiliations are or.


Rebecca [00:20:42] And just to add to that, obviously you're someone with high media literacy. And are these three questions or three steps the same for someone with high media literacy versus someone perhaps more vulnerable to being sort of fooled?


Ben [00:20:55] Look, we go through exactly the same process. I think the difference is that, A, we have the time to do it. We have the time to go a bit more in depth. We have access and are used to using various kind of digital tools. And, you know, as a fact checker, it's easy to reach out to experts and get leading academics or, you know, figures to respond to you to confirm whatever it is. But it is ultimately the same process. Yeah, as I say, look, the benefit that we have is mainly time. It's our job. It's kind of what we're paid to do. But, like, as an approach for anyone and everyone to, you know, prevent themselves from falling foul for missing disinformation, just having those three questions in mind, that alone will, you know, save you nine times out of 10. It's just, as I say, it's just we've almost lost the skill of critical thinking. We're so used to doing so many things at once.


Rebecca [00:21:56] I know. Everyone's so rushed. You say slow down, but I mean, people don't have the luxury of time these days. Retraining.


Ben [00:22:03] Yeah. And it is difficult. Like, I guess, you know, I'm a professional fact checker, but occasionally I fall foul of things. Like, I'll read something one day while I'm flicking through whatever social media app and I'll say, oh, okay, that sounds interesting. Then a couple of days later, one of my fact checkers will raise it in as a potential check and say, have you seen this? I was like, oh, I read that and thought that was real.


Rebecca [00:22:25] They're clever. They're getting more and more clever, aren't they?


Ben [00:22:27] Yeah. But it really is a habit, as I say. It's something I think initially you need to consciously think about doing, but then just build it into your kind of daily routine, really.


Johanna [00:22:43] What about kids that may not have been able to access that information or education around those three steps and may just be on their phone scrolling through social media? You just mentioned social media as a sort of prime source of fake information. Do you think there's any sense in this recent discussion from politicians around banning social media for under 16, or do you think there's ways we can educate young people and that's more the answer?


Ben [00:23:09] Yeah. Again, in terms of banning social media, probably above my pay grade, but I think it just reinforces how important media literacy is. I don't think it's a particularly controversial statement to say that more of this stuff needs to be done at a much younger age within schools. It's a huge threat to the future of democracy. You've got widespread information that's out there, and you need to be able to have a tooled up, skilled public who can process that information in a healthy way. I think it just speaks to the need for media literacy, and I think that really needs to be prioritised from a much younger age.


Rebecca [00:23:59] And parents' media literacy, obviously, because that's going to philtre down in the household, isn't it, from a young age. You referenced this earlier in that fact -checking is your job, and you obviously have the specific tools. And so what are these tools? What are the digital tools that you'd recommend using that are publicly available, I suppose, for everyone?


Ben [00:24:19] Tools absolutely are important. I don't like people to get too hung up on tools. I think, as I say, I think the main thing is having that mindset and that curiosity and engaging in that critical thinking is the main thing. But yeah, tools, there are certainly hundreds out there available to us for all manner of niche subjects. But I think if there's one kind of critical tool that everyone should know about, it is reverse image search tools. So much of the mis and disinformation that we come across now is based on images and videos. It's so easy with AI to distort or create images, but equally it's just as easy to take a real image and re -caption it or take it out of context. And it can be so powerful, as I say. So, I mean, again, going back to the recent riots we've seen in the UK, a couple of examples that we saw. There was a narrative going around that it's not the far right that are causing all the riots. You've got armed Muslim men on the streets who are attacking British people. And there was a really powerful image that was going around of a group of young men with what looked like swords holding them aloft. And as I say, the caption was, you know, these Muslim men were pictured in Birmingham yesterday and they're running around hunting, you know, white people. And, you know, it was a large group. It looked like they could be Muslim and they were holding large knives, swords. Yes. A quick reverse image search showed that to be a Sikh wedding, I think it was from, you know, five years before. And there were ceremonial swords that are typical at those weddings and it was part of a kind of celebration dance.


Rebecca [00:26:10] So like for the layman, I mean, I mean, because obviously we're talking like everyone has an understanding of all these tools. I mean, how does the reverse image tool work? You get, you capture the image and you put it through like a filter on a website.


Ben [00:26:22] Yeah. I mean, it's easier than that. It's, I mean, there are lots of reverse image search tools out there. The Google tool, Google reverse image tool is probably the easiest. And for those on Chrome, it really is just a simple right click on an image and you can select to reverse image search. And yeah, I probably should have said at the start, reverse image search basically takes an image and it looks for any of the uses of that exact image or similar images online. So with this case, with the man carrying swords, it will search for all other uses of that image. And in this case, it came up with the original, the original usage of his image, which was on some guy's Facebook post, you know, his album of this wedding from five years ago or something like that.


Rebecca [00:27:11] It really is that simple though. I mean, that's an amazing thing for people to know.


Ben [00:27:14] Yes. Yeah. And as I say, there are many different reverse image search tools and they all have their kind of pros and cons. But yeah, you can't go too far wrong with the Google one. That's certainly the easiest. But yeah, as I say, it's so easy for someone who, you know, with an agenda to spread disinformation by simply recaptioning an image. And yeah, we need to find equally easy ways to be.


Rebecca [00:27:39] We need to be less gullible.


Johanna [00:27:41] Yeah, absolutely. And also, I think about time, the time it might take to do that. Parents being time poor while they're running around trying to do their jobs and cooking and what have you, and the kids are in their room on their own devices. I guess I'm wondering about if a parent has time to be able to do that, and that's great. But what are some of the other ways that parents can help their kids or teens to understand misinformation, particularly when sometimes I guess the kids are more digital natives than the parents?


Ben [00:28:11] Yeah, look, it's, it's... Kids know everything these days. They do know everything. And yeah, I mean, it's tricky. I mean, if you're going to start telling your teens not to take any notice of Andrew Tate, then they're probably going to do the complete opposite.


Rebecca [00:28:27] It's why the social media band probably have the adverse effect.


Ben [00:28:30] So, I mean, certainly from speaking to experts within this space, they talk much more about not so much focusing on particular claims and people, but not only explaining that, hey, there are people out there who wish to deceive you and mislead you, but talking about some of the methods that are used and talking about some of the manipulation techniques. And that has been shown to be really effective, in particular, what is known as inoculation theory, which is the idea that you introduce a technique that may be used, explain to a person how it's used effectively against people, and maybe even introduce a false claim. And that will help build up knowledge and immunity, so that when they're met with that technique in real life, you know, they've already come across it, you know, much like a vaccine, the idea is that you give the person a little bit of the virus, the body gets used to it, and then, you know, in real life, you know, you're ready, your body's ready to fight it. And that's something we focus on, particularly with media literacy. And as I said, there's a lot of research behind that. So, you know, techniques such as, you know, scapegoating and use of emotional language and things like that, just by kind of making people, making young people aware that these are techniques that are used that will hopefully kind of enable them to spot them when they are met with these techniques in real life. So, yeah, when it comes to parents talking to teens and children, as I say, my suggestion would be don't go in with specific people and specific claims and specific narratives, but talk more generally and teach them about some of the techniques they use and what they might see and, you know, how people might try to fool them. I think that idea that people are out there trying to fool you is a very powerful one. And I think people, you know, they don't like the idea that people are trying to trick them and deceive them. So I think that is a very powerful message to get across rather than focusing on don't do this, don't do that.


Rebecca [00:30:39] Yeah, that's true. Just sort of bringing it back to, I guess, the politicians because it's a bit current at the moment as well. So what about when misinformation becomes mainstream from previously trusted sources? So thinking about instances when politicians have presented misinformation at press conferences and things like that. Are the goal posts on what is a trusted source sort of shifting?


Ben [00:31:02] It's a difficult one. And, look, I don't want to stick the boot in too much. But I think politicians twisting and spinning information and facts is nothing new. Yeah, it has been like happening for a long time. But, you know, I think, with the likes of Trump is, you know, you got potentially, I guess, the most well known man in the world who is particularly adept at this. So yeah, it is a major problem. But yeah, look, it's tricky. And do I think the goal posts have moved? I think all I can really say with that is that I'd go back to the need to read widely and look at a variety of trusted sources. I think the most important thing and this is very much a new problem is breaking out of our bubble. We're all served up the same content from the same people, you know, through the algorithms. Easy as well, isn't it? Yeah. And I think we all need to read a bit more kind of widely across, as I say, across the political spectrum, and look for that consensus across sources. And, you know, it's not easy because we're all time poor. But I think that is a particular problem that we have at the moment. And that is that is something that's really important.


Rebecca [00:32:21] I mean, it's a problem too, that news is just being consumed on social media now as well, because often it's condensed, it is videos, it's a flick of your finger, and you're not actually getting the full story.


Ben [00:32:31] And it's something that media organisations need to look at it as well. They need to realise how people are consuming. And look, there's some fantastic examples of media organisations doing a great job in that space. And it's certainly something at AP FactCheck that we're trying to do. But I think this whole, you know, we'll do things our way and people will come to us. I don't think I don't think you can really operate like that anymore. And I think you need to meet the audience where it's at in the way that they want to consume things.


Rebecca [00:33:01] Can you tell us you've moved into TikTok? Can you tell us a little bit about that? And why that move?


Ben [00:33:06] Yeah, I mean, look, traditional fact checks appear on website, and they are rigorously researched, and they can be quite lengthy at time and details and involved. And there certainly is a market for that. There are people that enjoy consuming content in that way. But we were certainly aware that there was an audience that we were not attracting through that. And we did some work with TikTok a couple of years ago where we worked with some social media influencers on TikTok. And we basically spoke to them about media literacy and, and talked to them about some some tips and tools that we use. They talk about information away. And, you know, these were, I mean, there was all sorts of kind of comedians and the you know, there's a party planner and, you know, real kind of diverse group of people and they turned this often kind of quite dry information into these fantastic videos using their style to kind of attract their usual audience. And it was it was a huge success.


Rebecca [00:34:10] We might need to borrow these influences.


Ben [00:34:12] But I think I think it I think it I mean, certainly for us, it speaks to kind of trusting people who know that platform and know that media. I mean, some of the content that we produce, I don't particularly like, but it's not for me. I kind of think it looks a bit odd, but I think it's realising that, you know, that there's a huge audience out there who desperately need this information.


Rebecca [00:34:35] You have to adapt.


Ben [00:34:37] Yeah. And it's about adapting and as I say, meeting them where they're at and in the style they're used to and the content that they're used to.


Johanna [00:34:47] Yeah. Wonderful. Ben, thank you so much. We've learnt so much from you today. I'm definitely going home to reset my algorithm and also to look up some reverse image tools. Just in terms of key takeaways, I think it's really important the three steps that you've given us to think about who is making the content, really pause, think about what the evidence is behind it, what their goal might be and then look for trusted sources. So I think those are my main takeaways from speaking with you today. But please, we'd love to hear if there's anything else you'd like to emphasise for our listeners today. And also, I think some of them would be interested in following along the work of AAP if you could point us in the right direction.


Ben [00:35:28] Yeah, no, I think that sums it up well. As I say, there are plenty of fancy tools out there on the internet that you can use. But I really do think it is that mindset and engaging in that critical thinking. And as you say, kind of stopping and pausing initially and considering what it is that you're seeing and consuming and potentially sharing. And it is that three -step process, those three questions. Stop. Who is this? Who is it that is sharing this? Why might they be saying it? Have they got a grudge? Have they got an agenda? Are they qualified to say it? Then interrogating evidence. What evidence is there? Is there any evidence that's being provided here? Can I find any evidence? Can I test it against anything? And then, as I say, the final step, you know, what do trust sources say? And as you say, this is lateral reading, breaking out of our bubbles and looking for consensus across trusted sources. So, yeah, I think that is the key message that we come back to time and time again. We always try to think of fresh takes on our kind of media literacy work. And is that that same three -step process that we always come back to? That's what we do as fact checkers. And I'd say that is best practise for everybody, really. And in terms of AAP FactCheck, yes, I mean, AAP FactCheck, we regularly do debunks on viral claims that are doing the rounds on social media in Australia and beyond. And we're also regularly checking the claims made by politicians, typically federal politicians, but also state politicians as well. So our website is the place to go for that. But yeah, also our social media channels, as I say, TikTok and vertical video is a new thing for us. We've got a fantastic team that are starting off that journey for us. And we very much see it as the future and a way to attract that younger audience and get across some pretty dense, tricky subjects in under a minute. So yes, it's not easy, but hopefully we can have some people follow us and keep an eye on what we're doing.


Johanna [00:37:35] So Ben, you mentioned the speed at which misinformation can travel and disinformation. Is there any way we can get in front of this information and get the real facts out there?


Ben [00:37:44] Yeah, I think we've certainly recognised in the years that we've been doing this, that while debunking is incredibly important, it is not the silver bullet and you need to have a different approach. As I say, we saw things like Southport, the speed at which this mis and disinformation travelled and by the time the debunks were out there, it was kind of too late. So we now see answer to this problem as being a kind of three pronged approach. We have our debunks which are reacting to the claims that are out there and our aim is to get those out there as quickly as possible to combat false claims. But pre -bunking is something new that we're looking at at the moment and we've started to work on. And the idea of pre -bunking is really getting ahead of the problem, anticipating what false claims are going to be appearing and basically warning people about them. And there are some events in which you can predict what's going to come up. I mean, we certainly see around election time all the same kind of claims coming up about rigged elections and in particular a lot of claims from sovereign citizen groups which have absolutely no basis in fact, but every election they spread like wildfire. You know, coming up to the next federal election we'll be doing a lot of pre -bunking there, effectively warning people saying, hey, you might see this out there, here's why it's rubbish.


Johanna [00:39:09] Where can they find that pre -bunking information?


Ben [00:39:12] We will be focusing that on our social media channels. So as I say, vertical video is a new thing for us and we see that as a particularly kind of effective platform to get that message across on. So yeah, pre -bunking, very good to get ahead of the problem. Obviously there are some things that you can't predict, so it doesn't fit perfectly. But we can certainly use pre -bunking as well to warn people about manipulation techniques and certain narratives that they use time and time again. So we see that as a really, really important part of the puzzle. So yeah, pre -bunking, debunking and then the other obviously vital thing is media literacy. We can't get ahead of everything, we can't predict everything, we can't tackle every claim through  debunk. So media literacy is vital to tool up the general public with the skills that they need to do the fact checking themselves effectively. So yeah, we very much see it as that three -pronged approach. Debunks, pre -bunks and media literacy and we see that as the major piece of the puzzle to tackle this problem.


Rebecca [00:40:09] That's great, thanks so much Ben. And just a little note for our listeners as well, we will list all of these tools listed throughout this podcast episode in the show notes so you can look them up yourselves and put them into practise.


Simon intro [00:40:23] You have been listening to Start the Conversation, a podcast series produced by the NSW Countering Violent Extremism Engagement and Support Unit. For more information, please see the episode notes or visit www.steptogether.nsw.gov.au.


Last updated:

17 Dec 2024

Was this content useful?
We will use your rating to help improve the site.
Please don't include personal or financial information here
Please don't include personal or financial information here

We acknowledge Aboriginal people as the First Nations Peoples of NSW and pay our respects to Elders past, present, and future. 

Informed by lessons of the past, Department of Communities and Justice is improving how we work with Aboriginal people and communities. We listen and learn from the knowledge, strength and resilience of Stolen Generations Survivors, Aboriginal Elders and Aboriginal communities.

You can access our apology to the Stolen Generations.

What's this? To leave this site quickly, click the 'Quick Exit' button. You will be taken to www.google.com.au

Top Return to top of page Top