BRISBANE, Australia (Reuters) -At a beachfront park in Brisbane's north, suspended Australian doctor William Bay told a gathering that an upcoming referendum to recognise the country's first inhabitants and enshrine an Indigenous advisory body in the constitution would "open a gateway to unending tyranny and lawlessness".
The proposal was "equivalent to Germany's Enabling Act of 1933, which turned Hitler into the Fuhrer", Bay said in the speech in August, which he posted on Facebook for his 14,000 followers. The advisory body could "control the parliament and the government, thus replacing our system of representative democracy", added Bay, who lost his medical licence in 2022 after protesting against COVID-19 vaccines.
Dozens of campaigners who built substantial audiences during the COVID era by opposing Australia's pandemic response have turned their focus to undermining the Oct. 14 referendum, analysis of social media posts by independent fact-checkers shows.
Many of their claims bear little resemblance to the proposal Australians will vote on: to establish a body called the Voice to Parliament to provide non-binding advice to lawmakers on matters concerning Indigenous Australians.
These influencers are playing an outsized role in the debate, spreading falsehoods that threaten to put the landmark vote at risk of failing, eight political analysts and anti-misinformation experts told Reuters. The direct link between COVID agitators and misinformation about the Voice has not been previously reported in detail.
Polls show support for the Voice has slumped from about two-thirds in April to less than 40% this month. While factors cited by political commentators include lack of bipartisan support, uncertainty about the Voice's scope and a lackluster "Yes" campaign, the experts who spoke to Reuters said some of the decline can be attributed to misinformation.
Facebook owner Meta increased funding for third-party fact-checkers in July, but a month later 40% of posts from accounts flagged for sharing "misinformation or toxic narratives related to the referendum" went viral, according to previously unpublished research by Reset.Tech Australia reported by Reuters for the first time. The internet advocacy group defines "viral" as receiving more than 100 engagements within 24 hours.
Just 4% of posts on Facebook containing independently assessed misinformation about the electoral process were marked or taken down after three weeks, said Reset.Tech, which monitored 99 misleading posts with a combined reach of 486,000 people across Facebook, X (formerly known as Twitter) and TikTok.
Not one X post containing electoral misinformation was marked or taken down in the monitoring period, before or after being reported, Reset.Tech said.
X, which laid off many staff after billionaire Elon Musk bought the platform in 2022, did not respond to a request for comment. The company's civic integrity policy says the use of its services to manipulate or mislead people about elections is a violation of its user agreement.
TikTok labelled or removed one-third of misleading posts, Reset.Tech said, the most proactive in the study.
"Many of the accounts pushing electoral misinformation narratives turned to a style of anti-lockdown politics during the pandemic," said Reset.Tech Australia executive director Alice Dawkins. "Some of these accounts have since attained new levels of virality in the lead up to the referendum, particularly on X."
A Meta spokesperson said the company wanted healthy debate on its platforms but it was "challenging to always strike the right balance" when some users "want to abuse our services during election periods and referendums".
TikTok's Australian public policy director Ella Woods-Joyce said the company was focused on protecting "the integrity of the process and our platform while maintaining a neutral position".
In relation to the referendum, Australia's Electoral Commission has seen "more false commentary about electoral processes spread in the information ecosystem than we've observed for previous electoral events", its media and digital director Evan Ekin-Smyth told Reuters.
Under a giant fig tree, Bay urged his mostly middle-aged audience - and Facebook following - to "scrutineer" polling booths to "make sure it is counted correct", in remarks reminiscent of unsubstantiated vote-rigging claims by former U.S. president Donald Trump over his 2020 loss.
Speaking to Reuters, Bay denied spreading misinformation, saying he considered his claims accurate. He acknowledged his statements "may carry some weight" given his public profile related to the pandemic.
At the same event, local member of parliament Luke Howarth spoke against the Voice, sticking to the conservative opposition's argument that the proposal would be ineffective and divisive because it would extend additional rights to some people based on race.
'POLLUTE YOUR OPINION'
Australia's tough pandemic lockdown and vaccine measures triggered numerous protests, often inspired by social media influencers and anti-vaccine campaigners.
"Covid seemed to awaken in people a complete distrust of authority and lack of confidence in the state," said David Heilpern, dean of the Southern Cross University law school, who studies anti-government movements. "It certainly will have an effect on the vote."
Bay is far from alone in the anti-Voice online ecosystem that has emerged from the pandemic.
A Qantas pilot who quit over the airline's COVID vaccine mandate, Graham Hood, now hosts a webcast that he shares with 142,000 Facebook followers.
His guest on July 10, far-right senator Pauline Hanson, told viewers the Voice would turn Australia's Northern Territory into a breakaway "Aboriginal black state" and add extra seats in parliament "which they can make purely for Aboriginal, Indigenous people".
Tristan Van Rye, an electrician with 22,000 Facebook followers after protesting against COVID vaccines, wrote in a July 10 post that the Indigenous body would "take control of certain beaches, nature reserves, national forests and either totally restrict access to all Australians, or charge them fees to access the land". Hood, Hanson and Van Rye did not respond to Reuters' questions about the spread of misinformation.
The Voice was proposed by Aboriginal leaders in 2017 as a step toward healing a national wound dating back to colonisation. Unlike Canada, the U.S. and New Zealand, Australia has no treaty with its Indigenous people, who make up about 3.2% of its population and lag national averages on socioeconomic measures.
Ed Coper, director of communications agency Populares, said that for voters facing a new issue like the Voice, "it is a lot easier to see misinformation on social media and have that pollute your opinion while you're (still) forming that opinion".
One X account labelled by misinformation researchers as possibly fake due to its high volume of anti-Voice content was ultimately linked to a real person, a retired cleaning-business owner from Melbourne.
"I've only got political within the last two years," the account operator, Rosita Diaz, 75, told Reuters by phone. "99.9% of what I post is 100% correct. I would say 100% but some people would turn around and call me a liar. Sometimes I might get something wrong."
Diaz said she had been suspended by Facebook "seven or eight" times over posts deemed false. She now mostly posts on X, where she has 20,600 followers and pays for a subscription, meaning her posts appear more frequently on users' feeds.
MISINFORMATION BILL
Australia's left-leaning Labor government, which supports the Voice, introduced draft legislation this year that would allow the media regulator to determine what constitutes misinformation and fine social media companies that fail to curb it.
The bill, which is still in public consultation, has been criticised by Voice opponents as government censorship. But it may not become law until after the referendum.
A spokesperson for Communications Minister Michelle Rowland said the government wants the bill passed this year but social media platforms are expected to comply with a voluntary code of conduct when it comes to the Voice.
The Yes campaign, meanwhile, has accused the No camp of deliberately spreading misinformation as part of its strategy. A spokesperson for Advance Australia, which is coordinating the No campaign, told Reuters there were "tens of thousands of (No campaign) hats and t-shirts out there and we're not responsible for what people say while they're wearing them".
Elise Thomas, an analyst with the Institute for Strategic Dialogue, said a lack of evidence-based research meant Australians may never gain a full picture of how disinformation and misinformation influence the referendum outcome.
"That's a shame, both for us here in the present and for future generations of Australians trying to understand this moment in history," she said.
(Reporting by Byron Kaye; Editing by Praveen Menon, Daniel Flynn and David Crawshaw)