The Queensland Liberal National Party (LNP) recently pursued a bold political strategy to use artificial intelligence (AI) to shape public perception of current Premier Steven Miles, a move that not only highlights the transformative potential of AI in political campaigns, but also sparks a major debate about its ethical implications.
Globally, the use of AI in political campaigns is on the rise. Recent elections around the world have used AI to analyze voter behavior, create targeted messages, and even generate persuasive content.
The UK general election saw the use of AI through the development of AI-generated politicians, and February 2024 saw a powerful use of AI in Pakistan when Imran Khan and his Pakistan Tehreek-e-Insaf party generated an AI video of Khan delivering a victory speech that he had written while in prison.
However, the LNP's approach in Queensland, albeit a much lighter one, marks a notable escalation in the Australian context. The video features realistic depictions of Miles dancing to popular songs from the early 2000s, with captions aimed at swaying voters' opinion by questioning Miles' leadership.
That's smart, but is it ethical?
Although technologically impressive, the role of AI in political campaigns has been called into question. Negative campaigning is a common strategy around the world. Society has come to expect negative posts and comments from the opposing party. For example, in the 2022 Australian federal election, the Labor Party used technology and video editing tools to manipulate images of then Prime Minister Scott Morrison.
What makes the Queensland LNP example unique is that it uses AI to manipulate an individual's physical morphology.
The Labor Party has also come under scrutiny recently over an AI-generated TikTok video featuring Opposition Leader Peter Dutton.
The video leverages AI to manipulate Dutton's appearance and behavior, and also provides an example of how AI technology can be used to create realistic and compelling content.
AI can be simultaneously persuasive and misleading, challenging the limits of acceptable political debate and highlighting the need for robust regulatory frameworks.
The Queensland Electoral Commission said the state's electoral laws do not explicitly mention AI, but do cover the publication of false statements about a candidate's character or conduct, but that freedom of political expression does allow for negative campaigning.
The Queensland Electoral Commission said the state's electoral laws do not explicitly mention AI, but do cover the publication of false statements about a candidate's character or conduct, but that freedom of political expression does allow for negative campaigning.
When politics and pop culture collide
From a campaigning perspective, there has been a big shift towards a more light-hearted and culturally relevant approach, and short-form video platforms are a great way to engage with a generation that isn't yet politically minded.
These platforms are incredibly powerful tools. But because platforms like TikTok are driven by algorithms, you need to create content that will catch their attention. One effective strategy for doing this is to incorporate elements of pop culture and current trends, which can help turn serious topics into more entertaining content.
As a result, to effectively use these platforms, politicians, governments and large organisations must adopt these pop culture tropes, regardless of the seriousness of the topics being addressed. This has resulted in a growing trend towards “politainment” by politicians.
But politicians are also increasingly using the platforms to project authenticity. In Queensland, both party leaders have used personal accounts to portray themselves as “normal” Australians, and the techniques they use to do so have focused on domestic topics like cooking – connecting with food is common internationally, notably in Italy, but is a relatively new approach in Australia.
Scott Morrison loves showing off his culinary skills – but it hasn't always worked out well for him.
After all, it is not uncommon for political parties to use digital manipulation for strategic purposes, but the question remains whether there needs to be rules governing the use of AI in election campaigns.
After all, it is not uncommon for political parties to use digital manipulation for strategic purposes, but the question remains whether there needs to be rules governing the use of AI in election campaigns.
AI is generally fine, but this needs to be clearly indicated.
Free speech in political life is crucial, but clearly identifying the use of AI is essential to maintain transparency and trust. Restricting official accounts could further complicate matters by shifting AI-generated content to more informal, harder-to-police sources.
The Queensland example highlights the opportunities and challenges of incorporating advanced technology into political campaigns, and as AI continues to evolve, its role in shaping the political landscape will only expand.
Political parties, regulators and citizens will need to navigate this territory carefully, embracing the transformative potential of AI while ensuring that the integrity of the democratic process is maintained.

