A robot working an office job to represent AI taking Australian jobs.

AI Experts Warn 46% Of All Aussie Jobs Could Be Automated By 2030

2 min read

This article is for general information purposes only and isn’t intended to be financial product advice. You should always obtain your own independent advice before making any financial decisions. The Chainsaw and its contributors aren’t liable for any decisions based on this content.



AI can be used to make our lives easier, but it is also a frightening tool that could see many of us out of a job – at least according to some experts.

Australian Academy of Technological Sciences and Engineering (ATSE) CEO Kylie Walker told The Canberra Times AI could replace anywhere between 25 and 46 percent of all Aussie jobs by 2030.

Walker, and a group of 13 other AI experts, called for a $1 billion national artificial intelligence initiative in a new report to push out more than 100,000 digitally skilled workers over the next decade.

“Our strong institutions, innovation ecosystem, effective governance and high rate of technology adoption position Australia to lead the world in the development of Responsible AI,” Walker said.

“Australia needs to invest in AI research and coordination between academia and industry to foster a culture of research, innovation and risk-taking.

“This culture of innovation needs to be backed by a laser focus on the science and technology skills we urgently need in the Australian workforce.”

Government cracks down on AI

This all comes as the Albanese Government plans to strengthen Australia’s Basic Online Safety Expectations to “address gaps, emerging harms, and further clarify the government’s expectations of industry”, Minister for Communications Michelle Rowland announced yesterday at the National Press Club.

“Under the proposed changes, services using generative AI would explicitly be expected to proactively minimise the extent to which AI can be used to produce unlawful and harmful material,” Rowland said.

“We are also proposing a new expectation that industry consider the best interests of children in the design and operation of their services. This could include implementing appropriate age assurance mechanisms and technologies, to prevent children from accessing age-inappropriate material.”

Rowland said the rise of AI over the past couple of years has brought forward concerns over the rise of misinformation.

“Over the past two years, it has also become harder to distinguish between AI generated images and genuine ones,” she said.

“And while this technology has incredible potential as a positive tool, it has also been used to create images designed to humiliate, embarrass or even abuse – others.”