News Corp Australia outlines its approach to using AI in its newsrooms
Conference Blog | 08 September 2025
Like most in the industry, News Corp Australia has been grappling with how to effectively integrate AI into its editorial departments while upholding ethical standards and ensuring accuracy.
The company’s director of newsroom innovation, Rod Savage, presented his vision for how journalists should use AI at the INMA Asia/Pacific News Media Summit 2025.
News Corp Australia’s AI strategy benefits from having held off on immediately jumping on the bandwagon, allowing the organisation to learn from the missteps of more gungho competitors.
“I think there are no prizes for coming in first anymore. There used to be, right? There used to be a first mover advantage,” Savage said.
That’s because the pace of AI’s technological advancements is so rapid that mistakes are easy to make. And in a business like news where trust and credibility are paramount, mistakes can be devastating.
“You literally can do things today that you couldn’t do last week,” Savage said. “How many of us were talking about Nano Banana last week? I don’t think any of us were. But it’s equally easy to slip on a banana peel, right? To trip over spectacularly, this is what keeps me up at night.”
Setting up clear AI guidelines is crucial
To help their newsrooms navigate such murky waters, News Corp Australia set up an AI ethics board with representatives from across the company. The board established six key pillars that AI utilisation must adhere to in order to be viable.

Savage laid them out as follows:
-
Augment: Does your AI augment what you are already doing or does it replace your work? AI should be your assistant — it should not do everything for you.
-
Fair: Is your AI giving out reasonable output? Be wary of bias that can be built into the data that your AI is trained on.
-
Reliable: How much has your AI been tested? If it does not work consistently, it is not worth it.
-
Secure: How safe is the data of your staff, clients, and audience? If it is not 100% secure, do not use that tool.
-
Explainable: How transparent is your AI use? It is not just about declaring when you are using AI to the audience. Your staff also needs to understand when and how AI should be used.
-
Accountable: Is your AI accountable to a human? You hit publish, the AI does not.
Convincing cynical journalists
Even with this comprehensive approach to responsible AI usage, Savage conceded that some of the journalists he works with have remained hesitant. After all, journalists by nature tend to be skeptical people.
To get more colleagues on board, Savage stressed the importance of having honest dialogues and listening to the real concerns many have about AI: “We had some excellent debates with some very, very senior journalists. It’s like a press conference where you take every single question and you don’t say no more questions.”

While AI presents sensitive challenges to the news industry, the opportunities it opens up are undeniable. Savage said he is confident News Corp Australia is ready to deploy AI to reinvent editorial workflows.
“I think we’ve popped the lid,” Savage said. “I think we’ve got our ducks in a row, we’ve got an ethical framework, we’ve got adoption and training, we’ve got internal expertise, and we’ve got an excited workforce. The New York Times said a little while ago they were all in on AI — I think we’re absolutely all in.”