From the Building Workforce Capability with AI event, Melbourne, November 20. Browse photos from the event here.
At our recent Building Workforce Capability with AI event in Melbourne, aged care leaders gathered to explore how artificial intelligence can transform workforce development and improve care outcomes. From addressing administrative burden to creating personalised learning experiences, the conversation moved beyond the hype to focus on practical implementation.
Will Egan, CEO of Ausmed, opened the session by framing AI as a powerful tool for building workforce capability, followed by an engaging panel discussion featuring:
- Elyssia Clark (Benetas - General Manager of Marketing, Insights and Customers)
- Dr Jack Feehan (RMIT -e AI in Aged Care Research Fellow)
- Dr Tom Kelly (Heidi Health - CEO)
- David Diviny (Nous - Chief Data and Analytics Officer and Head of AI).
Here are 10 key takeaways to help you embed AI safely and strategically into your organisation.
1. AI Offers a Once-in-a-Generation Opportunity to Reimagine Learning
Traditional e-learning platforms are built on rigid, pre-programmed systems that deliver the same content to everyone, regardless of their existing knowledge or experience. As Will Egan highlighted, this "one-size-fits-all" approach has created significant redundancy in workforce training.
Generative AI presents a transformative alternative. By moving beyond relational databases to large language models, organisations can now deliver truly individualised learning experiences that adapt to each team member's capability level, learning style, and specific knowledge gaps.
But the transformation goes even deeper. Will described a fundamental shift from didactic to experiential learning. Current training typically presents a set of facts, then tests whether learners can remember them three minutes later through multiple choice questions. It's passive and disconnected from real practice.
AI enables something radically different: learners can now apply skills in practice through simulated scenarios. Imagine a care worker having a realistic conversation with a simulated resident about a complaint, where the AI evaluates their ability to show empathy, accurately describe workplace policies, and handle the situation appropriately. The learning experience then adapts in real time, dialling up focus on areas of weakness and reducing time spent on demonstrated strengths.
This dual shift from standardised to personalised and from didactic to experiential represents a fundamental reimagining of how we develop workforce capability in aged care.
2. Start with Administrative Pain Points, Not Efficiencies
When considering where to implement AI, don't begin with abstract efficiency gains. Instead, identify the administrative tasks that are consuming valuable time and causing frustration among your team.
Dr Jack Feehan advised: "Sit down and map out what your biggest problems are, particularly from an administrative side. What are you spending heaps of man hours on? And especially if those man hours are hating every second of it, there's probably a tool that can make it better or completely remove it."
Clinical documentation emerged as a prime example. Dr Tom Kelly shared that aged care workers can spend 40% of their day on documentation. AI-powered medical scribes like Heidi can reduce this burden dramatically, reclaiming up to two hours per day for direct care work.
3. The Time-to-Care Equation Changes Everything
One of the most compelling arguments for AI adoption in aged care is its potential to restore time for meaningful resident interactions. When documentation time drops from three hours to one hour per day, that's not just an efficiency gain – it's a fundamental shift in how care workers spend their time.
This reclaimed time directly addresses one of aged care's most pressing challenges: ensuring workers have sufficient time to deliver quality, person-centred care. As the panel discussed, this isn't about doing more with less; it's about redirecting human capacity towards the work that truly requires human connection and clinical judgement.
4. Frontline Workers Hold the Best Implementation Ideas
Don't make AI adoption a top-down initiative exclusively. David Diviny emphasised the importance of engaging frontline workers early in the process, referencing research from the UTS Human Technology Institute called "Invisible Bystanders."
The research found that when frontline nurses were educated about AI basics and given diaries to track opportunities for AI use in their daily work, they generated far superior ideas than executive workshops produced. Frontline staff understand the pain points intimately and often already use AI tools in their personal lives. Involving them creates better solutions and stronger buy-in for implementation.
5. Just Start – But Start Strategically
Elyssia Clark from Benetas captured this principle succinctly: "Just start. Don't be afraid. Test, test and learn. Fast fail is a big mandate in our organisation. We try a lot of different things. Many things haven't worked. Some have, but just be brave and be open to the different tools that are out there."
The key is to balance action with strategy. Begin with pilot projects in contained areas where you can measure impact. Select one pain point, choose a tool, test it with a small group, and evaluate the results. If it works, scale it. If it doesn't, learn from it and move to the next opportunity on your list.
6. Governance and Guardrails Aren't Obstacles – They're Enablers
Will Egan stressed that AI governance isn't about slowing down innovation; it's about creating the frameworks that allow safe, confident adoption. At Ausmed, they've developed comprehensive policies around data security, transparency about AI use, and structured evaluation of any AI tool before implementation.
Their approach includes mandatory proof-of-concept testing, vendor due diligence processes, and clear guidelines about when AI is appropriate versus when human judgement is required. These guardrails have actually accelerated their AI adoption by removing ambiguity and building organisational confidence.
Elyssia also stressed the need for good governance and most importantly a psycologically safe environment for discussing AI opportunities.
7. Measure What Matters: Focus on Outcomes, Not Activity
David Diviny repeatedly emphasised the importance of an evaluative mindset: "What are the outcomes you're trying to influence? What's your theory of change with AI? And then measure those outcomes and track progress."
Don't just measure whether staff completed AI training or how many tools you've deployed. Measure the outcomes that matter to your organisation: improvements in care quality, reductions in clinical incidents, increased staff satisfaction, better compliance rates, or enhanced resident outcomes. This outcome focus helps justify investment and guides continuous improvement.
8. AI Can Transform Compliance from Burden to Enabler
The panel explored how AI is already transforming regulatory compliance in aged care. Rather than experiencing compliance as a constant administrative burden, AI tools can automate evidence collection, flag potential issues before they become problems, and ensure documentation meets required standards.
Dr Jack Feehan noted that AI can help organisations move from reactive compliance to proactive quality improvement. When AI handles the routine documentation and monitoring, human attention can focus on interpreting trends, addressing root causes, and genuinely improving care quality rather than simply ticking compliance boxes.
Will Egan took this a step further in his opening address, emphasising the opportunity AI now presents to intelligently ask the learner what they know and teach appropriately to achieve compliance. This turns the traditional model - of purely measuring compliance based on training completion - on its head.
9. The need to be able to manage unstructured data
David Diviny explained that due to the capabilities of AI to use unstructured data in an efficient way, our definition of usable data has now broadened significantly.
This also means, however, that organisations need to be mindful of structuring their qualitative data in a way that makes sense. It's no longer good enough to type notes up and put them in a shared cloud drive; metadata, tabular layouts and effective storage mechanisms matter to AI systems when retrieving information.
10. Tailor the Message to Your Executive Team
When asked for an elevator pitch to persuade change-resistant executives, Dr Tom Kelly was pragmatic: "Whatever's the most painful thing that you could change for them, there'll be some use case that you could come up with." If they're resistant to any change, he suggested, it might be time to focus your efforts elsewhere.
David Diviny offered a more strategic approach: identify existing services that are constrained or rationed because you lack the people to deliver them. Frame AI as a way to provide these services at greater scale or with more personalisation. This shifts the conversation from "new technology to adopt" to "problems we can finally solve."
Moving Forward: Your Next Steps
The overarching message from the Building Workforce Capability with AI event was clear: the time for AI in aged care is now, but implementation must be thoughtful, strategic, and grounded in improving outcomes for residents and staff alike.
As you consider your organisation's AI journey, remember that this technology isn't about replacing human connection – it's about enabling it. By reducing administrative burden, personalising learning and development, and freeing up time for meaningful care interactions, AI can help your workforce deliver the quality of care that drew them to aged care in the first place.
Start small, measure what matters, involve your frontline workers, and build the governance frameworks that enable confident adoption. The future of aged care workforce capability is being written now, and the organisations that act thoughtfully and strategically will be best positioned to attract, develop, and retain the capable workforce that quality care demands.
About the Speakers
Will Egan - Ausmed - CEO
As CEO of Ausmed, Will Egan is passionate about helping health and aged care providers unlock the potential of their people. He's leading Ausmed's next evolution: bringing AI into learning and capability development in a safe, meaningful way.
Through new features like AI-assisted reflection, adaptive assessments and intelligent policy chat tools, Ausmed is showing how artificial intelligence can enhance critical thinking and personalise learning without compromising governance or human connection.
At Building Workforce Capability with AI, Will shared how providers can embrace this technology confidently, using it to build more capable, future-ready teams across the care sector.
Elyssia Clark - Benetas - General Manager of Marketing, Insights and Customers
With a wealth of executive leadership experience across healthcare, technology, financial services and consulting, Elyssia Clark speaks directly to the everyday aged care leader navigating the emerging role of AI in their organisation. As General Manager, Customer, Insights & Marketing at Benetas, Elyssia oversees five teams spanning marketing, communications, sales, customer service and research & data analytics, bringing strategic, data-driven insight to every facet of customer experience and workforce engagement.
At Benetas, she has led early, practical applications of generative AI, from expediting content development and multilingual video explainers to exploring social robotics and voice-based information services, all with a focus on safety, governance and real-world workforce enablement.
A former media spokesperson for SEEK, Elyssia is the Co-Chair of the Research Society Human Insights Conference, a past elected Director of the Research Society Board, a regular CX Awards judge and a frequent speaker at aged care, HR and customer experience conferences across Australia.
Dr Jack Feehan - RMIT - AI in Aged Care Research Fellow
Dr Jack Feehan is the Deputy Director of Healthy Longevity and Chronic Disease at RMIT's School of Health and Biomedical Sciences. An Allied Health Professional (Osteopath) by training, he holds a PhD in Gerontology from the University of Melbourne and has authored more than 100 publications. His research spans ageing, chronic disease, and digital health, and has attracted over A$2 million in competitive research funding.
He leads an AI in Aged Care program in partnership with Directed Electronics and CSIRO, focused on translating evidence and technology into real-world impact. The program aims to improve care quality, strengthen operational capacity, and lift efficiency across home-care organisations, supporting providers to deliver safer, more responsive, and person-centred services at scale.
Dr Tom Kelly - Heidi Health - CEO
Dr Tom Kelly is co-founder and Chief Executive Officer of Heidi Health, and leads one of Australia's fastest-growing AI businesses in healthcare. A trained clinician who completed surgical training and worked in frontline practice, Tom founded Heidi to reduce administrative burden for doctors and improve patient care.
Under his leadership, Heidi has returned over 18 million hours to frontline clinicians since its inception and is used in over 2 million patient encounters every week worldwide. He combines medical experience with deep technical and commercial expertise, guiding Heidi's scaling, fundraising and operational deployment while emphasising clinician trust, data privacy and pragmatic, safe use of generative AI in clinical settings.
Tom is a regular speaker on AI leadership in health, focusing on responsible AI adoption, clinician-centred product design, and the practical challenges of bringing AI safely into routine care.
David Diviny - Nous - Chief Data and Analytics Officer and Head of AI
David Diviny is a leading strategic thinker on Generative AI in the public sector. He has delivered a wide array of AI and machine-learning projects that surface novel insights from large datasets and embed GenAI into enterprise processes. He leads cross-disciplinary analytics and engineering teams that translate advanced models into measurable policy and operational outcomes, with an emphasis on rigorous model governance, risk management and stakeholder-centred deployment.
David advises senior public-sector and enterprise leaders on data strategy, capability uplift and the practical governance needed to scale responsible AI. He is a regular presenter on topics including operationalising GenAI, evidence-based evaluation of AI systems, and aligning analytics to organisational change, bringing a pragmatic, outcomes-focused approach that bridges technical, policy and organisational perspectives.

