AI in Healthcare: How to Use It Without Breaking Confidentiality

Every week, more New Zealand clinicians are using AI in their day-to-day work. Some are doing it cautiously, under approved frameworks. Others are quietly typing patient details into ChatGPT because it saves them twenty minutes. The gap between those two realities is where your organisation’s biggest liability lives.
This isn’t a case against AI. The evidence for its value in healthcare is compelling. It’s a case for using it correctly, understanding where the risks sit, which tools have passed scrutiny in the New Zealand context, and what your obligations actually are.
———————————————————————————————————————————
The Risk Most Leaders Are Underestimating
Here’s the scenario that plays out constantly: a clinician, buried in documentation after a long shift, pastes patient notes into a free AI tool to generate a referral letter. It works. Nobody finds out.
The problem isn’t whether anyone finds out today. It’s that the data may now be retained by the AI provider, used to train future models, and stored outside New Zealand, beyond the reach of our health information governance structures.
Health New Zealand’s guidance is unambiguous: employees and contractors must not enter any personal, confidential or sensitive patient or organisational data into unapproved LLMs or Generative AI tools. Doing so will almost certainly constitute a breach of the Privacy Act 2020 and the Health Information Privacy Code. There’s no goodwill exemption for time-pressured clinicians.
There’s also an equity dimension worth naming. Generative AI has largely been developed with a focus on major global languages, leaving Te Reo Māori and other minority languages under-represented creating risks of misunderstanding, inaccuracy, and miscommunication. In a sector where Māori health outcomes are already inequitable, that’s a clinical risk, not a minor technical footnote.
————————————————————————————————————————————
Where AI Is Already Working Safely
The clearest win right now is AI scribes. During a pilot at Hawke’s Bay Hospital’s Emergency Department, Heidi Health reduced average documentation time from
around 17 minutes to just over four minutes per patient, with clinicians able to see an extra patient per shift and after-shift administrative work falling by up to 81%.
The human impact was just as significant. One psychiatrist entered the trial with self-rated stress at 10 out of 10, on the verge of leaving the profession. By the trial’s end, they rated it at two out of 10. That’s a retention story as much as a productivity one.
Following stringent reviews covering privacy, cybersecurity, and data sovereignty, Heidi was endorsed by NAIAEAG and is now being rolled out across New Zealand’s emergency departments, with 1,000 clinician licences and 100 for mental health crisis teams.
Beyond scribes, AI can also add immediate value in areas that carry no privacy risk at all: scheduling, roster planning, internal communications. If there’s no patient-identifiable data in the input, the primary risks don’t apply. That’s genuinely low-hanging fruit most organisations haven’t picked.
————————————————————————————————————————————
Building a Confidentiality-Safe AI Culture
Approved tools are necessary but not sufficient. Most privacy exposure comes from staff who simply don’t know what they can and can’t do. A few things that actually work:
Make the approved path easier than the workaround. If endorsed tools are harder to access than ChatGPT, you’ve already lost. Heidi achieves high adoption because it’s genuinely faster than the alternative and the compliance is built into the experience.
Train on the why, not just the what. “Don’t use ChatGPT with patient data” isn’t enough. People need to understand what a Privacy Act breach means for the patient, for them personally, and for the organisation.
Apply Māori data sovereignty principles from the start. Relevant iwi, hapū, whānau, and Māori organisations should be included in decision-making as partners throughout the conception, planning, governance, design, and implementation of AI tools. This is a Te Tiriti obligation, not a consultation checkbox.
Keep clinicians accountable. There’s a critical distinction between AI for clinical decision support and AI making clinical decisions and clinicians must maintain final accountability, with AI serving as a tool. AI-generated documentation needs to be reviewed before it becomes a clinical record.
————————————————————————————————————————————
The Workforce Angle
New Zealand’s healthcare sector is in the middle of a documented burnout crisis — 70% of GPs rating themselves as moderately to highly burnt-out, nursing burnout at
62%, with excessive administrative workload identified as a primary driver. The organisations that get AI right aren’t just reducing compliance risk; they’re addressing one of the most grinding parts of clinical work.
The organisations that win the recruitment and retention competition of the next decade will be those that can tell candidates: we’ve invested in tools that make your working life materially better, and we’ve done it without compromising the patients in your care.
That combination is rarer than it should be. And right now, it’s a genuine differentiator.
————————————————————————————————————————————
At Frontline, we work with healthcare organisations across Aotearoa to structure hires for the long term, including the capability needs that come with AI adoption. If you’re building a workforce strategy that accounts for where the sector is heading, we’d like to talk.
————————————————————————————————————————————
Sources: Health New Zealand | Te Whatu Ora, Generative AI and LLM guidance, 2025 · RNZCGP, MCNZ AI Statement submission, October 2025 · npj Digital Medicine, AI governance in Aotearoa NZ, 2023 · PMCSA, Capturing the benefits of AI in healthcare for Aotearoa NZ, December 2023 · MBIE, NZ AI Strategy, July 2025 · Heidi Health, NZ ED rollout, November 2025 · NZ Herald, Heidi rollout and privacy analysis, November 2025 · Newsroom, Health NZ security flaw report, March 2026 · RNZCGP, 2024 Workforce Survey · Weston K., Nursing Praxis in Aotearoa NZ, 2024 · NZMJ, Aotearoa NZ doctor shortage, 2024