Can you use AI for record keeping without violating GDPR?
A GP finishes a consultation and asks ChatGPT to write a short journal note:
“Summarize this: 54-year-old man, chest pain last 3 days, using atorvastatin, previous MI.”
In a matter of seconds, the doctor receives a tidy journal note back.
It seems effective - but what many people don't consider is that the patient data may have been sent to an external AI service. without a data processing agreement.
In many cases, this may be in violation of the privacy policy.
Many people are surprised by this - because AI itself is not illegal in healthcare.
The challenge is what happens to the patient data after it is sent to an AI tool.
More and more doctors and therapists are testing AI tools to write journal entries, dictate referrals or summarize consultations.
Tools like ChatGPT do this impressively well.
But here an important question arises:
Is it legal to use AI on patient data?
Short answer:
Yes - but only if certain requirements are met.
The problem is that many general AI services is not designed for health data, and can therefore violate the privacy policy if used directly for record keeping.
This guide explains:
-
what the GDPR actually requires
-
why general AI tools can be problematic
-
what an AI system must have to be safe in clinical use
Why health data is extra protected
The General Data Protection Regulation classifies health data as special categories of personal data.
This means that they have Stricter protection than regular personal data.
Examples of health information:
-
diagnoses and treatment history
-
symptoms and clinical observations
-
drug use
-
test results
-
discharge summaries and journal notes
-
information about mental health
-
genetic and biometric data
The regulations can be found in GDPR article 9, which regulates the processing of sensitive personal data.
Source:
GDPR art. 9
https://gdpr-info.eu/art-9-gdpr/
Can doctors use ChatGPT for record keeping?
Many doctors and therapists have tested this.
But in practice, it can create more privacy issues.
Rule of thumb:
If you have not signed a data processing agreement with your provider, you should not send patient data to the system.
1. Lack of data processing agreement
When you enter patient information into an AI tool, you are sending data to an external service.
In GDPR this is called data processing.
To make this legal, there must be a data processing agreement between the clinic and the provider.
If this does not exist, the processing may be in violation of the GDPR.
Source:
GDPR art. 28
https://gdpr-info.eu/art-28-gdpr/
2. Data can be stored outside the EEA
Several AI services store data in the US or other third countries.
The transfer of health data outside the EEA requires special legal mechanisms, such as:
-
EU Standard Contractual Clauses
-
Binding Corporate Rules
-
Adequacy decisions
This makes the use of general AI services legally more complex in the healthcare sector.
3. Limited control over storage and deletion
When patient data is sent to an external AI service, you need to know:
-
where data is stored
-
how long they are stored
-
who has access
If this is not clearly regulated, it may violate information security requirements.
What happens if patient data ends up in the wrong system?
Breaches of data protection rules in the healthcare sector can lead to:
-
Supervision from the Danish Data Protection Agency
-
fines of up to 20 million euros or 4 % of global turnover for businesses
-
compensation claims from patients
-
in serious cases, supervisory proceedings against healthcare professionals
This does not mean that AI is dangerous - but that The tool must be developed for handling health data.
When can AI be legally used in healthcare?
AI is not illegal in healthcare.
But the processing must have a valid processing basis.
The most common are:
1. Necessary for health care
Healthcare professionals can process health information if it is necessary for:
-
diagnostics
-
treatment
-
record keeping
This is regulated by GDPR article 9 (2)(h).
2. Fulfillment of duty of care
In Norway, healthcare professionals have statutory duty to keep records.
This follows from Health Personnel Act §39.
If AI is used as a tool to fulfill this obligation, it may be legal - provided that privacy is safeguarded.
Source:
Health Personnel Act §39
https://lovdata.no/lov/1999-07-02-64/§39
What requirements must an AI tool meet in healthcare?
To be used safely in clinical work, an AI system must normally have:
1. Data processing agreement
Regulates how patient data is processed.
2. Storage within the EEA
To avoid complicated third country transfers.
3. Encryption
Data must be encrypted both:
-
during transfer
-
during storage
4. Access control
Only authorized personnel should have access.
5. Logging
All accesses to patient data must be logged.
These are requirements that follow from GDPR article 32 on information security.
Source:
https://gdpr-info.eu/art-32-gdpr/
How MediVox handles privacy
AI in health requires solutions that are designed for clinical use.
Developed specifically for healthcare professionals, Medivox is designed to meet the legal and technical requirements that apply to the processing of patient data in the healthcare sector.
Among the measures are:
-
data processing agreement with all customers
-
Storage of data in Norway
-
encrypted transmission and storage
-
access control and logging
-
clear routines for deleting data
Want to see how it works in practice?
We're also hosting a webinar where we'll demonstrate how AI can be used for record keeping, letters and documentation in clinical work.
You can see more about the webinar here:
Streamlining medical notes - how AI can free up doctors' time
https://events.medivox.ai/
Checklist: Is the AI tool you use GDPR compliant?
Go through these questions:
- Is there a data processing agreement?
- Do you know where your data is stored?
- Is data stored within the EEA?
- Is data encrypted during transmission and storage?
- Is there access control and logging?
- Are patients informed about the use of AI?
If you can't answer yes to these questions, you should consider the solution further before applying it to patient data.
FAQ - frequently asked questions
Do I need patient consent to use AI?
Not necessarily.
If AI is used as a tool for diagnostics, treatment or record keeping, the treatment basis may be necessary health care.
But patients should be informed about how their data is processed.
Can the AI provider see patient data?
It depends on the system.
Access may be necessary in certain cases, for example when troubleshooting.
Then the access should be restricted and logged.
Are AI journals legal in Norway?
Yes.
AI can be used as a tool for writing journal notes, as long as healthcare professionals remain professionally responsible for the content of the journal.
The journal must still meet the requirements of the Health Personnel Act.
Can I use ChatGPT if I remove names?
Not necessarily.
Even without a name, information such as age, diagnosis, symptoms or rare conditions can make a patient identifiable.
This means that also Partially anonymized information can be personal data under GDPR.
Does data have to be stored in Norway?
No, you don't.
But they normally need to be stored within the EU/EEA if they contain personal data.
Can AI tools use patient data for model training?
Only if this is explicitly regulated in the data processing agreement.
In many cases, this will require pseudonymization or anonymization of data.
What happens if a patient requests deletion?
Patients have the right to erasure in many situations.
But in the health service journal retention obligation, which often takes precedence over the right to erasure.
A situation many doctors recognize themselves in
In a busy clinical environment, it's easy to test new tools to save time.
As a result, many doctors have tried pasting a journal entry or epicrisis into an AI tool to help improve the wording or create a summary.
It's understandable - the need for better tools in clinical documentation is great.
However, when patient data is sent to general AI services without a data processing agreement or control over where the data is stored, it can also create legal and privacy challenges.
Conclusion
AI can be a powerful tool for healthcare professionals.
But when patient data is involved, the solution must meet strict requirements:
-
privacy
-
data security
-
legal responsibility
The most important question is therefore not if you use AI, but whether the tool is designed for handling patient data.
Sources
GDPR - General Data Protection Regulation
https://gdpr-info.eu
Norwegian Data Protection Authority - Artificial intelligence and privacy
https://www.datatilsynet.no/regelverk-og-verktoy/rapporter-og-utredninger/kunstig-intelligens/
Norwegian Data Protection Authority - Privacy and artificial intelligence (recommendations)
https://www.datatilsynet.no/regelverk-og-verktoy/rapporter-og-utredninger/kunstig-intelligens/anbefalinger/
Health Personnel Act
https://lovdata.no/lov/1999-07-02-64