This website is being retired.
Content is no longer being updated from 31 March 2026. Find out more.
This website is being retired.
Content is no longer being updated from 31 March 2026. Find out more.
Transformation Directorate
This guidance has been reviewed by the Health and Care Information Governance Working Group, including the Information Commissioner's Office (ICO) and National Data Guardian (NDG).
Have we done a good job? Let us know.
Microsoft 365 Copilot, including Copilot Chat (called copilot from this point) is a tool that uses Artificial Intelligence (AI). Health and care organisations may use copilot to help with day-to-day tasks.
Copilot has both a free and a paid-for model. This guidance relates to the paid-for enterprise model which integrates with Microsoft 365 apps.
Copilot can have many uses, for example: answering questions about the contents of documents, making summaries of work meetings, or making some simple tasks run automatically.
Copilot uses 2 main sources of information:
This is likely to include personal data and may include data that is considered sensitive.
Copilot can be used by health and care organisations to make things like searching documents, taking notes and other admin tasks more efficient. The IT systems that hold your health and care records for your care do not use Microsoft 365. For example, the IT system used by your GP practice to record your medical notes is not provided by Microsoft.
Your data may, however, be used by copilot if it is held in the Microsoft 365 applications used by your health or care organisation. For example:
Health and care organisations will look at what information they hold in Microsoft 365 systems before they start using copilot. This will help them to decide whether copilot should access that information or not. If not, they will move the information so it cannot be accessed by copilot.
Health and care organisations will check that copilot is only used in the way they planned to.
The use of copilot will not directly impact your care. Copilot will never be used to decide the care you receive. It will not replace a healthcare professional’s expertise or affect their decisions about your care.
Data protection laws give you rights over how your personal data is used. These rights still apply when an organisation uses copilot. You should see your health or care organisation’s privacy notice for more information about your individual rights.
For general information about copilot, how it works and its benefits, please see the Copilot hub page.
Microsoft 365 is not deployed as a clinical system. While Copilot agents may be able to interact with other local systems, they should not access patient health record systems or triage systems.
We therefore do not expect extensive data about your patients and service users to be used by copilot. This will, however, depend on how your organisation currently uses Microsoft 365 applications. The same applies to any personal data stored within your Microsoft system, such as HR records. You should always ensure any personal data is appropriately restricted to only those who require access.
If you are a part of NHS.net Connect, your use must comply with the Acceptable Use Policy. If you do not use NHS.net Connect, your organisation should have its own Acceptable Use Policy, although it will likely have similar requirements.
Copilot comes trained and does not use any organisational data to train it further.
Copilot warns users after each prompt that the responses that generative AI produces are not guaranteed to be 100% factual. Users must use their judgement when reviewing the output before sending them to others or using them in official documentation. Copilot can provide useful drafts and summaries to help users achieve more. It also provides a chance to review the generated AI rather than fully automating these tasks. As a user, you should be aware that it is your responsibility to check the accuracy of copilot outputs, particularly if any personal data is involved. This includes verifying information or statistics against an independent second source where possible.
Copilot must not be used for clinical decision-making, diagnostics, or as a substitute for a healthcare professional’s expertise. The tool can support clinical administration but should not be used to provide individuals with care, and therefore cannot be relied upon to inform treatment decisions, determine patient care pathways, or interpret health and care data. There is a clinical safety case and hazard log available for copilot.
Copilot may be used by staff to seek specialist advice such as legal or compliance advice. In these cases, staff should always check the guidance they receive from copilot with an expert with sufficient expertise to confirm the accuracy of the advice. This is to ensure any advice is accurate.
It is important to note that ‘large language models’ can introduce new functions unintentionally, as well as allow users to use it in a way that goes beyond its intended use. This can lead to the tool inadvertently being used in ways that are not approved by your organisation. It is therefore important for your organisation to monitor use on an ongoing basis.
You should speak to your information governance (IG) team or your Data Protection Officer (DPO) if you need support relating to the use of information by copilot. You can also speak to your Caldicott Guardian about ethical, lawful and appropriate uses of information.
If you are adopting copilot, you are encouraged to assess what information you hold in your tenant to confirm exactly what data sets may be subject to use by copilot. You can use your Information Assets and Flows Register (IAFR) to help with this task, updating it where necessary.
If you are processing personal data using copilot, a DPIA is recommended based on how new the technology is and its potential to process a large amount of data. It is important to review different use cases for copilot and the risk can vary depending on its intended use. A template DPIA for copilot has been provided for you to use to help understand and assess the processing. You should record in your DPIA the information you have assessed is used in your tenant. It may be the case that you decide to have separate DPIAs for specific uses of copilot.
It may also be necessary to update DPIAs for other processes or projects where copilot will be used as part of those processes, for example, if you have a DPIA for a research project that will be supported by copilot, ensure that the DPIA for that project also reflects this. See guidance from the Health Research Authority (HRA) for further information on when a DPIA is required for research projects.
You should update your privacy notice for patients, service users and staff to inform individuals about your use of copilot where this involves processing of personal data. You should also assess which other methods of communicating the data use are needed to meet your organisation’s transparency requirements. This could include posters, emails or information leaflets. We have developed a privacy notice template that you can use to develop your wider privacy notices and some recommended wording below to include for copilot:
[Your organisation’s name] uses copilot, an AI tool to help with certain tasks such as:
Copilot does not replace the expertise of your health and care professional and is never used to make decisions about the care you receive. The rights set out in this privacy notice will apply when it is used.
When generating responses, copilot processes data held in the shared tenant in its original location and does not store it elsewhere.
Copilot saves and stores prompts and responses in the individual user’s mailbox. This remains within your organisation’s Microsoft tenant and is stored in line with your wider Microsoft set up.
The lawful basis for processing personal information will vary for each task and it is important that you assess this on a case-by-case basis. As the purpose of copilot is to aid staff in performing certain tasks within the Microsoft 365 environment, the lawful basis for using copilot will typically be the same lawful basis which applies to the underlying task. For example:
Because copilot is integrated with your Microsoft tenant, it applies all existing Microsoft security, compliance and privacy controls that you have already deployed in your tenant. Organisations are responsible for monitoring their own privacy settings.
If you are on the NHS.net Connect shared tenant, find information on how to do this.
Ensuring data minimisation (that copilot only accesses and uses the data that is strictly necessary for its task, for example) is dependent on a number of factors:
You can achieve this by:
The National Acceptable Use Policy (AUP), which is applicable to NHS.net Connect users, is designed to prevent use of copilot in a way that constitutes automated decision-making. Local AUPs must do the same and can use the national policy as a basis to do so. It is important that this is socialised locally (see How can we make sure staff are using it correctly?). Results are returned from copilot to the user and are there for the user to choose to use or disregard. This human check is essential to ensure that no automated decision-making is taking place.
Organisations should consider local arrangements for socialising the expectations around copilot use. You could do so through staff communications, training or local policies.
The AUP should be socialised with users of copilot.
In addition to socialising expectations, organisations should implement audit arrangements which may involve reviewing how people are using copilot and its outputs on a regular basis.
The process for complying with individuals’ rights under UK General Data Protection Regulation (UK GDPR) are not significantly altered by the use of copilot. Information continues to be stored in your Microsoft tenant and your existing processes will likely be sufficient, though this may need to be assessed on a case-by-case basis. Further information and considerations can be found in the template copilot DPIA.
The addition of copilot to your Microsoft suite will not significantly change your records management practices and should not cause issues. It may, however, highlight existing issues. For example, copilot may surface data which an individual has access to that they should not have (or which they previously did not know they had access to). For NHS.net Connect users, this may occur across the shared tenant, meaning that users in other organisations may be able to surface your data if it is public. Staff should report any access to data that they should not have or do not need, through their organisation’s information incident reporting process.
To prevent this, organisations are encouraged to check their privacy setting. For organisations using NHS.net Connect, read guidance on how to do this.
The NDOO may apply to disclosures of data beyond individual care, for example, for research and planning purposes.
The NDOO does not apply to specific tools that are used to support processing (i.e. an individual cannot use the NDOO to opt-out of processing of their data by copilot), however, it may apply to the project, programme or purpose that copilot is supporting in specific circumstances. For example:
By contrast, where the data being used in copilot is not reliant on section 251 support – for example, because it has been rendered anonymous or provided with consent - the NDOO does not apply.
This would not prevent you from using copilot to support a task associated with the same individual’s care, such as writing them a letter about an appointment, as the opt-out would not apply to that purpose.
You should ensure that for any processing that is subject to the NDOO, the relevant DPIA reflects use of copilot where appropriate and details how you will exclude relevant data from copilot’s searches.
See Understanding the national data opt-out for more information on when the NDOO applies.
The Information Commissioner’s Office has produced a consultation series on how data protection laws apply to the development and use of generative AI.