Is Copilot Safe?

Is Copilot safe? It’s redefining productivity and efficiency in businesses, but there are still a number of considerations businesses need to take with security – we show you what they are and how to mitigate the risks.

18.04.24 Charles Griffiths
is copilot safe

There is always tension between efficiency and security.

We all witnessed this during COVID-19 when IT departments swiftly deployed Microsoft Teams before they could fully understand the workings of its underlying security model or how in-shape their organisation’s M365 permissions, groups, and link policies were.

While Microsoft 365 Copilot is an excellent productivity tool, businesses still need to be aware of a number of security considerations.

So, what is Copilot and is it safe for your business?

Understanding Microsoft 365 Copilot

Microsoft 365 Copilot is an AI-powered tool that assists users with tasks like drafting emails, summarising content, analysing data or creating documents. What makes it more powerful than tools like ChatGPT is its integration with Microsoft 365 applications and the Microsoft ecosystem.

This means Copilot can access everything a user has worked on, such as documents, emails and presentations, to provide better-personalised answers.

Which begs the question:

Is Copilot for M365 Safe?

Microsoft has implemented high-level encryptions, training boundaries, tenant isolation, and more to ensure the safety and security of all data supplied to their new AI tool.

However, Copilot can use everything a 365 user has access to. This means businesses need to be extra cautious when granting employees access to the content they need to complete their jobs.

But, according to Microsoft, more than 50% of identities on 365 are ‘super admins’ – meaning they’ve inadvertently been granted access to all permissions and resources in their organisations.

Those businesses are risking their data security.

So, how does Microsoft manage risk for you, and what can you do to make your sensitive data safer when using Copilot?

How Microsoft Manages Risk for You

Tenant Isolation

The Microsoft architecture means that Copilot only uses data from the current user’s M365 tenant. The AI tool will not pull data from other tenants where the user may be a guest, nor from tenants that might be set up with cross-tenant sync.

Training Boundaries

In its privacy policy, Microsoft has assured users that Copilot is not trained on any of the user’s business data. So, you don’t have to worry about your proprietary data showing up in Copilot’s response to other businesses.

Two-Way Data Encryption with MS Server

Microsoft encrypts the user’s M365 data between the server and client, and at rest to protect it from unauthorised access. This means that if there’s a data leak, it is unlikely to be from Microsoft’s servers.

Data Retention Policies

Microsoft has put in place data retention policies that ensure the user’s data is only retained for as long as necessary.

Integrating Copilot safely

Risks You Need to Manage

Businesses need to manage a number of elements and settings themselves — failing to do so opens up potential security risks.

Granting and Revoking Permissions

Granting users access to only what they need for a project sounds like an excellent idea. However, most companies are unable to easily enforce the principle of least privilege (PoLP) in Microsoft 365.

Microsoft 365 permissions can be quite complex, particularly for the average user (who likely doesn’t work in the IT department). There are many ways in which a user accesses data in Microsoft’s ecosystem:

  • Direct user permissions
  • Microsoft 365 group permissions
  • Guest account access
  • SharePoint local permissions
  • External/public access
  • Link access (anyone with a link)

The average organisation has 1 in 10 M365 files open to all employees.

To make matters worse, the power to grant or revoke permissions is mostly in the hands of end users instead of IT or security teams. As is often the case, end users quickly move on to the next “almost due” task without giving a thought to the permissions they’ve granted.

Sensitivity Labels

Microsoft depends on sensitivity labels to enforce DLP policies, apply encryption, and broadly prevent data leaks.

It’s harder for humans to consistently apply the correct sensitivity labels to files. It is common for sensitivity labels to lag behind or become outdated as humans create new data.

Encrypting or blocking data often adds friction to workflows, and labelling technologies aren’t always available for all file types. Plus, the more labels a company or business uses, the more confusing it can become for employees.

The complexity of sensitivity labels is directly proportional to the size of the organisation.

Throw in an AI like Copilot with the ability to create usable data instantly and the efficacy of label-based data protection starts to diminish.

Humans

AI-generated content keeps getting better and better. However, it can still generate incorrect content, so employees must take care not to blindly trust the AI to create safe and accurate responses.

Imagine a scenario in which Copilot drafts a proposal for a client but the proposal includes sensitive data belonging to another client. The user sends the email without properly scrutinising its content, and now you have a privacy or data breach scenario on your hands.

In this scenario, Copilot isn’t the main problem…it’s the human!

Yes, Copilot allows anyone to generate content instantly. But it’s up to the business to implement checks and balances that eliminate such errors and ensure that company standards are maintained.

Work/Web Functions

Copilot will only utilise data that the users permit it to use, and that includes sources from the web. If you properly vet the people you collaborate with and the information they include in your project, then the “work” safe-keeps your project from outside data/threats.

You can toggle the “web” function to grant Copilot permission to use sources from the web – just be careful with your prompts.

Trust Policy

Your trust policy (or lack of it) with people outside your organisation will greatly impact the safety of your data with Copilot enabled.

Copilot could be used to create sensitive data and wreak unimaginable havoc on the business.

Now might be a good time to implement zero-trust policies with people outside your organisation.

Get Copilot-Ready

Copilot’s powerful feature set makes it a great addition to any workforce. But setting up Copilot securely can feel daunting.

AAG’s comprehensive support helps you get the most out of Copilot. An initial consultation and readiness assessment ensures that the new services can be accessed securely, while customised training helps your team understand Copilot features and its applications in their workflows. We’ll even run regular updates based on your usage to keep your Copilot services optimised.

Free Copilot Demo Call

You can book a free Copilot demo call with one of our team today.
Book Your Free Demo

Related stories

Browse more articles from our experts and discover how to make better use of IT in your business.

AI & Automation
Business

How AAG Uses AI to Cut the Friction for Our Customers

09 Apr, 2026

Discover how AAG uses AI and Automation to triage, route and resolve IT support requests faster, and improve your wider operations. Read more

Business
News

Welcoming Klingspor Abrasives to AAG

07 Apr, 2026

We're delighted to announce we will be supporting the UK division of Klingspor Abrasives with with day-to-day IT support. Read more today. Read more

Business
News

AAG Welcomes SDE Group Onboard

01 Apr, 2026

AAG IT Services continues to grow, welcoming SDE Group as a new partner, where we will support them with their IT and technology needs. Read more