plane taking off from airport with mountains behind it
News Analysis

Microsoft 365 Copilot Is Now on General Release. Are Your Permissions in Order?

5 minute read
David Barry avatar
By
SAVED
Microsoft 365 Copilot goes into general release today. While it promises many benefits for the workplace, organizations should proceed with caution. Here's why.

After months of build up, Microsoft 365 Copilot is generally available for enterprise customers starting today. But while many will herald the arrival of the AI assistant for work, not every organization is necessarily prepared for its arrival. 

A Lesson From Delve 

Today's release includes enterprise-grade security, privacy, compliance and responsible AI to ensure all data processing happens inside the Microsoft 365 tenant, according to Microsoft.  

More to the point, Copilot has access to all the data people use and create in an enterprise. This includes all documents, emails, Teams meetings and chats, and more — which could be a mixed bag for all concerned.

Because as with so much of technology, the issue doesn’t lie in the technology itself, but rather how it is implemented. Depending on the maturity level of your data governance and security strategy — particularly around data classification, access governance and permissions management — Copilot may inadvertently serve up information an employee shouldn’t necessarily see.

We saw this happen before with Microsoft Delve. Microsoft released Delve in 2014, hailing it as “a new way of working – proactive, transformational, and delightful. Delve is the first in a new breed of intelligent and social work experiences.”

Delve served up information, documents, trending conversations and more based on an individual’s history as represented in the Office Graph. Only sometimes those documents were ones people shouldn’t have had access to. The system was operating as designed — the issue came down to the permission settings.

Copilot operates under a similar model. According to Microsoft, "The permissions model within your Microsoft 365 tenant can help ensure that data won't unintentionally leak between users, groups and tenants."

So the question is: is your organization ready for Microsoft 365 Copilot? While tens of thousands of enterprise users in Microsoft Early Access Program (EAP) were already using Microsoft 365 Copilot before today, every enterprise that can afford it will now have access.

Invest in Data Governance and Security Before Copilot Deployment

Ensuring Copilot operates as intended boils down to ensuring only the right people have access to the right type of data, Curtis Johnstone, engineer and Microsoft MVP at Quest Software, told us. Microsoft 365 Copilot uses whatever Microsoft 365 data the end-user has access to to fuel its own engine and provide value.

This model, he said, maximizes the value generative AI can offer the end user, but can also lead to blind spots where the right data permissions were lacking. Here Johnstone cites the example of someone receiving inadvertent access to a sensitive HR file in SharePoint. While they may not know about it, he said, Copilot might offer data up from that sensitive document in an attempt to generate useful content. Even if permissions are not explicitly breached, a lack of oversight into access can become a serious security risk.

“The more an organization invests in best practices for data governance and security, the less of an issue oversharing will be for AI, and the easier it will be to deploy Copilot with the appropriate safeguards,” he said.

So what can organizations do to minimize that risk? Johnstone suggests that to minimize data exposure issues and be cost-effective at the same time, a good strategy to deploy Microsoft 365 Copilot is to start with a pilot for a small, targeted set of people or teams who will highly benefit from it, then scale its use based on where it provides value.

“As wider adoption takes hold across the organization, leaders can gain more confidence that the right safeguards are in-place, and the tangible business value for these new AI capabilities is worth the associated risks and cost,” he said.

Governance by Design

However, it’s not just about settings, it really begins well before that with a design philosophy Microsoft calls Responsible AI, said Rich Wood, VP of Microsoft Alliances at RightPoint.

“I call it a design philosophy, but it’s a commitment they’re making to their customers and really, to society. Principles like ‘Privacy and Security’ and ‘Transparency’ are literally built into every iteration of Copilot. Actual features follow from there,” he said.

Enterprise of course have their role to play, too, he added. “Obviously, nobody should be turning generative AI loose without at least an assessment of the current systems and repositories, an audit of the environment it’s going to be ingesting, and a plan for what use cases to explore."

Assuming enterprises are doing all of that, good governance begets safer AI. In the case of Microsoft 365 Copilot, organizations who’ve been careful about solid governance of their SharePoint and OneDrive environments will be that much closer to being ready for Copilot, he added.

It is essential to remember that while Copilot is a catchall for Microsoft’s generative AI assistants, there will be Copilots beyond just Microsoft 365 and traditional productivity apps, even if those are the most obvious use cases.

When looking to deploy Copilot, a foundational aspect of that work will be identifying key use cases to explore and paths organizations can drive to determine the real business value. “This is the most important part. Do not just roll it out there and expect it to grow up in the wild — guide it. Have a plan. In other words, document, prove out, and calculate the ROI of each business case,” he added.

Related Article: Information Governance Is Boring, But Necessary

A Learning Journey

Copilot will only have access to what an individual can see, Andrew Pope of Designing Collaboration added. However, the risk is that when Copilot prepares content, it will not always be able to differentiate between sensitive and unrestricted content. Users will need to ensure either that content sources are classified, or everything is reviewed before sending/publishing.

“This will be a long learning journey. Those who have extensive classification and taxonomy systems are more likely to benefit in the short-term,” he said.

Learning Opportunities

“Consider Copilot as one tool among a set of many others. Start with problems, issues, areas that we can improve. Then look to see whether Copilot is the right tool in this situation.”

Organizations also need to develop scenarios, such as improving meetings or compiling reports more efficiently. This may require some experimentation to find the right application, and will vary from team-to-team.

Having Copilot communities where individuals can test scenarios and report back on what they have found worked and did not work will be valuable - sharing insights and co-working on improving the application

“It may well be best to test out these in controlled spaces — not sharing what Copilot has created outside of learning communities, until we are satisfied that we have a practical application other than as an AI assistant, where it can help with basic tasks,” Pope said.

Related Article: How Copilot in SharePoint Can Extend Its Workplace Reach

Assessing Readiness

Microsoft has a strong track record in data security and privacy, said Anurag Gurtu, CPO at StrikeReady. That said, the best strategy for organizations looking to deploy Microsoft Copilot involves a combination of factors, he said, including:

  • Conducting a Privacy Impact Assessment (PIA):  This should evaluate the potential risks associated with Copilot in the context of their specific use cases.
  • Clear policies: Organizations need to establish and guidelines for employees regarding the use of Copilot and its limitations.
  • Continuous monitoring: Organizations need to carry out continuous auditing of Copilot’s interactions and access to sensitive data.
  • Integration and Training: Organizations should ensure seamless integration of Copilot with their existing security infrastructure and invest in employee training to raise awareness about the privacy implications and potential risks.

Microsoft should prioritize user consent and transparency, allowing individuals to understand when and how their data is being used by Copilot. Clear opt-in/opt-out mechanisms should be in place, he added.

Enterprises, for their part, must ensure that their deployment of Copilot complies with data protection regulations such as GDPR or HIPAA, depending on their geographic location and industry.

“By proactively addressing these privacy and security considerations, both Microsoft and enterprises can harness the potential benefits of Copilot while safeguarding sensitive information and ensuring compliance with privacy regulations,” he said.

About the Author
David Barry

David is a European-based journalist of 35 years who has spent the last 15 following the development of workplace technologies, from the early days of document management, enterprise content management and content services. Now, with the development of new remote and hybrid work models, he covers the evolution of technologies that enable collaboration, communications and work and has recently spent a great deal of time exploring the far reaches of AI, generative AI and General AI.

Main image: Philipp Katzenberger | unsplash
Featured Research