Angle
The Top Five Questions Legal Should Ask IT During Copilot for Microsoft 365 Adoption
- Information governance
It’s no longer hype. Organisations are broadly looking to adopt Gen AI to drive productivity across their workforce.
There’s also end-user demand to satisfy. People use Gen AI in their personal lives and want to use it at work. Workers are evaluating employment options based on the prospective company's available Gen AI tools. From the top down, organisations must ensure they’re keeping up with the competition for both talent and productivity.
And it’s proving to be effective. A recent Forrester study projected increased productivity, cost reductions, reduced onboarding time, and an impressive ROI estimated at up to 450%.
All of this is driving mass adoption and in ways that have never been seen before.
Large, conservative enterprise organisations, who have historically been slow to adopt technology, are now purchasing tens of thousands of Microsoft 365 Copilot licences and pushing adoption at a rapid pace.
Amidst all these exciting developments, the key question on the mind of every General Counsel or Chief Legal Officer is: How do we implement Copilot for Microsoft 365 responsibly?
Here are five questions legal can ask IT to help ensure Copilot for Microsoft 365 adoption addresses legal considerations. The good news is addressing these considerations will also help drive greater Copilot for Microsoft 365 efficiency and better results!
Question #1: How can we ensure Copilot for Microsoft 365 doesn’t expose sensitive information to employees who shouldn’t have access?
An end-user will typically have access to Outlook, OneDrive, Teams, and other applications, but they likely have access to information they aren’t aware of, such as files on certain SharePoint or OneDrive sites. Suppose an employee has shared a file using the selection ‘share with anyone in my organisation.’ This was not previously a significant risk when ‘anyone in my organisation’ would also need to know the link to access the file. However, when an end-user sends a prompt to Copilot for Microsoft 365, it’s grounded in utilising all files the end-user can access. Therefore, Copilot’s response could contain information the end-user wasn’t intended to have access to or they don’t know they have access to.
To address this concern, organisations can leverage Microsoft 365 Purview’s data security and compliance capabilities to identify and label sensitive content, including data subject to GDPR, HIPAA, PCI, Privacy regulations, and sensitive business materials defined by the organization, e.g., a confidential M&A target, intellectual property, etc. To speed up Copilot implementation, organisations can protect data overexposed to Copilot in stages based on risk categories. For example, first identifying and remediating overexposed data that contains sensitive content allows end-users to get access to Copilot quickly. Later, the team can address permission hygiene on overexposed data that does not have sensitive content.
Question #2: How can we avoid having Copilot for Microsoft 365 use outdated or inaccurate data?
As Copilot for Microsoft 365 grounds on accessible data, that data could be outdated, or it could contain factually incorrect information. Many organisations retain broadly due to a “keep everything” data culture or have not implemented automated disposition practices aligned with corporate retention policies.
Copilot for Microsoft 365 will be a better tool for its end-users if it uses accurate and precise data. When an organisation is rolling out Copilot for Microsoft 365, it is a great time to revisit data retention and disposition policies. Good data retention and disposition practices reduce data volumes based on timeframe and content duplicity, helping to avoid poor Copilot for Microsoft 365 responses. These same retention policies will also reduce data subject to future legal holds, lowering litigation costs and making it a win-win for both legal and IT.
Question #3: How can we prevent data loss or leakage?
By taking the steps above to define sensitive content and implement automatic data classification, organisations can take advantage of their good data hygiene and apply proactive controls to reduce the risk of data loss or leakage. With Microsoft Purview Data Loss Prevent (DLP), organisations can block users from sharing sensitive information to a personal email account or other unauthorised destinations. Proactive controls can warn end-users, block the activity, log the activity, and direct a centralised person to act on it. It can also identify any responses from Copilot that contain information from an already classified file, and the new file will inherit the classification and associated controls.
Question #4: How do you monitor and control the prompts and responses in Copilot for Microsoft 365?
Out of the box, Gen AI technology is smart. It won’t let end-users ask foolish questions.
In addition, proper implementation of a classification schema and DLP policy controls will protect prompts and responses at an individual level.
If concerns remain that end-users will ask for sensitive or confidential information, then additional layers of protection can be applied. This includes limiting what end-users can ask in a prompt or preventing them from asking sensitive questions to an organisation. Responses will also be evaluated for compliance with organisational policies to prevent sensitive information from being surfaced to an end-user who shouldn’t have
access to that content.
Question #5: If Copilot creates new documents, how do we align our data retention policies with our eDiscovery practices?
Today, the retention of Copilot prompts is tied to Microsoft Teams. If an end-user asks Copilot for Microsoft 365 to summarise a Word document, the end-user’s mailbox retains the prompt and response.
There is industry debate around whether this should be considered another data source, like a chat, but technically it isn’t. It’s transient content or can be thought of as a convenience copy, and it doesn’t need to be kept for records retention purposes. Still, an organisation may want to retain Copilot for Microsoft 365-generated data for employment litigation scenarios, forensic investigations, etc. Organisations should consider and put policies in place regarding this type of data.
Bonus question that everyone should be asking: What is a realistic timeframe for responsibly adopting Gen AI?
One of the most important and time-consuming aspects of preparing for Gen AI adoption is ensuring data is controlled and secure. On average, an enterprise-level organisation’s full deployment will take six months. One way to speed up time to value is to utilise an agile approach and roll out Gen AI in stages. In this way, Gen AI tools can get into in the hands of the most critical end-users, and learnings can be applied from these early adopters when rolling out the program to other groups within an organisation.
This approach can accelerate adoption and is more effective than trying to fix problems and implement controls after your broader teams start adopting. While an agile approach is not new, layering analysis of sensitive information and access controls allows an organisation to reduce risk associated with the deployment of Gen AI.
Learn more about Epiq’s Responsible AI and Copilot for Microsoft 365 Adoption Readiness assessment and services, including how it can support your organisation.
Epiq has experience advising multiple client organisations as well as implementing Copilot for Microsoft 365 within Epiq. Our largest engagement to date has been for a single organisation with 25,000 Copilot for Microsoft 365 licences and over 100,000 end-users. Through this work, we’ve developed our Responsible AI and Copilot for Microsoft 365 Readiness assessment and services to help legal and IT teams quickly put Gen AI to work within their organizations. This advisory programme addresses the most pressing questions that a General Counsel or Head of Legal should ask IT to ensure they successfully adopt Copilot.
Jon Kessler, Vice President, Information Governance, Epiq
Jon Kessler is Vice President of Information Governance for Epiq, where he leads a global team that focuses on Microsoft Purview, responsible AI adoption, data migration, legal hold, data privacy, insider risk management, data lifecycle management, data security advisory, and M&A data process management. He has extensive experience working on complex matters across diverse industries and often provides formal compliance and eDiscovery training to government agencies and Fortune 500 companies.
The contents of this article are intended to convey general information only and not to provide legal advice or opinions.