

Navigating Copilot Adoption: Key Legal Considerations for Data Governance
- Information governance
Legalweek 2025 Session Recap
The widespread interest in AI for users’ personal and professional lives has driven significant demand for tools like Copilot for Microsoft 365. However, the implementation of any AI tool comes with several considerations.
Balancing strategy, support, and implementation is key to ensuring successful adoption of Copilot for Microsoft 365 across your organisation. This was the focus of the Legalweek 2025 session, “Navigating Copilot Adoption: Key Legal Considerations for Data Governance.” This blog summarises the discussion and advice from industry leaders on managing the demand and navigating the use of AI tools like Copilot for Microsoft 365, while prioritising the security and privacy of sensitive data.
Paul Renehan, Senior Director of Advisory and Implementation, Information Governance Practice, Epiq, moderated the panel. The panelists included:
-
Rachel O'Shea, Director, Technical Specialist Management — Data Security, Microsoft
-
Steven Berrent, Director of IT Innovation & Architecture, Steptoe LLP
-
Lawrence Briggi, Manager of Corporate Litigation Support, IBM
-
Trisha Sircar, Partner and Privacy Officer and Head of the Global Privacy, Data & Cybersecurity Practice, Katten Muchin Rosenman LLP
Challenges of Copilot Adoption
Across organisations, panelists agreed that successful Copilot for Microsoft 365 adoption requires intentional frameworks, access management, and encryption systems to prevent the leakage of sensitive data. There’s a range of reactions in the legal industry to AI — from excitement to apprehension. Clear communication is essential for addressing fears and misconceptions, especially among in-house teams. Lawrence Briggi noted that the industry lacks case law and consensus on the ‘correct’ way to approach a framework for the successful adoption of these tools.
While some clients are enthusiastic about the potential of AI, others worry about governance gaps, particularly when concerns most prominently surround data security and retention impacts. Establishing a thoughtful AI governance framework and training end users on best practices to meet business needs while understanding a responsible way to use AI, are another critical step to mitigate these concerns for the people aspect of responsible AI considerations.
Best Practices for Responsible Implementation
During the session, panelists explored the various practices that companies can adopt to support successful Copilot for Microsoft 365 adoption:
-
Transparent Communication: Rachel O’Shea and Paul Renehan advocated for clear communication with both internal teams and clients. Organisations should address risks from the perspective of stakeholders, fostering trust and comfort.
-
Phased Implementation: According to Trisha Sircar, a phased implementation approach is critical. Involving diverse stakeholders — from IT teams to external counsel — ensures that all angles of adoption are addressed. Trisha also emphasised the inevitability of AI being used across the entire workplace, urging organisations to embrace it and plan for its integration, rather than shy away.
-
Hands-On Experience: Steven Berrent encouraged IT professionals to get under the hood of AI platforms. Understanding the tools firsthand equips IT teams to ask informed questions and navigate challenges effectively.
-
Education and Licensing: O’Shea emphasised the importance of understanding Microsoft contracts, the specifics of licenses, and available protections. Understanding the differences between Microsoft licenses positions users to adapt to changing requirements while maximising security and productivity. Organisations should leverage Microsoft partners like Epiq to educate themselves on available security protections and efficient license management.
Addressing Risks
Risk management was a central theme of the panel discussion, and two types of data leakage risks were identified: requests made to Copilot and sensitive information input into the tool. To mitigate these risks, organisations should:
-
Implement models with built-in security measures (e.g., Data Loss Prevention) and sensitivity labels.
-
Restrict access rights and ensure proper permissions are enforced.
-
Carefully govern data inputs to avoid exposing sensitive information.
Looking Ahead: Copilot in 2025
As AI adoption accelerates, panelists predicted that 2025 will continue to be a year of mass implementation. Sircar shared her optimism that AI is here to stay, and we have to embrace it.
With the right questions and considerations, legal teams can ensure responsible implementation and successful adoption, helping their organisation get the most out of Copilot for Microsoft 365 and see better results.
Learn More about Epiq’s Responsible AI and Copilot for Microsoft 365 Adoption Readiness assessment and services, including how it can support your organisation.
For teams that have already implemented Copilot but need help getting the most out of it for their legal teams, see Epiq’s Legal Department AI Training offering.
The contents of this article are intended to convey general information only and not to provide legal advice or opinions.