Aligning Microsoft Tools With NYC Bar AI Recording Guidance
On Dec. 22, 2025, the New York City Bar Association's Professional Ethics Committee issued Formal Opinion 2025-6, addressing the ethical obligations arising when attorneys or clients use artificial intelligence tools to record, transcribe or create summaries of their conversations.1
Among other things, the opinion:
- States that an attorney should obtain client consent before the attorney engages AI to record a call and consider whether recording, transcribing and summarizing is well advised in the specific circumstances, including issues of confidentiality and privilege;
- Explains that if an attorney knows that a client is recording a call with an AI tool, the lawyer should advise the client of the disadvantages of doing so; and
- Recommends that attorneys review any AI-generated transcripts, summaries or other meeting artifacts for accuracy with respect to any meetings with counsel to effectuate ethical duties.
While the opinion focuses on attorney-client communications between outside counsel and their clients, it is not limited to, and its guidance applies equally to, in-house counsel communicating with internal business partners.
For organizations using Microsoft 365 and Copilot, the opinion raises immediate questions about platform configuration, data governance and e-discovery readiness.
Recording and transcription are no longer just discrete, intentional acts by meeting participants. They can also be triggered by default settings, enabled automatically when Copilot features are used or initiated by any meeting participant. The resulting data artifacts can proliferate across mailboxes, OneDrive accounts and SharePoint sites in ways that create novel challenges for preservation and collection.
This article explores those technical realities and offers guidance for aligning the Microsoft 365 environment with the opinion's requirements.
1. The opinion emphasizes obtaining client consent before recording. How does Microsoft Teams handle consent notification, and what configuration options are available?
Microsoft Teams provides several mechanisms for consent notification, though organizations must actively configure them to meet the opinion's requirements.
When recording or transcription is enabled for a meeting, Teams displays a banner at the top of the meeting window notifying participants. This provides basic notice, but it may not constitute adequate consent in all circumstances — particularly in two-party consent jurisdictions or when participants join via phone dial-in.
More robust consent controls are available through the Teams admin center. Organizations can configure a participant-agreement setting that requires express consent from users before they can fully participate in a recorded meeting. Users who do not acknowledge consent may still view the meeting, but they cannot unmute, share their screen, turn on their camera or add chat content.
This feature provides technical enforcement of consent requirements, though organizations should verify that it meets their specific legal obligations.
Additionally, it is good practice for meeting hosts to provide verbal notice at the start of any recorded meeting involving legal matters. A sample statement might be: "This meeting is being recorded and transcribed. A banner is displayed at the top of the window. If you do not consent to participating in a recorded session, you may disconnect now and view the recording later."
2. The opinion distinguishes between recordings that persist — i.e., a kept record of a conversation that may be relied upon years later — and those that don't, i.e., those that exist only long enough to support other Copilot features. What are the Copilot configuration options in Teams meetings, and how do they affect data retention?
This is a critical distinction, and Microsoft provides two primary configuration modes for Copilot in Teams meetings that have dramatically different implications for data persistence and e-discovery.
"Only During the Meeting" Mode
In this configuration, Copilot relies on a hidden, temporary transcript that is not retained after the meeting ends. Once the meeting concludes, the prompts and responses generated during the meeting are no longer available.
The only way for participants to retain their Copilot interactions is to manually copy and paste them into a different location before the meeting ends. There is no way to save the temporary transcript.
This mode significantly reduces e-discovery exposure and aligns well with the opinion's recognition that not all recordings need to persist.
"During and After the Meeting" Mode
In this configuration, meeting participants can start a recording or transcript during the meeting. When the meeting ends, the recording and transcript are saved to OneDrive or SharePoint, depending on how the meeting was scheduled. Copilot prompts and responses are stored in the user's Exchange mailbox.
This creates multiple durable artifacts across different storage locations, each with its own preservation and retention considerations.
Recommendation
Organizations seeking to minimize data persistence for meetings involving legal matters should consider confirming that "Only during the meeting" is the default, while creating a separate meeting policy or template for situations where recording is specifically authorized and appropriate.
This approach affords meeting participants the benefit of Copilot functionality while purging content when the meeting ends.
3. Where are these AI-generated artifacts actually stored? How does this affect data mapping and legal holds?
Data storage raises significant and complex technical issues that directly affect preservation, collection and governance. Microsoft has created a complex organizational structure for data storage, resulting in a variety of locations where content may reside.
How that data can be accessed, edited, managed and deleted differs by location. Understanding these storage patterns is essential for effective preservation and collection.
Meeting Recordings and Transcripts
For meetings scheduled from Outlook calendars — i.e., Teams meetings — recordings and transcripts are stored in the meeting organizer's OneDrive account in the Recordings folder.
For meetings scheduled through Teams channels — i.e., channel meetings — they are stored in the SharePoint site for that team.
This means legal holds must target different locations depending on how meetings were scheduled, and the meeting organizer may be different from legal hold custodians in a matter.
Copilot Interactions — Prompts and Responses
These are stored in a hidden folder within the Exchange mailbox of the participant who executed the Copilot prompt.
When you collect from mailboxes using Microsoft Purview eDiscovery, these hidden folders are automatically included in the search and collection scope. However, these are stored separately from the transcript and recording, adding another layer of complexity.
AI Meeting Notes From Copilot Facilitator
When Copilot Facilitator is added to a meeting, where the resulting AI meeting notes will be stored depends on who joins the meeting first, who turns on Facilitator during the meeting or the order of other actions taken during the meeting. This creates additional complexity for identifying relevant data sources.
Intelligent Recap AI Summaries
These are displayed in the Teams interface after a meeting, but are not stored as separate artifacts in Microsoft 365. They are not subject to Purview e-discovery or data life cycle management.
This creates a gap where content that appears authoritative to users may not be discoverable through standard e-discovery processes.
Audio Recaps
These are stored in the user's OneDrive in the Recordings > AudioRecaps folder. These have a fixed 60-day retention period that cannot be modified through standard administrative settings — creating a significant governance challenge, discussed below.
Practical Implication
Organizations must update their data maps to account for these multiple storage locations. Legal hold procedures should be revised to ensure preservation actions target all relevant locations, not just traditional custodian mailboxes and OneDrive accounts.
Consider adding questions to custodian interviews about Copilot usage, meeting recording practices and any known meetings that may be relevant to the matter.
4. The opinion emphasizes the duty of competence and the need to review AI-generated content for accuracy. What accuracy concerns may arise with Teams AI features?
The opinion's concern about AI accuracy is well founded. Benchmarking and testing reveal that AI-based transcripts and summarizations remain susceptible to meaningful inaccuracies.
Sources of Inaccuracy
Transcription errors commonly result from microphone quality, distance of the speaker from the microphone, background noise, internet connection issues, speaker accents and participants talking over one another. These technical factors can lead to inaccurate interpretation and the absence of nuance.
AI-generated summaries compound these issues by imputing meaning and intention to speakers, potentially mischaracterizing the nature of statements or advice.
The Evidentiary Risk
Meeting recordings and transcripts can carry significant weight in litigation because they purport to be verbatim or substantially verbatim accounts of events.
In addition to being used in discovery, statements may be admissible at trial under various theories, including opposing-party statements, admissions against interest, impeachment, present-sense impression and recorded recollection.
An inaccurate AI-generated transcript could be used as evidence of what someone said or meant, even if it doesn't accurately reflect the actual conversation.
Practical Considerations
As the opinion suggests, it is likely not practical or feasible for attorneys to verify all AI-recorded conversations involving legal matters. The sheer volume of meetings, the distributed nature of where transcripts and summaries are stored, and the reality that multiple participants may each have their own AI-generated versions make comprehensive review unrealistic for most organizations.
What organizations can do is take a risk-based approach. For meetings where the substance is particularly significant, such as board discussions, regulatory matters or sensitive investigations, participants can be instructed not to enable recording or transcription in the first place, avoiding the creation of potentially inaccurate AI-generated records.
Alternatively, for meetings where a formal record is sufficiently important to justify costs, organizations might consider using professional transcription services or court reporters, rather than relying on AI-generated transcripts that carry accuracy risks.
Organizations should also ensure that employees understand the limitations of AI-generated meeting content and do not treat Copilot summaries or transcripts as authoritative records without recognizing they may contain errors. Training and awareness efforts can help prevent overreliance on content that may not accurately reflect what was actually said.
5. Are there other AI features that create artifacts without direct user action that organizations should be aware of?
Yes. However, this is an area where the opinion's framing of consent becomes particularly challenging.
While the opinion emphasizes that both ethical and legal concerns require lawyers to notify clients in advance that their conversations will be recorded to create AI-generated summaries, several Microsoft 365 features can generate AI content without affirmative user action.
Auto-Generated Document Summaries
Around August 2024, Microsoft began automatically generating Copilot summaries when users with Premium Copilot licenses open Word documents stored in OneDrive or SharePoint.2
This can occur without requiring any affirmative action from the user; simply opening a document may trigger summary generation. These summaries are stored in a hidden folder within the user's mailbox and appear similar to email messages in review platforms.
Governance Implications
This creates several concerns:
- Volume explosion, e.g., every document opened potentially generates a record;
- Privilege complexity, e.g., privileged content gets summarized, too;
- Accuracy risks, e.g., AI summaries may be inaccurate representations of the document; and
- Reviewer confusion, e.g., reviewers may misinterpret autogenerated summaries as intentional user actions.
Organizations should decide in advance whether to cull, produce or treat these auto-generated summaries differently from intentional Copilot interactions.
Teams Chat Rewrite Feature
An interesting counterexample is the "Rewrite and Adjust" feature in Teams Chat. This allows someone to use Copilot to help draft chat messages, making them longer, shorter, more formal or more casual.
However, this interaction with Copilot does not create a Copilot interaction stored in the user's mailbox. If the user accepts Copilot's suggested changes, the Copilot response becomes the user's chat message, which is classified as a chat message, not a Copilot interaction.
This means there is no record that the person used AI to craft the message, which could be relevant in matters where the user's state of mind or original phrasing matters.
Key Takeaway
Not everything that looks like it should be discoverable actually is, and some things that are discoverable are stored in unexpected locations or generated without user awareness. Organizations need to understand these nuances to develop appropriate legal hold and collection strategies.
6. What practical steps should organizations take to configure their Microsoft 365 environment in alignment with the opinion's requirements of consent, confidentiality and accuracy?
Organizations should approach this as a cross-functional effort involving legal, IT, information governance and compliance.
The specific steps will depend on each organization's risk tolerance, litigation profile and business needs for these collaboration features, but several key areas warrant attention.
Configuration Assessment
A threshold step is understanding what is currently enabled in your environment. Many organizations discover that features were enabled by default or during pilot programs without full consideration of the legal and governance implications.
Working with IT to document current settings and any planned changes is the foundation for informed decision-making.
Policy Alignment
Organizations should evaluate whether their existing policies adequately address AI-assisted recording, transcription and summarization. The evaluation includes acceptable use policies, meeting recording policies and data retention policies.
The opinion's guidance may necessitate updates to address consent requirements, permissible uses and handling of AI-generated content involving legal matters.
E-Discovery Readiness
Legal hold procedures and collection workflows may need to be revised to account for the multiple storage locations and unique characteristics of Copilot-related artifacts.
Organizations should consider whether their current preservation model — whether in-place holds or collect-to-preserve — adequately addresses content with short or fixed retention windows.
Ongoing Governance
Perhaps most importantly, organizations need a mechanism for ongoing monitoring and assessment. Microsoft continues to release new features at a rapid pace, and e-discovery implications are often unclear at launch. A one-time configuration exercise is insufficient; organizations need a process for evaluating changes as they occur.
The right approach will vary based on organizational context. Some organizations may choose restrictive configurations that minimize data creation, while others may accept greater data proliferation in exchange for productivity benefits, with appropriate governance controls.
The key is making informed, deliberate choices, rather than allowing default settings to drive outcomes.
Conclusion
Formal Opinion 2025-6 provides important ethical guidance on AI-assisted recording, transcription and summarization. However, implementing that guidance in a modern Microsoft 365 environment requires understanding the technical realities of how these features work, where they store data, and what governance controls are and are not available.
Organizations that proactively address these technical considerations will be better positioned to meet their ethical obligations while also realizing the productivity benefits these tools offer.
Staci Kaliner is a managing director, Martin Tully is a partner and John Collins is a managing director at Redgrave LLP.
The views expressed in this article are those of the authors and do not necessarily represent the views of their law firm or any of its clients.