Litigation, Technology, and Ethics: The Importance of Technological Competence (2025 Update)

| | Thomson Reuters Practical Law

Most lawyers understand that their ethical obligation of competence requires a level of knowledge, skill, and preparation to handle any particular matter, whatever the complexities of that matter. 

However, for the past two decades or longer, many lawyers have struggled with this long-established ethical principle in the face of rapid technological change, and continue to do so.  Some lawyers may be ignorant of evolving technologies and their impact on litigation practice or fear them as unduly complex, while others rely on technology too much and fail to understand its limitations.  For many litigators, e-discovery has remained terrain “where angels fear to tread” (United States v. O'Keefe, 537 F. Supp. 2d 14, 24 (D.D.C. 2008) (Facciola, Mag. J.)).

It is hard to imagine a litigation of any size or complexity where technology, specifically electronically stored information (ESI), does not come into play.  These issues are no longer relegated to substantial cases or large corporate matters; they pervade all litigation, particularly in a post-COVID world.  Most clients regularly communicate via email, text message, instant message (whether through collaboration platforms or third-party applications), and engage and actively participate in some form of social media.  And what lawyer has not conducted depositions, court hearings, or even trials via collaborative video technology?  Due to the explosion of electronic platforms and increasing reliance on this ever-changing technology as a tool in virtually every aspect of litigation, technological ignorance has long ceased to be an excuse for practitioners.  The rise of generative artificial intelligence (GenAI) underscores this reality and the need for technological competence.  (See Practice Note, Key Legal Issues in Using Generative AI: Overview (US).)

Rules and Guidance

Most bar associations, state bars, and courts have issued at least some guidance on what a lawyer’s duty of competence means in the e-discovery context.  Notable advice includes the following and each is described in more detail in:

  • The American Bar Association’s (ABA) revision to Comment 8 of Rule 1.1 of the Model Rules of Professional Conduct and later opinions, including Formal Opinion 512 relating to GenAI tools.
  • Ethical opinions from the State Bar of California and other state bar associations.
  • The 2015 amendments to the Federal Rules of Civil Procedure and later case law.
  • GenAI hallucination cases and judicial responses.

ABA Model Rule 1.1

Model Rule 1.1 requires a lawyer to provide competent representation to a client.  This requires “the legal knowledge, skill, thoroughness and preparation reasonably necessary for the representation” (Model Rules of Prof’l Conduct R. 1.1).  Comment 8 to this rule clarifies that a lawyer must keep abreast of changes in the law and its practice, including the “benefits and risks associated with relevant technology” as part of this duty.

As the ABA explained in the 2012 Committee Report accompanying the amendment to Comment 8:

“[T]echnology is such an integral – and yet at times invisible – aspect of contemporary law practice . . . The proposed amendment … does not impose any new obligations on lawyers.  Rather, the amendment is intended to serve as a reminder to lawyers that they should remain aware of technology, including the benefits and risks associated with it, as part of the lawyer’s general ethical duty to remain competent in a digital age.” 

(ABA Comm’n on Ethics 20/20, Resolution & Report on Tech. & Confidentiality, at 3 (August 2012), available at americanbar.org.)

As later opinions clarify, lawyers cannot abdicate their responsibility to understand technology or delegate that responsibility entirely to IT departments, outside vendors, paralegals, or secretaries.  As technology evolves, particularly in the area of cybersecurity, lawyers have more to monitor than ever before.  (See ABA Formal Ethics Op. 483 (2018) (duty to monitor for and respond to data breach); ABA Formal Ethics Op. 477R (2017) (securing communication of protected client information).)  This duty to monitor includes the need to provide supervision and oversight in relation to the use of GenAI.

As the 2024 ABA Legal Technology Survey Report noted, there has been a significant increase in AI adoption among law firms, with 30 percent of respondents using AI tools, up from 11 percent in 2023.  To address this rapidly growing usage, the ABA issued Formal Opinion 512 “Generative Artificial Intelligence Tools” on July 29, 2024, which provides guidance on the ethical use of GenAI in legal practice consistent with the Model Rules.  The opinion highlights six key ethical duties that GenAI implicates:

  • Competence (Rule 1.1).
  • Confidentiality (Rule 1.6).
  • Communication (Rule 1.4).
  • Candor and Meritorious Claims (Rules 3.1, 3.3, 8.4).
  • Supervision (Rules 5.1, 5.3).
  • Fees (Rule 1.5).

With respect to Rule 1.1, the opinion confirms that lawyers must do one of the following:

  • Develop a reasonable understanding of the capabilities and risks of any GenAI tool they plan to use.
  • Draw on the expertise of others who can provide this guidance.

Although lawyers are not required to become GenAI experts, they must recognize the potential for error and ensure the tool's output is accurate and appropriate for the intended task.  This includes critically reviewing and independently verifying AI-generated content before relying on it in client work or court filings.

The opinion acknowledges that the appropriate amount of review and verification depends on the GenAI tool and the specific task it performs but warns that lawyers may not abdicate their responsibilities by relying solely on a GenAI tool to perform tasks that call for professional judgment.  Moreover, because GenAI tools are evolving rapidly, lawyers must stay informed; the opinion recommends continuing legal education, reading materials targeted at the legal profession, and consulting with knowledgeable individuals or AI experts.  As a later part of this Article demonstrates, lawyers who fail to follow this guidance risk significant legal consequences, including numerous potential sanctions and reputational harm.

State Bar Guidance

Most state bar organizations have provided some form of guidance on a lawyer’s duty of technological competence.   They generally follow the ABA’s approach, often adopting Rule 1.1 verbatim (see, for example, Del. Supreme Court, Order Amending Rules 1.0, 1.1, 1.4, 1.6, 1.17, 1.18, 4.4, 5.3, 5.5, 7.1, 7.2, and 7.3 of the Del. Lawyers' Rules of Prof’l Conduct, R. 1.1, cmt. 8 (Jan. 15, 2013) (adopting cmt. 8 verbatim); Mass. Rules Advisory Comm., Proposed Revisions to Mass. Rules of Prof’l Conduct, R. 1.1, cmt. 8 (July 1, 2013) (proposing to adopt cmt. 8 verbatim)).  According to the ABA's list, as of April 2023, 39 states had adopted some form of statement about technological competence, while another six states had adopted comments but no statement.  Additionally, the District of Columbia recently amended the comments to its Rule 1.1 with similar language to the ABA's approach, D.C. Court.  App. Order No. M284-24 (April 7, 2025), and Puerto Rico recently adopted a new Rule 1.19 titled “Technological Competence and Diligence” that goes beyond the ABA model, effective January 1, 2026 (Puerto Rico Supreme Court Order ER-2025-02 (June 17, 2025)).

Furthermore, many states provide specific additional guidance defining competence with technology.  For example, in 2015, the State Bar of California issued an extremely influential formal opinion (which continues to be routinely cited) holding that a lawyer is not competent to handle complex cases involving ESI without sufficient understanding of the technical skills, knowledge, and aptitude required to conduct e-discovery (see Cal. State Bar Standing Comm. on Prof’l Resp. & Conduct, Formal Op. No. 2015-193 (June 30, 2015)).

The opinion stated that a lawyer undertaking complex litigation involving e-discovery should be able to perform at least nine specific tasks, including the ability to:

  • Initially assess e-discovery needs and issues, if any.
  • Implement or cause to implement appropriate preservation procedures.
  • Analyze and understand a client's ESI systems and storage.
  • Advise the client on available options for collection and preservation of ESI.
  • Identify custodians of relevant ESI.
  • Engage in competent and meaningful meet and confer with opposing counsel concerning an e-discovery plan.
  • Perform data searches.
  • Collect responsive ESI in a way that preserves the integrity of that ESI.
  • Produce responsive ESI in a recognized and appropriate manner.

(Cal. State Bar Standing Comm. on Prof'l Resp. & Conduct, Formal Op. No. 2015-193, at 2-4 (June 30, 2015), citing Pension Comm. of the Univ. of Montreal Pension Plan v. Banc of Am. Sec., LLC, 685 F. Supp. 2d 456, 462-65 (S.D.N.Y. 2010).)

The opinion directs that lawyers “who handle litigation may not simply ignore the potential impact of evidentiary information existing in electronic form.”  A lawyer who is not sufficiently competent in these areas must instead choose one of three options:

  • Obtain sufficient learning before undertaking the matter.
  • Associate or check with technical consultants and competent counsel.
  • Decline the representation.

The opinion further directs that lack of competency in e-discovery matters can, in certain circumstances, result in ethical violations (Cal. State Bar Standing Comm. on Prof'l Resp. & Conduct, Formal Op. No. 2015-193, at 1, 7 (June 30, 2015), citing Rule 3-110(C) (“a lack of technological knowledge in handling e-discovery may render an attorney ethically incompetent to handle certain litigation matters involving e-discovery, absent curative assistance … even where the attorney may otherwise be highly experienced”)).

At least three states, New York, Florida, and North Carolina, require mandatory continuing legal education (CLE) that includes technology training (22 NYCRR § 1500.22(a) (requiring 1 CLE hour in “Cybersecurity, Privacy and Data Protection”); 27 N.C.A.C. Ch. 1D, Section .1518 (“at least 1 hour shall be devoted to technology training as defined in Rule .1501(c)(17) of this subchapter and further explained in Rule .1602(e) of this subchapter”); In re Amends. to Rules Regulating The Fla. Bar 4-1.1, 6-10.3, 200 So. 3d 1225 (Fla. 2016) (“We amend subdivision (b) (Minimum Hourly Continuing Legal Education Requirements) to change the required number of continuing legal education credit hours over a three-year period from 30 to 33, with three hours in an approved technology program”) (emphasis added)).  New Jersey also approved a new CLE requirement in “technology related subjects” in April 2025, but did not immediately provide details on the requirement or the date it would go into effect (April 2, 2025 Notice - Attorney Responsibilities as to Cybersecurity & Emerging Technologies).

Recent state bar guidance also addresses the use of GenAI tools similarly to Formal Opinion 512, cautioning lawyers to verify AI-generated content and ensure compliance with ethical obligations, and emphasizing the need for supervision, accuracy, and the protection of client confidentiality.  For example, the Pennsylvania and Philadelphia Bar Associations jointly issued Opinion 2024-200 “Ethical Issues Regarding the Use of Artificial Intelligence,” a comprehensive guide cautioning attorneys against relying on GenAI outputs without independent verification, and warning of potential violations of:

  • Rule 1.1 (competence).
  • Rule 1.4 (communication).
  • Rule 1.6 (confidentiality).
  • Rule 1.7 (conflicts of interest).
  • Rule 3.1 (meritorious claims and contentions).
  • Rule 3.3 (candor to the tribunal).
  • Rules 5.1 and 5.3 (duty to supervise).
  • Rule 5.5 (unauthorized practice of law).
  • Rule 8.4 (misconduct).

Similar opinions from other jurisdictions include:

  • The California State Bar’s “Practical Guidance for the Use of Generative Artificial Intelligence in the Practice of Law” (November 16, 2023).
  • The Florida Bar’s Ethics Opinion 24-1 (January 19, 2024).
  • The Michigan Bar’s JI-155 (October 27, 2023).
  • The New York City Bar Association’s Formal Opinion 2024-5 (August 7, 2024).
  • The Texas State Bar Committee On Professional Ethics’ Opinion 705 (February 2025).  The New Jersey Supreme Court Committee on Artificial Intelligence and the Courts’ “Preliminary Guidelines On New Jersey Lawyers’ Use of Artificial Intelligence" (January 24, 2024).

More jurisdictions issue guidance very frequently; the New York City Bar created a May 2025 summary of ethics opinions and reports related to GenAI that compiles formal ethics opinions from California, Washington, D.C., Florida, Kentucky, Michigan, North Carolina, New York, Pennsylvania, Minnesota, Missouri, West Virginia, the Massachusetts Attorney General, and the New York Attorney General.

2015 Amendments to the Federal Rules of Civil Procedure and Later Case Law

Given the numerous articles already written on the 2015 amendments to the Federal Rules of Civil Procedure and their impact on e-discovery, there is no need to repeat those insights here (see generally Amii N. Castle, A Comprehensive Overview: 2015 Amendments to the Federal Rules of Civil Procedure, 64 U. Kan. L. Rev. 837 (2016); Clare Kealey, Discovering Flaws: An Analysis of the Amended Federal Rule of Civil Procedure 37(e) and Its Impact on the Spoliation of Electronically Stored Evidence, 14 Rutgers J.L. & Pub. Pol’y 140 (2016); K. Alex Khoury, Electronic Discovery, 67 Mercer L. Rev. 859 (2016); Sarah Himmelhoch & Neeli Ben-David, Rule 26 Proportionality: Have the 2015 Amendments Brought Common Sense to the Preservation Obligation?, 68 DOJ J. Fed. L. & Prac. 81 (2020)).  For a chart summarizing these changes, see Legal Update, Overview of December 2015 Amendments to the Federal Rules of Civil Procedure.

While the changes to proportionality have rightfully garnered significant litigation attention, the changes to Rule 37(e) have also been extremely influential, particularly as new forms of ESI have become increasingly commonplace.  Consider, for example, text messages that have become as ubiquitous today as emails were a decade ago.  Numerous federal cases throughout the country have weighed in on the need to preserve text messages, and virtually all do so under the framework of Rule 37(e) (see Colonies Partners, L.P. v. Cty. of San Bernardino, No. 5:18-cv-00420-JGB (SHK), 2020 WL 1496444 (C.D. Cal. Feb. 27, 2020), report and recommendation adopted, No. 5:18-cv-00420-JGB (SHK), 2020 WL 1491339 (C.D. Cal. Mar. 27, 2020) (imposing sanctions under Rule 37(e) for spoliation of text messages); First Fin. Sec., Inc. v. Freedom Equity Grp., LLC, No. 15-CV-1893-HRL, 2016 WL 5870218 (N.D. Cal. Oct. 7, 2016) (same); Paisley Park Enters., Inc. v. Boxill, 330 F.R.D. 226 (D. Minn. 2019) (imposing sanctions under Rule 37(b) and 37(e) after rejecting the claim that the cell phones at issue were personal cell phones because the cell phones were used for business purposes); Sinclair v. Cambria Cty., No. 3:17-CV-149, 2018 WL 4689111 (W.D. Pa. Sept. 28, 2018) (imposing sanctions for text message spoliation under Rule 37(e))).

Under the mandate of Rule 37(e), counsel must ensure their clients take reasonable steps to preserve ESI, and sanctions can be entered even without a finding of prejudice if the failure to take reasonable steps was intentional. Therefore, technological competence is more important than ever, as even the failure to disable auto-delete settings can subject the client (and counsel) to sanctions (see Mod. Remodeling, Inc. v. Tripod Holdings, LLC, No. Civil Action No. CCB-19-1397, 2021 WL 3852323 at *10 (D. Md. Aug. 27, 2021) (imposing sanctions under Rule 37(e) and holding that “[f]ailure to either disengage the auto-delete setting or to back up messages to a cloud server prior to deleting them from a device ‘is sufficient to show’ that a defendant ‘acted unreasonably.’”)).

Other forms of ESI, such as social media, are not exempt from this requirement (see Torgersen v. Siemens Bldg. Tech., Inc., No. 19-CV-4975, 2021 WL 2072151 (N.D. Ill. May 24, 2021) (imposing sanctions under Rule 37(e) for intentional deletion of a Facebook page)). In this day and age, it is simply not enough to tell a client to preserve everything.  Counsel must provide further guidance, including but not limited to:

  • Investigating any and all available forms of client ESI.
  • Having sufficient knowledge to know where relevant information may exist.
  • Informing the client on how to preserve that information, including how to disable auto-delete.

(See DR Distributors, LLC v. 21 Century Smoking, Inc., 513 F. Supp. 3d 839 (N.D. Ill. 2021) (holding that reliance on the client’s representations was inadequate, ordering spoliation sanctions under Rule 37(e), and finding counsel to be so woefully uneducated regarding technology that it ordered counsel to attend at least eight hours of continuing legal education on ESI).)  For more detailed information, see Practice Note, Sanctions for ESI Spoliation Under FRCP 37(e): Overview.

GenAI Hallucination Cases and Judicial Responses

As set forth above, the rise of GenAI tools, whether general purpose (for example, ChatGPT, Gemini, CoPilot) or legal-focused (such as CoCounsel and HarveyAI), has introduced new ethical and practical challenges for attorneys.  While these tools can enhance efficiency when researching, drafting, and communicating with clients, they also pose profound risks related to accuracy, confidentiality, and candor to the tribunal.  Indeed, it is well established that the current iteration of GenAI tools are prone to invent content in response to user queries and are more likely to fabricate information than return no information at all.  Despite appearances, GenAI tools do not think at all, much less think critically.  As ABA Formal Opinion 512 explains: “G[en]AI tools lack the ability to understand the meaning of the text they generate or evaluate its context.  Thus, they may combine otherwise accurate information in unexpected ways to yield false or inaccurate results.  Some G[en]AI tools are also prone to ‘hallucinations,’ providing ostensibly plausible responses that have no basis in fact or reality.”

Numerous cases throughout the country emphasize the dire nature of these risks, with attorneys facing sanctions and disciplinary actions at an alarming rate, almost invariably for citing non-existent facts, cases, or other materials.  In one of the earliest (and hence most cited and discussed) hallucination cases, Mata v. Avianca, Inc. (678 F. Supp. 3d 443 (S.D.N.Y. 2023)), the court sanctioned attorneys who submitted a brief containing fictitious case citations generated by ChatGPT by imposing a $5,000 fine and ordering the attorneys to write letters informing their client, and the alleged judges of the fake opinions, of their actions.  The court emphasized the duty to verify all legal authorities and warned against blind reliance on AI tools, setting out several harms flowing from the submission of fake opinions.  Similarly, in Lacey v. State Farm Gen. Ins. Co. (2025 WL 1363069 (C.D. Cal. May 5, 2025)), the court criticized the use of AI-generated content in filings without proper review, striking the offending briefs and awarding $31,100 to the opposing party.  And in Park v. Kim (91 F.4th 610 (2d Cir. 2024)), the Second Circuit referred the responsible attorney to the court's grievance panel for further investigation after half of the cases the attorney cited in her appellate reply brief were fabricated.  In extreme cases, sanctions can extend beyond the attorney, with at least one court dismissing multiple matters for repeat violations (ByoPlanet Int'l, LLC v. Johansson, 2025 WL 2091025 (S.D. Fla. July 17, 2025)).

Even attorneys at sophisticated firms with established policies and artificial intelligence committees are not immune, as Johnson v. Dunn demonstrates (No. 2:21-CV-1701-AMM, 2025 WL 2086116 (N.D. Ala. July 23, 2025)). Indeed, hallucination cases have become so commonplace that United States v. Hayes provides a list of nine such cases within the short span of one paragraph (763 F. Supp. 3d 1054 (E.D. Cal. 2025), reconsideration denied, 2025 WL 1067323 (E.D. Cal. Apr. 9, 2025)).  Perhaps more alarming, these are but a handful of the decisions that have drenched the legal landscape since the release of ChatGPT in November 2022 opened the floodgates of GenAI.  In fact, one researcher’s website devoted to tracking these types of cases listed 276 GenAI hallucination cases as of August 16, 2025.   

In response to this proliferation of erroneous and inaccurate AI-generated information, many judges have begun implementing safeguards, such as a requirement for attorneys to certify that no part of a document was generated by an AI tool, or certification that a human reviewed any AI-generated text relied upon in briefing or other court filings.  While most judges who have addressed GenAI have instituted such disclaimer requirements, there are a few who have entirely banned the use of GenAI tools entirely.  See, e.g., Judge Christopher Boyko’s Standing Order on The Use of Generative AI in the Northern District of Ohio (“no attorney for a party, or a pro se party, may use Artificial Intelligence (“AI”) in the preparation of any filing submitted to the Court.  Parties and their counsel who violate this AI ban may face sanctions including, inter alia, striking the pleading from the record, the imposition of economic sanctions or contempt, and dismissal of the lawsuit”).  A useful collection that tracks and summarizes judicial Standing Orders and Local Rules on the Use of AI is maintained here: https://rails.legal/resources/resource-ai-orders/.

Best Practices

Now that lawyers’ e-discovery responsibilities have been the subject of significant consideration and clarification, counsel must face the continuing challenge of evolving technology like GenAI while simultaneously adopting tangible best practices to ensure compliance with their ethical obligations.

Aside from the obvious considerations of potentially neglecting their ethical duties, clients increasingly hold lawyers responsible for failing in matters requiring technological competence (see, for example, Indus. Quick Search, Inc. v. Miller, Rosado & Algois, LLP, No. 13 CIV. 5589 (ER), 2018 WL 264111 (S.D.N.Y. Jan. 2, 2018) (allowing certain malpractice claims based on alleged negligence in e-discovery preservation to move forward)).  Discovery on discovery disputes are also on the rise.  These disputes examine a party’s collection, retrieval, and production efforts and are costly in terms of time, money, and both a client’s and a lawyer’s reputation.  (For information on the key considerations for counsel seeking or resisting discovery about a party’s efforts to preserve data and comply with discovery requests, see Practice Note, Discovery on Discovery (Federal).)

Moreover, recent cases demonstrate that courts are willing to impose severe sanctions for deficiencies in this area (see, for example, Staubus v. Purdue Pharma, L.P., No. C-41916 (Tenn. Cir. Ct. Sullivan Cty. Apr. 6, 2021) (entering default judgment against pharmaceutical manufacturer for failure to conduct an adequate search for responsive information and making misleading statements regarding the same); DR Distributors, LLC v. 21 Century Smoking, Inc., 513 F. Supp. 3d 839 (N.D. Ill. 2021) (imposing evidentiary curative measures and awarding attorneys’ fees for spoliation of ESI); Charlestown Cap. Advisors, LLC v. Acero Junction, Inc., 337 F.R.D. 47 (S.D.N.Y. 2020) (awarding attorneys’ fees and costs and ordering evidentiary preclusions for failure to take reasonable steps to preserve ESI)); Skanska USA Civ. Se. Inc. v. Bagelheads, Inc., 75 F.4th 1290, 1314 (11th Cir. 2023) (affirming order imposing an adverse inference on grounds that plaintiff company failed to take reasonable steps to preserve employees’ text messages when it failed to suspend autodelete functions or backup the phones); Freeman v. Giuliani, 691 F. Supp. 3d 32, 56 (D.D.C. 2023) (entering default judgment against defendant for failure to take reasonable steps to preserve text and chat messages); Oakley v. MSG Networks, Inc., No. 17-CV-6903 (RJS), 2025 WL 2061665, at *6, *12 (S.D.N.Y. July 23, 2025) (awarding attorneys’ fees and permitting evidence of spoliated text messages at trial, finding that both plaintiff and his counsel failed to take reasonable steps to evidence, noting that “preservation is a process that requires attorneys to take follow-up steps to ensure electronic evidence is preserved after the issuance of a litigation hold. . . [I]t is not enough for counsel to issue a hold and assume the client took adequate steps to preserve relevant data.”) (internal citations omitted).  For information on which sanctions and what standard, a court may impose when relevant evidence was destroyed or lost, see Spoliation Sanctions by US Circuit Court Chart.

In light of this guidance to define what competence means in the technological sphere (such as the nine subject matters enumerated by the State Bar of California), it seems clear that mere knowledge of computers, email, and mobile devices and even a basic understanding of information data storage are insufficient to meet the ethical obligation of technological competence in the third decade of the third millennium.

Therefore, all lawyers involved in litigation must step back and critically ask themselves whether they have the skill, knowledge, and ability to:

  • Adequately interview a client’s IT representatives to understand the client’s fundamental IT issues, including the operation of any retention policies and the IT infrastructure.
  • Assess where relevant ESI may be located, including non-traditional sources such as instant and ephemeral messages, internal collaborative platforms, third-party applications, text messages, and social media, and ensure that auto-delete functions are successfully managed.
  • Identify the legal issues involved with the generation, receipt, transfer, storage, preservation, and destruction of ESI.
  • Ascertain the impact of technology decisions, implementations, and changes on a client’s legal rights and obligations.
  • Identify situations where the concerns and challenges surrounding GenAI could be implicated, address such concerns and challenges as necessary, and safely and effectively employ GenAI if they intend to use it.
  • Ensure that the rights of a client and any non-parties (such as trade secrets, privilege, or privacy rights) are adequately protected in addressing the preservation, collection, and production of ESI.

In any given case, counsel should at least observe the following five directives for best ethical practices in litigation:

  • Determine what preservation steps need to be taken as soon as possible in the course of litigation.  This includes considering not only any issues regarding a client’s ESI, including the potential need to disable auto-delete functions, but also the need to put opposing parties and relevant non-parties on notice to preserve ESI.
  • Engage opposing counsel early in the process.  Doing so permits counsel to:
    • reach agreements about ESI issues that may eliminate future unnecessary discovery disputes (for more information, see Article, Learning to Cooperate);
    • avoid possible sanctions or future challenges to the efforts made in the preservation, collection, and production of ESI; and/or
    • ensure appropriate protection of the rights and property interests of parties and non-parties.
  • Assess whether to associate with more experienced and qualified counsel.  Counsel should consider whether they have sufficient knowledge and experience to meet e-discovery challenges.  There are several law firms and lawyers with specialized knowledge in the technology arena less experienced counsel can associate with.  Taking this step comports with ethical requirements (see Model Rules of Prof’l Conduct R. 1.1, cmt. 2 (“[c]ompetent representation can also be provided through the association of a lawyer of established competence in the field in question”); see also Model Rules of Prof’l Conduct R. 1.1, cmt. 6 (“the reasonableness of the decision to retain or contract with other lawyers outside the lawyer’s own firm will depend upon the circumstances, including the education, experience and reputation of the nonfirm lawyers”)).
  • Appreciate the complexities and nuances of the processes surrounding the identification, preservation, collection, review, and production of ESI.  This includes understanding how to implement preservation protocols, undertake defensible searches, and make productions effectively and in acceptable formats.  Inexperienced lawyers can:
    • seek guidance and advice from experienced IT professionals within their practice and firm;
    • build on their experience by taking continuing legal education courses, which are increasingly available to every practitioner nationwide (and are now required in at least two states); and
    • take advantage of the many free opportunities for education from e-discovery vendors about specific systems, collection techniques, predictive coding, and a myriad of other specific issues.
  • Understand and acknowledge the implications and challenges posed by GenAI.  This includes taking steps to satisfy their ethical obligations such as:
    • developing a basic understanding of how GenAI functions, its capabilities, and its risks;
    • staying informed through continuing education, reading, or consulting experts as GenAI evolves rapidly;
    • checking local court and judicial rules for specific GenAI guidance;
    • critically reviewing and verifying all AI-generated content before relying on it in client work or court filings;
    • building time for citation checking into the standard practice routine;
    • disclosing GenAI use when appropriate, including to clients;
    • avoiding use of confidential client data in public GenAI tools;
    • maintaining human oversight over legal work; and
    • establishing and following internal policies governing the use of GenAI tools, including training, supervision, and risk management procedures.

The views expressed in this article are those of the authors and not necessarily those of Redgrave LLP or its clients.