Deepfakes in the Courtroom: Challenges in Authenticating Evidence and Jury Evaluation

*Colin Livingston

I. Introduction

In a UK child custody dispute, a mother presented a “heavily doctored recording” in court to portray the father as “violent and threatening” in an effort to deny him access to his children.[1] The father’s attorney successfully challenged the recording’s authenticity and warned that “it would never occur to most judges that deepfake material could be submitted as evidence.”[2] Advances in technology and AI have enabled the creation of “deepfake” content that convincingly portrays individuals as doing or saying things that never occurred.[3] Courts must now assess the authenticity of evidence that may have been altered in ways that make manipulation difficult to detect.[4] As AI technology advances and deepfakes become easier to produce,[5] courts can expect more cases that challenge the application of traditional evidentiary rules and frameworks.[6]

II. Authenticating Evidence

Concerns regarding AI and deepfakes have already surfaced in United States courts,[7] forcing judges to confront how this technology can be used in litigation. In a wrongful death lawsuit against Tesla,[8] “Plaintiff[s] submitted a [r]equest . . . for Tesla to admit the” authenticity of a video in which Elon Musk made statements about the safety of its Autopilot feature.[9] In response, Tesla argued that Musk’s public figure status makes him the subject of many deepfake video attempts, and therefore, it could neither admit nor deny the authenticity of the video.[10] The court rejected its argument, unwilling to set a precedent that “Mr. Musk, and others in his position, can simply say whatever they like in the public domain” and then use the defense that their statements were potentially deepfakes to avoid liability.[11] Another example of a nuanced use of AI occurred in a Washington criminal trial when the State attempted to introduce an AI-enhanced video recording to compensate for the video’s low resolution and motion blur.[12] Concerned with maintaining the integrity of the evidence, the court denied admission of the enhanced video based on relevancy standards and the danger of unfair prejudice to the defendant.[13] The aforementioned cases illustrate some of the challenges courts face in authenticating potentially AI-altered or enhanced evidence, emphasizing the court’s need to adapt their evidentiary rules.

Currently, Federal Rule of Evidence 901 governs the authentication of evidence in federal courts, and in conjunction with this rule, courts may decide preliminary questions of admissibility under Rule 104.[14] Attempting to address the growing concerns regarding deepfakes and AI, the Advisory Committee on Evidence Rules, on November 8, 2024, considered proposed Rule 901(c), which “if adopted, would govern ‘potentially fabricated or altered electronic evidence.’”[15] The proposed rule would only apply when the opponent of the evidence demonstrates that a reasonable jury could find that the evidence has been altered by AI.[16] If this is successfully demonstrated, the court will admit the evidence only if the proponent demonstrates that its probative value outweighs the risk of unfair prejudice.[17] Rule 901(c) would be the first of its kind to address advancements in AI, but while “technology evolves rapidly, . . . the rul[em]aking process [remains] slow.”[18] At the time of this writing, the proposed amendment to Rule 901 has yet to be adopted.[19]

An additional concern is the potential for increased litigation costs.[20] The use of deepfake evidence will increase the need for digital forensic experts capable of detecting AI alterations and authenticating evidence.[21] These experts can charge anywhere from a few hundred dollars to “thousands [of] dollars per project” depending on the complexity and legal significance.[22]

III. Juror Skepticism and the Liar’s Dividend

The growing presence of AI and deepfakes may also affect how jurors evaluate the weight and credibility of evidence at trial.[23] Advancements in AI are not the first instance of social changes influencing juror expectations.[24] Around the year 2000, attorneys faced what became known as the “CSI effect,” a shift that heightened jurors’ expectations regarding the evidence offered at trial.[25] A similar shift may be emerging, a 2019 Pew Research Center survey shows that 66% of Americans said “they at least sometimes come across altered videos and images that are intended to mislead.”[26] It is plausible to believe that this percentage has risen as deepfakes become more sophisticated and widespread.[27] While it remains to be seen whether AI and deepfakes will trigger another paradigm shift in jurors’ beliefs, there is legitimate concern that jurors may become increasingly skeptical of evidence presented at trial.[28]

The concern that deepfakes will influence the minds of jurors becomes increasingly rational when considering a social phenomenon called the “liar’s dividend.”[29] With the increased prevalence of deepfakes, litigants may begin asserting that evidence presented against them is a deepfake, a tactic already utilized in the litigation involving Elon Musk and Tesla discussed previously.[30] Due to the liar’s dividend phenomenon, this tactic becomes increasingly credible as the public becomes more aware of the threats deepfakes pose because “a skeptical public will be primed to doubt the authenticity of real audio and video evidence.”[31] 

IV. Conclusion

The emergence of AI and deepfakes presents significant evidentiary and procedural challenges for courts.[32] Beyond making evidence harder to authenticate, this technology could make jurors more skeptical of even legitimate evidence and raise litigation costs by requiring expensive forensic experts.[33] Although early cases and proposed rules show some adaptation, the full extent of AI’s impact on evidentiary standards and trial practice remains uncertain.[34]

*Colin Livingston is a second-year student at the University of Baltimore School of Law where he is a Staff Editor for Law Review and a Distinguished Scholar of the Royal Graham Shannonhouse III Honor Society. Prior to law school, Colin earned a Bachelor of Arts in Government and Politics from the University of Maryland, College Park. Next summer he will be working as a Law Clerk for Eccleston & Wolf.


[1] Patrick Ryan, ‘Deepfake’ Audio Evidence Used in UK Court to Discredit Dubai Dad, The Nat’l (Feb. 8, 2020), https://www.thenationalnews.com/uae/courts/deepfake-audio-evidence-used-in-uk-court-to-discredit-dubai-dad-1.975764.

[2]  Id.

[3] James Ellis Arden, Deepfakes and Digital Evidence at Trial: Who Are You Going to Believe, the AI or Your Lying Eyes?, A.B.A.: GP Solo Mag. (June 4, 2024), https://www.americanbar.org/groups/gpsolo/resources/magazine/2024-may-june/deepfakes-digital-evidence-trial/.

[4] Natalie Runyon, Deepfakes on Trial: How Judges Are Navigating AI Evidence Authentication, Thomson Reuters (May 8, 2025), https://www.thomsonreuters.com/en-us/posts/ai-in-courts/deepfakes-evidence-authentication/.

[5] Frank Young, A Deepfake Evidentiary Rule (Just in Case), Univ. Ill. Chi.: Law Library (July 3, 2025), https://library.law.uic.edu/news-stories/a-deepfake-evidentiary-rule-just-in-case/ (stating that “[t]he legal profession has been anticipating a ‘tsunami of deepfake evidence’ dropping into exhibit lists” because of “the widespread availability of free or inexpensive apps for creating deepfakes.” (citations omitted) (quoting Chuck Kellner, The End of Reality? How to Combat Deepfakes in Our Legal System, A.B.A. J. (Mar. 10, 2025, at 9:10 CT), https://www.abajournal.com/columns/article/the-end-of-reality-how-to-combat-deepfakes-in-our-legal-system)).

[6] Id.; see Runyon, supra note 4.

[7] See Runyon, supra note 4.

[8] Lora Kolodny, Tesla Settles Lawsuit Over Autopilot Crash that Killed Apple Engineer, CNBC (Apr. 8, 2024), https://www.cnbc.com/2024/04/08/tesla-settles-wrongful-death-lawsuit-over-fatal-2018-autopilot-crash.html.

[9] Audrey Mitchell, Deepfaked Evidence: What Case Law Tells Us About How the Rules of Authenticity Needs to Change, Berkeley Tech. L.J.: Blog (June 23, 2025), https://btlj.org/2025/06/deepfaked-evidence-what-case-law-tells-us-about-how-the-rules-of-authenticity-needs-to-change/; Defendant Tesla, Inc.’s Opposition to Plaintiffs’ Motion to Compel Re Tesla, Inc.’s Supplemental Responses to Written Discovery; Motion for the Deposition of Elon Musk; and Motion for Sanctions at 4, Sz Hua Huang v. Tesla, Inc., No. 19CV346663 (Cal. Super. Ct. Santa Clara Cnty. Apr. 20, 2023), https://cdn.arstechnica.net/wp-content/uploads/2023/04/tesla-opposition.pdf.

[10] See sources cited supra note 9.

[11] Tentative Ruling on Plaintiff’s Motion to Compel Further Responses To Request for Admission (Set 3 and 3), Request for Admission as to Genuineness of Documents (Set 3), Form Interrogatory No. 71.1, Special Interrogatories (Set 3, 4 and 5), Request for Production (Set 5 and 6), the Deposition of Elon Musk and for Sanctions at 29, Sz Hua Huang v. Tesla, Inc., No. 19CV346663 (Cal. Super. Ct. Santa Clara Cnty. Apr. 27, 2023), https://cdn.arstechnica.net/wp-content/uploads/2023/04/musk-deepfake-ruling.pdf; see also Mitchell, supra note 9.

[12] Findings of Fact and Conclusions of Law Re: Frye Hearing on Admissibility of Videos Enhanced by Artificial Intelligence at 1–2, State v. Puloka, No. 21-1-04851-2 KNT (Wash. Super. Ct. King Cnty. Mar. 29, 2024).

[13] Id. at 6.

[14] See Fed. R. Evid. 901, 104.

[15] Kellner, supra note 5 (quoting Advisory Comm. on Evidence Rules, Jud. Conf. of the U.S., Agenda for Committee Meeting November 2024, at 241 (2024), https://www.uscourts.gov/sites/default/files/2024-11_evidence_rules_committee_meeting_agenda_book_final_10-24.pdf).

[16]  Id. (quoting Advisory Comm. on Evidence Rules, Jud. Conf. of the U.S., Agenda for Committee Meeting November 2024, at 241 (2024), https://www.uscourts.gov/sites/default/files/2024-11_evidence_rules_committee_meeting_agenda_book_final_10-24.pdf).

[17] Id. (quoting Advisory Comm. on Evidence Rules, Jud. Conf. of the U.S., Agenda for Committee Meeting November 2024, at 241 (2024), https://www.uscourts.gov/sites/default/files/2024-11_evidence_rules_committee_meeting_agenda_book_final_10-24.pdf).

[18] See Young, supra note 5 (citing Advisory Comm. on Evidence Rules, Jud. Conf. of the U.S., Agenda for Committee Meeting November 2024, at 241 (2024), https://www.uscourts.gov/sites/default/files/2024-11_evidence_rules_committee_meeting_agenda_book_final_10-24.pdf).

[19] See Fed. R. Evid. 901; Advisory Comm. on Evidence Rules, Jud. Conf. of the U.S., Agenda for Committee Meeting November 2025, at 5 (2025), https://www.uscourts.gov/sites/default/files/document/2025-11_evidence_rules_commitee_agenda_book_final.pdf (“The Committee has prepared, but not approved, a new Rule 901(c) that would require the opponent to provide evidence sufficient to support a finding that a challenged item was a deepfake”).

[20] Kellner, supra note 5.

[21] See id.

[22] Id.

[23] See Jonathan A. Porter, Will Juries Latch onto Deepfake Concerns Like They Did to Scientific Evidence During the Infamous “CSI Effect”?, A.B.A.: Criminal Justice Section (Feb. 8, 2024), https://www.americanbar.org/groups/criminal_justice/resources/committee-articles/juries-deepfake-concerns-csi-effect/.

[24] Id.

[25] Id. (explaining that after the television show “CSI” became popular, “[A] study found that nearly half of jurors expected to see some kind of scientific evidence in every type of criminal case, and nearly a quarter of jurors expected to see DNA evidence in every type of criminal case.”).

[26] Jeffrey Gottfried,About Three-quarters of Americans Favor Steps to Restrict Altered Videos and Images, Pew Rsch. Ctr. (June 14, 2019), https://www.pewresearch.org/short-reads/2019/06/14/about-three-quarters-of-americans-favor-steps-to-restrict-altered-videos-and-images/ (citing Amy Mitchell et al., Pew Rsch. Ctr., Many Americans Say Made-Up News Is a Critical Problem That Needs To Be Fixed 5 (2019); see also Porter, supra note 22.

[27] Porter, supra note 23.

[28] Porter, supra note 23.

[29] Bobby Chesney & Danielle Citron, Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security, 107 Calif. 1753, 1785 (2019).

[30] Id.; see supra text accompanying note 9.

[31] Chesney & Citron, supra note 29, at 1785.

[32] See discussion supra Part II.

[33] See generally discussion supra Parts II, III (explaining that the rise of deepfakes may require updates to evidentiary rules, increase reliance on costly forensic experts, and shape jurors’ perspectives on whether evidence is legitimate).

[34] See discussion supra Part II.

Leave a comment