I posted earlier today about a lawyer's filing unchecked ChatGPT-generated material, complete with hallucinated cases. But even if lawyers manage to avoid that, I'm sure that many self-represented litigants will be using ChatGPT, Bard, and the like, and won't know to properly check the results.
Indeed, a quick CourtListener search pointed out three self-represented filings (1, 2, 3) that expressly noted that they were relying on ChatGPT. That suggests that there are many more that used ChatGPT but didn't mention it. (To my knowledge there's no requirement to disclose such matters.)
Note also that self-represented litigants are quite common: Even setting aside prisoner filings (since I'm not sure how many prisoners have access to ChatGPT), in federal court, "from 2000 to 2019, … 11 percent of non-prisoner civil case filings involved plaintiffs and/or defendants who were self-represented." And I expect this would be even more common in state courts, for instance in divorce and child custody cases, where I'm told self-representation is even more common. (Family court plaintiffs might feel like they need to file for divorce, even though they can't afford a lawyer, and defendants might get sued for divorce or over child custody disputes even though they don't have any money that the plaintiff can recover.) And even not limiting matters to such categories of cases, it appears that "The caseload of most California judges now consists primarily of cases in which at least one party is self-represented."
See also this post from late February for an early query along these lines, in which one commenter did mention that he had used ChatGPT-3 for a state court filing; and see this post from January for the DoNotPay traffic-ticket-litigation story. If you know of any cases involving pro se litigants using ChatGPT and similar programs, please let me know.
The post ChatGPT Coming to Court, by Way of Self-Represented Litigants appeared first on Reason.com.