Happy Friday! It’s tech reporter Alexandra Sternlicht here to help you finish the week.
Yesterday, after a brief four-and-a-half hour deliberation, a jury in the Southern District of New York found FTX founder and CEO Sam Bankman-Fried guilty on seven (of seven) charges, including wire fraud and conspiracy to commit money laundering. On the same day, the Federal Trade Commission un-redacted portions of their lawsuit alleging that Amazon algorithmically stifled competition to monopolize e-commerce.
A common thread in the evidence presented in both cases: Signal’s self-deleting messages.
“Amazon executives systematically and intentionally deleted internal communications using the 'disappearing message' feature of the Signal messaging app," reads the agency’s freshly unredacted complaint. At SBF's trial, prosecutors claimed the crypto kingpin and his inner circle communicated in Signal chat groups with messages that had an auto-delete function.
While corporate use and misuse of self-deleting messages are suddenly in the spotlight, the disappearing messages have long plagued another important group of regulators: parents.
This week I conducted an interview where self-deleting Snapchat messages caused anguish for a Florida-based mother.
The mom I interviewed banned her five children from Snapchat after an incident she believes involved a predator coercing her daughter on the platform. She told me that two years ago her then-13-year-old daughter connected with a 20-year-old male drug dealer on the social platform. The two formed a relationship, which she says ended when her daughter was caught with a THC vaporizer pen—allegedly supplied by the man—at her middle school. The man was never booked for selling drugs or pedophilia because “it was almost impossible to prove it was him,” due to Snapchat’s disappearing messages, says Webster. “It was a terrifying, gut-wrenching experience as an adult.”
A Snap representative says the platform offers extra protections for teens “to help keep the focus on connecting with close friends, preventing unwanted contact from strangers, and providing an age-appropriate content experience.”(Snapchat is also being sued, alongside Meta, TikTok and YouTube, in Northern California in a personal injury suit that alleges the platforms are harmful and addictive to children.)
This is nothing new. Ten years ago, right after Snapchat garnered an $800 million valuation, a parent who served as a director at a digital forensics firm penned a USA Today essay about the “nefarious use” of self-deleting messages. “Kids who are bullied or adults who are stalked may struggle to prove their cases. Predators can lure victims, comforted by the assumed ephemeral nature of the communication,” she wrote.
Though Signal was founded in 2018, years after Snapchat, the end-to-end encrypted messaging app had 40 million monthly active by January of 2022, per the BBC, and over 100 million downloads on the Google Play store alone. The app, owned by the nonprofit Signal Foundation, does not host throngs of young users like Snapchat does, but it has become a favorite of tech workers, journalists, and others looking for an added layer of privacy with sensitive messages. (Signal did not respond to queries by Fortune about the ongoing litigation involving their disappearing message technologies.)
With the legal scrutiny on disappearing messaging platforms, will we see companies, parents, and regulators forcing communications back onto traditional venues like iMessage? The jury is still out.
Alexandra Sternlicht
Want to send thoughts or suggestions to Data Sheet? Drop a line here.
Today’s edition was curated by David Meyer.