Renowned actor Scarlett Johansson has recently made headlines for her decision to pursue legal action against Chat GPT, a popular AI platform, over allegations that the platform utilized a new voice that closely resembled her own.
The controversy arose when Chat GPT introduced a new virtual assistant named Sky, whose voice bore a striking resemblance to Johansson's. This similarity prompted Johansson to take action, claiming that the platform had not obtained her consent to use her voice.
In a statement released by Johansson's legal team, it was revealed that Open AI CEO Sam Altman had approached the actress in September with an offer to be the voice of Chat GPT's new virtual assistant. However, Johansson declined the offer after careful consideration and for personal reasons.
Despite Chat GPT's initial denial that the voice of Sky was derived from Johansson's, the company eventually decided to take down the voice following the legal threats from the actress. Johansson expressed shock, anger, and disbelief over the situation, emphasizing her lack of involvement in the creation of the AI voice.
The incident has sparked a debate about the ethics of using celebrity voices for AI technology without their explicit consent. Johansson's case serves as a reminder of the importance of obtaining proper authorization when incorporating real voices into virtual assistants.
As the controversy continues to unfold, both Johansson and Chat GPT are expected to engage in legal proceedings to address the issue and determine the appropriate course of action moving forward.