That Little Question of Humanity…
If Scarlett Johansson can’t own her voice, what does the future hold for the rest of us?
We’ve talked about AI and copyright here: Can companies like OpenAI take whatever content they want to feed their models? And what happens when that content is, well, an individual’s human characteristics? A person’s voice and likeness belongs to them and therefore has a value, right? It’s a question that Hollywood has been demanding an answer to in the last year (it’s part of what drove the actors’ strike), and was reframed on Monday, when the new OpenAI chatbot seemingly replicated Scarlett Johansson’s voice, even though she’d turned down their offer. Twice.
As I read ScarJo’s statement about her interactions with OpenAI’s Sam Altman — allegations that OpenAI quickly moved to disprove — I kept thinking back to my conversation with Marvel actor and director Clark Gregg, to whom I spoke to for the final episode of Season Two, which will drop in a few weeks. We met last fall, when we testified before Congress in support of the Data Privacy Act. Clark was a prominent spokesperson for actors’ rights against the threat of AI during the SAG-AFTRA strikes, using his visibility to try to ensure that actors at all levels aren’t simply scanned and replaced.
During our interview, he told me that actors are trying to protect their ownership of their own voice, likeness, and eye and face scan, much like how writers were trying to maintain that they are something different than AI during the strike — “That they have an ability to copyright and have authorship in a way that a computer does not. We're trying to say that our face, our likeness, our voice, our — you know, whatever we call our personality — belongs to us. And that's something, because of biometrics, that's covered in that Data Privacy Act.” We are officially data.
Clark and others are advocating that a person should not only own his or her own scan — data containing their image, voice and movement that can be cloned by AI to create a digital replica — one should also be able to participate in some kind of economy around that scan. There have been scary stories about companies paying voiceover artists $400 to record some clips, telling them that they would be used only for academic research purposes…only to hear themselves voicing deepfake YouTube videos. They are being paid a pittance to replace themselves, as well as losing the right to control how what constitutes their talent is being used. (See ScarJo v. OpenAI, above.)
Clark has been scanned and re-directed, so to speak. But, he says, it’s never quite him: “They can change your facial expression. They can put tears in your eyes… All of a sudden you're kind of turning over your poetry, we would like to think, and they're putting a few more words in it.”
“They can change your facial expression. They can put tears in your eyes… All of a sudden you're kind of turning over your poetry, we would like to think, and they're putting a few more words in it.”
— Clark Gregg
He has spoken to Congress in support of the No Fakes Act, which seeks to protect actors’ biometric data and likeness. They are asking for the Three C’s: Consent, Compensation and Consideration. Consideration? Clark told me that actors are selling their scans and signing away their rights to them even after they’re dead.
So who is ultimately benefiting from these deals? Is it about increasing profit or improving an art form? Clark says that as far as his investigation into tech’s motives goes, “They don't seem super capable of explaining how it's going to evolve, or even be that clear about how it works. So there's been this idea, and I think it's one — I don't want to seem paranoid — that seems especially present in this moment in capitalism, which is: The profit drives the decision. And I think that's not bearing out to be necessarily that good for human people. It needs to be tempered somehow, and that should no longer be looked at as some kind of anti-business, anti-capitalist, but pro-human idea.”
When Altman reached out to Johansson, he was up front about what their partnership would be about — you know, get the actress who voiced the personal chatbot in Her to be the voice of a real-world personal assistant. He also framed it in terms of its benefit to humanity. As ScarJo wrote in her personal statement:
“He told me that he felt that by my voicing the system, I could bridge the gap between tech companies and creatives and help consumers to feel comfortable with the seismic shift concerning humans and A.I… He said he felt that my voice would be comforting to people.”
The questions at this moment, Clark says, seem very large: “Like, if you can do something, should you? The funny thing is that all of us have spent time watching or working on dystopian sci-fi things where these are all the early beats of the — pretty soon we end up as human batteries or running away from the hunter killer robots, one of whom looks like Arnold.”
“I've been scanned a bunch,” he explained. “The idea that we're going to just — for the sake of profit, for the sake of profit for corporations, as opposed to profit for people — we're going to turn ourselves into a bunch of zeros and ones? It almost feels like The Matrix, but we're deliberately climbing into the canisters and plugging ourselves in.”
Thinking back to ScarJo’s quote: People. Remember them? They are the terrifying missing piece of this discussion. This technology has conferred questions on what it means to be human: What makes us us, and who gets to define it? Most of all, why are we being so complacent when even mega-actors are having their identities ripped off? What does that mean for the rest of us?! Please join the discussion. Because your voice is valuable, too.
Worth the Read
Researchers at the AI company Anthropic claim to have an understanding of how large language models work, and even figured out how to turn off certain features. I’m optimistic!
The consultant behind those deepfake Biden robocalls telling Democrats not to vote during the New Hampshire primaries was fined $6m.
Axios reports that Amazon is using AI to use battery storage more efficiently and lower emissions. Yes, but don’t energy-hungry data centers kind of wipe out those gains…?