Event Recap | Proof of Personhood: Provenance and AI Avatars
AI avatars can look identical to us, and they can speak languages that we humans can’t. They can perform consistently without food or sleep in any time zone and, at their most advanced, they can carry on full conversations with humans or even other avatars. They can unlock new business opportunities, develop creative projects, and supercharge productivity in an array of industries by enabling us to augment and scale our likenesses.
However, these digital clones introduce profound challenges for intellectual property protection and digital rights. Misuse or unauthorized replication of likenesses can compromise trust, exploit identities, and destabilize creative economies.
Recently, we hosted a conversation with experts to explore these issues. We discussed ethical and legal boundaries; transparency, consent, and control; and the role of content provenance in protecting intellectual property and digital identities.
Henry Ajder, CAI advisor and Founder of Latent Space Advisory, was our host for the event. He moderated a group discussion with our three guests about compensation models, levers of control in the use of likenesses, and what we can learn from the aversion that arises from having been fooled by AI.
Natalie Monbiot, Founder of Virtual Human Economy, spoke about how AI avatars have evolved in the last several years, and she shared examples of commercial applications in which AI can benefit the humans they’re based on. She also explained how trust with businesses and users is critical to the effective use of AI avatars and how Content Credentials are one of many tools that can help build that trust.
Kelsey Farish, a media and AI lawyer who specializes in working with creatives and performers on digital replica rights, described the importance of good industry practice, technological safeguards, and regulations. She also discussed the need to define reality, protect creative autonomy, and balance power in creative contracts, and she explained how standard contracts that grant broad future usage rights haven’t caught up with the sophistication of today’s AI technology.
Erik Passoja, Co-Chair of the LA New Technology Committee at SAG-AFTRA and an accomplished actor in his own right, spoke about his experience having his likeness from a performance capture used in a video game without his knowledge or consent. He provided an overview of current California and federal digital identity protection laws and the additional legislation needed to help enforce them. He also described a model for using provenance data in the content authentication process to verify consent, and he stressed the urgent need for provenance at the point of image or voice capture.
“We performers are the canaries in the coal mine of digital identity,” he said. “You take away a carpenter’s voice, and they can still build. You take away a voice actor’s voice, and they’ll never work again.”