Community Events

Last year we launched a series of members-only events, including an open-source office hour, a panel with experts on emerging mis- and disinformation threats, a UX working session, and a community event co-hosted with Leica. We're continuing these conversations to create more opportunities to connect. Whether you're an implementer, designer, creator, or educator, we hope you'll discover new ideas and practical ways to contribute to the content authenticity movement.

How to get started: 

To RSVP to events, you must be a CAI member and signed in.

  • Not a CAI member yet? Join us!
  • Create your account to see and RSVP to virtual and in-person events, working sessions, and more. When creating your account, use the same email address you used when signing up as a CAI member. Reference your membership welcome email and reach out with any questions.

🗓️ Tuesday, September 30, 12:00-12:45pm ET

This event is free and open to the public. 

The rapid proliferation of AI generated content is threatening the integrity of our information ecosystem. Misinformation and disinformation are just one dimension of a broader challenge to our ability to reliably know and understand the world, also known as epistemic risks.


High volume, low-quality synthetic content continues to contaminate the sources from which we find and share information. If AI models are trained on those outputs, they threaten to create a digital environment of decreasing reliability, diversity, originality, and nuance, while amplifying harmful biases and inaccuracies. Other epistemic risks include the erosion of trust in expertise and institutions, attention scarcity, and algorithm-induced echo chambers.


What effects will all of this have on our societies? What tools, policies, and infrastructures do we need to prevent information degradation and preserve the integrity of knowledge? How can we build epistemic resilience and strong foundations for knowledge in this evolving landscape?


Join Henry Ajder, CAI advisor and founder of Latent Space Advisory, Elizabeth Seger, Associate Director of Digital Policy at Demos, and Andy Dudfield, Head of AI at Full Fact for a critical conversation on epistemic risk, how content provenance technology plays a role in mitigating information decay, and how societies can adapt to maintain epistemic security in a time of increasingly complex information threats.

RSVP