Fay Johnson is an expert in the field of Trust & Safety. Johnson’s tenure combines deep expertise in behavioral science with hands-on experience in product development, content moderation, and online safety at leading technology companies including Meta, Twitter, and Nextdoor.
Johnson has designed and implemented systems that promote respectful interactions, fairness, and community trust, shaping how billions of people engage online. While at Meta, Fay was the product lead on designing and building the Oversight Board, and its integration into the broader Facebook ecosystem.
She is currently a fellow at Harvard University’s Berkman Klein Institute for Internet and Society where she is doing applied research and product development focused on depolarization and increasing civility in online discourse.
During our chat, we discussed Meta’s decision to cut back on Fact Checking, which includes both long-term and short-term consequences.
We also dove into the roles of platforms like Meta, Twitter, and Nextdoor – specifically, if they have any moral or ethical responsibility with their Tenants and Conditions.
We then talked specifically about some of the lessons she learned while running her product teams, and what the future may hold for these platforms with technology and systems like AI and Community Notes as possible remedies for what ails our online communities.
I hope you learn as much as I did from Fay.
Watch Episode:
This is a public episode. If you’d like to discuss this with other subscribers or get access to bonus episodes, visit truethirty.substack.com/subscribe