With political polarization reaching record levels in the United States, trust in our democracy is at a crossroads—only about one-third of Americans feel our system is functioning effectively, and nearly 60% experience stress and frustration when discussing politics with those who hold different views. In a recent San Francisco Chronicle op-ed, I offered a more hopeful perspective, drawing on the Neely Center’s efforts to foster thoughtful, respectful dialogue among young people. We believe that bridging divides can begin with cultivating empathy, active listening, and mutual understanding.
Polarization is just one of several complex challenges we’re working to address. We also face widespread misinformation, growing mistrust, and the difficulties many young people encounter as they grapple with feelings of isolation, loneliness, and mental health issues. At the Neely Center, our work addresses these challenges by bringing leaders, organizations, and students together—across diverse backgrounds and disciplines—to identify ethical policy interventions and innovative design choices that align emerging technologies with human-centered values.
Our efforts are rooted in the conviction that by actively engaging diverse voices, we can provide leaders and decision-makers with the insights and tools they need to guide technology toward greater understanding, resilience, and well-being—helping us move toward a more hopeful and collaborative future. I’m proud of the meaningful contributions our team and partners are making on this front. Read on to check out some of what we’ve been up to!
Sincerely,
Nate Fast, Director
Informing the European Union's Protection of Minors
Ravi Iyer, Managing Director of the Neely Center, recently joined a panel with Buffy Wicks, California Assemblymember and Martin Harris Hess, Head of Protection of Minors for the EU, to discuss policy approaches to protecting children online. A recording of the event is available online. The Neely Center also provided formal input into the upcoming code of practice for the Digital Services Act and participated in several workshops to support this work. We remain actively engaged with EU regulators in shaping the implementation of the Digital Services Act in ways that protect minors.
Informing OfCom's Measurement Strategy
Ravi Iyer also delivered a keynote presentation at OfCom's public event on "Evaluating Effectiveness of Online Safety Measures." During his presentation, he introduced both the Neely Center Design Code for Social Media and the Neely Indices. Ravi continues to serve on OfCom's academic panel, contributing insights to support the implementation of the United Kingdom's Online Safety Act.
Sponsoring Build Peace 2025 in the Philippines
The Neely Center was proud to sponsor the Build Peace Conference in 2024 and 2025. Technology impacts societies worldwide, and some of the most profound effects—both positive and negative—are seen in global majority countries. Last year, in Nairobi, we led several workshops to share insights on global platform design trends and gather input from diverse communities on their ideal technology products. This year, in the Philippines, we facilitated similar discussions on the progress of global design governance and measurement initiatives. Our goal is to inform global policymakers and engage peacebuilders on how best to design AI, social media, and mixed reality systems for the benefit of all.
Impact Guild Forum 2024 - Democracy & Media: Where Do We Go From Here
Nate Fast, Director of the Neely Center, was an invited speaker at the Impact Guild Forum 2024. Organized by the UTA Foundation—a nonprofit committed to harnessing the power of media, entertainment, and the arts for social impact—the Forum convened thought leaders to explore media's role in democracy and its potential to shape our shared future. Nate contributed to the plenary session, "Persuasion at Scale: Artificial Intelligence, Data, and the Future of Storytelling," offering insights on how emerging technologies are transforming the ways we communicate and foster connection in today's digital landscape.
How Social Media Changed from March 2023 through May 2024
In a recent Substack post, Matt Motyl, a Senior Advisor at the Neely Center, analyzes longitudinal data from the Neely Social Media Index to explore how positive and negative experiences have changed over time. An analysis of the rankings lead to four main points:
Traditional social media platforms (e.g., X/Twitter, Facebook, Reddit, TikTok) continue to tend to have more bad experiences than direct communication services (e.g., Facetime, text messaging) and more focused/niche social media platforms (e.g., LinkedIn, Pinterest).
Services are relatively stable in their rankings over the first and last waves relative to each other. The main exceptions are LinkedIn and Instagram, which both dropped on both negative experience dimensions. NextDoor also leapfrogged several platforms in the learning something useful dimension, while Facebook and X (Twitter) both dropped several spots.
Pinterest has overtaken the top spot for the fewest negative experiences, and held strong near the top of the list for the highest rates of learning experiences. Given that Pinterest does not prioritize user to user communication, it makes sense that they have the lowest rate of users reporting meaningful connections with each other on the platform.
X (Twitter) has the highest rate of negative experiences, and has held that position across all 5 waves of the survey. X (Twitter) also showed the biggest drop in the rate of users reporting learning something useful or important, falling from 7th to 13th out of 16.
Read the full post here.
Advancing Regulatory Coherence at State, Federal, and International Levels
The effects of technology platforms span state, federal, and global levels, requiring responses from policymakers across jurisdictions. Despite varying contexts, the challenges faced remain considerably similar. To address these, the Neely Center, in partnership with Knight Georgetown Institute and the Tech Law Justice Project, convened a gathering of state officials, legal scholars, federal regulators, and technology experts in Washington, DC. The discussion focused on identifying pathways to create effective, feasible, and constitutional policy solutions. One key outcome of this gathering was the development of a design taxonomy that is now being used across jurisdictions to guide regulatory responses. Ongoing collaborations leveraging this taxonomy aim to produce syndicated policy options ready for implementation in 2025.
The 8th Annual Psychology of Technology Conference 2024
The Digital Business Institute at Boston University’s Questrom School of Business hosted the 8th Annual New Directions in Research on the Psychology of Technology conference on October 12-13, 2024. This year’s theme, “The Quantified Society,” brought together a diverse group of industry leaders, behavioral scientists, technologists, and AI experts dedicated to fostering a healthy psychological future as AI becomes an integral part of daily life. Conference speakers included Madeleine Daepp, Microsoft Research; Johannes Eichstaedt, Stanford University; Emily Saltz, Google Jigsaw; Glenn Ellingson, Civic Health Project; Tara Behrend, Michigan State University; Andrea Liebman, Swedish Psychological Defence Agency; Chloe Autio, Autio Strategies; Ben Waber, MIT Media Lab & Humanyze; Dokyun "DK" Lee, Boston University; Kurt Gray, Ohio State, and many others (see full list here). The keynote was delivered by Luis von Ahn, CEO and co-founder of Duolingo. Special thanks to Carey Morewedge and Questrom for organizing and hosting.
Neely Center Design Code Informs European Design Governance
The Neely Center showcased its Design Code for Social Media at multiple key engagements across Europe. We participated in a European Commission-sponsored workshop focused on protecting minors and co-hosted an event in Brussels with civil society groups influencing the implementation of the Digital Services Act through the Council on Technology and Social Cohesion, which we co-chair. Additionally, the Council organized a workshop titled “Prosocial Tech Design Governance: Exploring Policy Innovations” at the European University Institute in Florence on 9th and 10th October 2024, bringing together leading scholars advancing regulatory approaches. Across these events, the Neely Center Design Code for Social Media played a pivotal role in shaping discussions on how technology platforms can be designed to positively impact society.
Neely Center Presents Design Code and Indices at Daniels Fund Lecture at University of Utah
Ravi Iyer, Managing Director of the Neely Center, delivered an invited talk as part of the University of Utah's Daniels Fund Lecture Series, which aims to provide students with an ethical perspective on current events. In his talk, Ravi shared insights into the challenges and opportunities for creating a more ethical and socially responsible social media environment. Drawing from his experiences at platforms like Facebook, he discussed how thoughtful design choices can mitigate negative societal impacts. Key topics included the importance of effective content moderation, incentivizing positive engagement, and addressing the influence of algorithms on user behavior.
Developing an AI Ethics for Leaders Curriculum
Under the leadership of David Evan Harris, a Senior Advisor to the Neely Center, the "AI Ethics for Leaders" course has been taught over several semesters at the University of California-Berkeley with two sections being taught this year—one with undergrads and one with international students. Ravi Iyer, Managing Director of the Neely Center, has guest lectured both last semester and this semester and the Neely Center is continuing to refine the curriculum to offer it to other institutions. USC's Neely Center both conceptualized the course and funded its initial curriculum.
Neely Center Presents to the Trust and Safety Community
As part of our outreach to technologists, Ravi Iyer presented our Design Code for Social Media and findings from the Neely Indices to hundreds of technology professionals at Trustcon 2024—the world’s leading conference for the Trust and Safety community. The Neely Center also participated in the Trust and Safety Research Conference, which brings together academics and industry professionals working in trust and safety.
Salon on AI and Human Flourishing
Neely Center Director, Nate Fast, was invited to speak on a panel at a private gathering of influential technologists, researchers, investors, and policymakers in San Francisco. The day’s discussion focused on how to ensure that human-AI interactions improve – rather than worsen – the problem of social isolation and loneliness that many are experiencing. In the coming year the Neely Center will be working with partners to identify policy and design solutions to address this issue. Learn more about the HumanConnections.AI event here.
Neely Social Media Index in Understanding Newsfeed Quality in the 2024 Election
In a recent Tech Policy Press article examining the quality of social media newsfeeds in the lead-up to the 2024 election, the author talked about the Neely Social Media Index. Emerging evidence from the index highlights the impact of declining information quality on user engagement, suggesting that in the long term, platforms may have an incentive to invest in content integrity. According to the USC Neely Social Media Index, 30 percent of adults reported seeing content they considered “bad for the world” on social media, particularly on platforms like X and Facebook.
Design Ideas at the 2024 Online Safety Forum in Nigeria
Our partners at Search for Common Ground and Build Up continue to champion the power of design in enhancing the global impact of technology platforms. Their efforts are particularly significant in contexts where civil society groups lack trust in governments to make fair content-related decisions. At the 2024 Online Safety Forum in Lagos, Nigeria, they showcased several innovative design ideas, including contributions from the Neely Center's Design Code for Social Media. Their session, titled "Designing for Good: The Role of Prosocial Tech Design in Ensuring Safety and Cohesion," explored how thoughtful tech design can promote safety and foster social harmony. Key takeaways from the session are available here.
EY.ai Global AI Advisory Council to Guide AI Strategy
EY invited thought leaders from business, government, and academia to join the EY.ai Global AI Advisory Council. This newly established council unites top thinkers to guide EY’s AI strategy and address the rapid technological and market shifts shaping AI today. The council’s focus spans several domains, including customer experience, talent, human behavior, and industry impact. We’re thrilled to share that our director, Nate Fast, has been invited to join the Council. Nate will contribute insights on ethical considerations in democratizing AI, joining a diverse group of leaders to help navigate the opportunities and challenges of this transformative field.
Council on Technology and Social Cohesion
Last week the Council on Technology and Social Cohesion held its 6th workshop of the year in Waterloo, Ontario in partnership with University of Waterloo's Grebel Peace Incubator, Centre for International Governance Innovation (CIGI), GoodBot, and the Balsillie School of International Affairs. Members of the Council presented a draft "Blueprint for Prosocial Design Governance." Through a series of talks and panels, including a talk by Nate Fast, Director of the Neely Center, participants discussed a system-wide analysis of "Tech for Good" movement and toured the world-famous local prosocial tech ecosystem in Waterloo.
Thanks for reading. As always, if you have any thoughts, suggestions, or questions about our work, don’t hesitate to reach out!