Imagine doing tech research… but from outside the tech industry? What an idea…
More like this: Nodestar: Turning Networks into Knowledge w/ Andrew Trask
So much of tech research happens within the tech industry itself, because it requires data access, funding, and compute. But what the tech industry has in resources, it lacks in independence, scruples, and a public interest imperative. Alix is joined by Brandi Guerkink from The Coalition of Independent Tech Research to discuss her work at a time where platforms have never been so opaque, and funding has never been so sparse
Further Reading & Resources:
Disclosure: This guest is a PR client of our consultancy team. As always, the conversation reflects our genuine interest in their work and ideas.
**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**
Governments around the world are using predictive systems to manage engagement with even the most vulnerable. Results are mixed.
More like this: Algorithmically Cutting Benefits w/ Kevin De Liban
Luckily people like Soizic Pénicaud are working to prevent the modern welfare state from becoming a web of punishment of the most marginalised. Soizic has worked on algorithmic transparency both in and outside of a government context, and this week will share her journey from working on incrementally improving these systems (boring, ineffective, hard) — to escaping the slow pace of government and looking at the bigger picture of algorithmic governance, and how it can build better public benefit in France (fun, transformative, and a good challenge).
Soizic is working to shift political debates about opaque decision-making algorithms to focus on what they’re really about: the marginalised communities who’s lives are most effected by these systems.
Further reading & resources:
**Subscribe to our newsletter to get more stuff than just a podcast — we host live shows and do other work that you will definitely be interested in!**
Seeing is believing. Right? But what happens when we lose trust in the reproductive media put in front of us?
More like this: The Toxic Relationship Between AI and Journalism w/ Nic Dawes
We talked to a global expert and leading voice on this issue for the past 20 years, Sam Gregory to get his take. We started way back in 1992 when Rodney King was assaulted by 4 police officers in Los Angeles. Police brutality was (and is) commonplace, but something different happened in this case. Someone used a camcorder and caught it on video. It changed our understanding about the role video could play in accountability. And in the past 30 years, we’ve gone from seeking video as evidence and advocacy, to AI slop threatening to seismically reshape our shared realities.
Now apps like Sora provide impersonation-as-entertainment. How did we get here?
Further reading & resources:
Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!
Post Production by Sarah Myles
What happens when AI models try to fill the gaping hole in the media landscape where journalists should be?
More like this: Reanimating Apartheid w/ Nic Dawes
This week Alix is joined by Nic Dawes, who until very recently ran the non-profit newsroom The City. In this conversation we explore journalism’s new found toxic relationship with AI and big tech: can journalists meaningfully use AI in their work? If a model summarises a few documents, does that add a new layer of efficiency, or inadvertently oversimplify? And what can we learn from big tech positioning itself as a helpful friend to journalism during the Search era?
Beyond the just accurate relaying of facts, journalistic organisations also represent an entire backlog of valuable training data for AI companies. If you don’t have the same resources as the NYT, suing for copyright infringement isn’t an option — so what then? Nic says we have to break out of the false binary of ‘if you can’t beat them, join them!’
Further reading & resources:
**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**
Mozilla Foundation wants to chart a new path in the AI era. But what is its role now and how can it help reshape the impacts and opportunities of technology for… everyone?
More like this: Defying Datafication w/ Abeba Birhane
Alix sat down with Nabiha Syed to chat through her first year as the new leader of Mozilla Foundation. How does she think about strategy in this moment? What role does she want the foundation to play? And crucially, how is she stewarding a community of human-centered technology builders in a time of hyper-scale and unchecked speculation?
As Nabiha says, “restraint is a design principle too”.
Plug: We’ll be at MozFest this year broadcasting live and connecting with all kinds of folks. If you’re feeling the FOMO, be on the look out for episodes we produce about our time there.
Further reading & resources:
**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**