Taiwan’s former Digital Minister Audrey Tang and economist Glen Weyl share an infectious techno optimism. In a surprisingly spiritual conversation, they tell us why they want to invite more, not less, technology into the democratic process. We also talk to Bianca Wylie from Digital Public about the limitations of tech as a tool of salvation.
Resources
Group Chat is funded by the Rideau Hall Foundation and the Flanagan Foundation.
Transcript
[00:00:00] Sabreena Delhon: Hi, I'm Sabreena Delhon. Welcome to Group Chat, where we make sense of what's happening in our democracy with a few friends.
[00:00:10] Glen Weyl: Actually, there was an earthquake in New York just after we left there too. So I started to get other feelings of divine intervention. Audrey, literally leaving earthquakes in her wake.
[00:00:20] Audrey Tang: Yeah. Dalai Lama in his holiness says that's how those good ideas break earth, right? When things break earth, the earth is going to be shaken a little bit.
[00:00:31] Sabreena Delhon: Audrey Tang was Taiwan's first digital minister, a hacker turned politician, and now a tech emissary. She's the co-creator of the book, Plurality: The Future of Collaborative Technology and Democracy, along with Glen Weyl, an economist and the founder of the nonprofit, RadicalxChange. I don't think I'm exaggerating when I say the future of democracy depends on how we use technology and our values around it. Their vision of plurality is infectious in its techno optimism. It invites technology into the democratic process for community and consensus building, not outrage and profit.
[00:01:16] Audrey and Glen get into plurality in more detail and talk about tech in spiritual and maybe even challenging ways. You will also hear from a Group Chat alum throughout the episode–Bianca Wylie.
[00:01:32] Bianca Wylie: Hi Sabreena, great to be here!
[00:01:34] Sabreena Delhon: Bianca is a partner at Digital Public, a digital rights agency, and someone I paint as more of a techno soberist.
[00:01:44] But first, here's Audrey Tang and Glen Weyl, and their case for how technology can strengthen democracy. I want to start with how we think about democracy. You both believe we need to look at it in a different way, as a technology. Can you tell me what that means to you, democracy as a technology?
[00:02:09] Glen Weyl: I think technologies are things that we expect to get better and to sometimes work and sometimes not, and you expect to have bugs all the time. You expect to have viruses, and you expect to be doing something against those. So these are all the things that like, there's gonna be attacks on the democratic system, there's gonna be stuff going wrong and like you gotta do something about that. And so I kind of feel like it would be great if we could have more of that attitude.
[00:02:38] Audrey Tang: Yeah, and there's also the other side, which is to technologists. Many technologists want to contribute to a society where they have more meaningful conversations with people, have a meaningful public square. Many of the early internet felt that way. And so they would also like to contribute toward that direction as well. So by saying democracy is a technology, it is also an invitation to technologists to contribute.
[00:03:06] Sabreena Delhon: So how does thinking about democracy as a technology change our practice of democracy and change our practice of politics?
[00:03:14] Audrey Tang: Yeah, you can think of democracy as a bandwidth. That is to say, how rich can you provide your input to the collective decision making, the collective agenda setting? Now, if it is just one vote every four years, then that vote among eight candidates is just three bits of information. And you can also analyze it in terms of latency. How long do you have to wait until the next meaningful input? And for many systems, that's one year, or two years, or four years, right?
[00:03:49] And so if you think of democracy as a technology, then naturally, you would ask, are there ways to involve more people in a shorter time period that shows the ongoing divisions? That shows the things that people are in dispute with each other without having to wait for four years? You can have shorter rounds and you can have richer conversations.
[00:04:11] Sabreena Delhon: Mm-hmm. Well, you present a joint vision for the present and the future called plurality. Picture yourself at 10, 11, 12 years old. How would you explain plurality to your younger self?
[00:04:26] Glen Weyl: Well, I happen to have a two-and-a-half and a five-year-old daughter, so it's something I have to try to do a bit. I guess I would say that there are lots of different types of people in the world, and it's sometimes hard to talk to them, it's sometimes hard to work with them because we don't understand them. The reason that we're making all these computers and so forth, is to allow us to work with and understand them better. That's the point of all of this.
[00:04:58] Audrey Tang: Yeah, that's great. I think the fact that we're now having these conversations across a lot of space and across a lot of time also, that is somehow taken as a given.
[00:05:11] But before the internet, it was not a given at all. So just the ability to connect across unlikely distances. That by itself, is very very powerful and plurality is about saying that it is not just the distance in terms of time. It is also the distance in terms of values, cultures, ethnicities, ideologies. There's all sorts of ways to inter-network between those different pockets.
[00:05:41] Sabreena Delhon: So what makes plurality different from other paths that we've explored or that we're on when it comes to technology?
[00:05:49] Audrey Tang: Well, the internet was the kind of pinnacle of thinking when it comes to plurality. So in a sense, we were always, since the internet started, on the plurality path, it was just that there are now a few applications on the internet that dominates people's attention, that dominates people's time, that dominates people's imagination so that the rest of the internet that are experimenting a lot in terms of how to collaboratively fact check, how to do collaborative journalism, how to make sure that you can always bridge common knowledge across people who hold very different political views, so on and so forth. I mean, all these are part of the internet. So in a sense, plurality is just to say, look, these corners of the internet that carry the original promises of the internet, instead of the attention-seeking, addiction-building corners that commands our attention, there are other ways that we can spend our time in and when we spend our time in there we feel much more empowered and much more democratic also.
[00:07:04] Glen Weyl: And, you know, to a large extent, all we're doing, I think, is returning to some just like very basic eternal truths. Gandhi often said that there are some very basic, very broadly held truths that we just forget over and over again. And you know, one way I like to put it is, if you just look at the Christian Bible, the New Testament, it says, you know, what is proscribed? Idolatry. Don't build an artifact that you place above your creator in your hierarchy of worship. That's really the core of what, you know, we have an issue with, in a lot of technology. So if we just apply those ideas to technology, I think we'd be in very good shape. And you know, really all the book is doing is just kind of like working out systematically, like what does that mean?
[00:07:57] Sabreena Delhon: In reading your material and preparing for today, I felt that faith current through what you have developed together, and it made me think about how third spaces have decreased. They're not as available as they used to be, people aren't part of practicing faith as much as they used to. Other third spaces are eroding, and that's part of democratic backsliding right now. So I did wonder about how intentional you are with that as well. Like how explicit are you, can you be in these types of conversations about drawing on faith, going back to first principles, you know, can you speak to that?
[00:08:38] Audrey Tang: Yeah. Taiwan is the second most religiously diverse place. I'm a Daoist. My grandparents are Catholics on my father's side, and there's also a lot of Buddhism, especially Tibetan in our family, so on and so forth. So I think the fact that we can easily talk about how those spiritual traditions influenced our technical design and the fact that I think all of that enriches the conversation around design, around technology, because otherwise then, you're basically substituting those very rich metaphors and ideas. Then you're basically forced to use a very sterile vocabulary to talk about possibilities. And then you just build one large skyscraper or something with choices of elevators and elevator music, but they're all very vertical, there's no cross-linking between the various cultures in a transcultural way. So I do think these spiritual or faith-based metaphors or languages serve a very important transcultural role that bridges across different cultural practices.
[00:09:53] Glen Weyl: Yeah, I think often the metaphor that a lot of technologists have in their mind is that, you know, you're on top of the earth. And what science and technology are letting us do is dig down, throw away superstition, get to the core of truth at the center. And I would encourage us to think in an opposite way–that we're on the surface of a sphere and we're building trees out into the infinite abyss of mystery. And the farther those trees grow out, the more we can glimpse that mystery. The more we skip the surly bonds of Earth to touch the face of God.
[00:10:32] Sabreena Delhon: Okay, listeners, let's come back down to Earth. Bianca Wylie, let's bring you in here. You've joined us on Group Chat before. Thanks for being here again.
[00:10:40] Bianca Wylie: Thanks for having me.
[00:10:41] Sabreena Delhon: So what do you think about the metaphor that Glen brings up here. that technologists see themselves as on top of the world drilling down, instead of seeing how technology can help us look up?
[00:10:56] Bianca Wylie: So I think the idea of where are we looking from is a good consideration, right? And there's this phrase, “the view from nowhere.” You know, like something's happening, but we're not naming where and who in the action. So I think it's a really good idea for us to consider the vantage point. But then I also think, the next step to that, is getting down to the material. Who are we talking about specifically? What are the relationships specifically? Because I think there's a bit of escape potential when you go into the abstract. And so I think I'd offer that as a balance, you know, as a “yes and” to the idea of thinking about the vantage and the, you know, where we're looking from.
[00:11:39] Sabreena Delhon: So it seems like part of this is a bit of a notion of purity in technology and that's kind of being, I don't know, troubled right now in a really intense way. What do you think is the purpose of technology?
[00:11:55] Bianca Wylie: It's relations, it's between people. And so the purpose of technology can be myriad. You know, there's a lot you can do with it, but first of all, you can't excise the fact that there's this human influence. So, to really go directly to “what is the purpose of technology?”, I think we have to know it's guided by human intention and values, which I think is aligned with what Audrey and Glen talk about. But I think it's the size and proportion of how much power we assign to people. ‘Cause for me, we gotta put the hand really heavy on the people part of that equation, and not escape that part by looking at the technology so much.
[00:12:32] Sabreena Delhon: This is really helpful, Bianca. Thank you so much. Right now let's go back to Audrey and Glen.
[00:12:40] I'm gonna shift gears now and play a clip from a young person that we spoke to recently, Ottavia Paluch. We asked her where she gets her information, including about elections. This is what she said.
[00:12:53] Ottavia Paluch: Being a Gen Z kid, Twitter has been enormous. And it's just a really easy way to get up to speed at what's going on in the world and in Canada. And I think, well, I know on social media, it's hard to distinguish stuff that's been vetted and fact checked and stuff that isn't, and when your friends and peers are posting stuff online that they assume is true may not actually be true, it is concerning 'cause you might not realize it yourself.
[00:13:23] Sabreena Delhon: So you each are inviting more technology into the democratic process and that can seem like a risk in this time. How does plurality deal with AI-driven threats like deep fakes and misinformation?
[00:13:38] Audrey Tang: Yeah. First of all, the bridge-making algorithms we introduced in the book were prototyped and deployed in a very scalable way, open source also on Twitter, when it was still called Twitter.
[00:13:55] Sabreena Delhon: Let me jump in here to break down a few things. Okay. What is a bridge-building algorithm? It's about finding common ground for people with different viewpoints. We're building bridges. This is different from algorithms that focus on polarization and outrage. Governments, for example, use platforms with bridge-building algorithms to collect public input for policymaking. Back to Audrey now.
[00:14:22] Audrey Tang: Now that it's called x.com, you can still join the Community Notes, which is the collaborative contextualizing system that Twitter uses, and it's explicitly modeled after and inspired by one of the earlier bridge-making algorithms we introduce in the book, also called Polis.
[00:14:44] The idea is that if there's a piece of context that is broadly accepted by people who would otherwise not accept the same things, then those things should be floated to the top instead of the more divisive things floated to the top only. And in general, I think Gen Z people understand that just because you see a video online, it doesn't mean that something like what the video showed actually happened. They all understand now that you can easily change faces, you can easily change voices. Some people are catching up to the fact that you can now do it interactively, like in real time, right? And so I think very soon, when everyone has the ability and the potential of just wearing anyone in real time, then we will switch to a very different norm. We'll switch to the old norm where you check the source. You understand that this comes from somebody who is very rigorous, like a journalist, like somebody who writes a paper academically, and things like that. So the source will be the main thing that you look for when checking the ideas, whether it's true, or not, in the video.
[00:16:03] Glen Weyl: I would encourage us to get past the word “fact checking” and maybe use the idea of “fact building” a little bit more because it invites us to participate in a process and to understand that “what is a fact” is a socially co-created thing and it evolves, and it's never final. There's an expression from Andre Gide, which is, “admire those who seek the truth and fear those who found it.” And I think fact checking puts us into that position of, you know, certainty and fact building maybe into the position of exploration and joint seeking.
[00:16:43] Sabreena Delhon: Bianca, I wanna bring you back in here.
[00:16:45] Bianca Wylie: I'm here, Sabreena.
[00:16:47] Sabreena Delhon: What do you think about fact building as a collective process, and using technology to help us do that?
[00:16:55] Bianca Wylie: I think that fact building as a collective process is a piece of the puzzle, but it also points us to who's taking part in that collective process, who isn't, and I think we have to look at the moment we're in right now and see how deep we are in a technocracy. And I say this because we're looking at democracy and authoritarianism as the two major words to contend with, and that's real. I mean, we are living beside authoritarianism right now in Canada. And I think it's more, and this is pluralist again, this is how I'm living in the world, this is how you're living in the world, how do we coexist if we have different belief systems, different value systems? Because I just think that's a pragmatism that we need to embrace and we need to get into harm reduction measures because there are people who don't want to live in a society with people that don't agree with them.
[00:17:55] And to me, this is an important piece of consideration when we talk about coming together to do something collectively. I think from the times of neoliberalism to now, we need to understand, culturally, that there are a lot of people that are disinterested in the collective project, and you have to start your thinking from that point. You cannot start from the point of assuming everyone wants to come and participate.
[00:18:21] Sabreena Delhon: So what does this mean for mis- and disinformation? Can tech help with that?
[00:18:26] Bianca Wylie: I think with mis- and disinformation, I don't think the scale of the internet is conducive to the types of relationships where you can really move someone in their beliefs.
[00:18:39] So I don't think it's nothing, but in terms of the iterative process of how we make truth and how we understand information and how we learn, or how we change our minds, I do not think that scales. I think that happens in trusted relationships between people and sometimes between organizations. So, unsatisfying, the answer here is: it might be helpful, but it's also really difficult to weigh the helpful against the harms that are inherent to scale. Relationships that are healthy and deep are inefficient. And so I think we have to be pragmatic enough to say, yes, there's all this potential. But like policy, it's always about trade-offs.
[00:19:22] So I think a lot of people in technology have a hard time believing that that scale potential can't somehow be leveraged for good. And I think there's a difficult truth in the fact that relationships that are good and healthy and rely on trust, are inefficient and do not scale.
[00:19:37] Sabreena Delhon: So what's your overall take on the techno-optimism–can tech bring us together?
[00:19:45] Bianca Wylie: So when I think about this question, I think about democracy writ large, and I think one of the points Audrey made was about democracy being a technology. And so I ask myself, if we haven't been able to use the technology of democracy to bring us together, where are we getting the confidence that a more technocratic vehicle is gonna do it? Like that's where I struggle because I think all of these systems do have opportunities with them–whether it's technology systems, whether it's democracy systems–all of them do. But the truth is that what comes as a benefit from these systems, it comes on the back of responsibilities that we all have to each other.
[00:20:28] And I think, unfortunately, when I look at the history of liberal democracy and representative democracy, we have not been using that system to bring ourselves together well. We haven't been using that system to share our material wealth well. We haven't raised the floor to the point that everybody in Canada has a decent and good life. And we haven't done it with the democracy technology. So I am not bullish about the idea that we can push that failure into technical systems, which of course, they inherit our values and our approaches. So I think this is where you can't escape technocracy right now. It's failing us, whether it's in internet systems or whether it's in democracy. This technocracy is not working for everyone, and I think that's the moral and spiritual question of the moment we're in.
[00:21:25] Sabreena Delhon: Thanks, Bianca. Let's go back to Audrey and Glen. I wanted to speak with you both about news and material that's relevant to our conversation in Canada. So we saw the release of two significant reports about foreign interference in our democracy and key findings for that. China was seen as a prolific actor, quote, “prolific actor in local, and federal elections.”
[00:21:49] And also that diaspora communities in Canada experience some of the most harmful impacts from foreign interference. This includes having their families threatened who live outside of Canada, and of course immigrants make up almost a quarter of the country's population. This is a pivotal moment for us right now. How do we leverage it to be a healthier democracy? What's your advice?
[00:22:14] Glen Weyl: Audrey's had the benefit of the prolific nature of Chinese actions online and has learned how to benefit from that prolific behavior. So maybe she had something to share.
[00:22:28] Audrey Tang: Yeah. The idea of prebunking is quite important. Reports, such as the one that you just cited, make sure that before the interference, before the polarization attacks happen, people already see it coming. It's very important that we make sure that everyone in this society, even before those conspiracy theories, even before those artifacts of polarization appear, see what it will look like.
[00:23:00] You can show actually, literally, what it looked like in other jurisdictions when they tried it before and so on. Because it is, as Glen said, not about the check mark that says, “oh, we've checked the facts.” It's not like that. That never inoculates a mind. It is the act of going through the fact building process of starting it somewhere.
[00:23:25] So the idea is not that we fact check after the fact. The idea is that we take the kind of reports you just mentioned and turn them into viral ideas that people already know, “oh this is coming, and this is how you can spot it, and this is how you can counter it when it comes.” So for example, we already knew that there will be polarization attacks that contest the process of election, of the counting process, which is why we invite everyone with their cameras to attend the counting ceremony. So it's paper only, it takes more than four hours, like five hours, six hours to finish. But the thing is, every single paper ballot is captured by multiple cameras belonging to multiple party members from multiple angles.
[00:24:18] Sabreena Delhon: To jump in here, Taiwan's last presidential election was in January 2024. Instead of using technology to count ballots, they were manually and publicly counted. Poll workers read out loud and showed each ballot for public scrutiny. Observers said this level of transparency helped with public trust, a win for analog.
[00:24:42] Audrey Tang: And it's this radical participation and transparency that then when the attacks, the deep fakes, and cheap fakes, alleging election rigging actually came the night after the election, then it enables us to very quickly say, “see, we prebunked that and nothing like that actually happened, if you look at the documents.”
[00:25:06] Sabreena Delhon: A few takeaways from Audrey are: no burying our heads in the sand; think about preemptive actions; and there's one more tool, a surprising one, that Audrey uses to fight misinformation–humor.
[00:25:23] Audrey Tang: So during the pandemic, we had a hotline. Now, early 2020, we started rationing out masks. And at the time there was a young boy, I think 10 years old, who called the hotline saying, “okay, it's great you're rationing out masks, but all I got was pink ones. I don't want to wear these to school, I'm a boy, I'll get laughed at, right, I'll get bullied.” And so very quickly the people in each ministry started talking about how to respond to this challenge, right? And imagine if we do debunking, the top down way. If the Ministry of Education comes out and says, “bullying is bad,” would that fix anything?
[00:26:03] It would not fix anything, right? So, but it's going to be viral. If we don't respond to this issue then maybe a couple days from that point, it becomes something that decimates, right, the people's willingness to wear masks. So we have to do something. And so at the 2:00 PM press conference, everybody wore pink, regardless of gender. The Ministry of Health and Welfare, all the staff and so on, wore at least the pink mask. And then the fashion brands and everything you saw on social media, they all turned pink. And so overnight, pink became the most trendy color. And so the next day, I'm sure the boy is the envy of the class because only he has the rare collection of the colour that is the most hip in the nation. And so that's humor over rumor. It is a story that brings a smile–I see both of you smiling–to people. And it is viral in the sense that everybody can join and it's like the ice bucket challenge, but more comfortable I guess, just by wearing something pink to join the movement.
[00:27:14] And so that not only stopped the bullying, it also stopped any polarization attack that our adversaries can mount around the use of masks, for example. If you pre-bunk with humorous memes and messages, then it captures the attention in a way that's not violent, and then the disinformation, the polarization, becomes much harder to mount over those topics.
[00:27:40] Sabreena Delhon: Very good advice. I've got another clip for both of you. This time we spoke with Julia Kamin from the Prosocial Design Network, an organization that aims to make the web a more respectful place.
[00:27:52] Glen Weyl: A good friend of ours, yes.
[00:27:53] Sabreena Delhon: Very good. I had a feeling. We asked her what she thinks is the biggest hurdle facing pro-democracy or pro-social technology. Here's what she said.
[00:28:02] Julia Kamin: It's not gonna be surprising to you, or to any listener, that most of the spaces where people are engaging online are tech platforms that are profit-driven. And the way they make their profits, by and large, is by increasing engagement. And the broad consensus view is that within these tech platforms, pro-social technologies don't necessarily increase engagement. And oftentimes they, we hear they view them as costs. So the challenge in getting them to adopt pro-social technologies is immense. And I think most folks believe that if we want tech companies to adopt pro-social technology, it's gonna have to come from harder pressures, such as legislation and regulation.
[00:28:55] Sabreena Delhon: So, Julia uses the word “immense” when it comes to big tech changing their ways, working for collaboration instead of profits. We've been talking about, you know, an ideological mind shift. Talk to me about government regulation. Is that part of the answer?
[00:29:14] Audrey Tang: First of all, I think there are some voluntary movements in this space. So even before we passed in Taiwan the Anti-Fraud Act, which I'll introduce in a moment, YouTube already voluntarily adapted the Community Notes system from Twitter to YouTube. So nowadays in YouTube you have exactly the same bridge-building notes floating to the top when it comes to videos that may contain misleading information and so on. And I bet they introduced that because they have seen the research that shows this kind of recontextualization services actually increase the quality without decreasing engagement. So it's a net positive for the YouTube ecosystem, and they introduced that. So there's that. But I do agree that without some sort of liability, they will not adopt that quickly across the industry. So Taiwan passed the Anti-Fraud Act that says, okay, now we're going to see deep fakes or cheap fakes on those large platforms, if you post an advertisement, it has the potential, because the platform is large, it reaches more than 10% of our population, then no matter whether you're domestic or foreign, you need to secure it by getting a public key signature. Not just from the person funding it, but also from the person appearing in it. So if it is Audrey Tang telling you to, I don’t know, buy the book, Plurality, or something like the advertisement, then I will have to digitally sign it, or in some equivalent way, to provide an attestation, a provenance that says, “okay, I endorse this deep fake version of myself.” And if I do not provide such a signature, if the platform went on with it anyway, and if somebody get conned for $1 million dollars, then in Taiwan, that platform will be liable for that $1 million dollars. So this fraud is just the latest in a series of such regulations. And the great thing is that we didn't come up with this idea. This idea was crowdsourced using a technique called alignment assemblies.
[00:31:34] And in a mini public of 450 people, statistically representative of the Taiwanese population, who jointly come up with ideas such as this, that is now law. And so when people feel very strongly about something, democracy needs to have the bandwidth for them to come together and figure out these kinds of things. And also the latency needs to be short enough so that a few months after the deliberation, it becomes law.
[00:32:02] Sabreena Delhon: Glen, any reflections on regulation and legislation and the kind of culture and ideological shift we've been exploring?
[00:32:09] Glen Weyl: Well, look, I think that there are good people doing this kind of work at all the companies and they don't always get always the support or the praise that they need. And part of that is like the corporate machine. But the reality is that is a little bit of a cop out. A lot of it is that it's not demanded by the public. The public doesn't understand it and the media doesn't highlight it. We can't just keep pointing fingers in a circle about who's causing the problem
[00:32:33] we actually have to get together and raise up that work. And blaming the companies doesn't really accomplish that much. It's the lack of collective action, not the fault of the civil servants or the fault of the politicians. It's that none of us are getting together in any of those places and surfacing that good work. So, rather than pointing fingers, let's take that responsibility. You know, we say, don't ask why nobody's doing it, you or nobody, you know. You're the nobody.
[00:33:06] Sabreena Delhon: Bianca, are you still there?
[00:33:09] Bianca Wylie: I'm here.
[00:33:10] Sabreena Delhon: So are we the nobody? How much power do we have as people to turn all of this around?
[00:33:18] Bianca Wylie: We are the nobody. But we also haven't used that well in our non-technology option set, would be the argument here. And as someone who has done work and works in public with participatory process, I believe in process. I believe we need to come together and make decisions. I think finding facts, finding truth, all of this work collectively, is very critical,
[00:33:40] it's important. But at the same time, our systems shouldn't demand that people be in the room for those systems to work for them. That's the whole effort behind anything that has this rights, responsibilities, liberal democracy frame. So this to me is that question about public responsibility. We still have to think about everybody and we have to think about everybody both when they don't participate and when we disagree with them. That to me is the calling of the people in a democracy. But it's a lot more work than what we're doing right now.
[00:34:15] Sabreena Delhon: Well, let's give you the last word here. Where do we go from here? What steps do you think we need to take to make technology work for us, for democracy?
[00:34:25] Bianca Wylie: So I'm gonna answer this question in the Canadian context. So in the Canadian context, if everyone that lived in Canada had access to housing, they had access to food, they had a decent life, they had what they needed, that we were doing repair in terms of nation-to-nation-to-nation federalism. If all of these conditions were met, technology could do a lot more.
[00:34:52] So I would just offer that for people who are enthusiastic about technology, that's fine, but then pour your energy now into raising the floor. Because if everyone has a decent life in Canada, if everything is healthy on that front, you can innovate away and there's a lot less fear of harm and there's a lot more opportunity to potentially accelerate some of what's working.
[00:35:14] But I would caution on the idea that it can somehow resolve those issues, which are deeply material. They have to do with money, they have to do with personal health, and they have to do with wellbeing. All of these things are critical to a good foundation. And if that's there, innovate away.
[00:35:38] Sabreena Delhon: A special thanks to Audrey Tang, Glen Weyl, Bianca Wylie, Julia Kamin, and Ottavia Paluch. And thank you for listening to Group Chat. I'm Sabreena Delhon, CEO of the Samara Centre for Democracy. Group Chat is executive produced by Debbie Pacheco. The Group Chat team also includes Farha Akhtar, Andrea Mariko Grant, and Beatrice Wayne. Theme music is by Projectwhatever. The Samara Centre for Democracy is a non-partisan charity that produces groundbreaking research, dynamic events, and educational resources that advance a vibrant culture of civic engagement across Canada. Donate to support our work and check out our other podcasts @samaracentre.ca. If you like us, help spread the word about our show. Subscribe, rate, and review this podcast. If you teach, share it with your class. A special thanks to the Flanagan Foundation and the Rideau Hall Foundation for their support.