Transcript of the "Stories of Impact" podcast episode

Upholding Democratic Values in the Internet Age with Dr. David O’Brien

Tavia Gilbert: Welcome to Stories of Impact. I’m producer Tavia Gilbert, and in every episode of this podcast, journalist Richard Sergay and I bring you conversation about the newest scientific research on human flourishing, and how those discoveries can be translated into practical tools. This season of the Stories of Impact podcast explores the vital question of citizenship in a networked age: how the internet has impacted the meaning of community, how information is newly interpreted with digital tools, and how our civic understanding has changed with the involvement of big data. Today, we’ll hear from David O’Brien, Assistant Research Director at the Berkman Klein Center for Internet and Society at Harvard University. Prof. O’Brien talks to Richard about how we as citizens can bring our democratic values to bear on social media platforms; the role privacy plays online in allowing democracy and democratic citizenship to flourish; and more.

Richard Sergay: Citizenship in a Networked Age. Why don't we start there? What does that mean to you?

David O'Brien: Well, it's a timely question to ask, I think. This is the question of our time. I've always thought of citizenship as being the sum total of our experiences and our relationships with one another, a kind of a sense of community that exists at the local level, at the regional level, at the national level, maybe even globally these days. And the Internet, of course, has dropped a lot of those barriers. It's no longer just the people in our immediate vicinity, but we can connect with anyone around the world pretty seamlessly. So citizenship is all of those things rolled together and how we make decisions collectively and for our small communities that we're a part of.

Richard Sergay: And citizenship in a networked age—what does that mean to you?

David O'Brien: I think it's very different. If we look back to past decades, I imagine, including before I came to exist on this wonderful earth, that citizenship had more to do with what was immediately around us. And like I said, I think those barriers have dropped quite a bit over time. And the Internet has kind of forced us to think a little bit more globally at times, has put us into the shoes of other people, but it's also had a tendency to separate us. It's facilitated, I think, finding other people just like yourself as much as it has inviting you to step into other people's shoes from wherever they might be. And I think that's something that profoundly marks this era as well.

Richard Sergay: There's a term that's been bandied about for awhile because of the Internet, that we are now global citizens. Is that relevant to today? Does that mean something to you?

David O'Brien: I think it does to me personally, and I think it ought to mean something to all of us. I don't know that it necessarily resonates with everybody, but yeah, I think that we're far more able to dig into the affairs of what used to be a foreign country, from other parts of the world. We're able to actually peer into those communities and to see what's going on. It's easy to break down language barriers now, thanks to things like Google Translate, where you can just at the click of a button, you know, translate a webpage completely in a second to a language that you understand. And certainly we've become more global in this way. The tools are there, I guess that's to say. Whether we are in all ways, in practice, more of a global community, I guess I'm not really sure. Sometimes I would say yes, and sometimes I would say no.

Richard Sergay: Does citizenship in a networked age demand a different type of engagement in community than pre-Internet days, in your mind?

David O'Brien: It has to, and look, so the Internet has really broken down a number of things. It's been this in some ways, a democratizing force, right? It's enabled us to organize together in spaces that didn't used to exist. It's made it very easy to communicate with large groups of people. And whether you're on the supply side of that or on the consuming side of that, I think it has changed just about everything. You go onto the Internet these days, and you want to find out what people are thinking about a particular issue, chances are, you fire up your Twitter feed, right? And you're suddenly getting your fire hose, as it were, of everybody's thoughts, in some cases rather raw thoughts, about whatever the latest thing that is that's going on. And that makes it quite different, I think, than in the past where, you know, we had just a few voices, the Walter Cronkites of the world, telling you what was the news every night at 6 PM or so. There are many more of those voices now than ever before. It's in some ways, a little bit less of a professional environment, although certainly there are still professional news organizations, and there are still politicians and so on, but it definitely has changed quite a bit. It's more noisy. And definitely the signal-to-noise ratio is not very strong.

Richard Sergay: So this cacophony of voices that you refer to, how does that impact citizenship in the 21st century?

David O'Brien: I think it puts the burden on individuals more than it has in the past, and to some extent also groups of people, to figure out what's what. It's up to you now to use your own devices to sort of parse things like what's truth and what isn't, what's opinion and what's actual fact. And that makes it pretty different than in the past as well.

Richard Sergay: Many say that we are living in our own echo chambers because of the Internet, and citizenship therefore is defined by values which we adhere to in these echo chambers. That can be quite dangerous, can't it?

David O'Brien: It can. There's a tendency we've found in some of our research at the Berkman Klein Center and research generally in the field that these days people's opinions are strongly shaped by how they perceive their own world, and they tend to seek out opinions just like there's more than anything else. That seems to be intensifying over time. And it's almost like a filter, right, that gets put in front of people at times that they're able to filter out, I think, opinions that might not resonate, opinions with which they might not agree. And I think it has a very polarizing effect.

Richard Sergay: In fact, some say we live in this almost post-truth or post-fact society because of the Internet. Would you agree with that?

David O'Brien: That's definitely changed a lot, especially in the last four years. And there's lots of research on this, it's hard to pin down exactly what the problem is. So, first, let me just say yes, I think I completely agree we're in some sort of post-truth era now where everything is contestable as a factual matter. We're also seeing right now, in the middle of an ongoing pandemic, the scientific process actually playing out in real time. That's very different, I think, than how it would have gone years ago, and it is a rough process, a lot of experimentation where we try to divine truth at the end, a scientific truth of some kind. And along the way, you might end up with bad hypotheses. And we're seeing now, as we sort out coronavirus, its effects on people, what we know about the virus, how it affects us, what the best treatments are, and even just in the last six months, many times over, it feels like the truth about this public health crisis has changed quite a bit on the ground. Should we wear masks? Should we not wear masks? How many feet do we stay apart? Can you get reinfected? Are there any homeopathic treatments that would be appropriate for it, and so on? And that's just a small example, of course, and it's a public health example, of course. And then there's this whole political environment, which has really complicated things where everything can be politicized now, including public health issues that, you know, five, ten years ago, it felt like would have been a little bit more grounded in science and in medicine. Whereas today, things are a little bit more a matter of opinion about how someone looks at something. And to be fair, that's always been there a little bit. In some ways it's a question of policy, whether you, for example, implement a mandate that everybody wear a mask, because there were a lot of equities that, say, a government would have to sort out. And yet at the same time, there is still a scientific grounding underneath all of this, one that as we've seen has sort of shifted a little bit in the last six months. And what technology and the Internet has given us is we are seeing all of this, right? In real time, just sort of playing out. And it's certainly opened up a lot more conversation about it—not always good conversation, there's bad information all over the Internet, it's a very disorderly environment, and the quality of information isn't always very high either. So are we in a post-truth era? Yes, absolutely. What's the next era that comes after this one? I don't know. Hopefully it's something that, you know, has a better factual basis to it, but certainly more than ever, you can seek out the facts that you want to support your views. And that's something that seems to be backed up a little bit by the idea of echo chambers too, or filter bubbles is another way that some people have described this, where really you are kind of shaping your own information environment through your preferences. And somehow, the technologies that we use to communicate are in a way reinforcing that. And that does seem, itself, also very problematic, too. We haven't really settled this out. But it's changing just in the way that technology is changing, too. I don't think that it will stay like this forever, but again, the big question seems to be what comes next after this one?

Richard Sergay: You referred to this pandemic as obviously a public health crisis, but I can bring it back to citizenship in terms of the good of the community. If half the population isn't paying attention to it as a public health crisis and not wearing masks, etc., it has an enormous impact on the good of the community. So what's that say to you about citizenship today? Is it irrevocably broken?

David O'Brien: Well, I sure hope it isn't. I'm still a little bit optimistic that despite whatever rocky time we're going through right now, we can eventually turn this around in a meaningful way. Again, I think it goes back a little bit to people sort of defining their view of the world however they want, according to whatever values that they have. And that seems to have this outsized impact on the way that people right now are relating to each other as citizens, where there seems to be a part of the population here in the United States that perhaps would not promote as strongly some of the public health measures that the other half would. That goes back to sort of the mask-wearing mandates problem that we've seen playing out. And then of course, it's just one of many examples, and also because it's politicized. And so I think that kind of marks the era in a way. I mean, it can have a lot of detrimental effects on it. It's hard to know exactly how some of this will resolve. I mean, I think it sort of over time, parts of it will work itself out. And part of it might take actual interventions, whether those are governmental interventions or interventions on the behalf of a private sector company, like one of the Internet platforms, and that, too, also is kind of a question in here about how do we try to right the ship a little bit.

Richard Sergay: I was thinking back to 9/11, where coming out of that catastrophe, there was a common sense of purpose in the United States, a national sense of purpose, whereas with the pandemic, for some reason, that common sense of purpose for the common good seems to have been lost in translation. So, there's a 20-year gap there. What do you think happened in terms of our civic understanding?

David O'Brien: I'm definitely not the expert on this topic, so what, what I'm going to say here is very anecdotal and a little bit personal to me. It does feel, though, that there is a part of the population, maybe for a number of reasons that it feels like it's been left a little bit behind in terms of where the country has gone with its policies. And part of this seems to be attributable to globalization. Some of this all ties back, again, to sort of technological development over time, but the promises, I think, to certain generations, it didn't really meet their expectations, I guess, years ago. Just a personal example: My father-in-law served in the Navy, and after he got out of the Navy, he was able to attend college and eventually got a master's degree, became an engineer, worked for IBM for 25 or 30 years or so, and was able to retire with a pension. That is such an uncommon story today. It's so interesting to me, too, because nobody stays at a company or an employer for that long. Nobody gets pensions anymore either. So I think some of the economic realities are, in a way, catching up a little bit with some of the policies and the directions that technology has pulled us in and tugged us along. And so I personally think that that probably has something to say about where we are now. I would really leave it, I guess, to some of the political scientists out there though, to help us understand where we are. And just another quick point, too, since we're hot off the election, and this is again, sort of a little bit personal, but I think something that a lot of people are feeling right now, it feels like we don't really have a good grip on where we are as a population politically at the moment. And I'd point to what looks a lot like a failure in polling for the second time in four years, where it at least felt like the way that people were talking about how they expected the election to come out, that it didn't really pan out, at least in a way that matched that narrative. And that I think is interesting, too, and it, maybe it even speaks a little bit to the polarization and the division that we're feeling right now because expectations, again, aren't really being met. And it does seem like with such a close election that we are kind of far apart from at least half the population, no matter which side that we're on. And that's a little bit troubling, because it doesn't feel like we have a good grip on the status quo either.

Richard Sergay: So let's get back to your expertise, which is around the Internet and privacy. One of the recommendations of the citizenship report is to promote the value of privacy for personal moral development. What does that recommendation mean to you?

David O'Brien: You know, I often get asked this question by students: What is privacy good for? Why does it matter? Why do we have it? And there are a lot of ways that we can go about answering that question. But I think one that might resonate, especially with younger people right now, is, it has a little bit to do with maybe how you experienced part of your teenage years, which might've been as it was the case for me, spending a lot of time in your bedroom and just thinking about things, contemplating the world. Of course, that's changed a little since I was a teenager, social media didn't exist back then, so you're still connected to the world. But being able to discover yourself, being able to ask questions, not just of your parents, but just in general, seeking out ideas, thoughts, inquiry, having these freedoms, I think, really shapes an individual. And we need to have spaces like this in order to sort of become the people that we ultimately are in adulthood and even beyond in adulthood as we continue to live, and without them, I think we lose something. So that's one thought around privacy and why it might matter and how it could be tied back to that ideal. Another is that, I once had a computer scientist, really smart person on the faculty at Harvard at the School of Engineering, ask—and he wasn't joking either, "Why don't we have laws that self-execute in the way that a computer program would, such that if you violate a law, you automatically have to endure the consequence of that law, whatever it is?" And to me, the answer in part ties back again to privacy. It's because we want some ambiguity in the law. We want people to be able to ask ideas that might be a little bit uncomfortable and to seek out thought on different things as a way of changing society over time. That, in a way is more feature than bug, right? We want to be able to shift norms over time. And we can point back to even a number of things that have happened recently within the last 20 years, from the acceptance of the gay community and marriage, looking at it from a more universal perspective. This is something that took decades to change, but it's one that sort of marched forward, and it started in a very private sort of way, allowing the community to more or less develop, and over time, people became more comfortable with it. Or looking at some of the changes in drug laws around the country. Now marijuana is legal for recreational sale in a really large number of States that would have been a crazy thought back in the '90s, that that was actually on the horizon. And yet here we are. And again, a thing where we allowed it to change over time without having that sort of self-executing law. And so an ideal like this, I think, helps preserve some of that change, because it is something that we want. We don't want to have a community that sticks to just the mainstream.

Richard Sergay: Your colleague, Mark Zuckerberg, who was then at Harvard and started Facebook there, said, privacy is dead. There is no privacy on the Internet. Now, his views have changed somewhat over the years. What does it mean, if you can't assure privacy, for citizenship in a networked age?

David O'Brien: I think it puts all of what I just said a little bit at risk. I would counter a little bit, though, I think that statement that privacy is dead, can both be true and not true at the same time, believe it or not, because I have always taken that statement to mean that we have the potential with computers and with the Internet to know more than ever about people as they use programs, as they go to websites, as they communicate with their friends and family and colleagues and so on. But that doesn't necessarily mean that there are always invasions happening. It means that the potential is always there. And so there's this sort of ethical barrier that needs to exist. It practically exists, where just because you can build systems that are capable of this style of tracking doesn't mean that you have to, or that you should. And if you look at how Mark Zuckerberg's perspective seems to have changed over time, I think he ultimately is very pro-privacy and trying to claw some of that back. And some of the changes that he's announced at Facebook within the last few years with Facebook moving to a more private model where their goal over time seems to be, to take things like the messages that you send between people using Facebook messenger and construct it in a way where the company is not actually able to see those communications. However—

Richard Sergay: Let me just jump in and say, isn't it like Pandora's box at this point? To have started the conversation one way about what privacy is and isn't, and now trying as, in your words, to claw it back. To use another cliche, the genie's out of the bottle on this one.

David O'Brien: Yes. Yeah. That's absolutely true. And you know, another reality that I have personally learned over time, just from experiencing everything in this world is, you can't march back the sands of time. As change starts, we can't ever hope to go to the days of having the Walter Cronkite-like figures giving us the news. That's totally impractical, and no matter how much we might idealize that, that's unlikely to be the future. We need to find ways of coping with what comes next. To that end, that's where I go back to thinking about things like first principles and having a set of values to start from and to build upon, and to try to implement into systems, into architectures that help preserve those ideals in different ways. And so, you're absolutely right. It is a Pandora's box and there are tons of issues with this, and maybe we'll even get a chance to talk about some of them, but there are trade-offs in having a more pro-privacy model versus some of what we have now.

Richard Sergay: I mean, in fact, one of the issues, and this was early on in the Internet, is, anonymity on the Internet poses its own set of problems.

David O'Brien: And it continues to, as a matter of fact, and it's both a benefit and a detriment in a way at the same time. Famously in the early 1990s, the New Yorker had that wonderful cartoon, “no one on the Internet knows that you're a dog”—that's more true than ever today, it seems like. So if you look at some of the questions that we're grappling with, and some of these were a little bit more prominent during the 2016 election, but the question of whether there are Russian agents online holding themselves out to be American citizens, but organizing political events and gatherings in really polarizing ways. There's this baseline question about authenticity: What is authenticity? How do we know if we have it or not? Can we quantify it? Who has a right to have an audience and to influence? Those are huge questions right now. And so, in a way, some of the questions around anonymity tie back to this, but for good reasons, we also want anonymity. And we have to remember the Internet is a global phenomenon, right? It exists everywhere. And there are political dissidents in autocratic regimes that rely on the ability of being anonymous or at least pseudo-anonymous in order to promote their own ideas of what democracy ought to be in the countries where they reside without having the governments so easily step in and take them away to put them in jail or, in some cases, to kill them.

Richard Sergay: So, since you are a lawyer, and the First Amendment is something you know well, I want to come to the question about how these platforms, which they consider themselves technology platforms and not media platforms, start to rein in what otherwise might be called “free speech.” And the best example, and the most current example, is what we're seeing with President Trump, as we go through the final count of the US election, where Twitter and Facebook are flagging facts in question. So how do you draw that line between free speech, the technology platform, and facts? It's a vague sort of thought, but they're all sort of intertwined.

David O'Brien: I think this is an excellent question. And let's just start maybe with a baseline, a little jog back to how the First Amendment works. The focus of the First Amendment is actually on government action, right? And so Congress shall make no law, which really suggests that it's the US government which is generally not able to infringe on freedom of expression. Platforms, right, the Internet platforms, private companies, are not subject to the First Amendment. But over time, what they've tried to do as they've developed what first started as their terms of service, that wonderful contract that you'd be presented with when you signed up for the service—

Richard Sergay: —which no one reads—.

David O'Brien: —exactly, right. And then later what they call sort of their community standards, which also can at times look a little bit contractual by the way. But it's a set of rules, sometimes loose guidelines of how they govern their service online and how they expect you to act—.

Richard Sergay: —which no one understands. Go ahead—

David O'Brien: [laughs] Right, exactly. They've really tried to embrace a sort of a Western democratic view of freedom of expression. We're pretty unique as a country here in the United States, a First Amendment equivalent really does not exist most other places. It's also interesting to compare US foreign policy, which at times has really promoted some of these platforms abroad because of the hope that there would be these democratizing forces to promote things like freedom of expression. What we found out over time though, and it, to be honest, it was not really a surprise to a lot of people, is that these can be really toxic ecosystems that, in a way in part because of that “no one on the Internet knows that you're a dog” kind of problem, people are willing to do all sorts of things online that they probably wouldn't do in person, especially if you knew who they were, if they were a member of your community. And that ranges from just saying awful things to people, to hate speech, to acts that might incentivize violence, to being manipulative and deceptive. And then of course, there's this broader problem, right, around facts and sharing information and sharing low-quality information or information that is intentionally architected to deceive and to manipulate and to pull people in different directions to sow discord, and so on. So they've evolved a set of guidelines over time, and the problem has always been one of scale. So if you take Facebook, because they are the largest of all the large social media platforms, they have something around 3 billion users, right, which is almost half the world's population, if you count each of those people as just being one. So it's quite a bit of people. How do you start to moderate global conversations when you're dealing with a scale of posts that are on the order of billions every day. It's impossible, you can't do this with human reviewers. You have to set up these automated tools to try to do it and to use a combination of things. And that's kind of how this works now. So they have these guidelines, over time, you know, they developed them first internally. They were private, and gradually they started making them more public. And we're at a point now where actually they will issue a quarterly report about what they're finding on their platform and what actions that they're taking. And that's included—and this is more of a recent phenomenon, but within the last couple of years, labeling information, labeling posts on the platforms to try to contextualize it in some ways. So they all do it a little bit differently, but I'll just give you a couple of quick examples. On YouTube, if you go to a YouTube channel like RT or Russia Today, you'll now find a little box at the bottom that says that RT is funded by Russia, right? It's essentially a Russian news organization, and that's supposed to help you as an individual, you know, know something contextual about what it is that you're watching, although they won't go much further than that. And then more recently we've seen fact-checks. So Facebook has this partnership with these fact-checking organizations, and it's a little bit complicated how it works, but there's a queuing system where Facebook is able to determine sort of low-quality posts or information, things that are questionable, that it passes off to a third-party fact-checker, which then sort of looks it up and comes up with an answer and then might label something as being accurate or inaccurate or of a low reputational quality. And that then affects how that post might appear in your feed as a user. And if it's marked as being inaccurate, for example, it might be pushed down to the bottom of your feed.  And Twitter has something very similar. And that too has been in the news a lot, right, with respect to the president's tweets. That tends to be where we hear about him most, where it, too, will label posts as either being where the fact hasn't really settled out. We're still in the wake of the election, we're seeing a lot of this right now, where the president is alleging that there's been voter fraud. Each of those tweets gets labeled in a way where it says that this is a disputed claim—again, trying to contextualize information to other users who might happen upon the post.

Richard Sergay: So I'm curious since you use Cronkite, who was one of my icons as a journalist growing up, and we used to call him the voice of God, which we no longer have, and we now have this plethora of information that spans the Internet. As you think about citizenship going forward, are you an optimist or a pessimist in terms of the harm that the Internet has done, or not, to what citizenship means in the 21st century?

David O'Brien: Oh, that's a tough question. I think I'm, I'm a little bit pessimistic at the moment, we'll put it that way.

Richard Sergay: Or a realist? Either way, you know.

David O'Brien: Yeah, yeah. Realist and pessimist combined, I think. So I'm sure, you know, as a journalist that this is something that you've experienced, too, over time. Our media ecosystem used to be very different. And there's no question that it was the Internet that has changed a lot of this. It's shrunken down newsrooms, it's completely gutted local news, that's a completely different landscape today than it used to be. And yet at the same time, there's more news than ever, which is just sort of interesting. And so it, I think it has taken out the professional quality to a lot of the things that we see online. And then on top of that, there's this other trend that's happened over time, where commentary, and to some extent, opinion has been mixed in a little bit with the reporting, and it's hard for the average person to discern the differences between these things. When I was in college, one of my favorite books that I read in a communications class was How to Watch TV News. It was a short book, just about a hundred pages or so that gave this wonderful breakdown of how news has become this business. And in a way it mimics a lot of what we see on the Internet, too, where it's trying to keep you, especially cable news, is trying to keep you engaged constantly to keep your eyeballs on the screen. "And now this!" and "The breaking news" this, and I'm sure you're familiar with all of this. And I don't know that it's led to a better quality environment overall. And of course, it's not just the cable news, although I think that's a core part of the problem, it's also the Internet has had this effect, too. And it, surprisingly, or maybe not so surprisingly, anyone can have an audience on the Internet, huge audiences. You look at some of these people who use TikTok these days, all right, which is popular with the young kids. You know, I was just reading the other day, an article about one of the TikTokers who has now more than a hundred million followers. And that's just like unbelievable numbers, far greater than people probably could have expected back in the Walter Cronkite days or a little bit thereafter, and a lot of power to influence. And I don't think we've really settled out the norms around how we use these technologies, how we think of influence and reach. And this has affected almost everything. If you look at YouTube, another great example, again, where a lot of people are now producing just from their basements, right, product placement, the way people advertise, this as all changed. And it's really hard again for users to discern a lot of these things, there aren't just closures that are set up, there isn't a good legal framework. And even if there were, I'm not sure that it would be easy to enforce it, because it's so much different than it was, because there were so many more producers of information. We used to call this “citizen journalism” back in the early days of the blogosphere, we're way past that now. This looks like something very different. And I have a hard time even pinning a word to describe exactly what it is and how exchanged.

Richard Sergay: Looking forward for being a good citizen, though, it seems, at least here in the United States and elsewhere, the rise nationalist populism has cornered the market in a way, not that we didn't see it in the 1930s in Germany, but it's accelerated because of these tech and media platforms. What does it mean for citizenship going forward?

David O'Brien: I think it's incumbent on all of us as individuals and as a community to find ways to adapt ourselves here. I think because technology is this fantastic breaker of norms, and it's reset everything and at such a rapid clip, we've had a hard time catching up to this as a society. It's like the next thing comes out before we're really able to adapt ourselves. And so I think I would start there, by saying that one, we really need to be cognizant of just this fact, because again, in the same way that we can't go back to the Walter Cronkite days, we can't expect to be able to really roll back the technological dial to a different time, either. But that isn't to say we can't do things as individuals. I mean, first, it's about, and something I really like about the report, is that we need to become better listeners of each other. That speaks a little bit to that partisan divide that we see right now in the country. We need to find ways to relate to one another as well, and we need to come up with norms. And this is going to be the hardest part, I think, is developing norms at a faster rate that better matches these technological changes over time. And we need to ask more of the companies, right, that offer some of these services. I think it's another thing that has changed over time is that they are now more aware—and it speaks to what you said about Mark Zuckerberg having changed his opinion of how he looks at his own company—they're a little bit more aware of these problems. The question of what to do about it and how to go about it, I think it's still a tricky one for a lot of them, as they try to navigate a complicated environment of regulation and governments, not to mention just angry mobs of people who don't like the things that they're doing. But we still need to keep pressing them, I think, to do more and to make sure that technology isn't just benefiting their bottom line, but also doing something to enhance citizenship meaningfully.

Richard Sergay: So if I hear you correctly, we all have to find more common ground.

David O'Brien: I think that's right. It's going to be, and I hope this isn't a hollow phrase, but a whole of society kind of response is really what's required right now. We need change at every single level, in other words. And I think it would help to first go back to those first principles, to reevaluate ones that we've relied on in the past. And again, kind of looking at this report, looking at some of the recommendations it has, I think is at least a good starting point. It will be a monumental effort to figure out how to encode these things into, say, technology and the Internet and the platforms. But certainly, we need to start with a common agreement about what it is that we're trying to achieve here if we're going to hope for this to succeed.

Richard Sergay: David, thank you very much. We appreciate your time and insight.

David O'Brien: It was a pleasure. Thank you.

Tavia Gilbert: I want to reiterate Prof. O’Brien’s last point. He says, “We need to start with a common agreement about what it is that we’re trying to achieve here if we’re going to hope for this to succeed.” That closing point leaves us with plenty of space to define what “this” is. This year, I’ve often thought that not only has 2020 brought us a public health crisis, a financial crisis, a crisis of leadership and politics, we’re experiencing a failure of imagination. What is it that we, as citizens — whether we’re thinking locally, nationally, or internationally — imagine for a better world? What kind of world do we want to create for ourselves and those generations that follow us? O’Brien recommends first principles, and in my memory, the first principle I learned, as so many of us do, is, “Do unto others as you would have them do unto you.” Perhaps that’s the place to return, to begin again, to recreate a planet that truly serves all of its inhabitants.

We’ll be back with another episode in two weeks, continuing our focus on Citizenship in a Networked Age. Richard will speak with Filippo Trevisan, assistant professor at the School of Communication and Deputy Director at the Institute on Disability and Public Policy at American University. Dr. Trevisan talks about our evolving views on social media and offers insight on how we can use—and how we are already using—the Internet as a tool to improve representation in our democratic process:

Filippo Trevisan: You know, we've talked about all of the negatives of these platforms, but there are a lot of positives as well, and, you know, I see it. A lot of my work relates to how people come together to advocate for better health policy and healthcare, disability support. That's what a lot of my research looks at. And, you know, those are voices that tend to be sidelined in a lot of debates, but find strength in numbers and are able to find each other online in ways that were impossible before the Internet due to a variety of barriers.

Tavia Gilbert: We look forward to bringing you the full interview with Dr. Trevissan in that next episode. In the meantime, if you liked today’s Story of Impact, we’d be grateful if you’d take a moment to subscribe to the podcast, rate and review it, and if you’d share or recommend this program to someone you know. Your support helps us reach new audiences. You can hear all of our podcasts at storiesofimpact.org, and you can find Stories of Impact videos at templetonworldcharity.org/our-impact. This has been the Stories of Impact podcast, with Richard Sergay and Tavia Gilbert. This episode written and produced by Talkbox and Tavia Gilbert. Assistant producer Katie Flood. Music by Aleksander Filipiak. Mix and master by Kayla Elrod. Executive producer Michele Cobb. The Stories of Impact podcast is generously supported by Templeton World Charity Foundation.