under the sea code monkey
583 stories
·
31 followers

Agency and Trust in a Digital Democracy

1 Share

Last week I was on a panel about ‘Democracy and the Digital Commons’ at Suffolk University.  At the start of the panel, each of us gave a 5-10 minute talk to help frame the discussion.  While there’s no transcript of the panel itself, here are my notes for the intro.  (Quick context: each of us tied our talk to the Boston Marathon bombing and in particular Reddit’s response to it.)

The internet was supposed to bring about a new world, a better world in which everyone had a voice. With the rise of blogging and social media, the machinery of publishing was being democratized. And more democracy is always good, right?

Well, not necessarily. Democracy has its flaws and its dangers, just like any other system of governance. As Winston Churchill said, “Democracy is the worst form of government, except all the other ones that have been tried.”

I want to focus on one particular long-understood flaw of democracy, which is this: in a democratic system, every voice counts equally, even though some may be much better educated on a given topic.

We can view what happened after the marathon bombing through this lens. Reddit led a crowdsourced manhunt – another way of saying that is that they led a democratized manhunt. Anyone, regardless of their experience with intelligence investigations, could contribute to the discussion. It’s not surprising, then, that the investigation went awry – that the crowd failed to consider how misidentifying suspects could harm innocent people, or how a public manhunt might influence the behaviors of the perpetrators. The vast majority of the people participating had no training or experience in the matter, so why would they know?

Here’s a question for the room. A loved one of yours has been mysteriously murdered, and you have to choose who investigates, one or the other – the local police, or a community of people on Reddit. Which do you choose?

Now I want you to imagine that the prime suspect is a local police officer. Who you want to investigate – the local police, or a community of people on Reddit?

Here we have a great tension. It’s better and more effective to delegate power to those with the expertise to use it. But it’s only better and more effective when the experts act fairly and in the best interests of those who’ve delegated power to them. And there’s no way to be 100% sure that they’ll act in your best interests. This is called the Principal-Agent Problem.

We can see the Principal-Agent Problem at work with the issue of misinformation more broadly. Techno-Utopians have championed social media as a way to make every user a publisher. But there’s a real value to the training that journalists get. There are cultural standards they internalize and then enact, including things like verification processes, disclosing conflict of interest, and refusal to plagiarize or fabricate quotes.

I for one am relieved to be able to delegate this work to journalists at ProPublica or the New York Times or the Boston Globe. I may not agree with what they chose to report on or the opinions in their editorials but I trust that they’ll follow journalistic norms. I trust them to act on my behalf, my news-gathering agents. But not everyone does. There’s been a concerted effort for decades, which has risen to a fever pitch over the last few years, to portray them as biased, as liars, even, lately, as traitors deserving of violence.

If we stop viewing journalists as our “news-gathering agents”, who replaces them? We’ve got to trust someone to gather our news, because I sure am not capable of doing it all for myself. So who do we trust instead?

One response is, “We’ll trust those in power, we’ll trust the President”, which, no matter what party you belong to should give you pause.

Another, more optimistic response, is to say, “The crowd! We’ll trust the crowd.” In other words, screw agents – let’s all be principals.

But what does the crowd give us? It gives us clickbait passed around every corner of Facebook. It gives us waves of abuse and harassment on Twitter. It gives us lies that spread faster than truth.

It turns out this result is not very satisfying! So there are calls for someone to exercise power and try to fix these problems. People ask Facebook to stop the spread of misinformation. They ask Twitter to stop abuse on their platform. And in doing so, they’re asking the tech platforms to act as their agents.

The tech platforms are hesitant to do this, I think rightly. “Who are we to determine which journalists are legitimate and which are not?” Facebook asks. “Who are we to determine what’s rudeness and what’s abuse?” Twitter asks.

But if not them, then who? They’ve designed their platforms as a meeting ground of millions of principals rather than a place where people can delegate responsibility to agents. The platforms don’t empower people to address these problems, so the only solution is to go behind the platforms. And behind these scenes, of course, these companies are profoundly non-democratic.

And so you end up with a site like Twitter, where many users feel coerced into letting Twitter act as an agent on their behalf despite having no mechanisms to hold it accountable. So Twitter has power its users don’t want to grant it, and that Twitter itself doesn’t want to use, but that must be used for the commons to remain remotely functional.

So how do we move forward? I have three main ideas.

First, I think we need to change how we design our digital platforms. Web applications are governance systems made of electricity and silicon rather than ink and parchment. When you ban a person from your website, it’s not that different from asking the sheriff to walk that no good rascal to the edge of town. And if we view web platforms as systems of governance then we can see just how naive and inadequate sites like Twitter or Facebook or Reddit are. The use of the phrase “upvoting” and “downvoting” on Reddit seems almost insulting. Users aren’t upvoting a person to represent them in a specific situation or downvoting a proposed policy they don’t want to see adopted. Or over on Twitter – people have been using external tools like shared blocklists for years to try to establish some semblance of collective control that the platform itself refuses to grant them.

Specifically, I think we need to design systems that encourage us to delegate power to those we trust – voluntarily, and revocably. Because we need agents that we can trust to act on our behalf, but we also need ways to withdraw power from those we no longer trust. If we can do this on our platforms, we won’t have to beg for intervention from the people behind the platforms.

Second, we need to nourish existing systems of trust and adapt them to online spaces. A lot of tech industry rhetoric has centered around replacing trust, for instance blockchain is supposed to be trust-free. But humans will always have to trust each other, and we’ve developed some pretty good cultural norms and social systems to facilitate that trust. We shouldn’t just throw them away.

Which brings me to my third point. Our legal system has a solution for the Principal Agent Problem. It doesn’t always work, but it does help a lot of the time. This solution is the concept of fiduciary duty. This is what requires Doctors to act in the best interest of their patients, lawyers to act in the best interests of their clients, and bankers to act in the best interests of their customers. Why not require platforms to act in the best interests of their users?

Nothing we do is going to permanently solve the Principal Agent Problem. There will always be some amount of misinformation and abuse in our digital commons. But that’s not an excuse to turn away from the issue. By thinking carefully and compassionately about these problems we can improve our approach to them.

Read the whole story
codersquid
3 days ago
reply
chicago
Share this story
Delete

Comment on Tolerating Uncertainty by Shauna

1 Comment and 2 Shares

I did! I love that essay:

“When human beings in Shakespeare’s plays reach irritably after fact and reason in their dealings with one another, they break, or break the world. Certain truths and chains of logic bind souls, and drive people mad. There is no room for slack or play within them. Hamlet’s hunger for certainty drives him to the edge of suicide; Lear’s demand that his daughters prove their love undoes his kingdom. The isolated mind uses all its tools and power to protect and justify itself—but so long as its judgment is driven by suspicion and fear, it will never be able to diagnose its own flaws.”

And later:

But as a man of systems and facts, once he was infected with distrust he could not convince himself to trust again. Every use of logic justified the wound in his heart. The world became a conspiracy against him.

if you require perfect certainty to live in harmony with others then you will always be at odds with the world. We have to be willing to be wrong about each other in order to occasionally be right. I saw on Twitter there was a woman who had a goal of getting 100 rejection letters, because it meant she was taking risks with her writing. Perhaps I should have a goal of feeling disappointed with 100 people, as a sign I am taking risks with my trust.

Read the whole story
codersquid
12 days ago
reply
chicago
Share this story
Delete
1 public comment
duerig
12 days ago
reply
This is very interesting. Logic doesn't lead us to truth. Logic simply leads us to the ends our assumptions imply. I think this is how smart (sometimes genius) people often end up with absurd beliefs.

When we are open to others and their ideas, they push the small boat of our beliefs in different directions. Even when we reject those directions, they help keep those mental waters from stagnating.

If I isolate myself from others, if I stop learning because I believe I have found truth, it doesn't matter how logical I am. I will slowly drift in the direction that the stagnant currents take me. There will be no correction, no vitality. Step by step I will move deeper into absurdity, the logic and intelligence will simply build in that same direction. Until one day I wake up and look around and I have drifted off into absurdityville and believe something I would have scoffed at in my earlier days when I still looked for new things to learn in the world.

Of course, it can be even worse when I surround myself by people who all have the same direction of drift as I do. At that point, we not only drift but are actively assisted into the absurd by those around me.

Comment on Liberty is a gift from mankind by Shauna

1 Share

Thanks for linking me to Luis’s talk! It’s great, I’m sorry I missed it at the time.

I agree that it’s a different way of getting at the same set of ideas. I’m not sure what the barrier is to libertarian-leaning folks adapting the capability approach to liberty, other than a lack of empathy. (It’s easy to ignore the importance of capability when your circumstances have allowed you to develop your abilities in a way that’s satisfying and rewarding to you.)

Read the whole story
codersquid
12 days ago
reply
chicago
Share this story
Delete

Super Princess Saves the Night

1 Share

Almost five years ago now, frustrated by the lack of trans-inclusive children’s books that my best friends’ kids had as options, I sat down and wrote a story about a tiny trans/gender non-conforming superhero named Super Princess.  It’s so exciting to finally be able to put this book out into the world.  This is a story about the magic of empathy and the importance of approaching the world with love instead of fear.  

All profits are being donated to the Trans Women of Color Collective.

Learn more about Super Princess and buy the book here.

Read the whole story
codersquid
16 days ago
reply
chicago
Share this story
Delete

Democracy as an information system

2 Shares

Democracy is an information system.

That’s the starting place of our new paper: “Common-Knowledge Attacks on Democracy.” In it, we look at democracy through the lens of information security, trying to understand the current waves of Internet disinformation attacks. Specifically, we wanted to explain why the same disinformation campaigns that act as a stabilizing influence in Russia are destabilizing in the United States.

The answer revolves around the different ways autocracies and democracies work as information systems. We start by differentiating between two types of knowledge that societies use in their political systems. The first is common political knowledge, which is the body of information that people in a society broadly agree on. People agree on who the rulers are and what their claim to legitimacy is. People agree broadly on how their government works, even if they don’t like it. In a democracy, people agree about how elections work: how districts are created and defined, how candidates are chosen, and that their votes count­—even if only roughly and imperfectly.

We contrast this with a very different form of knowledge that we call contested political knowledge,which is, broadly, things that people in society disagree about. Examples are easy to bring to mind: how much of a role the government should play in the economy, what the tax rules should be, what sorts of regulations are beneficial and what sorts are harmful, and so on.

This seems basic, but it gets interesting when we contrast both of these forms of knowledge across autocracies and democracies. These two forms of government have incompatible needs for common and contested political knowledge.

For example, democracies draw upon the disagreements within their population to solve problems. Different political groups have different ideas of how to govern, and those groups vie for political influence by persuading voters. There is also long-term uncertainty about who will be in charge and able to set policy goals. Ideally, this is the mechanism through which a polity can harness the diversity of perspectives of its members to better solve complex policy problems. When no-one knows who is going to be in charge after the next election, different parties and candidates will vie to persuade voters of the benefits of different policy proposals.

But in order for this to work, there needs to be common knowledge both of how government functions and how political leaders are chosen. There also needs to be common knowledge of who the political actors are, what they and their parties stand for, and how they clash with each other. Furthermore, this knowledge is decentralized across a wide variety of actors­—an essential element, since ordinary citizens play a significant role in political decision making.

Contrast this with an autocracy. There, common political knowledge about who is in charge over the long term and what their policy goals are is a basic condition of stability. Autocracies do not require common political knowledge about the efficacy and fairness of elections, and strive to maintain a monopoly on other forms of common political knowledge. They actively suppress common political knowledge about potential groupings within their society, their levels of popular support, and how they might form coalitions with each other. On the other hand, they benefit from contested political knowledge about nongovernmental groups and actors in society. If no one really knows which other political parties might form, what they might stand for, and what support they might get, that itself is a significant barrier to those parties ever forming.

This difference has important consequences for security. Authoritarian regimes are vulnerable to information attacks that challenge their monopoly on common political knowledge. They are vulnerable to outside information that demonstrates that the government is manipulating common political knowledge to their own benefit. And they are vulnerable to attacks that turn contested political knowledge­—uncertainty about potential adversaries of the ruling regime, their popular levels of support and their ability to form coalitions­—into common political knowledge. As such, they are vulnerable to tools that allow people to communicate and organize more easily, as well as tools that provide citizens with outside information and perspectives.

For example, before the first stirrings of the Arab Spring, the Tunisian government had extensive control over common knowledge. It required everyone to publicly support the regime, making it hard for citizens to know how many other people hated it, and it prevented potential anti-regime coalitions from organizing. However, it didn’t pay attention in time to Facebook, which allowed citizens to talk more easily about how much they detested their rulers, and, when an initial incident sparked a protest, to rapidly organize mass demonstrations against the regime. The Arab Spring faltered in many countries, but it is no surprise that countries like Russia see the Internet openness agenda as a knife at their throats.

Democracies, in contrast, are vulnerable to information attacks that turn common political knowledge into contested political knowledge. If people disagree on the results of an election, or whether a census process is accurate, then democracy suffers. Similarly, if people lose any sense of what the other perspectives in society are, who is real and who is not real, then the debate and argument that democracy thrives on will be degraded. This is what seems to be Russia’s aims in their information campaigns against the US: to weaken our collective trust in the institutions and systems that hold our country together. This is also the situation that writers like Adrian Chen and Peter Pomerantsev describe in today’s Russia, where no one knows which parties or voices are genuine, and which are puppets of the regime, creating general paranoia and despair.

This difference explains how the same policy measure can increase the stability of one form of regime and decrease the stability of the other. We have already seen that open information flows have benefited democracies while at the same time threatening autocracies. In our language, they transform regime-supporting contested political knowledge into regime-undermining common political knowledge. And much more recently, we have seen other uses of the same information flows undermining democracies by turning regime-supported common political knowledge into regime-undermining contested political knowledge.

In other words, the same fake news techniques that benefit autocracies by making everyone unsure about political alternatives undermine democracies by making people question the common political systems that bind their society.

This framework not only helps us understand how different political systems are vulnerable and how they can be attacked, but also how to bolster security in democracies. First, we need to better defend the common political knowledge that democracies need to function. That is, we need to bolster public confidence in the institutions and systems that maintain a democracy. Second, we need to make it harder for outside political groups to cooperate with inside political groups and organize disinformation attacks, through measures like transparency in political funding and spending. And finally, we need to treat attacks on common political knowledge by insiders as being just as threatening as the same attacks by foreigners.

There’s a lot more in the paper.

[This short piece was co-authored with Bruce Schneier, and originally appeared at Lawfare.]

Read the whole story
codersquid
18 days ago
reply
chicago
Share this story
Delete

670 ballots in a precinct with 276 voters, and other tales from Georgia's primary

1 Share
https://www.msn.com/en-us/news/politics/670-ballots-in-a-precinct-with-276-voters-and-other-tales-from-georgias-primary/ar-BBLBUA4

WASHINGTON - Habersham County's Mud Creek precinct in northeastern Georgia
had 276 registered voters ahead of the state's primary elections in May.

But 670 ballots were cast, according to the Georgia secretary of state's
office, indicating a 243 percent turnout.  Georgia is one of four states
that uses voting machines statewide that produce no paper record for voters
to verify, making them difficult to audit, experts say.

  Difficult indeed. Coincidentally (we hope), 83% of the county vote was for
  the outgoing secretary of state Kemp.

  It really only takes one story like this to prove the larger proposition
  that unauditable electronic voting machines are a menace to
  democracy. Only obvious errors like this bubble to the surface; who knows
  what goes on in other cases?
Read the whole story
codersquid
19 days ago
reply
chicago
Share this story
Delete
Next Page of Stories