Type to search

Engagement Best Practices

Creating Dialogue and Deliberation in a Polarized Political Climate


Photo by Lucas Sankey on Unsplash

In the aftermath of the United States presidential election, bitter partisan divides have resulted in protests, hate crimes, riots, and other forms of strife across the country. Within the political system, polarization has had a serious effect: the most partisan members of the public are less likely to support compromise, and are more likely to have a disproportionate impact on electoral politics (e.g. primaries). Calls for unity and healing expect people to instantly look beyond months, even years, of deeply adversarial political discourse which frames the “other” as fundamentally different.

How can we have better and more productive conversations about politics that don’t seek to attack and defend, but to understand? In contrast to the extreme, winner-take-all nature of political discourse in the US, our friends at the National Coalition for Dialogue and Deliberation (NCDD) define these concepts as follows:

Dialogue is not about winning an argument or coming to an agreement, but about understanding and learning. Dialogue dispels stereotypes, builds trust and enables people to be open to perspectives that are very different from their own. Dialogue can, and often does, lead to both personal and collaborative action. Deliberation is a closely related process with a different emphasis. Deliberation emphasizes the importance of examining options and trade-offs to make better decisions.”

1. Establish ground rules for participation.

Different people will have different expectations of what is acceptable, so you will need to define exactly what is appropriate and what will not be tolerated. While it’s important to establish clear code of conduct for participants, it is even more important to be consistent about enforcing the guidelines. For example, Twitter has come under fire for failing to adequately deal with online harassment on its platform and its inability to enforce its guidelines (e.g. abusive online behaviour, spamming, etc.) By “modelling” appropriate behaviour and moving swiftly to curb negative online behaviour, participants will quickly learn how to contribute to a healthy discussion.

2. Listen first.

Entering the dialogue with an open mind is the most difficult step. You may find yourself thinking instinctively about how much you disagree, how wrong the other person is, or only listening at the most surface level in order to formulate arguments against what you’re hearing. This is a surefire way to sour the conversation and descend into old habits of attacking and defending.

While listening, do not spend this time tuning out arguments with which you disagree or thinking of rebuttals. Genuinely try to understand where the other party is coming from. Ask questions that seek to learn, not to challenge.

3. Find common ground and determine shared facts.

In a polarized world, people start to believe that the “other” is fundamentally different from them, and that there is no starting point from which they can even begin to understand each other. People are increasingly polarized on multiple dimensions, including marital preferences, where they choose to live, and more. They no longer see conflicting opinions with which they can disagree; they see the “other” as a threat to their way of life.

However, for all their differences, there is still common ground to be found. For example, the economy is the top issue amongst people of all partisan stripes. Where people differ is on the best way to improve the economy or create jobs. Too often the conversation jumps straight to where people differ, but fails to establish that there is a shared starting point for discussion: people, regardless of party, are concerned about the impact that the economy is going to have on themselves, their families and communities. Once both sides are able to find that common ground, it humanizes the other side. They are able to see the other person as someone like them, with hopes, aspirations and concerns, rather than just a caricature of a Democrat/Republican.

More challengingly, people often simply don’t trust the facts and figures that are being discussed – and for good reason. Governments have given citizens cause for mistrust, skepticism, and cynicism over the past, resulting in an overall lack of public trust. How do you get people to agree on the facts on which discussions can be built? Paradoxically, this might have to start with governments. Citizens will not trust the facts and figures as long as they believe that information is being kept from them. Governments can leverage increased transparency to prove themselves trustworthy. This includes Freedom of Information legislation, open data by default, real-time disclosures, and more. Empowering citizens to hold governments accountable is the first step towards rebuilding public trust.

4. Diminish the impact of bots, sock puppets, and other fake accounts.

How do you know the person you’re retweeting is who they say they are? Can you even be certain that you’re interacting with a human being? An analysis of tweets by Alessandro Bessi and Emilio Ferrera found that 20% of political tweets sent during the US presidential debates were produced by bots, primarily meant to influence or manipulate political discussions. The researchers noted several key consequences of bots in political discourse, including increased political polarization and the spread of misinformation or unverified information. Bots aren’t interested in (or capable of) rational thought or discussion. By their very nature, bots pump out simplistic information in a unidirectional fashion and fail to engage, which only adds to the existing noise around politics.

Twitter and other popular social media platforms allow anyone to publish content without having to verify their identity – or even whether they’re a real person. Facebook’s annual report shows that up to 11.2% of its accounts could be fake, and that the company itself struggled with determining which accounts were fake or real. This includes duplicate accounts – people creating multiple accounts under various pseudonyms, and even adding fake pictures – acting with impunity online. When users cannot be held accountable for what they say, it is easy to act upon knee-jerk reactions instead of taking the time to consider the impact and influence of one’s words.

5. Develop platforms which, by design, encourage dialogue and deliberation.

Facebook has come under fire after the election for the ease with which misinformation is shared and propagated. While Mark Zuckerberg has claimed that “of all the content on Facebook, more than 99% of what people see is authentic,” the company’s own reports have demonstrated that upwards of 10% of accounts could be fake. By design, Facebook facilitates knee-jerk reactions and interactions which require little effort or thought. Its algorithms deliver content that aligns with users’ pre-existing views while filtering out opposing viewpoints. Sensational content, regardless of its truthfulness, is easily shared without context at the click of a button.

These features are antithesis to dialogue and deliberation, and civic technology has a duty to tackle these challenges to civil, reasonable discourse. How can we build tools which, by design, encourage listening first before speaking? How can we develop platforms that take into account the values of considered judgement, or reasonableness? This means incorporating features and functions which encourage users to be more informed and educated. This also means finding ways to curb negative online behaviour such as trolling, harassment, or cyberbullying, which decrease the overall quality of discourse and deter people from participating.

Here at PlaceSpeak, we have found digital identity authentication to be an effective deterrent to negative online behaviour. We’ve succeeded in creating dialogue and discussion on challenging and controversial topics, without a single troll, by getting people to stand behind the statements they make. We’re not the only ones working on this issue. For example, Civil Comments requires users to rate the quality and civility/rudeness of two other comments before they can participate. They must then rate their own comment according to the same metrics before it is published. The hope is that self-reflection will result in more nuanced dialogue instead of simply dropping a comment.

How else can civic technology facilitate dialogue and deliberation in a polarized political climate? How can we roll back partisan discourse and engage genuinely with the issues, and our fellow citizens? We’d love to hear your solutions and what you’re working on in the comments below.


You Might also Like

1 Comment

  1. Carol Zeising December 4, 2016

    Mary, I thought the article on creating dialogue in a polarized political climate was excellent. Some of the very things you suggested is what I try to do when engaged in these types of conversations. If everyone tried this we would not have such rudeness as displayed on Facebook comments on postings.


Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.