President Barack Obama, with Facebook CEO Mark Zuckerberg, holds a town hall meeting at Facebook headquarters in Palo Alto, California, April 20, 2011. (Official White House Photo by Lawrence Jackson)
Recent revelations showed that Cambridge Analytica was able to obtain and use data from 50 million Facebook users to create psychographic profiles. This allowed them to deliver highly targeted and customized political advertising during the 2016 United States presidential election. The leaked blueprint for the Trump campaign indicated that they were able to “target 10,000 different ads to different audiences” in the lead-up to the election.
These findings raise significant issues around consent and privacy. The majority of users are not aware of how their data is used or what rights they have to their own information. According to one survey, 61% of respondents were “unaware of Facebook’s right to share their information with third parties.” While users need to take some responsibility for the information that they share with Facebook (and by extension, many third-party sites and apps), most Terms and Conditions are opaque and confusing, even for tech-savvy users.
The 2018 Edelman Trust Barometer highlights this growing mistrust in platforms such as social media and search engines. Overall, people are starting to recognize that they cannot trust social media. In particular, only 51% of respondents said that they trusted social media. Despite this, governments and decision-makers have come to rely so heavily on social media platforms such as Facebook and Twitter for reaching out to the public.
A New Paradigm
While users struggle with whether to delete their Facebook accounts in the wake of these revelations, one thing is clear. We need a new paradigm for engaging with the public that goes beyond targeting them during the election cycle. We need to look at ways of building an inclusive civic network that facilitates and encourages ongoing political engagement in between elections – which treats the public as equal partners in the decisions that impact them and the places where they live, work, and play.
The following factors are crucial for building platforms that build trust and encourage ongoing civic participation:
1. Be privacy-respecting
Privacy by Design (PbD) principles ensure that privacy measures are architected into the very structure of the platform. Instead of responding to a privacy breach, PbD works proactively to ensure that such circumstances do not happen in the first place. For example, PbD assumes that privacy is the default. User information is kept private and never shared with third-party individuals or organizations by default – users have to agree to make their information public.
This is in contrast to Facebook and other social networks, where users have to opt in if they don’t want their information to be shared or sold. The process of adjusting privacy settings can be challenging even for tech-savvy users. In the wake of the Cambridge Analytica scandal, articles instructing users on how to revoke third-party access to their Facebook data were rapidly shared as people struggled to figure out their privacy settings. Any platform which aims to build trust needs to ensure that users are clearly informed of, and in charge of, their own privacy and data settings.
2. Not be based on advertising revenue
In 2017, Facebook reported $39 billion in advertising revenue. While social media advertising has proven to be an effective means for increasing reach and influence, it has reinforced echo chambers and accentuated political divides, which have profound implications for citizen engagement and civic participation.
One element of Facebook advertising is its pay-per-click model (PPC). Facebook charges, on average, between $0.20 – $0.80 USD when a user clicks on a link, such as a targeted political ad. Advertising-based models are thus incentivized to show users content which aligns with their pre-existing viewpoints, which they are more likely to click on and engage with. The algorithm is also self-reinforcing – as users click on or interact with (e.g. like, “love”, comment on, etc.) more of the same kind of content, they are likely to be shown more of it in the future.
A platform for inclusive civic engagement needs to ensure that all participants’ voices are given equal weight to engage with the decision-making process – not only industry or lobby groups which can afford to pay to amplify their message. Any advertising-based model is inherently incentivized to reinforcing existing biases, rather than showcasing and exposing the public to diverse perspectives. We are seeing the results where echo chambers and partisanship have been exacerbated – people on both sides of the aisle cannot even agree on fundamental facts or statistics to build consensus and engage reasonably with people they disagree with.
3. Facilitate healthy dialogue
Cambridge Analytica’s tactics may be effective at manipulating voter behaviour, but there is little evidence thus far on whether this has the capacity to sustain or encourage ongoing engagement. It is too early to tell whether people who were influenced by these highly targeted political ads will vote in the next election, or become more engaged in their own communities.
Genuine engagement starts with ensuring that the public is informed – that there is a shared basis from which dialogue can happen, both between decision-makers and citizens, and amongst citizens, across partisan lines. Once people are informed, they can start to make a meaningful impact on their communities, starting with voting and extending to other forms of civic participation, such as engaging with public consultation processes (in-person or online), co-creating solutions to local issues, starting local initiatives, attending community events, and more.
Healthy dialogue also points to the importance of inclusive online spaces which are free of trolls, bots, astroturfing, and other methods which manipulate and undermine online democratic engagement. Cambridge Analytica’s tactics work because they tap precisely into how Facebook rewards knee-jerk reactions. How can we build tools which, by design, encourage listening first before speaking? How can we develop platforms that take into account the values of considered judgement, or reasonableness? How can we curb negative online behaviour, which decrease the overall quality of discourse and deter people from participating?
Here at PlaceSpeak, we have found digital identity authentication to be an effective deterrent to negative online behaviour. We’ve succeeded in creating dialogue and discussion on challenging and controversial issues by getting people to stand behind the statements they make. We’re not the only ones working on this issue. For example, Civil Comments requires users to rate the quality and level of civility of two other comments before they can participate. Meanwhile, the Deliberatorium shows competing answers for a given question next to each other – instead of hiding opposing perspectives from participants, the system makes it hard to avoid them. A platform for genuine engagement must incorporate features which encourage users to participate in more informed and productive dialogue, by design.
If you found this post interesting and useful, we’d appreciate if you would share and subscribe to our blog.
To get started with your online public consultation, visit placespeak.com.