Facebook’s Problem with Political Ads - Can Human Rights Due Diligence Help?
28 September 2017
Last week, in response to growing public pressure, social media giant Facebook agreed to provide information to U.S. Congressional investigators on advertising purchased by Russian-linked accounts seeking to influence the 2016 U.S. presidential campaign. The ads apparently disseminated false and deliberately divisive information via the social media platform to targeted voter groups.
Facebook CEO Mark Zuckerberg issued a public statement saying the company had “found and shut down thousands of fake accounts that could be attempting to influence elections in many countries” and announced new commitments to make political ads on the platform more transparent. This will involve steps to disclose who paid for ads and allow users to visit advertiser pages. Zuckerberg also pledged to engage with election commissions around the world and expand sharing of threat information on misuse of social media platforms with security and other tech companies.
Too soon to tell.
It is too soon to tell whether these steps will be sufficient in combating attempts to unduly influence elections, or if new calls for comprehensive legislation to require greater transparency from companies like Facebook and Twitter will gather momentum.
For now, seen from the perspective of the broad corporate responsibility agenda, it is worth asking if living up to due diligence requirements in the United Nations Guiding Principles on Business and Human Rights (UNGPs) could have helped Facebook address the challenges it is currently facing.
The responsibility to respect human rights as set out in the UNGPs requires that business enterprises:
- Avoid causing or contributing to adverse human rights impacts through their own activities, and address such impacts when they occur; and
- Seek to prevent or mitigate adverse human rights impacts that they are directly linked to by their operations, products, or services through their business relationships, even if they have not caused or contributed to those impacts.
The UN Human Rights Office’s Interpretive Guide on the UNGPs states that an adverse impact occurs “when an action removes or reduces the ability of an individual to enjoy his or her human rights.”
Advertising has appeared directly on Facebook user news feeds since 2012, and the service has become a critical tool for political campaigns in their efforts to target potential voters.
The UN Human Rights Committee, which monitors state compliance with the International Covenant on Civil and Political Rights, in its General Comment on the right to participate in public affairs, voting rights and the right of equal access to public service, affirms that:
“…elections must be conducted fairly and freely…Persons entitled to vote must be free to vote for any candidate … without undue influence or coercion of any kind which may distort or inhibit the free expression of the elector's will. Voters should be able to form opinions independently, free of violence or threat of violence, compulsion, inducement or manipulative interference of any kind.”
Did Facebook’s policies (or lack thereof) on political advertising contribute to “adverse human right impacts”? There aren’t immediate or easy answers, but there are a number of points to reflect on further.
Unintended consequences.
Advertising has appeared directly on Facebook user news feeds since 2012, and the service has become a critical tool for political campaigns in their efforts to target potential voters. Facebook has moved swiftly into this space with dedicated staff, some with backgrounds in politics, who assist with outreach, engagement, and performance tracking.
To what extent did Facebook consider the unintended consequences of political advertising on its platform?
Is there any evidence the company took steps to identify or prevent adverse impacts associated with advertising that was deliberately false or misleading?
The right to vote in free and fair elections is a fundamental human right.
Failure to monitor who is using social media platforms for political advertising could contribute to the type of “undue influence” or “manipulative interference” in elections referenced by the UN Human Rights Committee. Facebook clearly has business relationships with users who purchase advertising space. Expecting it to catch every user abuse may be unrealistic, but its also the case that more can and should be done to stop the worst violators.
Facebook uses algorithms to identify problematic postings, and individuals upset by specific content can complain. Content that violates Facebook guidelines is routinely removed. Some of those decisions appear to be machine-determined, in that action is triggered when a certain number of complaints reach the company. But these mechanisms can also be abused as was seen in the 2014 case of Ukrainian activists wrongly suspended by Facebook due to a barrage of unwarranted take down requests instigated by Russian trolls.
What now?
So what should social media companies be doing now, given these challenges?
In the media world, advertising standards guide the policies and actions of newspapers and other media outlets for products such as tobacco and alcohol, and crimes such as human trafficking.
Facebook pushes back on attempts to suggest it is primarily an advertising and media company. But with many of its over 2 billion monthly users increasingly relying on the platform for the majority or their news and information, the time has come to look to the experiences and standards of media and other industries as part of efforts to shape advertising policies and practices for the future.
While Facebook users must also be more vigilant and responsible consumers, the company needs to do more, given its global reach. As one commentator put it, when information “is false, when it is purchased and manipulated to affect the outcome of an election, the effect is enormous. When the information purveyors are associated with a foreign adversary…we’re into a whole new realm of power.”
Facebook and other social media companies have an opportunity now to consult broadly on what reasonable due diligence should look like when it comes to advertising on their platforms.
Facebook and other social media companies have an opportunity now to consult broadly on what reasonable due diligence should look like when it comes to advertising on their platforms. That means consulting political analysts, lawyers, civil society groups, human rights practitioners, and media experts, among others.
Existing multi-stakeholder platforms like the Global Network Initiative (GNI), which aims to advance freedom of expression and privacy in the Information and Communications Technology (ICT) industry, could be helpful in this process.
The aim is to identify potential adverse human rights impacts of social media business practices and develop ways to prevent and mitigate them.
The right to vote in free and fair elections is a fundamental human right. If voters are relying on news and information primarily from a single source, then the source’s owner has to see itself as a critical actor in protecting democratic processes. Facebook needs to demonstrate it is willing to play its role responsibly.
After all, selling ads seeking to shape public opinion on political issues or candidates is not the same as selling soap.
After all, selling ads seeking to shape public opinion on political issues or candidates is not the same as selling soap.