Yesterday the Electoral Commission published their first report into spending around general election 2017, which included a series of suggested reforms aimed at providing further transparency in relation to political campaigning. Whilst they are to be welcomed, the recent self-regulatory action of tech platforms is far more significant and there’s still plenty more to do.
Electoral law sets out rules that apply to candidates, political parties and non-party campaigners who are aiming to influence the outcome of elections in the UK.
These rules aim to limit spending, provide transparency for voters about the sources of funding and provide clarity on what campaigning activity the money pays for.
The Electoral Commission – amongst other things – provides guidance on the rules, helps enforce adherence to the rules and publishes reports relating to the rules.
A problem facing the enforcement of these rules since social media became a dominant force in society is the fact that the ability to know which political ads are being run – and by whom – has been severely curtailed.
During recent elections and referenda political advertisers could run ads targeted at specific groups and be fairly confident of evading scrutiny of their messaging (provided that no pesky journalists were inadvertently served the ad) or how much was spent saying it. This social media advertising product is known as a ‘dark ad’.
I could have run £275,000 worth of political advertising on social media during the last general election – using money that was given to me by a shady Russian oligarch – and there’s a good chance that the Electoral Commission would be none the wiser.
People who saw my ads would have no idea that it was me who ran it, or who gave me the money to pay for it, and if they didn’t think the content was true, there was no regulatory body that they could report it to.
It’s no surprise then that shady practices have begun to develop.
Some groups, such as Vote Leave and Leave.EU, have been accused of accepting in-kind donations from people and companies outside the UK. This is illegal as political parties and campaigners in the UK are only allowed to accept domestic donations.
And some foreign states have been accused of meddling in elections by running ad campaigns; it is now proven for example that Russian-sponsored campaigns were seen by tens of millions of voters in the USA during their 2016 general election.
Did this happen during recent elections and referenda in the UK? Neither the Electoral Commission nor Parliament have any real way of knowing (but it would be very unusual, given events in France, Germany and the USA if it hadn’t).
If it did, it’s a pretty sizeable breach of our democracy and a reason to further regulate political advertising.
Yesterday the Electoral Commission published their first report into spending around general election 2017, which included a series of suggested reforms aimed at providing further transparency on sources of income and on how much money was spent and on which activities.
One suggestion is to require campaigners to include an ‘imprint’ – some wording as to who is responsible for the ads – on digital communications in the same way as is required of print media.
They also want a change in the way campaigners submit their expenditure so that there’s more clarity on where money is being spent. Currently there’s one umbrella category for all advertising; the Electoral Commission would like to know in future what was spent on social media, search, billboards etc… without having to sift through a load of receipts.
And the Electoral Commission would like improved punitive powers; the current limit on fines that can be issued is £20,000, which is a drop in the ocean for mainstream political parties who typically spend many millions of pounds on campaigning.
Whilst these are all good and reasonable suggestions that should certainly be adopted, there is nothing which would enable them to monitor bad actors that use the anonymity of the internet to run campaigns.
Fortunately for the Electoral Commission, the tech platforms have stepped in and self-regulated.
Facebook have stated that they are ending ‘dark ads’; going forwards people will be able to see all the ads a Page is running on Facebook, whether or not the person viewing is in the intended target audience for the ad. They are also promising to create an archive of election related ads so that it’s easier for journalists and campaigners to hold sponsors of ads to account. Twitter have also promised similar measures.
This change will make a much bigger difference than any of the aforementioned reforms, as it will help provide transparency around the universe of messaging and targeting being used by campaigners.
Getting the platforms to make the changes wasn’t easy. It took US Senators to propose a bill called the “Honest Ads Act” and for Facebook, Twitter and Google’s senior executives to be hauled in front of a Congressional hearing.
We are the fortunate beneficiaries of the fact that the US government have taken the issue seriously; it’s a big step forwards for holding those who seek to influence elections using social networks to account.
The final and hugely important step to improving trust in political campaigns will be to create a system for pre-clearance of factual claims being made by political parties.
Until people have the confidence that facts and figures used to justify promises and attacks are independently fact-checked, there will be skepticism from the public about the truthfulness of campaigns and false information will continue to be able to affect the narrative of elections.
The Electoral Commission in their report reiterated the fact that they “do not regulate the content of political campaign messages or advertisements, including mis-information” and nor are they “seeking an extension to our remit to include these issues”.
The Electoral Commission’s fear is that if they are required to act as a fact-checker or “truth commission” for political advertising, they risk damaging their reputation for regulating political finance.
It’s an understandable position, but it’s a shame that they haven’t seen fit to include the creation of a body that could do so as part of their recommendations. It’s a glaring omission.