Data exploitation in digital political campaigns and its implication on electoral democracy


The last decade has seen an explosion in the use of digital tools in political campaigning. Perceptions have also changed in that time. The use of social media was first foreshowed as a positive revolution in the mass engagement of voters. More recently we have seen serious allegations of misinformation, misuse of personal data, and overseas interference. Concerns that our democracy may be under threat have emerged.

Unwanted Witness considers that there are certain baseline safeguards that should be in place.


Democratic engagement is increasingly mediated by digital technology, from campaigning to election results transmission. These technologies rely on collecting, storing, and analysing personal information to operate. They raise novel issues and challenges for all electoral stakeholders on how to protect our data from exploitation.

The entire election cycle is increasingly data-dependent. This is particularly the case with digital political campaigns which are ever data-driven. This campaign environment presents novel challenges due to the scale and range of data available together with the multiplicity, complexity and speed of profiling and targeting techniques. All of this is characterised by its opacity and lack of accountability. Existing legal frameworks designed to curtail this exploitation often also fall short, either in substance or enforcement.

At Unwanted Witness, we are interested in what is “behind the curtain” – what data has been collected and inferred about you that has resulted in you being targeted with this content.

Through the amassing and processing of vast amounts of data, by platforms, data brokers, and other intermediaries and trackers, individuals are profiled based on their stated or inferred political views, preferences, and characteristics. These profiles are then used to target them with news and other content aimed at influencing and manipulating their view, which raises questions as to the transparency, fairness and accountability of both the data use and the campaign. This phenomenon, in turn, may lead to a number of harms including harms to individual autonomy, to civic participation, and other more diffuse harms to the integrity of the political process that may include political polarization.

In attempting to address these issues, it is essential to examine the relevant legal frameworks in place. For instance, the Data Protection and Privacy Act 2019 does not offer sufficient safeguards and it lacks regulations which makes it had to enforce. The electoral laws have not been updated to sufficiently address changes in digital campaigning, are disjointed, and lack teeth, an issue exacerbated by a lack of resources, coordination and enforcement action. As a result, they risk not being effective.

In March 2018, a British political consulting firm Cambridge Analytica that combined misappropriation of digital assets, data mining, and data brokerage and data analysis with strategic communication during the electoral processes, acquired and used personal data about Facebook users from an external researcher who had told Facebook that they were collecting it for academic purposes.

The personal data of up to 87 million Facebook users were acquired via the 270,000 Facebook users. By giving a third-party permission to acquire their data, this gave the third party access to information on the user’s friends network; this resulted in the data of about 87 million users, the majority of whom had not explicitly given Cambridge Analytica permission to access their data, being collected.

In Kenya, Cambridge Analytica ran campaigns in secret during Kenya’s 2013 and 2017 elections. The Company worked with 360 Media limited to developed online campaigns portraying “Raila Odinga as a blood-thirsty individual who is also sympathetic to Al-Shabaab and having no development agenda,” whilst portraying the incumbent President Kenyatta as “tough on terrorism, and being good for the economy.” The Jubilee Party downplayed Cambridge Analytica’s role, saying it had hired the firm’s parent company, to assist with branding.

In Uganda presidential elections of 2011 and 2016, President Museveni delivered a pre-recorded message using an automated “robocall” system in which he asked millions of mobile phone users to vote for the “man in the hat,” his trademark head gear. The use of people’s mobile numbers by the NRM without their knowledge or permission raised ethical and privacy issues.

We, therefore, need effective safeguards that reflect changes in digital campaigning both now and looking into the future. We need to see actors, from government, regulators, to political parties, taking measures to resist the current race to the bottom.


Digital campaign tools can make it easier and cheaper for legitimate campaigners to communicate with voters. It is a sign of a healthy democracy when campaigners tell voters about their policies and political views. However, we recognise that new techniques for reaching voters could reduce confidence in the integrity of elections and referendums. These techniques can be misused. For example, it could be easier for foreign individuals or regimes to illegally use people’s personal data to influence voters online without any physical presence in the country. Uganda based campaigners may also try to get around limits on spending through hidden digital activity


This section seeks to complement the above, with specific recommendations (some of which may already be legal requirements) to political parties and campaign groups as to how to avoid data exploitation.


The Data Protection and Privacy Act 2019 which lays down rules on the use of

Personal data is necessary, but without proper regulations, the law is not sufficient and cannot fully be functional. In its current state, the DPA can easily be manipulated and eventually affect the credibility of the entire electoral process. Political parties should, therefore, join our efforts to push for enactment of Data Protection regulations and amendment of different electoral laws to cater for the move to digital campaigns. Otherwise, there is no way we can have a free and fair election with the current legal regime.


Political parties and campaign groups must fully comply with the Data Protection and Privacy Act 2019, be accountable for all the work they do both directly and indirectly, and subject that work to close public supervision. They must ensure that the use of data in techniques such as profiling and targeting (by them and those with whom they work) complies with all the requirements of data protection and privacy law, including principles such as transparency, fairness, purpose limitation, the requirement to have a legal basis, rights such as the right to information, and obligations such as conducting a data protection impact assessment and applying due diligence to ensure that those third parties they work with comply with the law.


Political parties should also as a minimum:

  • Be transparent about their data processing activities, including publicly identifying the mechanisms they use to engage with voters (e.g. social media, websites, direct messaging).
  • Be transparent about how they collect people’s data, what data they collect, and the sources of it and how they use it.
  • Adopt and publish data protection policies.
  • Carry out and publish data protection audits and impact assessments.
  • Specify their legal basis for each use of personal data (including any sensitive data such as that revealing political opinions).
  • Be transparent as to the companies they contract with as part of campaigns, both to obtain data and to further process data, including profiling and targeting, such as data brokers and political advertising companies as well as the campaign tools/ software they are using – both in-house and external.
  • Make publicly available timely information on expenditure for online activities, including paid online political advertisements and communications. This should include information regarding companies assisting in online activities, including the amount spent on each companies’ services.
  • Be transparent on political ads and messaging, ensuring that the public can easily recognise political messages and communications and the organisation behind them and that this information is also available in an accessible online database. Make available information on any targeting criteria used in the dissemination of such political messages.
  • Ensure all online and offline advertisements are publicly available and submitted to the relevant authority.
  • Publish a complete, easily accessible and understandable list of any campaign groups that have financial or informal collaborative campaigning relationships with them, including all third parties and joint campaigners.
  • Facilitate the exercise of data rights by individuals, including by providing information about how their data is processed and providing timely access to it.


The Electoral Commission has a key mandate of protecting the integrity of the democratic process and it must, therefore, demonstrate a level of preparedness and experience to deal with the challenge that social media and other digital platforms present for elections and politics.

To that end, we recommend the following:

Since the Electoral Commission has the obligation to take measures for ensuring that the entire electoral process is conducted under conditions of fairness and transparency,  there is a need to;

  1. Build up capacities (like data analysis capacity) to detect attempts to manipulate the information environment. This includes increasing their data scrapping and analysis capacity, as well as improving its internal organization to make such units a fundamental part of their mission to protect the electoral process.
  2. Design structures and management models that provide key insights. These insights can help answer questions such as how many political actors are doing the digital campaigns, who connects with who, how often, and for what purpose.

This is especially relevant when it comes to voting suppression attempts by manipulating voting process information. This will go a long way in helping the commission in making accurate fact-based decisions surrounding these complex issues of data exploitation.

  • The second priority is the financial side of the problem, few steps have been taken to address this in general. Although political financial regulations are highly detailed they fail to consider online activities by political parties and candidates. Political financial regulations need to be updated, taking into consideration that regulating how parties spend online might help to reduce the possible use of social media to manipulate public opinion.
  • The Electoral Commission must demonstrate the ability to monitor and counter digital information operations to veer the results of elections through alteration of public opinion online which sometimes go beyond sheer disinformation and coordinated smear-campaigns. This should be done through well-articulated and binding guidelines
  • The electoral commission should update campaign standards and principles to reflect the importance of online campaigning. This should include an update of methods of monitoring: selection of media for monitoring (content monitoring); revision of spending monitoring, transparency and data requirements for platforms and intermediaries. This will go a long way in ensuring an orderly electoral process.
  • The Electoral Commission should ensure that campaigners are required to provide more detailed and meaningful invoices from their digital suppliers to improve transparency.


Legislators and governments must develop, strengthen and enact updated legal frameworks. Those must then be enforced by those empowered to do so, courts, oversight bodies and regulators.

  • Regulators must be empowered to provide clear and binding guidance, take action (both proactively and in response to complaints) and enforce the law, have the ability to conduct their work without external pressure and with the ability to request information from and if necessary take action against all parties involved in the electoral cycle.
  • There is a need for joint cooperation and enforcement between regulators at national, regional and international levels. Threats to elections come from diverse actors and require the engagement of multiple regulators as well as coordination among them. Other laws including advertising, telemarketing/ anti-spam, communications and cybersecurity may also come into play.
  • The Ministry of ICT and National guidance should issue a binding code of practice, code of conduct or equivalent that applies to all actors involved in political campaigns, with any violations being subject to appropriate enforcement action.

These recommendations will not be our final view on these issues. We certainly do not claim to have all the answers. We also recognise that no single political actor is responsible for all the concerns raised by digital campaigning. Continuing co-operation with others such as the Ministry of ICT and National Guidance, Electoral Commission and the Parliament is vital. For our part, we will continue to monitor the trends, and put forward our views when we think we can help promote public confidence in the electoral process.

For God and My Country!

About Unwanted Witness Uganda

Unwanted Witness Uganda is a non-partisan and non-profit organization working on the intersection of technology and human rights. 

We stand out, as an internet/ online based human rights organization in Uganda that seeks to put the power of change in the hands of citizens through the internet and online media to guarantee internet freedoms and improve the human rights situation in the country.

The core business for Unwanted Witness is the protection and defence of digital rights and internet/ online freedoms through highlighting the various human rights abuses. This is done owing to its character: Amplifying voices changing lives