Talking about Big Data: The Challenge of Privacy & Data Protection in International Development

Image: Server room

Data sovereignty vs. data protection

Privacy is a fundamental human right recognized in the UN Declaration of Human Rights. In an ever more connected and digitalized world, international development organizations need to make sure that data protection is a core component in the project planning and implementing process based on the principle of “privacy by design”. However, considering the advance of big data, we also need to discuss standards for an ethical use of data beyond personal data.

 

I recently came across a blog post of the Guardian column “Secret Aid Worker” dealing with data protection and privacy in development work. The bottom line was that NGOs do not take data protection seriously, which confirms more or less the impression I have as well. I would not go so far as to claim that people violate data protection regulations out of bad intentions and put those they actually want to help at risk. I rather think that there is a lack of sensitivity for the topic, its fundamentals, the ethical questions associated with it and the risks personal data can pose to those most vulnerable and marginalized in a world that is ever more digitalized and connected.

A human-rights-based approach to privacy and data protection

Privacy is a fundamental human right recognized in the UN Declaration of Human Rights, the International Covenant on Civil and Political Rights and in many other international and regional treaties. The right to privacy underpins human dignity and autonomy. It is also an essential foundation for many other human rights such as freedom of expression and freedom of association. This applies offline as well as online. Data protection regulation is thus essential to protect the privacy of the people we work with in international development. It is, however, often regarded as an obstacle to innovation and effective implementation rather than an enabler for the protection of human rights and effective development work, in line with the standard of “do no harm”. With the advance of digital technologies and data analytics, this ignorance is increasingly worrying.

In view of the current situation, I believe we need to address this challenge on three Levels:

  1. We need to ensure that organizations are working in conjunction with the law when we are legally responsible for the implementation of a project. From May 2018 onwards, European development organizations need to work according to the European General Data Protection Regulation (GDPR).
  2. We need to establish guidelines and define “red lines” in cases where our organization is not legally responsible and national law of the partner country applies, if existent.
  3. Considering the digital age and big data, we need to take this discussion to the next level and discuss standards for an ethical use of data beyond personal data.

Privacy by design as a means to work in conjunction with the law

The best way to ensure good data protection practice is to use a “privacy by design” approach. This means that data protection is a core component in the project planning and implementing process similar to issues such as gender. Accordingly, you should outline in a concept what kind of data sensitive questions might arise and how you are planning to address them. This covers which data types you are planning to collect, for what purpose, how you are planning to use, transfer and store data, who has access to it, which risks and harm might occur, how you plan to mitigate these and who is legally responsible for implementation. Ideally, your organization should also include a budget line for data protection in the project budget and there should be legal as well as technical expertise at hand. Privacy by design also means re-assessing the data protection concept regularly, especially when new data sources or technological solutions are being used. These changes can be quite fundamental to an organization’s processes and procedures. A good starting point is setting up a data protection toolbox including a self-assessment, good practice examples, a “whitelist” of tools and a list of external experts including organizations such as Privacy International and Tactical Tech.

The cornerstones of current legislation as guidelines

Guidelines for cases where your organization is not legally but ethically responsible should be based on current legislation such as the GDPR and take national contexts into account. The GDPR is based on a set of data-management principles, including well-known standards such as purpose specification, data minimization and use limitation as well as individual rights such as consent and access requirements for the collection, use and disclosure of personal information. The African Union Convention on Cyber Security and Personal Data Protection from 2014 mirrors a large part of the GDPR. However, often regulations might exist but are either very vague and/or are not legally enforced or implemented, as in the case of the AU Convention. Thus, organizations should develop their own guidelines, define “red lines” and complement them with practical tools as outlined above.

Imagine, for example, that your organization advises the Ethiopian government on the implementation of an e-participation tool for city planning. The Ethiopian constitution guarantees privacy in a detailed manner, but the country is known for massive surveillance and the interception of electronic communication of opposition members and journalists. How do you act?

The advance of big data

In the last decade, we have seen huge changes in the collection and use of personal data through the advent of mass data generation from digital sources such as smartphones, health trackers, smart housing solutions etc.. Data driven applications are going to play a much bigger role in our lives and are already beginning to do so, such as in pricing models for health insurances based on the user’s health tracker data. So far, there is no effective regulation of big data in view and political actors are just beginning to understand its foundations, legitimations, biases and the realities big data creates. With the advance of big data, already disputed aspects of data protection are going to be even more challenged.

Take the principle of informed consent for example: Even today, it is argued that users do not read terms and conditions when signing up for a service. The leading data protection specialist Prof. Dr. Wolfgang Hoffmann-Riem, a former judge at the Federal Constitutional Court in Germany, therefore speaks of a so-called “Einwilligungsfiktion” (fiction of consent). If we transfer this challenge to the context of international development, it becomes obvious that the principle is hard to implement in regions where people not even share the same sensitivity for privacy.

Data sovereignty vs. data protection

Other principles such as use limitation, purpose specification and data minimization are generally difficult to combine with big data needs. The idea inherent to big data implies that data is collected and combined without necessarily knowing beforehand for which future purpose it will be used. Some studies and reports are therefore concluding that we need a new approach to data management altogether in order to make use of the innovation potential of large-scale data analysis and simultaneously protect people from a misuse of their data. In this sense, the focus shifts from the collection of personal data to its use and the empowerment of the individual to control how their data is used, known as “data sovereignty”. However, this concept is very limited in the context of international development and humanitarian aid.

An ethical approach to data: International development organizations are taking the lead

Interestingly, the international development and humanitarian community might have other solutions: Many organizations such as UNOCHA, Médecins Sans Frontièrs and Oxfam defined responsible data frameworks or data sharing policies. UN Global Pulse’s Data Privacy Advisory Group even published privacy and data protection principles specifically aimed at big data research. A study carried out by Leiden University’s center for Innovation and the Governance Laboratory at New York University (GovLab) mapped and compared 17 responsible data approaches. These results can be used as a toolkit for organizations seeking to establish an own responsible data framework to handle data more effectively and responsibly. Ideally, these approaches aim at protecting the individual from harm emanating from data-driven solutions itself such as discrimination based on profiling as well as from privacy risks associated with data processing.

Privacy: Not a second-class human right

At times, some people from my sector argue that we cannot and should not apply the “heavy” data protection principles upon which the European Data Protection Regulation is based, especially in countries that do not share the same understanding and consciousness of privacy as we do. Misconceiving privacy as a second-class human right, however, carries a dangerous notion: It implies that human rights are negotiable and subject to discussion. What we need instead is improved data literacy and understanding of the data ecosystem with its implications in an interconnected, data-driven world, which does not stop with international development. Some players from our sector are even taking the lead.

Image: Lea Gimpel

Lea Gimpel works at the intersection of international development, digital rights and innovation. She currently heads the company-wide strategic project “Digital Transformation” at Deutsche Gesellschaft für Internationale Zusammenarbeit (GIZ).

Leave Comment

Deine E-Mail-Adresse wird nicht veröffentlicht. Erforderliche Felder sind mit * markiert