Florencia Goldsman

This is not just another of the dozens of articles about COVID-19 apps already published in the media around the world. This post is based on the assertion that we women experience a continuum of surveillance over our bodies, and that this control is exacerbated during health crises. We are not safe when we move around cities. We are raped or killed just for walking down the street, but we are also killed inside our homes, especially when we have to self-isolate and are in charge of caretaking. So let's look at the pros and cons of implementing COVID-19 apps from a cyber-feminist and intersectional perspective.

The inequalities imposed by the old neoliberal system we're all familiar with are felt today in the midst of this crisis, in the flesh. That's why we argue that the search for solutions should not be focused on techno-solutionism: the idea that (digital) technology as we know it today in its exacerbated commercial version will solve our precarious lives. This means that the various technological implementations presented as instant solutions to the historical and structural flaws that define our health, educational, transportation and scientific development support systems, will not work by magic. 

We need to situate ourselves in the present context. Today, more than ever, it is appropriate to once again point out the intimate link between the voracious spread of capitalist agribusiness and the etiology of recent epidemics, say Silvia Citro and María Luz Rosa. "It is politically necessary to repeat this, so we don't forget: the thousands of deaths from the persistent social plagues of coloniality, violence and capitalist inequality, those other deaths, of generally poor and not so white people, are already part of the 'global landscape' of capitalism... They are deaths that are seen as normal, bearable... This yellow-white plague that cuts across social classes is new, and we non-poor white people cannot bear it. That is the sad novelty of this."

In the face of the shock, paralysis and fear caused by the COVID-19 catastrophe we have seen how governments around the world hastened to look to technology as an ally to help curb the pandemic. We will see later that the way these new technologies are designed, what they are controlling are the people. Here is a (non-exhaustive) list of the global tracking applications shared by Sursiendo, which from southern Mexico shows us how for each country or region a different COVID-19 app is offered.

In fact, what we want to make clear in this article is that independently of the requirements made urgent by the present crisis, nothing justifies the deployment of new initiatives without first weighing up the risks or without applying safeguards when fundamental rights involving our most private information are at stake. And guess what? Our bodies are also involved.

Just to begin with, the risks associated with the use of such data are varied and well-documented. We have seen how mobile data, metadata such as location data, have been used to track our use of public spaces, monitor our movements in protests, and persecute activists in street demonstrations.


Desirable notions for the protection of our privacy

In many Latin American countries, especially Brazil, Ecuador, Guatemala and Mexico, different organisations and institutions are concerned about government proposals for technological implementations to track the spread of the virus. There is no clarity about the use, limitations and safeguards in the processing of sensitive personal data that will be massively captured through these applications.

Marta Peirano, a journalist who specialises in surveillance technology and systems, warns that "we have to be very careful because many governments and companies are justifying the development of applications and the use of data that are generally not allowed. They are justifying it based on the success of apps in Asia, but first, they are not the same applications and second, using applications without prior testing is only an invasion of privacy."

There are, however, contributions that geolocation and tracking systems (known as contact tracing) can make, used in an equitable manner, to help track the spread of the virus. But this is only possible with guarantees that this surveillance infrastructure created to deal with the coronavirus pandemic will be dismantled once the threat has passed.

Amnesty International and more than 100 other organisations recently issued a statement demanding limits on digital surveillance. The organisations outlined eight conditions that should be met in government projects, aimed at limiting the scope of permits for this type of surveillance technopolicy.

  • Surveillance measures must be "lawful, necessary and proportionate".
  • Expanded monitoring and surveillance powers must be time-bound.
  • Increased collection of personal data must only be used for the purposes of responding to the COVID-19 pandemic.
  • The security and anonymity of the data collected must be protected and this must be proven on the basis of evidence.
  • Digital surveillance must avoid facilitating discrimination and marginalisation.
  • Any sharing of data with third parties must be based on law.
  • There should be safeguards against abuse and citizens subjected to surveillance must have access to effective remedies.
  • Data collection efforts should include means for "free, active, and meaningful participation of relevant stakeholders", in particular public health experts and marginalised population groups.

Experience shows a familiar problem related to uses and abuses by governments once they start employing this type of tool. The case of China, for example, is the extreme of citizen surveillance. It must be taken into account, because the Asian giant has never scaled back after adopting ever more exacerbated surveillance technologies. According to Peirano, "If after this emergency there are governments that do not abandon citizen surveillance because they have discovered that these applications are much cheaper than exercising other types of control, it is very likely that the public will not find out."

How do COVID-19 apps work?

While each app has its own specificities that vary from country to country, and from company to company, the main thing is that they replace a week's work of manual contact tracing with instantaneous signals, and the data is sent to a central server. The results of coronavirus tests are sent to a server, and because the possible contacts are known, people are classified according to a stratified isolation and social distancing system, while preserving the anonymity of the infected persons. Tests can be requested by people with symptoms, using an app.

It is important to reiterate that apps should be just one gear in a system that starts with check-ups available to the entire population, access to quality health infrastructure and reliable epidemiological information aimed at disease prevention.

While there are points of consensus on how these applications can make a positive contribution, there are two questions linked to key factors: time and distance related to exposure. It is not clear how the technological limitations of Bluetooth proximity calculations will influence public health decisions to notify potentially infected persons. The same applies to the GPS location system which is even less accurate. Is it better for these apps to be slightly hypersensitive and risk over-notification of people who may not have actually been within two metres of an infected user for a long enough period of time, or should it have higher thresholds so that a notified person can be more sure that they were actually exposed?

It is very likely that in the days following the publication of this post, these obstacles will have been overcome. But at this time, a review of the literature reflects the uncertainties regarding the equation for the distance between healthy and infected people. There are also doubts regarding exposure time, in order to reduce the margin of error of predictions provided by the available technologies.

On the one hand, the nature of COVID-19 transmission suggests that an application can only provide very "crude" information on the spread of the virus, says the journal Wired: "A phone is typically able to determine its position with an accuracy between 7 and 13 meters in urban areas, according to a study published last year, and accuracy may often be less precise. The COVID-19 virus seems to spread between people who are within a few feet of each other."

And on the other hand, we know that in many countries of the world, only the most privileged have internet access, can upload data on a cell-phone or own a "smart phone". The Big Data Institute explains that in order to be effective, at least 60% of a country's population would need to participate in this kind of contact tracing. Bad news: the digital gap remains wide: in the least developed countries (LDCs), only 19% of people were online in 2019 (International Telecommunication Union).


An app for each COVID

A good explanation about how an app should collect, mix and then share private data was illustrated by a comic book collaboratively translated into several languages. Meanwhile, the Electronic Frontier Foundation reports that some apps rely on one or more central authorities that have privileged access to information about users' devices. "For example, TraceTogether, developed for the government of Singapore, requires all users to share their contact information with the app’s administrators. In this model, the authority keeps a database that maps app identifiers to contact information. When a user tests positive, their app uploads a list of all the identifiers it has come into contact with over the past two weeks. The central authority looks up those identifiers in its database, and uses phone numbers or email addresses to reach out to other users who may have been exposed. This places a lot of user information out of their own control, and in the hands of the government."

According to the EFF there are other models that don't have an authority storing real contact information. Instead, infected users can upload their contact logs to a central database, which stores anonymous identifiers for everyone who may have been exposed. "Then, the devices of users who are not infected can regularly ping the authority with their own identifiers. The authority responds to each ping with whether the user has been exposed. With basic safeguards in place, this model could be more protective of user privacy. Unfortunately, it may still allow the authority to learn the real identities of infected users. With more sophisticated safeguards, like cryptographic mixing, the system could offer slightly stronger privacy guarantees."

"You've been in contact with someone who has tested positive for coronavirus, ask for the test and isolate until you know the result." The proposal published by Apple and Google would broadcast a similar message, "a list of keys associated with infected individuals to nearby people with the app. This model places less trust in a central authority, but it creates new risks to users who share their infection status that must be mitigated or accepted," says the EFF.

Lastly, some apps require authorities, such as health officials, to certify that a person is infected before they may alert other app users. Other proposals could allow users to self-report infection status or symptoms, but this could lead to significant numbers of false positives, which could undermine the usefulness of the app.

What is at stake with these contact tracing applications is our autonomy and the use of our most private data. How will this contradiction be resolved in countries where people historically distrust authorities and health officials? What will happen in countries where there are no regulations on the use of personal data?

Finally, we have a fresh memory of the constant surveillance that is carried out as part of the commercial exploitation of our data. Facebook, Google and data analytics companies have been accumulating geolocation data for years, in great detail and for purely commercial purposes.

We know, as Marta Peirano explains, that there are many companies that will take advantage of this moment when we have to save lives, to violate our privacy. In a country where 77% of the population does not believe elections are fair, 85% think that corruption is widespread, and 66% doubt the judicial system, we are not going to believe that our governments are going to fully respect the protection of our health data, are we? 

Since governments began to declare a state of emergency and lockdowns, cyber-surveillance measures have been rapidly legitimised. These technocratic possibilities already existed and were latent, just waiting for the opportunity to flourish. The most ubiquitous is the tracking of people's movements through cell phones. That's why it has to be questioned from every possible angle. 

Perhaps we should dream about technologies that do not solve problems based on everyone having their own cell phone. Perhaps we can collectively come up with ways to take care of ourselves and each other, to manage databases while respecting privacy, with a more ethical and transparent approach to data collection. 

We must think of new collaborative micro-policies that allow us to live again and create technologies. These can be simple affective techniques, like getting in touch to ask people around us how they feel, from something as silly as sharing a meme to make them laugh to teaching them to connect to a secure VoIP platform and no longer multiply a news item that fuels mass panic. 

Contact friends with whom we can make a radio program, approach groups that are already creating chains of mutual aid to bring food to neighbours in need, and/or share a garden or pots on the balcony to re-create a subsistence garden. To finally create new technologies to re-exist. To focus no longer on the solutions brought to us by Silicon Valley's octopus-like businesses or by governments with a thirst for surveillance of their citizens, but to look at all the available digital (and analog) technologies that are around us.

p { margin-bottom: 0.25cm; line-height: 115%; }a:link { } p { margin-bottom: 0.25cm; line-height: 115%; }a:link { }