Has COVID-19 changed the face of tech ethics forever?

COVID-19 is influencing changes to what we consider ethical use of technology. While these changes are aimed at solving a short-term global issue, they may have permanent ramifications.

The outbreak of the novel coronavirus has fundamentally changed many aspects of the world economy, business, and our day to day lives. As many businesses are experiencing significant disruptions and a large portion of the global workforce finds themselves in precarious positions as a result, life as we know it, both professionally and personally, is undeniably different for the majority of us.

Within the technology space, this has been manifesting in a variety of ways, whether that is through the obligatory move towards robust remote working policies facilitated by digital technologies, or through the increased emphasis on critical digital infrastructure and networking upkeep. These technological shifts are, in fact, proving instrumental in keeping people working, business solvent, and the economy moving forward. Additionally, tech has even been instrumental in assisting with stopping the coronavirus from spreading, while giving health professionals the firepower they need in order to get ahead of the virus.

However, there are insinuations that efforts to curtail the spread of the coronavirus carry with them serious ramifications for privacy and ethics when it comes to the use of technology. There are technologies being used and developed right now in order to save lives and protect health services that even six months ago we would never have even considered. All of this raises key questions over the impact of relaxing privacy measures in the hopes of possibly saving lives and keeping citizens safe. Additionally, will the implementation of more invasive technological measures set a new precedent that will be carried through even after the pandemic has been thwarted?

It's a concept that has been considered countless times throughout the information age when issues of privacy and surveillance have been raised, but it's arguably never been as pragmatically relevant until now.

 

Tracing tech and limitations

Tech has even had a profound impact on global efforts to tackle the coronavirus in a direct way. One of the more obvious ways this is being accomplished is through things like contact tracing or tracking applications being built by some governments and public health institutions to alert citizens of the extent of their exposure. Google and Apple have even chipped in to help make this happen, announcing joint plans to build tracing tools and frameworks from which governments can build their applications.

These tools work by using Bluetooth signals to track potential coronavirus cases, based on data voluntarily submitted by the users of these devices, meaning if a user indicates they tested positive for the virus, their phones would intuitively alert others if they came into close proximity of that person. Such efforts will be especially important once lockdown measures start easing, as public health officials will be determined to control and trace the disease in order to avoid a massive resurgence.

Of course, Apple and Google's technology immediately raises key privacy concerns, primarily over the transmission of personal data. Such concerns were raised in the United States, with US senator Richard Blumenthal arguing that the two companies will "have a lot of work to do" if they want to convince the public they will be taking their privacy seriously. Fortunately, the two tech giants are working to bake in privacy-minded precautions, anonymizing exchanged Bluetooth keys and corresponding health data so that all identities are safeguarded. The system is also opt-in and is designed to work in conjunction with applications from governments and health authorities. Data is also stored directly on user devices, rather than on central servers, relying on proximity detection instead of centralised location data.

So, putting whether people inherently trust Apple and Google aside, on the surface of things, everything sounds relatively airtight. Although some issues persist with the proposed system as it stands, chiefly the effectiveness of the underlying Bluetooth technology itself. The most significant here, is the limitations of Bluetooth in determining the distance between two interacting mobile devices, with instances of inaccuracy with devices that are between 6 and 30 feet away.

These limitations could very easily result in the delivery of false positives to users that may have been well clear of infected users, or even those that may have been living in a big apartment block given Bluetooth's ability to penetrate through solid surfaces. While Bluetooth standards such as received signal strength indication (RSSI) are designed to offer finer-grained location details, even this is impacted by certain factors such as device orientation and whether the phone is in a backpack or shielded from the other signal. Overall, while the Bluetooth system is probably the most effective at upholding most privacy standards (even though such a system would have undoubtedly received major pushback in any other circumstances), it might not be the most effective at doing the actual job.

In addition, Apple and Google have already run into issues with some countries who are looking to build on top of their technology. The UK's National Health Service (NHS) was reportedly at odds with the two firms over their stipulation that publicly crafted applications do not create centralised databases of citizen data, which - in Apple and Google's view - could lead to future exploitation. The same goes for France, which recently publicly urged the two firms to ease their privacy protections around tracing, admitting that their plans for an application would not work under Apple and Google's current guidelines.

It's not a stretch to suggest that the more draconian measures, like those employed by China or even South Korea, have been far more effective in limiting the spread of the novel coronavirus. Although such measures obviously need to be actively weighed against user privacy and more largely the ethical standards related to the collection and use of personal data.  

 

Ethical exceptions worth it?

Apple and Google's tracing technology isn't the only example of instances where the act of forgoing regular ethical standards is presenting a concern. China's approach to containing the virus has been exceptionally heavy-handed, including the use of drones to monitor citizens and provide directives over whether they need to wear masks, isolate themselves or take other measures.

There have even been reports in China of drones equipped with thermal imaging cameras to check temperatures of residents and using facial recognition scanning to provide information to specific people. While we may have come to expect this kind of activity from China, it has nevertheless been effective at helping to curtail the virus, but remains at the more dystopian end of the spectrum when it comes to tech ethics.

So, are the more heavy-handed approaches worth implementing if it leads to lives being saved? Prominent technologist and tech ethics expert Anne Currie says that while she wouldn't necessarily advocate for China's approach, there is a degree to which ethical considerations must be eased if we are to save a considerable number of lives.

"Tech ethics in the good times and tech ethics in the bad times are extremely different. When you've got hundreds of thousands of lives on the line, we all do occasionally need to suspend some of our privileges. That is just the reality of the situation," she says

"Right now, we are in a battle. We're in a battle with an implacable other. We're not battling with a competitor at work and we're not battling with another country, as difficult as that may be. We are battling with a virus that doesn't care at all about us. It doesn't care about fairness, diversity, privacy, or any of the good things that we generally value. It will just kill us if we don't act and that has changed where our priorities lie, which is the right thing to happen."   

 

Permanent impact

While Currie says that the focal point of tech ethics up until this point has been privacy, she expects that this will shift as priorities become more about keeping people from dying, which can be facilitated by things like mass surveillance. She says this is set to have rather permanent ramifications on tech ethics in general, with discussions of privacy coming across as somewhat irrelevant as the sphere changes. Currie argues ethicists will then pivot their conversations away from keeping data private, and more towards how a society with a higher degree of surveillance and monitoring should work, keeping their eyes on events and encouraging people to question them.

"In six months, this will be the new normal. It's unclear that people will apply much pressure to go back, not least because we'll all be very aware that the next pandemic is just around the corner if we revert to our old behaviour, and we won't want to set up all our systems again from scratch at the loss of thousands of lives," Currie continues.

"Ethical frameworks reflect social pressures, and the balance may well shift from individual rights to privacy to collective safety. This doesn't worry me as much as it might. I suspect our greater danger comes from automated (AI-driven) decisions that we cannot challenge. Radical openness about data will be the new norm after covid and it might help protect us from opaque computer-generated judgments."

Whether this will fully come to fruition remains to be seen, but it doubtful to suggest that no long-term change to tech ethics and privacy arrangements will come about as a result of what is really a new way of living  and doing business for the majority of people for an extended term. Even when everything opens back up and business returns to normal, there are bound to be some conditions left over from any newly crafted ethical standards.

While that may be dependent on how those standards shift over the next 6-24 months, these are decisions that will have to be made much more quickly and with the health of the global population at the forefront of policy-crafter's minds.  

Related: