Trust is essential for contact tracing efforts to be successful. However, the recent revelation that Singapore’s contact tracing data is available to law enforcement for criminal investigations, in spite of previous reassurances that this would not be the case, has caused some alarm not only locally, but also internationally. Will this alarm and emergent mistrust pose difficulties for Singapore’s fight against the ongoing pandemic?

It is important to situate these worries in historical context. Contact tracing for the sake of public health has a long and storied history, raising moral questions aplenty. Disadvantaged communities around the world, in particular, have long been suspicious that contact tracing efforts might perpetuate – and mask – further discrimination and exploitation. Take, for instance, contact-tracing efforts in USA in the 1980s, during the early years of the AIDS epidemic . At a time when homosexuality was still illegal in many states, compiling lists of gay men and their sexual partners – who were believed to be the source of the epidemic – felt risky and stigmatising. Even as public officials argued that AIDS patients had a moral duty to disclose their sexual history , there was widespread distrust that a public health issue was being used to cover and justify discrimination against the gay community.

Optimists might say that as a global society we have learnt from these past mistakes, and that the art of trust-building would be mastered in contemporary contact-tracing efforts. However, the recent situation with TraceTogether shows that this might not be the case. It is, of course, reassuring that the Government appears to be listening to concerned voices – issuing a mea culpa acknowledging their mistake, and a series of clarifications about the use of TraceTogether data . However, even as local officials navigate the situation, we see two important caveats in how trust in the TraceTogether app has been approached. Those caveats need to be identified to ensure that the violated trust of many citizens can be successfully repaired. In doing so we rely on decades of behavioural research that has identified one crucial assumption that any trust-building effort has to work with: “Trust is in the eye of the beholder!” This assumption suggests that it is not the developer or government that decides that people should trust the TraceTogether app, but that it is the user who decides whether they perceive the app as trustworthy or not. And, it is in this perception process that we see problems so far:

First, the government’s approach to build trust was, and continues to be, to point to certain technical features of the TraceTogether app and token, and label them as trustworthy. Earlier last year, experts were brought in to inspect the TraceTogether token and attest to its privacy-preserving features. And, in response to the backlash after the recent announcement, public officials insisted that citizens could easily write in to delete their stored TraceTogether data. While this is an empirically-backed approach in order to build community trust, it breaks down in this particular context because of the fact that user trust in contact tracing technology is not determined based on its privacy preserving features. Rather, users assign trustworthiness to a piece of technology by assessing the trustworthiness of the creator, or the government in this case. In other words, technology has no intentions, but the person, or government, that creates the technology does . If the creator of the technology is worthy of trust, the technology is too.

Second, trust and consistency go hand-in-hand. Trust is developed when people perceive a pattern of consistency between word and deed , . So, walk the talk – because ensuring alignment between talk and action is essential in order to reduce the need for coercive state imposition. Inconsistencies between word and deed not only make people doubt your actions and decisions– challenging your perceived integrity – but also cast doubt over your competence. Much has been said, for instance, about how the public was earlier assured that Trace Together data would be used only for contact-tracing, only to have this contradicted by the recent acknowledgement that this data would, in fact, be used for certain criminal investigations. Further, insisting that TraceTogether data is necessary for criminal investigations also seems inconsistent – given the expectation that the use of TraceTogether is temporary, and would no longer be necessary once the pandemic abates. Elsewhere, legal approaches have been used to strictly delimit the use of contact-tracing data for the pandemic. In New York, for example, a bill was recently signed into law that protects the confidentiality of contact tracing information . Given how the discourse surrounding TraceTogether has, since the beginning, highlighted its privacy-preserving aspects, and its centrality in the fight against COVID-19, it is surprising that such legal interventions were not pursued until only very recently.

With these two points, it is easy to understand where things went wrong early on. To begin with, there was an over-reliance on the idea that an emphasis on the technical features of the contact tracing app can be used to induce trust in users. This, combined with a disregard of the consistency principle in relation to developing user trust, whereby the government did not “walk the talk,” resulted in uncertainty about the real intent of the technology. Ultimately, these violations of trust perceptions harm the efficient use of the app and in turn, contact tracing efforts. Unfortunately, a consequence of these short-term user perceptions is that it is now also very likely that any future effort by the government in this area will continue to be viewed in light of the recent violations of trust.

Our own research on what it takes to repair trust may help in offering some advice on how to move on now the damage is done. Citizens want to see what they can expect from the government in the future. In this process it is crucial that the government succeeds in communicating that no bad intentions were involved, and that they do possess the competence to improve the use of TraceTogether. The first lesson to be learned is thus clearly that any kind of denial will not help in this case. The government should accept the fact that its citizens perceive it in distrusting ways, and take the responsibility of ensuring that its citizens’ concerns are addressed. Promises alone, however, will not do the job. Solutions will need to be offered and explained in detail – not only to improve the employment of the app, but also to help make citizens feel safe again to interact with the government. And, finally, it is important to ensure that these solutions are acted on in a consistent manner. It is encouraging to already see a few acknowledgements of error, and the beginnings of legal interventions that delimit the use of TraceTogether data. However, as active debate continues – both in parliament sessions and on the digital public sphere – more concerns and points of distrust are likely to surface. It is therefore vital that trust is quickly and effectively addressed in the ways that we have suggested, so TraceTogether can serve a meaningful and long-lasting role in Singapore’s fight against COVID-19.