• Alexandra Sanyal

Privacy versus Trust: Mitigating Fear in a Data-Driven world



In our data-driven world, the uses and abuses of personal data are on many people's minds. From the targeted ads on social media to the reoccurring instances of identity theft and credit card misappropriation, we find ourselves asking the same question: is there really such thing as privacy, anymore?


The COVID-19 pandemic, and our journey towards reopening, has furthered questions and fears around this topic — especially with the issue of contact tracing and data collecting mechanisms rising to fore. But what are we really worried about? Wouldn't you rather share your personal information than risk contracting or spreading a deadly virus, such as this one?


Perhaps we should take a step back, and reexamine what really scares us: is it whether or not our data is private? Or rather... is it that we do not trust that our data is not being inappropriately used? Differentiating between maintaining a level of privacy versus feeling comfortable trusting authorities is key to developing a way to continue to enhance data-driven design and strategy while mitigating consumer fears.


First, lets define the two:

As per the Oxford English Dictionary, our notion of privacy is related to "seclusion; freedom from interference or intrusion — as a matter of choice or right." Our desire for privacy is based on certain freedoms that allow us to choose what we want to share with whom, when, why and how. It's based on autonomy.


On the other hand, our notion of trust is revolves around "a firm belief in the reliability, truth, or ability of someone or something," or, in other words, belief that what we fear will happen, or not happen, will not, because we've engaged in trust. It's important to note that trust is chiefly followed by the preposition in, implying that our notion of trust is fundamentally connected to that which we are trusting — we trust in something, someone, etc. It's based on a two-part relationship.


When we apply these two concepts to the topic of personal data, we begin to see a difference: privacy is difficult to define and maintain, because it's effectively "us against the world" — in other words, its the act of hiding our data from that which will abuse it. What makes this scary is that, we don't really know what we want privacy from (who, what, etc.). All we know is that our data is vulnerable in the sense that it may not always be privy to only us.


Trust, on the other hand is a much more human concept, because it implies that we are engaging with some other person, or mechanism, in an agreement: we've made the choice to give them access to our vulnerable data because we know that it will not be misused.


Thus, perhaps the question we ought to be asking is not, is my data private but rather can I trust who or what is using my data? In terms of data-driven technologies, this means creating a trusting partnership between a company and a client – human to human. Although there may be a device that these companies use to collect and process a client's data, it is the responsibility of the company to respect the client's boundaries, or their privacy, while still doing their job.


As contact tracing develops and companies begin to monitor individuals health data, it's imperative that employers and employees establish a strong foundation of trust so that tracing can be conducted efficiently and respectfully.


Trust will not solve all of our issues related to data privacy, but, without it, we stand no chance of outgrowing data-driven fear.




Products

Trace

Monitor

Company

Blog

Support

Contact Us

© 2020 Cambridge Blockchain.  All rights reserved.

  • Facebook