Skip to main content
7 min

We should all care more about trust on the Internet

We

Trust should not be understood as the opposite of control, but as a shared good or resource that determines the level of productive outcome when combining systems of control and freedom to get things done collectively. High trust should be in everyone's interest, yet the current iteration of the internet and digital interactions often erode trust, causing real harm. Ensuring secure identity and individual integrity are key to a future where digital interactions foster trust.

Fostering Trust in the Digital Age: The Mission Behind Truid

We started Truid because of one conviction and one observation. We’re convinced that high-trusting teams, organizations and societies are freer, more effective and more likely to produce outcomes that benefit all. We’ve also observed that the current iteration of the internet and digital interactions are not conducive to high levels of trust and, in fact, often erode trust on a downright dangerous scale.

As people move ever more of their interactions online they trust each other less, trust news and facts less and trust the economic and political institutions that underpin modern society less. This is no fringe phenomenon, it has real implications for people's lives, livelihoods and for the whole functioning of society, and research bears this out.

Meme, trust is goodbut control is better

Intelligent people disagree on what trust actually is and how to foster it. A famous quote, attributed to Lenin, says “trust is good but control is better”, positing a tradeoff between trust and control. Variants of this view underpin the theoretical frameworks of neoclassical economics and theories of governance and security, where people are seen as narrowly self-interested and deceitful when given the chance.

Actual empirical behavioral research (in e.g. game theory), however, shows that distrustful and deceitful behavior is typically contextual and that it holds generally true only for a quite small minority of the population. Most people are actually quite trustworthy unless they’re put in a situation which is explicitly deceitful.

Trust as a Measure of Effective Collective Coordination

If trust isn’t the dichotomy of control, what is it then? We can view the level of trust as a way to measure the success of coordination when getting things done in a collective setting. To simplify, there are two ways to coordinate action to get things done:

  • Coordination can be top-down where strict rules and prescriptive procedures decide how coordination is done. We can call this method CONTROL. It typically works well in predictable settings where it is clear what needs to be done, and is often low risk and resource efficient.
  • Coordination can also be bottom-up where decentralized decision-making and adjustment leads decides the outcome. We can call this method FREEDOM. It typically works well in unpredictable settings where it is unclear what needs to be done, but it is high risk and resource inefficient. In practice, both control and freedom are needed to successfully coordinate towards achieving an objective, and typically several combinations of the two factors can produce similar outcomes. This can schematically be described as an isoquant curve, as used in economics to describe how two factors of production in various combinations will lead to the same output.

Rebuilding Trust in the Digital Realm: Solutions for Identity and Integrity

Trust can then be understood as the location of the curve vs. the origin, meaning that for any combination of control and freedom, a higher level of trust will lead to a higher output (or less input will be needed to produce the same output). Trust then is seen as a catalyst or “lubricant” for collective action as well as a “technology” or resource that fosters productivity. 

To any organization, a resource that increases the value of all input factors is of course tremendously valuable. And yet we’re currently experiencing an erosion of this resource in society as a whole at the current technological iteration. Why trust is eroding is of course a multi-factor problem, but two factors that are clearly the result of how the internet works today are problems with Identity and threats to Integrity. 

In the digital realm, it is difficult both to prove who you are and to know who you’re really interacting with. When identity is uncertain, fraudsters are hard to catch and the context makes narrow self-interest more reasonable, making people generally less trustworthy.

This, in combination with the inherently open structure of the internet and the fact that all interactions leave digital traces also means that individual integrity is always at risk from monitoring and data theft. The closed systems of the internet giants, that overcome these problems on a superficial level (which explains a significant part of their initial attractiveness) in some ways actually exacerbate the problem, since they’re all built on ad revenue and thus on user data and analytics monetization. 

To start rebuilding trust on the internet, we need to move forward from the current setup (moving back towards a pre-digital past is hardly an option). We need a system for securely proving identity that safeguards integrity and that is simple enough to use to allow for mass adoption. The benefits are big enough on an individual level, but on a societal level, they’re a game changer. 

Join the conversation on LinkedIn -->