Towards a Sociology of Decentralisation

Technical advancements do not only lead to a change in business processes but also to various changes in the behaviour, thinking and interaction of stakeholders. The main problem with this is to identify potential stakeholders so that the expected impact can be evaluated and controlled. In the case of blockchain, who is considered as stakeholder differs greatly depending on whom you ask. Resulting, projects might only assess the effect they have on its developer community and user base with a disregard for, to the developers, unknown communities. Thus, this post will not deal with any specific stakeholder but the sociological implications of decentralisation in general, influenced by our psychological dispositions, and the care governance processes have to take as a result.

TLDR: Who is included as a stakeholder varies across projects. However, independent on stakeholders, decentralisation will have an impact on the sociological level of blockchain governance.

1. Reassurance and control

Humans are social, group-oriented beings, and any group has a leader, who takes responsibility and offers guidance to their group. This may result in increased levels of organisation and security for individual group members. In times of doubt individuals have a leader, or a central oracle, to turn to for advice. At the same time, if things do not go well, the leader is expected to take full responsibility, and might even be blamed. The case in which positive and negative outcomes are assigned to the behaviour of individuals instead of external circumstances is called the fundamental attribution error. If my group is doing great, then it is because we are awesome. However, if things do not go well, then it is because someone else must have messed up.

In decentralised applications, individuals can take full control over their financial interactions, i.e. store and transfer value on a trustless basis, whereby any interaction is invoked and executed solely by the user. The user has complete control and responsibility. Exactly this goes against human group norms. While it might be great for individuals who have trained the necessary skills to take responsibility, it tends to be scary for everyone else. The latter one will feel quite comfortable in a society or organisation that allows them to shift the accountability off to another entity. Ultimately, when we discuss user experience in decentralised applications, this problem has to be addressed.

Centralised applications have the advantage to better respond to social norms and provide the necessary support structure. Alternatively, in case the user messed up, decentralised applications have to provide a sufficient level of education and support to empower the user to take responsibility and understand one’s actions. The main problem hereby is that we can hardly expect users to change their behaviour, which took centuries to manifest.

TLDR: Humans manifested a reliance on central entities for reassurance and control. While centralised applications can respond to trained habits, decentralised applications have to respond to this intrinsic reliance on authority.

In centralised organisations usually offer customer support

2. Us vs. Them

As humans, we like to identify ourselves with culture, style, belief systems etc. This is ultimately how we come up with stereotypes and how we define our heritage and origin. Summarising, stereotypes are social structures in the brain that we have developed about each other over time. Generally, stereotypes help us in our daily life; they allow us to belong to a group and feel included, recognise situations and group norms, and make sense of our surrounding. The downside of this is that we tend to suppress or downgrade other groups while making ourselves stand out, often based on entirely irrational grounds. Arguably, this is also one of the reasons why humans developed different governing systems, each responding to different beliefs, values and ideologies to represent various groups.

In the case of crypto, similar group behaviour can be observed by user groups, for instance, Bitcoin vs. Ethereum vs. EOS. Within larger groups, we will always find smaller sub-groups of varying opinions. An example would be the difference between scaling Ethereum through layer one or layer two solutions. Each proposal responds to the varying priorities of different users, developers and researchers. Usually, if the larger group is under attack, sub-groups would choose to put aside their differences to support each other.

While it is easier to identify with or discriminate against varying features of known identities, it is much harder in regards of anonymous groups. If the person remains unknown, one cannot identify aspects of their personality or beliefs that they might disagree with. Furthermore, the documentation and group discussions of individual projects are usually openly accessible so that everyone who is interested is able to engage in discussions on updates and governance structures. The danger is in projects that operate behind closed doors and do not encourage external contribution, platforms that require membership applications before allowing users to engage, or even censor individuals to raise their, from group-norms differing, opinions. Therefore, any project, decentralised or centralised, has to take special care to prevent censorship.

TLDR: The distinction between in-group and out-group norms allow individuals to belong to a group but also to discriminate against each other. Highly decentralised projects have it potentially easier to allow for the inclusion of all stakeholders than consortium blockchains. (For a comprehensive explanation on the difference, please refer to my previous post.)

Communication barrier between groups

3. Shifting responsibility

Generally, if no one has assigned an individual a precise task, it is easy to ignore the indirect responsibilities that come with one’s decision making. A case example is civilians that walk past an accident, someone or something in need, without offering help. These are referred to as Bystanders. Therefore, it is always better in case of a crash, if you can, to call out a specific individual, e.g. ‘The man with the black hair and brown jacket, please help me up.’ In that case, you are assigning him direct responsibility to take care of a task. If he does not engage in the activity, he can be directly held responsible.

This is related to the previous two points. I often see on Twitter, conferences, or talks people shifting responsibility away from themselves and onto others or unknown entities. Unknown entities cannot assume responsibility. For example, if the conversation is redirected to topics, which are not in the focus of this person’s research nor interest. In many cases, when the problem becomes too immense, a later generation is expected to understand and take care of it. Similar situations are observable in regards to regulations, economic or environmental disasters, as well as failed social interactions.

Developer teams building the decentralised organisations and social structures of tomorrow aim to coordinate a broad group of unknown individuals; not engaging in efficient governance design from an early stage onwards will only push stakeholders into, maybe even unconsciously, defining interaction strategies and processes themselves. These processes might evolve arbitrarily over time without clear goals, structures or mechanisms in place to the potential negative impacts. Any project has only limited space for trial and error of its decision-making. Changing unwanted or dangerous habits and mindsets at a later stage is significantly more difficult and potentially more time-consuming, especially if there is no central authority to make decisions.

TLDR: Shifting responsibility is trained behaviour. However, in the case of decentralised platform design and governance, it can have long-lasting effects that are difficult to counterbalance since various (often undefined) stakeholders control the decision making.

4. The ease of anonymous decision making and privacy

Throughout the past centuries, humans did not have the possibility nor incentives to engage with strangers on an anonymous level. Instead, we had the opportunity to engage in relationships that have been based on trust, favours and repayments. This made us in some form dependent on each other. On decentralised platforms, individuals are dependent on the incentive structures and deterrents in place. While most centralised platforms require identity registrations, decentralised platforms make it possible to stay mostly anonymous. Arguably, once you cannot be held accountable, it becomes significantly easier to act in one’s interests.

One concern in publishing all developer talks and updates is that engaged individuals are forced to take accountability for themselves. Consequently, they might not act in the platforms’ best interest if forced accountability does not favour all stakeholders, as it does in most cases. The question arises, who is allowed to stay anonymous and who does not have the privilege based on their position as a developer, researcher or community manager?

Ultimately, this section is for everyone who is engaging on decentralised platforms or involved in its decision-making process to represent their own or group interests. Users seem to have an urge for a direct connection, to put a face to a platform or dApp. While this may engage more users and might allow for more effective decision making, it may even be limiting for the development process.

TLDR: How much anonymity is actually favourable for a decentralised platform? Who is allowed to stay anonymous and to what extent? While there is an urge for humans to connect and engage with known entities, it might not be in the best interest of the platform design and evolution.

Overall…

There is little research into the implications of the design and organisation of social systems on individual behaviour and group norms, leading to ‘what-if’ scenarios and hypothetical outcomes. This post provides a starting point, which I will aim to expand further on. At the same time, I hope to engage others into contributing. In many cases, when we talk about the user experience and incentive structures, these issues are touched upon but not clearly acknowledged. It is on every single stakeholder, whoever that might be, to think about their role and to maximise the positive impact they might have on group dynamics and the social implications of the design of decentralised platforms.

If you liked this article and would like to make sure others read it as well, don’t forget to give it a few claps 👏 (you can clap up to 50 times).

Special thanks to CleanApp for reviewing. Also, check out their stories in CryptoLaw!

For future thoughts, follow me on Twitter.

Developer Evangelist at Codefresh, 3 years crypto now DevOps Personal blog.anaisurl.com | More on GitOps codefresh.io/gitops/