User:CognitiveMMA/sandbox
There are different problem-solving methods that each are effective in different domains. Consensus is only one such method. But consensus fails in cases where the topic is too new or disruptive for any consensus to exist. Wikipedia isn’t meant to promote unsupported ideas, but of course avoiding new and disruptive topics could make Wikipedia useless. On the other hand, strongly prioritizing consensus based reasoning heavily biases towards system I (intuitive or fast) reasoning which works well in some cases, but works terribly in others. When it works terribly we call it “groupthink”. Is it time to begin to explore how to evolve Wikipedia’s editing so that it takes the different problem-solving domains into account?
Some useful ways to understand these different problem-solving domains are below:
Property Characterizing Problem Domain | Description |
Signal to Noise Ratio | Ranges from low signal to noise ratio in which solution resides in as few as one expert able to calculate the optimal solution, to high signal to noise ratio in which solution is distributed throughout the entire group and is obtained through consensus. |
Bandwidth | Ranges from low bandwidth in which each participant can understand the entire problem-definition and the entire solution, to high bandwidth in which each participant can understand only a small part of the problem-definition and a small part of the entire solution. Efforts must be coordinated to each address a different part of the problem, and solutions to each part of the problem must be coordinated to create the overall solution. |
Relevance and Engagement (Resonance Frequency) | Ranges from low relevance and engagement (low resonance frequency) in which problem is outside the range of topics that the group is motivated to solve and/or knowledgeable about. Therefore it doesn’t resonate. Ranges to high relevance and engagement (high resonance frequency), in which problem is within the range of topics that the group is motivated to solve and/or knowledgeable about. Therefore it resonates. |
Table 1: Sectors of human decision-making.
Recent research in the biological sciences is beginning to reveal that nature has a collective intelligence far more powerful than the collective intelligence humans have been able to create in any software platform, because this natural intelligence optimizes collective outcomes for every participant in the group.
In nature, complex systems such as multicellular organisms exhibit emergent behaviors where individual components work collaboratively, optimizing the overall outcome for the system. This coordination, akin to a form of collective intelligence, ensures the survival and functionality of the organism, with each cell contributing to the stability of the whole.
This is a profoundly important idea to spread because all the greatest and most existential challenges facing man are also problems of optimizing collective outcomes for each participant in a group. It’s also important because it can be demonstrated conclusively that current human decision-making processes lack this capacity to optimize collective outcomes in general, which requires specific infrastructure that is missing in platforms such as Wikipedia. The fact that nature has already solved this problem billions of years ago, and the fact that we can potentially model and copy that solution, is meaningful to Wikipedia if true.
How can the processes of Wikipedia leverage this General Collective Intelligence to improve so that problem-solving approaches such as consensus aren’t applied where they don’t function well, but are applied where they do function well? And how can awareness be spread that this need for a meta-collective intelligence strategy (i.e. this need to integrate GCI functionality) is even a thing?
Some symptoms of the need for GCI functionality are:
• Edit Warring: This occurs when editors repeatedly override each other's contributions, reflecting a lack of consensus or inability to collaboratively resolve content disputes. • Biased Editing: When editors with strong personal biases influence content, it can lead to a skewed representation of topics, rather than a balanced and neutral point of view. • Inconsistent Application of Rules: Sometimes, Wikipedia's guidelines and policies may be applied inconsistently, leading to uneven content quality across different articles. • Difficulty in Managing Complex or Controversial Topics: Certain topics, especially those that are new, complex, or controversial, may struggle to find a stable and balanced representation due to varying expert opinions and the rapid evolution of information.
The reality is that a variety of valid reasoning processes and information might be used for purposes of editing a Wikipedia page, often not allowing a single Wikipedia page version to clearly stand out as correct above the statistical noise.
Another part of this problem, which may be less visible, is the lack of capacity and capability of individual editors to consider every possible permutation and combination of every possible component of every edit proposed, in order to assess which is most fit in optimizing the usefulness of a given Wikipedia page.
Somehow natural systems formed from groups of entities have developed the capacity and capability to adapt through various interventions to discover such self-sustaining solutions. This is a kind of collective intelligence. It’s a meta-collective intelligence or “general collective intelligence” because it’s potentially capable of adapting to incorporate any way of defining a given problem, and capable of incorporating any problem-solving strategy or solution that works best (is most fit).
One of the most remarkable things about this collective intelligence is its ability of natural systems to detect which among a stunningly vast number of options is most fit in achieving an outcome, despite the statistical noise.
The reality is that a variety of valid reasoning processes and information might be used for purposes of editing a Wikipedia page, often not allowing a single Wikipedia page version to clearly stand out as correct above that noise.
Literature across various domains supports the argument that a solution such as a Wikipedia platform updated to incorporate General Collective Intelligence platform functionality, that is able to function as a meta-collective intelligence by integrating multiple sources of collective intelligence, could be vastly more effective than an individual editor, especially working in a complex environment like Wikipedia.