WikiProject Intellectual Diversity: Difference between revisions

From Encyclosphere Project Wiki
(Overhaul of goals)
Line 1: Line 1:
'''WikiProject Intellectual Diversity''' aims to help reinforce Wikipedia's original, firm commitment to intellectual diversity. In particular, we seek to reinforce the original principles of fair decision-making (giving new users and minority views a fair hearing), genuine neutrality (not taking a position on controversial issues), transparent governance (which would spread out the authority of an intellectually homogeneous inner circle), and responsiveness to the public (which presently includes viewpoints not found in great numbers on Wikipedia).
'''WikiProject Intellectual Diversity''' aims to help reinforce Wikipedia's original, firm commitment to intellectual diversity. In particular, we seek to reinforce the original principles of fair decision-making (giving new users and minority views a fair hearing), genuine neutrality (not taking a position on controversial issues), transparent governance (which would spread out the authority of an intellectually homogeneous inner circle), and responsiveness to the public (which presently includes viewpoints not found in great numbers on Wikipedia).


=== Mission ===
=== Goals ===
More fully, we seek to help Wikipedia to:
We seek to help Wikipedia to:


* Adopt fairer methods of reaching decisions on editorial and personnel questions, with a view to becoming more just and open.
* '''Ensure fair and open decision-making and governance.'''Wikipedia's processes for reaching decisions on both editorial and personnel questions need to be made more transparent, open, and provably fair. We stand against the idea that those who close hard-fought discussions are declaring a "consensus." We also want to make it easier for Wikipedia to make significant policy changes where warranted. Therefore, we will monitor policies governing debate closure and the role of closers, proposals for legislative or formal voting processes, any discussion of the difficulty of making significant policy changes or of governance reform generally, and the application of WP:IAR ("Ignore all rules") as it affects fairness.
* Return to the original neutrality policy, explicitly tolerating multiple points of view and acknowledging legitimate differences in judgment about source reliability.
* '''Broaden the scope of permissible sources.'''Over the years, Wikipedia has continually tightened the screws as to what sources may be used. We stand for a movement in the opposite direction, permitting responsibly-written sources that represent views of currently-disfavored ideologies, parties, nationalities, religions, and other viewpoints. So we will monitor policies on reliable-source designation, the perennial sources list, source blacklists, the emphasis on secondary sources over primary sources, and any debate that turns on whether a source's exclusion reflects ideological rather than quality-based judgment.
* Ensure Wikipedia's editorial governance is fully transparent and accountable, in effect ultimately distributing editorial authority that has ''de facto'' become concentrated in the hands of too few people.
* '''Roll back contradictory qualifications to the neutrality policy.'''The original Wikipedia neutrality policy was best when it represented competing viewpoints fairly. Our view is that attention to "due weight" and "fringe theories," and the close connection between these determinations and tendentious policies about "reliable sources," tends to undermine robust neutrality. Thus, we will monitor discussions about NPOV policy in general; "due weight," "fringe theories," and related doctrines; the treatment of viewpoints that dissent from Western, academic, secular, or mainstream-media consensus; any discussion in which one side is arguing for the right to be represented at all; notability and inclusion standards; and proposals for alternative or competing article frameworks.
* Encourage the adoption of tools whereby a more diverse public, both for its own sake and editors’, can evaluate and respond to articles.
* '''Rein in over-aggressive blocking by Administrators, holding the powerful to higher standards of accountability.'''We are concerned by the tendency of many Administrators to exercise authority aggressively and unfairly. So, we will monitor policies and high-profile actions in these areas: blocking (especially indefinite blocks), admin elections (RfA), de-adminship, the accountability of high-authority users (CheckUsers, Bureaucrats, ArbCom, Oversight members, others), and whether those exercising significant power are identifiable and subject to review. We also seek ways to shed more light on influence over important editorial decisions, particularly when associated with paid editing (especially by PR firms or government actors) and on the influence of third parties, especially operating through private back-channel discussions. If this cannot be reliably reined in, perhaps it should be officially permitted, to create a fair playing-field.
* '''Work to retain editors and create a more inclusive atmosphere.'''We are very concerned that many editors are driven away by aggressive threats; we want editors to be treated with more genuine kindness by long-time editors and other powerful voices here. So, we will monitor blocking threats and policy discussions that are used in making such threats, including canvassing and coordination policy, policy on use of AI (especially as a pretext to block new users), and civility and conduct rules as they affect editorial participation.
* '''Engage the public more.'''Our view is that Wikipedia's current intolerance and narrow scope drive away many potential contributors. Thus, we will monitor proposals to enable reader ratings and feedback systems, how Wikipedia handles public complaints, any mechanism by which a broader audience can assess article quality or neutrality; we may also monitor or help improve proposals for alternative articles on popular topics.


We will work to advance these principles iteratively, not all at once—dramatic change not being how Wikipedia works—but in concrete, achievable, incremental ways.  
We will work to advance these principles iteratively, not all at once—dramatic change not being how Wikipedia works—but in concrete, achievable, incremental ways.  

Revision as of 18:23, 7 April 2026

WikiProject Intellectual Diversity aims to help reinforce Wikipedia's original, firm commitment to intellectual diversity. In particular, we seek to reinforce the original principles of fair decision-making (giving new users and minority views a fair hearing), genuine neutrality (not taking a position on controversial issues), transparent governance (which would spread out the authority of an intellectually homogeneous inner circle), and responsiveness to the public (which presently includes viewpoints not found in great numbers on Wikipedia).

Goals

We seek to help Wikipedia to:

  • Ensure fair and open decision-making and governance.Wikipedia's processes for reaching decisions on both editorial and personnel questions need to be made more transparent, open, and provably fair. We stand against the idea that those who close hard-fought discussions are declaring a "consensus." We also want to make it easier for Wikipedia to make significant policy changes where warranted. Therefore, we will monitor policies governing debate closure and the role of closers, proposals for legislative or formal voting processes, any discussion of the difficulty of making significant policy changes or of governance reform generally, and the application of WP:IAR ("Ignore all rules") as it affects fairness.
  • Broaden the scope of permissible sources.Over the years, Wikipedia has continually tightened the screws as to what sources may be used. We stand for a movement in the opposite direction, permitting responsibly-written sources that represent views of currently-disfavored ideologies, parties, nationalities, religions, and other viewpoints. So we will monitor policies on reliable-source designation, the perennial sources list, source blacklists, the emphasis on secondary sources over primary sources, and any debate that turns on whether a source's exclusion reflects ideological rather than quality-based judgment.
  • Roll back contradictory qualifications to the neutrality policy.The original Wikipedia neutrality policy was best when it represented competing viewpoints fairly. Our view is that attention to "due weight" and "fringe theories," and the close connection between these determinations and tendentious policies about "reliable sources," tends to undermine robust neutrality. Thus, we will monitor discussions about NPOV policy in general; "due weight," "fringe theories," and related doctrines; the treatment of viewpoints that dissent from Western, academic, secular, or mainstream-media consensus; any discussion in which one side is arguing for the right to be represented at all; notability and inclusion standards; and proposals for alternative or competing article frameworks.
  • Rein in over-aggressive blocking by Administrators, holding the powerful to higher standards of accountability.We are concerned by the tendency of many Administrators to exercise authority aggressively and unfairly. So, we will monitor policies and high-profile actions in these areas: blocking (especially indefinite blocks), admin elections (RfA), de-adminship, the accountability of high-authority users (CheckUsers, Bureaucrats, ArbCom, Oversight members, others), and whether those exercising significant power are identifiable and subject to review. We also seek ways to shed more light on influence over important editorial decisions, particularly when associated with paid editing (especially by PR firms or government actors) and on the influence of third parties, especially operating through private back-channel discussions. If this cannot be reliably reined in, perhaps it should be officially permitted, to create a fair playing-field.
  • Work to retain editors and create a more inclusive atmosphere.We are very concerned that many editors are driven away by aggressive threats; we want editors to be treated with more genuine kindness by long-time editors and other powerful voices here. So, we will monitor blocking threats and policy discussions that are used in making such threats, including canvassing and coordination policy, policy on use of AI (especially as a pretext to block new users), and civility and conduct rules as they affect editorial participation.
  • Engage the public more.Our view is that Wikipedia's current intolerance and narrow scope drive away many potential contributors. Thus, we will monitor proposals to enable reader ratings and feedback systems, how Wikipedia handles public complaints, any mechanism by which a broader audience can assess article quality or neutrality; we may also monitor or help improve proposals for alternative articles on popular topics.

We will work to advance these principles iteratively, not all at once—dramatic change not being how Wikipedia works—but in concrete, achievable, incremental ways.

Scope

Our scope extends to policy pages, guidelines, and essays relevant to these categories of reforms. We do not assert any authority over such pages. Rather, we will be reviewing and proposing changes to them. We will do so in various appropriate venues where issues are being discussed and principles applied, including talk pages of major policies and guidelines (e.g., Wikipedia talk:Neutral point of view, Wikipedia talk:Reliable sources, and Wikipedia talk:Verifiability).

We will pay close attention to RfCs, the Village Pump, Dispute resolution, and occasionally ArbCom. We will also track relevant noticeboards where policy interpretation and enforcement questions are frequently addressed, including Administrators' noticeboard, ANI, Reliable sources noticeboard, NPOV noticeboard, Conflict of interest noticeboard, and BLP noticeboard.

Policy Scanner

The Policy Scanner, which we maintain, monitors dozens of policy talk pages, noticeboards, RfCs, and Village Pump discussions daily, flagging items relevant to our mission. The results are posted below and archived periodically. The scanner is a traditional Ruby script, augmented by LLM output at key points; but a human reviews all output before posting, and posts the result by hand. Of course, anyone may follow the links provided and write whatever is permitted on the pages. We do not instruct people what to say or how to vote.