Social Bias in Information Retrieval

Source: Addressing Social Bias in Information Retrieval in In Experimental IR Meets Multilinguality, Multimodality, and Interaction

”Many algorithmic processes are opaque and that the reasons for this may vary. For instance, it is more often than not difficult to interpret results from models induced by new machine learning techniques such as deep learning” (especially why we need to work on explainability)

As a counter argument, there are social and economic challenges for achieving algorithmic transparency, such as the need for developers/owners of such processes to protect trade secrets, or even the privacy concerns of users.

Friedman and Nissenbaum’s definition of bias:

  1. its results are slanted in unfair discrimination against particular persons or groups
  2. the observed discrimination is systematic within the system

Indeed, over the past years, many researchers have found that search engines, through the result sets they present to users, tend to reinforce a view of the social world that aligns with the status quo.

Related: To Live in their Utopia, Data Distributions, Algorithms of Oppression