Filter bubbles and the commoditised individual

The relationship between the individual and the digital realm seems to be, overwhelmingly ‘post-modernist.’ By that, it can be understood that the relationship between society and digital worlds is one where “people… are, in effect, consuming themselves … their desires, sense of identity, and memories are replicated and then sold back to them as products” (McCaffrey 1992, pg. 6). Particularly, the function of algorithms “amplify ideological segregation by automatically recommending content an individual is likely to agree with” (Flaxman 2016, pg.299). In this way algorithms effectively ‘sell’ the individual’s ‘sense of identity’ back to them, entrapping them in an echo-chamber which forces the individual to “consume themselves” (McCaffrey 1992, pg. 6). Crawford’s (2016) case study on Amazon’s ‘recommended products’ function demonstrates how algorithms “claim(ing) to know a public” and that it “calculate[s] publics” (Crawford 2016, pg.80). The ‘filter bubbles’ (Pariser 2011) created by such algorithms are unwittingly accepted by individuals. This leads to a conundrum which is best described by McLuhan’s analogy of “fish being entirely unaware of water, insofar as water is environmental” (as quoted in Strate 2017, pg. 245). Being immersed in the digital sphere means that the internet becomes “autocratic…making decisions without our knowledge, invisible to us, presenting a singular worldview” (Crawford 2016, pg.82). Facebook advertisements are an example of the burgeoning impact of algorithms and how the individuals’ sense of identity is being literally ‘sold’ back to them. Much like Amazon, Facebook suggests advertisements based on a “calculated public” (Crawford 2016, pg.80) or rather a ‘calculated individual.’ Using my own Facebook experience as a case study, it is clear that a startingly sexist archetype is being sold to me. My feed is inundated with advertisements pertaining to the cosmetic, fashion and fitness fields. Through McLuhan’s theoretical lens, this case study can be seen as “technology tak[ing] command only because human beings cede their responsibility, not willingly but out of ignorance” (as quoted in Strate 2017, pg.245). Being immersed as we are in social media, like ‘fish in water,’ the act of scrolling through Facebook is normalised, but upon close inspection it is obvious that this behaviour perpetuates ‘passive’ acceptance of a “singular” and “calculated” (Crawford 2016, pg.80) worldview. Therefore, algorithms commoditise the digital public; ‘selling’ the individual a ‘sense of identity.’


  • McCaffery, L (1992). “Introduction: The Desert of the Real”, Storming the Reality Studio: A Casebook of Cyberpunk & Postmodern Science Fiction
  • Flaxman, S & Goel, S & Rao, J (2016): “Filter Bubbles, Echo Chambers, and online News Consumption”, Public Opinion Quarterly, Vol. 80, pp. 298-320
  • Crawford, K (2015): “Can an Algorithm be Agonistic? Ten Scenes from Life in Calculated Publics”, Science, Technology and Human Values, v41n2, pp 77-92.
  • Strate, L (2017): “Understanding the Message of Understanding Media”, Atlantic Journal of Communication, Vol. 25 No. 4, pp. 244-254




Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s