In his 2013 essay “From Digital Natives to Digital Wisdom,” author Marc Prensky claims that an important aspect of exercising digital wisdom involves actively seeking out alternative perspectives, and capitalizing on the enhanced access to these perspectives that technology affords. Though I agree that wisdom in any sphere of life is directly associated with actively seeking out and listening to many different voices, I do not think algorithms, data tracking, cookies, and social media feeds necessarily set up the modern internet user for success in this arena, and we often run the risk of being stuck in our own digital echo chambers.
Let us first consider the Facebook News Feed. Facebook is the biggest social network on the planet with over 2.7 billion monthly active users worldwide (Clement, 2020). For the vast majority of users, the Facebook News Feed serves as the home base and primary content organizer that ultimately determines how users interact with the platform in any given instance, including the content they consume or engage with. That which shows up on a news feed is determined by a complex algorithm, but the end goal of the news feed is to ultimately show the user content that they find meaningful. Facebook determines this based on a relevance “score.” The score is given based on past behaviors (like posts, likes, replies, comments and shares on a user’s profile), past engagement with content from certain publishers, and whether or not the content has been shared from within your friend network (Rubin, 2020). In short, the more a user interacts with the things they like or are interested in, the more they will have the opportunity to engage with that same type of content in the future.
Facebook’s attempt to narrow the scope of what users interact with is not innately bad; after all, it does help regulate the overwhelming amount of content available on the internet and it attempts to make available content meaningful and hyper-relevant to the user. But this narrowing of scope has resulted in a risky consequence, namely that the algorithm does not inherently help the user see/read/hear alternative perspectives and thus become more digitally wise. For most users, the algorithm leads to a fairly robust echo chamber wherein the consumer is constantly being exposed to ideas and sources and content that they (probably) already like or ascribe to. Especially where news content is concerned, it’s a bit of a breeding ground for confirmation bias.
So my question is this: how do we “beat the algorithm” in digital spaces (both literally and metaphorically speaking) in order to remain mindful of our information sources and consider the underlying assumptions behind what we consume? How do we self-identify potential echo chambers and intentionally seek out alternative perspectives?
I believe the answer lies (at least partially) in media literacy education. According to Hobbs (2010), media literacy is a subset of digital literacy skills involving:
- the ability to access information by locating and sharing materials and comprehending information and ideas
- the ability to create content in a variety of forms, making use of digital tools and technologies;
- the ability to reflect on one’s own conduct and communication by applying social responsibility and ethical principles;
- the ability to take social action by working individually and collaboratively to share knowledge and solve problems as a member of a community;
- the ability to analyze messages in a variety of forms by identifying the author, purpose, and point of view and evaluating the quality and credibility of the content.
In a study conducted with 400 American high school students, findings showed that students who participated in a media literacy program had substantially higher levels of media knowledge and news/advertising analysis skills than other students (Martens & Hobbs, 2015). Perhaps more importantly, information-seeking motives, media knowledge, and news analysis skills independently contributed to adolescents’ proclivity towards civic engagement (Martens & Hobbs, 2015), and civic engagement naturally requires dialogue with others within and outside of an individual’s immediate circle. In other words, the more students were able to critically consider the content they were consuming and the motives behind why they were consuming it, the more they wanted to engage with alternative perspectives and be active, responsible, productive members of a larger community.
Understanding how the Facebook algorithm works is one form of media literacy education and it can certainly go a long way in helping users of that particular platform identify and avoid echo chambers therein. However, echo chambers can exist outside of the Facebook algorithm to the extent that any given individual fails to seek out opinion-challenging information. Therefore, in an attempt to lean into media literacy and, by extension, civic engagement, here are three simple but meaningful tips that the digitally wise might find useful:
- Habitually check multiple news sources; this is the only surefire way to ensure you’re getting complete information with the maximum amount of objectivity.
- Intentionally reach out and interact with people of different perspectives, both on and offline; take care to discuss new ideas with facts, patience, and respect.
- Be aware of your own biases; wanting something to be true doesn’t make it factual. (GCF Global, 2020)
These “tricks of the trade” are not revolutionary, nor do they find their origins with the dawning the internet, yet these are the very practices that have, perhaps, become more difficult to actively employ in digital space. Consequently, reminders about the simple things never hurt. Simple concepts aren’t necessarily simple to enact. The passive internet user is the one most likely to get caught in an echo chamber; the digitally-wise will make consistent efforts to challenge their own thinking and intentionally seek out alternative voices, even if it takes a little bit more elbow grease to do it.
Clement, J. (2020). Number of monthly active Facebook users worldwide as of 2nd quarter 2020. Statista. https://www.statista.com/statistics/264810/number-of-monthly-active-facebook-users-worldwide/
GCF Global. (2020, October 8). What is an echo chamber? Digital Media Literacy. https://edu.gcfglobal.org/en/digital-media-literacy/what-is-an-echo-chamber/1/
GCFLearnFree. (2019, June 18). What is an echo chamber? YouTube. https://www.youtube.com/watch?v=Se20RoB331w&feature=emb_logo
Hobbs, R. (2010). Digital and media literacy: A plan of action. The Aspen Institute Communications and Society Program & the John S. and James L. Knight Foundation. https://knightfoundation.org/reports/digital-and-media-literacy-plan-action/
Martens, H. & Hobbs, R. (2015). How media literacy supports civic engagement in a digital age. Atlantic Journal of Communication, 23, 120–137.
Prensky, M. (2013). From digital natives to digital wisdom: Hopeful essays for 21st century learning. Corwin, 201-215.
Rubin, C. (2020). 10 ways to beat the Facebook algorithm in 2020. UseProof. https://blog.useproof.com/facebook-algorithm