Content Prioritization – Algorithm IP

Algorithms control our daily lives far more than we may realize, and Noble (2018) argues that the way content is portrayed by search engines such as Google is not neutral. These ad-driven ranking systems reflect and reinforce existing racism and sexism, prioritizing engagement and revenue over accuracy and justice. In this paper, I will discuss how these content prioritization algorithms work, ways they systematically oppress marginalized groups, how they influence my professional and personal lives, and how we can impact search results.

Content Prioritization

Content prioritization is essentially the way content is delivered to an individual based on many different factors. These factors include keywords used, where the keywords are in the web page, validity of the webpage based on importance and trustworthiness, location of the user, how recent the page is, and how likely a page is to be a scam (Google, 2019). While this is what Google itself tells us, this seems to be a bit too simplified, outdated or potentially even deceiving.  For instance, in February 2025, a US DOJ hearing revealed that Google uses a click-driven system called Navboost, which prioritizes heavily clicked content. The case showed that Google uses Chrome-derived browsing data in a “popularity” ranking signal, beyond activity on Google’s own sites, though it did not establish cookie-based, per-user ranking personalization (U.S. Department of Justice, 2025). Let’s dive into different areas of content prioritization and their impacts on various peoples.

Areas of Content Prioritization

The way content appears in front of us is always evolving, and if it’s not regulated effectively, it may lead to severe consequences. They look at a few ways this technology is used and how impacts everyone, from racism and sexism as Noble argues to my own personal and professional life.

Search Engines

Google has the top search engine by far with a global market share of 90.4% according to gs.statcounter.com and an estimated 5 trillion searches per year. We are all connected to Google, meaning that it can have major influence over entire cultures and societies. They allow us to see what the system wants us to see and the ones controlling it, can shift the narrative around a certain event (ie Israel-Gaza conflict), group of people (ie immigrants), or political candidate (ie Zohran Mamdani) and lead to concerns around censorship in certain regions or certain demographics.

Social Media

Social Media’s algorithms are particularly harmful and seemingly a root cause of division among societies these days. YouTube, a Google company, is a great example to prove this point using a recommender system to keep users on the platform. According to Ribeiro et.al (2020), YouTube “users consistently migrate from milder to more extreme content”. While this paper has an emphasis on pathways to far-right groups, I believe it also has the opposite effect leading others down various paths to radicalization. The evidence shows in my own life as my father and I, leaning in opposite directions politically, will receive videos about the same political event, but the sources will never be the same. I also am shocked to ask him about certain things that occurred, which my side discusses, only to find out that his side is staying quiet about the issue. Could algorithms be the actual downfall of free society?

ChatGPT and LLMs

There seems to be a shift towards LLMs when it comes to searching as they can produce instant information to any question it is propositioned. The information on which the LLMs are trained is key to the way they present information, and lack of regulation or a federal oversight agency in the US is another cause for concern. As Timnit Gebru on 60 Minutes () argues, why is the technology industry unregulated and other major industries such as airlines, food, or pharmaceuticals. It feels that the industry can manipulate legislative system of the US without any impact through their extraordinary wealth, and AI is just the next step for more power over its citizens.

Impacts on Equity

Google and other tech companies seem to have a vice grip on information and can control any narrative. While I hope to think the way they do this isn’t intentional and that they are actively trying to prevent further harm to already oppressed groups, I can only imagine that they will do whatever it takes to stay on top, even if it means appeasing those at the top of the country by pushing the narrative the leaders want. It is a very frightening point in time in this regard and Google’s power over information has the ability to quickly suppress truth or oppress with the information that moves up the algorithm.

Professional Impact

Algorithms and PageRank directly affect my professional life in a number of ways. The first is my book business that runs on Amazon and TeachersPayTeachers (TpT). The results of paying for ads has been something that makes me ponder whether Amazon rewards me by boosting my organic visibility, leading to more organic sales than if ads weren’t on. After the research I made for this paper about Google’s Navboost, I believe click-throughs on paid ads could be accounted for in the organic SEO rank. Another professional impact of relevance is how my company approaches AI. They are European-centric and it is very evident that they use a proactive approach to being prepared for regulation to come through in the EU. This is profoundly different to the way an American company would approach this and I would like the US to adopt this, although I know it won’t happen.

How Can We Impact Search Results?

Working with Amazon and TpT has shown me the way that I can impact search results through changes in keywords, descriptions, paid ads and more, but I will take a different route in this discussion. We can affect search results through mindfully checking the source on the page before clicking. If the headline is inflammatory, not clicking, sharing, or liking it will effectively lower its rank. If we develop an approach to be consciously aware of the links we click or don’t click, and share our strategies with those less digitally literate, it could have far reaching effects on what is shown to us.

References

Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. New York University Press. https://doi.org/10.2307/j.ctt1pwt9w5

Google. (2019, October 24). How Google Search works (in 5 minutes) [Video]. YouTube. https://www.youtube.com/watch?v=0eKVizvYSUQ

U.S. Department of Justice, Antitrust Division. (2025, February 18). Call with Google engineer H. J. Kim [Exhibit PDF]. https://www.justice.gov/atr/media/1398871/dl

Ribeiro, M. H., Ottoni, R., West, R., Almeida, V. A. F., & Meira Jr., W. (2020). Auditing radicalization pathways on YouTube. Proceedings of the 2020 ACM Conference on Fairness, Accountability, and Transparency (pp. 131–141). https://doi.org/10.1145/3351095.3372879

CBS News. (2023, March 5). ChatGPT and large language model bias [Video]. 60 Minutes. https://www.cbsnews.com/news/chatgpt-large-language-model-bias-60-minutes-2023-03-05/

Statcounter Global Stats. (2025, September). Search engine market share worldwide. https://gs.statcounter.com/search-engine-market-share

ChatGPT. (2025, October 19). Assistance with outlining, source verification, and citation formatting for “Content Prioritization – Algorithm IP.” OpenAI. https://chatgpt.com/share/68f4f0c6-2f20-8005-a616-e9126b38de7e

AI Disclaimer: Portions of this paper—including the outline development, source verification, and APA reference formatting—were completed with the assistance of ChatGPT (OpenAI, GPT-5 model). The tool was used to organize ideas, locate and summarize credible sources, and ensure proper citation formatting. All interpretations, edits, and final arguments are my own.

Leave a Comment

Your email address will not be published. Required fields are marked *