Search Term "minorities" Data Analysis #1
- tylerprasad0
- Oct 3, 2021
- 3 min read
Updated: Sep 7, 2022
In my data analysis project, I used the term “minorities” as input for google and youtube searches. After choosing the term I was gonna use to search in the algorithms, I broke minorities into 4 more subgroups, minorities, why do minorities, are minorities, how do minorities, and what minorities. I searched each of these into Google and looked for the suggested searches, search results, images, recommended videos, dates, and advertisements. When looking at these different types of searches and results I gathered a great amount of data which helped me determine whether or not that algorithm was efficient and accurate. After all, it’s been proven that most algorithms are biased in some way based on the idea that data about the user now is valuable. For example in the PC magazine article “How Companies Turn Your Data Into Money” Max Eddy states, “The collected data has value because of how it's used in online advertising, specifically targeted advertising: when a company sends an ad your way based on information about you, such as your location, age, and race.” (Eddy). This highlights that now that data about one has value, companies are willing to spend their money in order to cater their algorithms and websites towards that one user’s interests which in the end creates a clear bias. In addition, when I began searching, I noticed that most of all the information regarding minorities all pointed back to the same things. Those things being definitions of what minorities are and false misconceptions and judgments about minorities. This led me to recognize the points made in “Youtube the Great Radicalizer”. For example, Zeynep Tufecki states, “This situation is especially dangerous given how many people -- especially young people -- turn to YouTube for information.”(Tufecki). Tufecki is making it clear that companies such as Google and Youtube know that so many users go to their algorithms for information and with this knowledge Google and Youtube try to exploit these users by feeding them specifically catered information that will have a greater effect on bringing them back to that algorithm for more. I noticed most of the results that came up about minorities were information regarding how minorities are treated as well what it means to be a minority showing that it seems most of the results seen are coming from a perspective of non-minorities or in better words the majority making it clear the biased in algorithms are rooting from the creators. This highlights Nicholas Young’s point in “I Know Some Algorithms Are Biased—because I Created One” based on the fact that Young states, “To address issues with the algorithm, we can push for algorithms’ transparency, where anyone could see how an algorithm works and contribute improvements.” (Young). This shows that if how an algorithm works were available to the user, potential bias could be removed since now the users are aware of why there is bias as well-informed of what’s being advertised and catered to them by the creator in result allowing them to filter out unnecessary information designed to hook them on the algorithm.
Work cited:
Young, Nicholas T. “I Know Some Algorithms Are Biased-Because I Created One.” Scientific American Blog Network, Scientific American, 31 Jan. 2020, https://blogs.scientificamerican.com/voices/i-know-some-algorithms-are-biased-because-i-created-one/.
Eddy, Max. “How Companies Turn Your Data into Money.” PCMAG, PCMag, 10 Oct. 2018, https://www.pcmag.com/news/how-companies-turn-your-data-into-money.
Tufekci, Zeynep. “YouTube, the Great Radicalizer.” New York Times, The New York Times Company, 11 Mar. 2018, https://coinse.io/assets/files/teaching/2019/cs489/Tufekci.pdf.




Comments