AI for Teachers, An Open Textbook: Edition 1

Behind the Search Lens : Effects of search on the society

Social effects

More and more, there is a feeling that everything that matters is on the web and should be accessible through search.7As LM Hinman puts it, “Esse est indicato in Google (to be is to be indexed on Google).” As he also notes, “citizens in a democracy cannot make informed decisions without access to accurate information.”1,8 If democracy stands on free access to undistorted information, search engines directly affect how democratic our countries are. Their role as gatekeepers of knowledge is in direct conflict with their nature as private companies dependent on ads for income. Therefore, for the sake of a free society, we must demand accountability for search engines and transparency in how their algorithms work.1

Creation of Filter bubbles

Systems that recommend content based on user profiles, including search engines, can insulate users from exposure to different views. By feeding content that the user likes, they create self-reinforcing biases and “filter bubbles”.1,5 These bubbles, created when newly acquired knowledge is based on past interests and activities, 2 cement biases as solid foundations of knowledge. This could become particularly dangerous when used with young and impressionable minds. Thus, open discussions with peers and teachers and colloborative learning activities should be promoted in the classroom.

Feedback loops

Search engines, like other recommendation systems, predict what will be of interest to the user. Then, when the user clicks on what was recommeded, it takes it as positive feedback. This feedback affects what links are displayed in the future. If a user clicked on the first link displayed, it is because they found it relevant or is it simply because it was the first result and thus easier to choose?

Implicit feedback is tricky to interpret. When predictions are based on incorrect intrepretation, the effects are even trickier to predict. When results are a certain nature are repeatedly shown - and are the only thing that the user gets to see - it can even end up changing what the user likes and dislikes : the case of the self-fulfilling prediction.

In a city in the United States, a predictive policing system was launched where the system points out which areas of a city are at high risk for crime. This meant, more police officers are deployed to such areas. Since these officers knew the area was at high risk, they were very careful and stopped, searched, or arrested more people than they would have normally. The arrests thus validated the prediction, even where the prediction was biased in the first place. Not only that, the arrests were data for future predictions on the same areas and on areas similar to it, compounding biases over time.10

We use prediction systems so that we can act on the predictions. But acting on biased predictions, affects future outcomes, the people invovled and thus the society itself. "As a side effect of fulfilling its purpose of retrieving relevant information, a search engine will necessarily change the very thing that it aims to measure, sort, and rank. Similarly, most machine learning systems will affect the phenomena that they predict" 10
 

Fake news, extreme content and Censorship

There is increasing prevalence of fake news(false stories that appear as news) in online forums, social media sites and blogs, all available to students through search. Small focused groups of people can drive ratings up for specific videos and web sites of extreme content. This increases the content’s popularity and appearance of authenticity, gaming the ranking algorithms.5 Yet, as of to date, there is no clear and explicit policy that has been adopted by search engine companies to control fake news.1

On the other hand, search engines systematically exclude certain sites and certain types of sites in favor of others.9,1They censor content from some authors, despite not being selected by public for such a task. Therefore, they should be used with awareness and discrimination.

------------------------------------------------------------------------------------------------------
1 Tavani, H., Zimmer, M., Search Engines and Ethics, The Stanford Encyclopedia of Philosophy, Fall 2020 Edition), Edward N. Zalta (ed.)
2 Englehardt, S., Narayanan, A., Online Tracking: A 1-million-site Measurement and Analysis, Extended version of paper at ACM CCS 2016.
3 Google Privacy and Terms
4 Microsoft Privacy Statement
5 Milano, S., Taddeo, M., Floridi, L. Recommender systems and their ethical challenges, AI & Soc 35, 957–967, 2020 
6 Tavani, H.T., Ethics and Technology: Controversies, Questions, and Strategies for Ethical Computing, 5th edition, Hoboken, NJ: John Wiley and Sons, 2016
7 Hillis, K., Petit, M., Jarrett, K., Google and the Culture of Search, Routledge Taylor and Francis, 2013
8 Hinman, L. M., Esse Est Indicato in Google: Ethical and Political Issues in Search Engines, International Review of Information Ethics, 3: 19–25, 2005
9 Introna, L. and Nissenbaum, H., Shaping the Web: Why The Politics of Search Engines Matters, The Information Society, 16(3): 169–185, 2000
10 Barocas, S.,  Hardt, M., Narayanan, A., Fairness and machine learning Limitations and Opportunities, yet to be published

This page has paths: