AI for Teachers, An Open Textbook: Edition 1

Behind the Search Lens : Effects of search on the individual

While search engines provide a very useful service, they have some negative impacts on both the individual user and the society as a whole. Being aware of these impacts can help us shield ourselves and those who depend on us.

Data and Privacy

Most websites, search engines and mail clients collect information about users. Most of this data is tied to the identity of the user through ip addresses. This data is then used to serve targeted advertisements and personalised content,improve the services provided and do market research. However, Search engines do not always disclose all the information they collect and everything they do with that information once it has been collected.1 Or even where they collect this information. For example, studies show that Google can track users across nearly 80% of websites.2
Information that search engines can display when someone searches for a user include:Information collected and processed when they use search engines include :Morever, Data that is collected on one user who gave their consent can be used to draw inferences about another user who did not consent, but whom the search engine judged to have a similar profile.

All this data, both raw and processed, gives rise to privacy and security concerns. Some measures can be taken by search providers, governments and users to prevent privacy breeches :In Europe, Search engine companies are viewed as “controllers of personal data,” as opposed to mere providers of a service. Thus, they can be held responsible and liable for the content that is accessible through their services.However, privacy laws often concern confidential and intimate data. Whereas even harmless information about people can be mined to create user profiles based on implicit patterns in the data. Those profiles (whether accurate or not) can be used to make decisions affecting them.1

Also, how a law is enforced changes from country to country. According to GDPR, a person can ask a search engine company to remove a search result that concerns them. Even if the company removes it from the index in Europe, the page can still show up in the results outside Europe.1
Not to forget, while policies of companies can shed some light on their practices, research shows that there is often a gap between the policy and its use.2
Read more on data here and here.

Reliability of Content

Critics have pointed out that  search engine companies are not fully open with why they show some sites and not others, and rank some pages higher than others.1

Ranking of search results is heavily influenced by advertisers who sponsor content. Morever, Big search engine companies provide many services other than search. Content provided by them are often boosted in the search results.In Europe, Google has been formally charged with prominently displaying its own product or service in its search returns, regardless of its merits.1

Large companies and Web developers who study ranking algorithms can also influence ranking by playing on how a search engine defines popularity and authenticity of web sites. Of course, the criteria the search engine programmers judged important are themselves open to question.

All this affect how reliable the search results are. It is always a good idea to use multiple sources and multiple search engines and have a discussion about the content used in schoolwork.

Autonomy

A search engine, with its ranking system, recommends content. By not revealing the criteria used to select this content, it reduces user autonomy. For example, if we had known one of the suggested web pages is sponsored, or selected based on popularity criteria we don’t identify with, we might not have chosen to use that content. By taking away informed consent, search engines and other recommender systems have controlling influences over our behaviour.

Autonomy is having control over processes, decisions and outcomes.7 It implies liberty (independence from controlling influences) and agency (capacity for intentional action)7 . Systems that recommend content without explanation can encroach on the users’ autonomy. They provide recommendations that nudge the users in a particular direction, by engaging them only with what they would like and by limiting the range of options to which they are exposed.5

------------------------------------------------------------------------------------------------------
1 Tavani, H., Zimmer, M., Search Engines and Ethics, The Stanford Encyclopedia of Philosophy, Fall 2020 Edition), Edward N. Zalta (ed.)
2 Englehardt, S., Narayanan, A., Online Tracking: A 1-million-site Measurement and Analysis, Extended version of paper at ACM CCS 2016.
3 Google Privacy and Terms
4 Microsoft Privacy Statement
5 Milano, S., Taddeo, M., Floridi, L. Recommender systems and their ethical challenges, AI & Soc 35, 957–967, 2020 
6 Tavani, H.T., Ethics and Technology: Controversies, Questions, and Strategies for Ethical Computing, 5th edition, Hoboken, NJ: John Wiley and Sons, 2016
7 Hillis, K., Petit, M., Jarrett, K., Google and the Culture of Search, Routledge Taylor and Francis, 2013
 

This page has paths:

This page references: