Bias in search queries can lead to biased search results and influence the formation of people’s opinions on the searched topics. Large language models (LLMs) have the potential to augment modern web browsers to mitigate some of their intrinsic limitations and help users find more reliable and useful information. We conducted a formative study (N=10) that explores the challenges and search strategies of people when searching the Web to verify the truthful- ness of pieces of information. Based on the findings, we developed a browser plugin that uses LLMs to support users in query formu- lation and refinement by signaling the presence of possible biases in search queries, assessing the reliability of search results, and suggesting alternative queries to facilitate unbiased information exploration.
BiasAlert: supporting bias identification in search queries
P. Mellatdoust;G. Di Fede;L. Alrabie;C. Cappiello;S. Andolina
2025-01-01
Abstract
Bias in search queries can lead to biased search results and influence the formation of people’s opinions on the searched topics. Large language models (LLMs) have the potential to augment modern web browsers to mitigate some of their intrinsic limitations and help users find more reliable and useful information. We conducted a formative study (N=10) that explores the challenges and search strategies of people when searching the Web to verify the truthful- ness of pieces of information. Based on the findings, we developed a browser plugin that uses LLMs to support users in query formu- lation and refinement by signaling the presence of possible biases in search queries, assessing the reliability of search results, and suggesting alternative queries to facilitate unbiased information exploration.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


