Tuesday, January 30, 2018

Word of the Day: filter bubble

Word of the Day WhatIs.com
Daily updates on the latest technology terms |January 29, 2018
filter bubble

A filter bubble is an algorithmic bias that skews or limits the information an individual user sees on the internet. The bias is caused by the weighted algorithms that search engines, social media sites and marketers use to personalize user experience (UX).

Filter bubbles, which affect an individual's online advertisements, social media newsfeeds and web searches, essentially insulate the person from outside influences and reinforce what the individual already thinks. The word bubble, in this context, is a synonym for isolation; its context comes from a medical device called the isolator, a plastic bubble that was infamously used to sequester a young patient with immunodeficiencies in the 1970s.

 

While the goal of personalization is simply to present the end user with the most relevant information possible, it can also cause a distorted view of reality because it prioritizes information the individual has already expressed an interest about. The data used to personalize user experience (UX) and create an insulating bubble comes from many sources, including the user's search history, browsing choices and previous interaction with web pages.

 

While default search settings are convenient, they also tend to skew an individual's perception of what products and information the rest of the world sees. It is recommended that users periodically review the privacy and personalization settings of the browsers and social media websites they use to prevent query results from becoming unnecessarily discriminatory and newsfeeds from being weaponized.

 

Quote of the Day

"Several personalization engines launched in 2017 -- easy-to-use platforms for marketing professionals to segment their different audiences from disparate data sources, to target content delivery and to measure results." - Geoffrey Bock

 

Trending Terms

personalized search

dark post

fake news

algorithmic accountability

data discrimination

weaponized information

 

 

 

 
Learning Center

Cognitive hacking: Understanding the threat of bad data
Cognitive hacking and bad data were used during the 2016 presidential election, and they can be used against enterprises. Here's what you need to know.

Beware the 'thought police:' The dangers of human and AI integration
Humans could experience endless cognitive enhancements as AI integration evolves, but we should also be wary of the potential negative consequences.

Tim O'Reilly: The flawed genie behind algorithmic systems
Platform companies rely on algorithmic systems that have objective functions. Learn why that's good -- and why it's bad.

SAP adds functionality targeting gender bias in job ads
Words and context matter in job ads, especially with gender. Gender bias can be subtle, and context is everything. Employers who notice may improve recruiting.

Mathematician warns against weapons of 'math' destruction
Cathy O'Neil's recently published 'Weapons of Math Destruction' shines a light on the drawbacks of data science.

Training data for algorithms must be right, not just plentiful
During his presentation at AI World in Boston, the CEO at PARC warned that more training data isn't a replacement for the right training data.

 

Writing for Business

Facebook announced that ________ prioritizing what friends and family share in newsfeeds.

a. it is

b. they are

Answer

 

Stay In Touch
For feedback about any of our definitions or to suggest a new definition, please contact me at: mrouse@techtarget.com

 

Visit the Word of the Day Archives and catch up on what you've missed!

 

FOLLOW US

TwitterRSS
About This E-Newsletter
This e-newsletter is published by the TechTarget network. To unsubscribe from Whatis.com, click here. Please note, this will not affect any other subscriptions you have signed up for.
TechTarget

TechTarget, Whatis, 275 Grove Street, Newton, MA 02466. Contact: webmaster@techtarget.com

Copyright 2016 TechTarget. All rights reserved.

No comments: