Tuesday, July 24, 2018

Word of the Day: explainable AI (XAI)

Word of the Day WhatIs.com
Daily updates on the latest technology terms |July 24, 2018
explainable AI (XAI)

Explainable AI (XAI) is artificial intelligence that is programmed to describe its purpose, rationale and decision-making process in a way that can be understood by the average person. XAI is often discussed in relation to deep learning and plays an important role in the FAT ML model (fairness, accountability and transparency in machine learning).

 

XAI provides general information about how an AI program makes a decision by disclosing:

  • The program's strengths and weaknesses.
  • The specific criteria the program uses to arrive at a decision.
  • Why a program makes a particular decision as opposed to alternatives.
  • The level of trust that's appropriate for various types of decisions.
  • What types of errors the program is prone to.
  • How errors can be corrected.

An important goal of XAI is to provide algorithmic accountability. Until recently, AI systems have essentially been black boxes. Even if the inputs and outputs are known, the algorithms used to arrive at a decision are often proprietary or not easily understood, despite when the inner-workings of the programming is open source and made freely available.

As artificial intelligence becomes increasingly prevalent, it is becoming more important than ever to disclose how bias and the question of trust are being addressed. The EU's General Data Protection Regulation (GDPR), for example, includes a right to explanation clause.

Quote of the Day

"You don't make your company successful by buying a bucket of AI. It's not about waving an AI wand -- there's no intrinsic value by itself. It's about picking the right thing to do with AI." - Carl Hillier

 

Trending Terms

unsupervised learning
machine learning
deep learning
GDPR
synthetic intelligence
supervised learning

algorithmic accountability

 
Learning Center

Intelligent information management is ready for AI, blockchain
With AIIM changing its name to reflect the modern tech landscape, intelligent information management professionals discussed emerging technologies and their effect on the industry.

Companies want explainable AI, vendors respond
The call is on for explainable AI: A big theme at the Strata Data Conference was the need for more transparency into AI and machine learning technologies.

Humans and AI tools go hand in hand in analytics applications
Organizations are pairing up humans and AI tools in analytics applications in an effort to ensure that the output of machine learning algorithms and other forms of artificial intelligence is accurate and relevant.

Developing AI apps free from bias crucial to avoid analytics errors
Developing AI around biased data or unrealistic models can lead to less effective applications -- and possibly land enterprises in regulatory hot water.

Artificial brains are no-brainers for business, but societal challenges inevitable
The benefits of artificial intelligence to businesses are already being reaped by early adopters.

Writing for Business

Some employees fear that computers will take over their jobs with the _______ of machine learning, but that is not the case.
A. raise
B. rise
Answer

 

Stay In Touch
For feedback about any of our definitions or to suggest a new definition, please contact me at: mrouse@techtarget.com

 

Visit the Word of the Day Archives and catch up on what you've missed!

 

FOLLOW US

TwitterRSS
About This E-Newsletter
This e-newsletter is published by the TechTarget network. To unsubscribe from Whatis.com, click here. Please note, this will not affect any other subscriptions you have signed up for.
TechTarget

TechTarget, Whatis, 275 Grove Street, Newton, MA 02466. Contact: webmaster@techtarget.com

Copyright 2018 TechTarget. All rights reserved.

No comments: