•  
 
 

Tech glossary

Defining IT & technology terms

Machine learning

Machine learning is the scientific process of training systems to act upon data without requiring explicit, programmed instructions. A subtype of Artificial Intelligence (AI), machine learning leverages algorithms and statistical models to identify patterns and predict future outcomes.

Most machine learning initiatives fit within two models: supervised and unsupervised. Supervised machine learning begins with a known, labeled dataset — often called “training data” — and uses that data to make predictions, which are compared against actual outcomes in order to further refine the algorithm. Unsupervised data leverages unlabeled data in order to provide a deeper understanding of how computers identify patterns.

Machine learning can be applied to a number of business applications, and it is particularly useful in areas where conventional algorithms have proven insufficient. Examples of machine learning innovation include self-driving cars, email filtering applications and speech-recognition software.

Learn more about machine learning

Related terms

  • Artificial Intelligence
  • Digital Innovation
  • Machine learning operations (MLOps)

Featured content for machine learning

Article The Path to Digital Transformation: Where Leaders Stand in 2023 Image

Insight report The Path to Digital Transformation: Where Leaders Stand in 2023

Article Outsmarting Ransomware: A Quick Response Guide for Military Image

eBook Outsmarting Ransomware: A Quick Response Guide for Military

Article 4 Technology Trends Impacting Effective Military Operations  Image

eBook 4 Technology Trends Impacting Effective Military Operations

Article Elevate the Military With AI Image

Datasheet Elevate the Military With AI

Narrow your topic:

Artificial Intelligence (AI)  Digital Innovation  View all focus areas