hmu.ai
Back to AI Dictionary
AI Dictionary

Inference

Definition

The process of using a trained machine learning model to make predictions against previously unseen data.

Deep Dive

In the context of machine learning, inference refers to the process of using a *trained* model to make predictions or decisions against *previously unseen* data. After a model has undergone a training phase where it learns patterns and relationships from a dataset, it is then deployed to apply this learned knowledge to new inputs. This step is distinct from training, as it focuses on generating outputs rather than adjusting the model's internal parameters.

Examples & Use Cases

  • 1A spam filter classifying a newly arrived email as "spam" or "not spam"
  • 2A recommendation engine suggesting products to an online shopper based on their browsing history
  • 3A self-driving car identifying pedestrians and other vehicles in its real-time video feed

Related Terms

Model DeploymentPredictionTraining

Part of the hmu.ai extensive business and technology library.