AI Term:Underfitting

·

·

« Back to Glossary Index

Underfitting is a concept in machine learning where a model is too simple to learn the underlying structure of the data, resulting in poor performance on both the training data and new, unseen data.

Imagine you’re trying to learn how to play a new musical instrument, but you only practice for a few minutes each week. Even though you’re putting in some effort, you might find that you’re not getting very good at playing the instrument because you’re not practicing enough. This is a bit like underfitting.

In machine learning, a model is trained on a set of data called the training set. If the model is underfitting, it means it’s not even doing a good job at predicting the outcomes on the training set, let alone on new, unseen data. This is usually because the model is too simple or hasn’t been trained enough to understand the patterns in the data.

To avoid underfitting, we might need to make the model more complex, for example by adding more layers to a neural network or more features to a linear regression model. We might also need to train the model for longer, or provide more training data so the model has more examples to learn from.

On the other hand, it’s also important to avoid making the model too complex or training it for too long, because that can lead to overfitting, where the model learns the training data too well and performs poorly on new data.

« Back to Glossary Index