Bias VS. Variance in Machine Learning

Hello, my fellow machine learning enthusiasts, well sometimes you might have felt that you have fallen into a rabbit hole and there is nothing you can do to make your model better. Well, in that case, you should learn about “Bias Vs Variance” in machine learning.

So, the very first question that props up in one’s head is, What is this Bias or Variance and What is it do with my machine learning model?

Well, let’s just start…


What is Bias???

Bias is basically how well our model predicted the value over the actual value.

A high bias condition means that the model is not “fitting” the dataset very well i.e. the training error will be large. On the other hand, a low bias means that the model is fitting the data set very well and there will be really low training error.


What is a Variance???

When on the testing or the validation set the pre-trained model doesn’t perform as good, then the model might be suffering from high variance.

A high variance refers to the condition when the model is not able to make as good as predictions on the test or validation set as it did on the training dataset. In low variance conditions, the model performs as good as it did on the training dataset on the testing data set.


But, how do the two play role in determining the efficiency of the model???

Well, we can say:

  1. High Variance-High Bias –> The model is uncertain and inaccurate on average
  2. Low Variance-High Bias –> The model is consistent but inaccurate.
  3. High Variance-Low Bias –> The model is uncertain but accurate.
  4. Low Variance-Low Bias –>  The model is consistent and accurate (IDEAL).


Well, that’s enough of the theory, now let us see how things play up in the real world…


Features of a model with high Bias are:

  • Underfitting: A model with a high bias implements a simple approach to fit the data set.
  • Low Training Accuracy: A model with a high bias does not fir the model very well hence it leads to a low training set accuracy.
  • Inability to solve complex problems: A model with a high bias is unable to learn the complexity of the dataset because it’s a simple model.

Features of a model with high Variance are:

  • Overfitting: A model with high variance is highly complex and tends to overfit the dataset.
  • Low Testing Accuracy: A model with high variance tends to overfit the model and hence it just leads to a forced higher training accuracy and hence on a testing set it fails to do the same.
  • Low performance on simpler problems: A model with high variance tends to overfit data, hence leading to over-complication of a simple problem.

Let’s introduce a factor what will help us know what is required at a time:


Bias – Variance Trade-off :

Both bias and variance a complementary to each other. In other words, we can say that if for a model we try to decrease the bias, that might result in an increase in the variance for the model. Similarly, if the variance is decreased that might increase the bias.

Hence, we can say that its nearly impossible for a model to have both low bias and low variance. This complementary relationship between both is called Bias-Variance Trade.


Bias vs variance


A high Bias condition leads to “Underfitting” and a high Variance condition leads to “Overfitting”.


over underfit


Yaa… There is a problem but what’s the solution then???

This solution comes from Dr. Andrew NG’s machine learning course on Coursera.


And there you have it.

I hope you loved to learn something new.

Thanks for reading.


Leave a Reply

Your email address will not be published. Required fields are marked *