The method of least squares of residuals is commonly used to get the best fit with linear regression. The reason why the absolute value of residual's (|y- ypred|) is not used is as follows :
Although least squares purposefully square the error in the belief that it will respond better when values deviate from your projected value, you could run something similar if you so desired.
Consider the following scenario: You plot your data's real values on a graph, and you observe a straight line of numbers with a few outliers that have greater values than you would expect. You could be tempted to believe that the best fit is a straight line along the values. Because the real pattern must take into account all the values that occurred, least squares adapt more to the outliers than other values and purposefully squares values, sort of assuming that the straight line of actual values was a coincidence.
To learn more about least squares here:
brainly.com/question/2141008
#SPJ4