Minimizing the sum of the squared deviations around the line is called Least square estimation.
It is given that the sum of squares is around the line.
Least squares estimations minimize the sum of squared deviations around the estimated regression function. It is between observed data, on the one hand, and their expected values on the other. This is called least squares estimation because it gives the least value for the sum of squared errors. Finding the best estimates of the coefficients is often called “fitting” the model to the data, or sometimes “learning” or “training” the model.
To learn more about regression visit: brainly.com/question/14563186
#SPJ4
It factors to (x+12)(x-12)
This is the same as (x-12)(x+12) as we can multiply two expressions in any order we want. This is like saying 7*5 is the same as 5*7
I used the difference of squares rule to factor x^2 - 144. It might help to write x^2 - 144 as x^2 - 12^2, then compare it to a^2 - b^2 = (a-b)(a+b)
Answer:
It's a function
Step-by-step explanation:
Functions can only have one output for each input. And since all the x values for each point are different, there is only one output (y-value) for each input(x-value).