Least squares is a method for conducting linear regression analysis in statistics. The result of a least squares analysis is a line equation with the regression coefficient as the slope and the intercept coefficient as the y-intercept. In this method, the squares of the vertical distances from the data points to the regression line are minimized, which is why it is called "least squares." You will also need the regression coefficient to calculate the intercept coefficient.

### Things You'll Need

- Computer/calculator (for large data sets)

Calculate the average, or mean, of the x and y data set values. You should have the same number of values (n) in each data set.

Calculate the regression coefficient (k). The regression coefficient is the sample covariance divided by the variance. When you substitute the equations for these two values into the formula, what remains is the ratio of two sums. For large data sets, it is more efficient to use a computer program or spreadsheet to calculate these sums.

Calculate the intercept coefficient. Solve the general linear regression equation for y and collect all the constant terms together. The Greek letter "alpha" sometimes represents the intercept coefficient, but most people are accustomed to seeing the line equation y = mx + b, where m is the slope and b is the y-intercept.

Use the regression constant as the slope (m) and the intercept coefficient as the y-intercept to write the equation for the regression line. You can draw this line on a graph that includes your data to see the trend and dependencies.

## References

- Advanced Engineering Mathematics, 7th ed.; Erwin Kreyszig; 1993