Regression analysis is a mathematical procedure that seeks to predict the value of one variable  the dependent variable  based upon the values of one or more independent, or predicting, variables. The simplest variety is a linear, or straightline, regression between two variables. If you assign the independentvariable values to a graph's xaxis and the dependentvariable values to the yaxis, the resulting linear regression line has a slope called beta (b).
The Regression Equation

In a simple linear regression, the value of the dependent variable (y) is equal to a constant, known as the yintercept (a) plus the product of beta times the independent variable's value (x) plus an error term (e). In math notation, y = a + bx + e. The yintercept is the value of y when x is equal to zero. Beta, the slope, tells you how much y changes for each change in x. The error term accounts for unpredictable variations in the relationship between x and y, but is usually not included in a regression graph.
Least Squares Method

Frequently, you don't get a straight line by connecting the (x, y) data pairs on a graph. To form a linear regression line, you apply the least squares method, which finds the straight line that best fits the data. Beta is the slope of this line. A horizontal slope implies that y does not depend on x, because y remains the same whatever value x assumes.
References
 Photo Credit violetkaipa/iStock/Getty Images