In statistics, linear regression models the relationship between a dependent variable and one or more explanatory variables using a linear function. If two or more explanatory variables have a linear relationship with the dependent variable, the regression is called a multiple linear regression. Multiple regression, on the other hand, is a broader class of regressions that encompasses linear and nonlinear regressions with multiple explanatory variables.
Regression analysis is a common way to discover a relationship between dependent and explanatory variables. However, this statistical relationship does not mean that the explanatory variables cause the dependent variable; it rather speaks of some significant association in the data. Linear regression attempts to draw a line that comes closest to the data by finding the slope and intercept that define the line and minimize regression errors. However, many relationships in data do not follow a straight line, so statisticians
use nonlinear regression instead.
It is rare that a dependent variable is explained by only one variable. In this case, an analyst uses multiple regression, which attempts to explain dependent variable using more than one independent variable. Multiple regressions can be linear and nonlinear.
Consider an analyst who wishes to establish linear relationship between the daily change in a company's stock prices and other explanatory variables such as the daily change in trading volume and the daily change in market returns. If he runs a regression with the daily change in the company's stock prices as a dependent variable and the daily change in trading volume as an independent variable, this would be an example of a simple linear regression with one explanatory variable. If the analyst adds the daily change in market returns into the regression, it would be a multiple linear regression.