A **Logistic Regression Model** is made from statistical analysis in which there are one or more independent variables that determine a binary outcome.

For example, a company sends out mailers to buy a product. The company has data that shows the age of the customer and if they bought it or not.

You can see the data implies that older people are more likely to buy the product.

Can this be modeled? A simple linear regression model will not work well. Moreover, a Linear Regression extends beyond the 1 value. It would be silly to say there is more than a 100% chance of anything to happen.

The key to remember for this example is you want to **predict probability**, and probability ranges from 0 to 1.

## Logistic Regression Model Formula

To get the formula for a Logistic Regression Model, you apply the Sigmoid Function to a the Simple Linear Regression equation. Solve for * y* inside the Sigmoid Function, and substitute the value of

*in theLinear Equation.*

**y**

Use of the Logistic Regression formula transforms the look of a Linear Regression Model. With this formula you can predict probability.

Take four random variables for the independent variable * x*. Project the values on the curve. These projections are the fitted values.

This information allows to give probability. It works slightly different if you want to make a binary prediction. In this case, you make a prediction if the customer will buy the product.

To make a prediction you choose an arbitrary, horizontal line. The 50% line is a fair line to choose. And then, any projected values on the Logistic Regression Model that shows below this line you would make a no precition. Any value above the line you would predict a yes value.

After predictions are made, a **confusion matrix** is used to give the accuracy of the predictions.

The second column of the top row gives he number or **false positives**. This is an outcome predicted to happen but in reality did not happen. The second row of the first column show the number of **false negatives**.