# Logistic Regression Models in R

In logistic regression models, the response variable ($y$) is of categorical (binary, dichotomous) values such as 1 or 0 (TRUE/ FALSE). It measures the probability of a binary response variable based on mathematical equation relating the values of response variable with the predictor(s). The built-in function glm() can be used to perform logistic regression analysis.

The odds are used in logistic regression. If $p$ is the probability of success, the odds of in favour of success are, $\frac{p}{q}=\frac{p}{1-p}$.

Note that probability can be converted to odds and odds can also be converted to probability. However, unlike probability, odds can exceed 1. For example, if the probability of an event is 0.25, the odds in the favour of that event are $\frac{0.25}{0.75}=0.33$, but the odds against the event are $\frac{0.75}{0.25}=3$.

In built-in dataset (“mtcars“), the column (am) describes the transmission mode (automatic or manual) which is of binary value (0 or 1). The logistic regression models between the column (response variable) am and other columns (regressors) namely hp, wt, and cyl can be performed as given are:

### Logistic regression with one dichotomous predictor

logmodel1 <- glm(am ~ vs, family = "binomial")
summary(logmodel1)

### Logistic regression with one continuous predictor

logmodel2 <- glm(am ~ wt, family = "binomial")
summary(logmodel2)

### Logistic regression with multiple predictors

logmodel3 <- glm(am ~ cyl + hp + wt, family = "binomial")
summary(logmodel3)
plot(logmodel3)

Note: in logistic regression model, both dichotomous and continuous variables can be used as predictor.

In R, the coefficients returned by logistic regression are a logit, or the log of the odds. To convert logits to odds ratio exponentiate it and to convert logits to probability use $\frac{e^\beta}{1-e^\beta}$. For example,

logmodel1 <- glm(am ~ vs, family = "binomial", data = mtcars)
logit_coef <- logmodel1$coef exp(logmodel1$coef)
exp(logit_coef)/(1 + exp(logmodel1\$coef))