Curvilinear Regression in R: A Quick Reference

Introduction to Curvilinear Regression in R Language

In this post, we will learn about some basics of curvilinear regression in R.

The curvilinear/non-linear regression analysis is used to determine if there is a non-linear trend exists between $X$ and $Y$.

Adding more parameters to an equation results in a better fit to the data. A quadratic and cubic equation will always have higher $R^2$ than the linear regression model. Similarly, a cubic equation will usually have higher $R^2$ than a quadratic one.

Logarithmic and Polynomial Relationships

The logarithmic relationship can be described as follows:
$$Y=m\, log(x)++c$$
the polynomial relationship can be described as follows:
$$Y=m_1x + m_2x^2 + m_3x^3 + m_nx^n + c$$

The logarithmic example is more akin to a simple regression, whereas the polynomial example is multiple regression. Logarithmic relationships are common in the natural world; you may encounter them in many circumstances. Drawing the relationships between response and predictor variables as a scatter plot is generally a good starting point.

Consider the following data that are related in a curvilinear form,

GrowthNutrient
22
94
116
128
1310
1416
1722
1928
1730
1836
2048

Performing Curvilinear Regression in R

Let us perform a curvilinear regression in R language.

Growth <- c(2, 9, 11, 12, 13, 14, 17, 19, 17, 18, 20)
Nutrient <- c(2, 4, 6, 8, 10, 16, 22, 28, 30, 36, 48)
data <- as.data.frame(cbind(Growth, Nutrient))

ggplot(data, aes(Nutrient, Growth) ) +
  geom_point() +
  stat_smooth()
Curvilinear Regression in R

The Scatter plot shows the relationship appears to be a logarithmic one.

Linear Regression in R

Let us carry out a linear regression using the lm() function by taking the $\log$ of the predictor variable rather than the basic variable itself.

data <- cbind(Growth, Nutrient)
mod  <- lm(Growth~log(Nutrient, data))
summary(mod)

##
Call:

lm(formula = Growth ~ log(Nutrient), data = data)

Residuals:
    Min      1Q  Median      3Q     Max 
-2.2274 -0.9039  0.5400  0.9344  1.3097 
Coefficients:
              Estimate Std. Error t value Pr(>|t|)    
(Intercept)     0.6914     1.0596   0.652     0.53    
log(Nutrient)   5.1014     0.3858  13.223 3.36e-07 ***
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 1.229 on 9 degrees of freedom
Multiple R-squared:  0.951,     Adjusted R-squared:  0.9456 
F-statistic: 174.8 on 1 and 9 DF,  p-value: 3.356e-07

FAQS about Curvilinear Regression in R

  1. Write in detail about curvilinear regression models.
  2. How visually one can guess the curvilinear relationship between the response and predictor variable?
  3. What may be the consequences, if a curvilinear relationship is estimated using a simple linear regression model?

Learn about Performing Linear Regression in R

Learn Statistics

Backward Deletion Method Step by Step in R

Introduction to Backward Deletion Method

With many predictor variables, one can create the most statistically significant model from the data. There are two main choices: forward stepwise regression and backward deletion method.
In Forward Stepwise Regression: Start with the single best variable and add more variables to build your model into a more complex form.

In Backward Deletion (Backward Selection) Regression: put all the variables in the model and reduce the model by removing variables until you are left with only significant terms.

Backward Deletion method (Step by Step Procedure)

Let’s start with a big model and trim it until you get the best (most statistically significant) regression model. This drop1() command can examine a linear model and determine the effect of removing each one from the existing model. Complete the following steps to perform a backward deletion. Note that the model has different R packages for the Backward and Forward Selection of predictors.

Step 1: (Full Model)

Step 1: To start, create a “full” model (all variables at once in the model). It would be tedious to enter all the variables in the model, one can use the shortcut, the dot notation.

mod <- lm(mpg ~., data = mtcars)

Step 2: Formula Function

Step 2: Let’s use the formula() function to see the response and predictor variables used in Step 1.

formula(mod)
Backward Deletion Method

Step 3: Drop1 Function

Step 3: Let’s use the drop1() function to see which term (predictor) should be deleted from the model

drop1(mod)

Step 4: Remove the Term

Step 4: Look to remove the term with the lowest AIC value. Re-form the model without the variable that is non-significant or has the lowest AIC value. The simplest way to do this is to copy the model formula in the clipboard, paste it into a new command, and edit out the term you do not want

mod1 <- lm(mpg ~ ., data = mtcars)

Step 5: Examine the Effect

Step 5: Examine the effect of dropping another term by running the drop1() command once more:

drop1(mod1)

If you see any variable having the lowest AIC value, if found, remove the variable and carry out this process repeatedly until you have a model that you are happy with.

FAQS about Backward Deletion Method in R

  1. Write a step-by-step procedure to perform the Backward Deletion Method in r.
  2. How one can examine the effect of dropping the term from the model?
  3. What is the use of the formula function term in lm() model?
  4. What is the use of drop1() function in r?

Learn more about lm() function

Online MCQs Quiz Website

Performing Linear Regression in R: A Quick Reference

Introduction to Performing Linear Regression in R

Regression is to build a function of independent variables (also known as predictors, regressors, explanatory variables, and features) to predict a dependent variable (also called a response, target, and regressand). Here we will focus on performing linear regression in R Language.

Linear regression is to predict response with a linear function of predictors as $$y=\beta_0+\beta_1x_1+\beta_2x_2+\cdots + \beta_kx_k,$$ where $x_1, x_2, \cdots, x_k$ are predictors and $y$ is the response to predict.

Before performing the regression analysis it will be very helpful to computer the coefficient of correlation between dependent variable and independent variable and also better to draw the scatter diagram.

Performing Linear Regression in R

Load the mtcars data, and check the data structure using str().

str(mtcars)

You have data stored in some external file such as CSV, then you can use read.csv() function to load the data in R. To learn about importing data files in R follow the link: Import Data files in R

Let us want to check the impact of weight (wt) on miles per gallon (mpg) and test the significance of the regression coefficient and other statistics to see the goodness of our fitted model

mod <- lm(mpg ~ wt, data = mtcars)
summary(mod)
Performing Linear Regression in R Estimation and Testing

Now look at the objects of results stored in mod

names(mod)

Getting Coefficients and Different Regression Statistics

Let us get the coefficients of the fitted regression model in R

mod$coef
coef(mod)

To obtain the confidence intervals of the estimated coefficients, one can use the confint()

confint(mod)

Fitted values from the regression model can be obtained by using fitted()

mod$fitted
fitted(mod)

The residuals can be obtained for the regression model using residual() function

mod$resid
resid(mod)

One can check the formula used to perform the simple/ multiple regression. It will tell you which variable is used as a response and others as explanatory variables.

formula (mod)

Graphical Representation of Relationship

To graphically visualize the relationship between variables or pairs of variables one can use plot() or pair() functions. Let us draw the scatter diagram between the dependent variable mpg and the explanatory variable wt using the plot() function.

plot(mpg ~ wt, data = mtcars)
Scatter Plot and Performing Linear Regression in R

One can add a best-fitted line to the scatter plot. For this purpose use abline() with an object having the class lm such as mod in this case

abline(mod)

There are many other functions and R packages to perform linear regression models in the R Language.

FAQS about Performing Linear Regression Models in R

  1. What is the use of abline() function in R?
  2. How a simple linear regression model can be visualized in R?
  3. How one can obtain fitted/predicted values of the simple linear regression model in R?
  4. Write a command that saves the residuals of lm() model in a variable.
  5. State the step-by-step procedure of performing linear regression in R.

To learn more about the lm() function in R

https://itfeature.com