** Regression analysis ** — it is a statistical process for assessing the relationship between dependent variables or criterion variables and one or more independent variables or predictors. Regression analysis explains changes in criteria in relation to changes in selected predictors. Conditional expectation of predictor-based tests, where the mean of the dependent variables is set when the explanatory variables change. Three main use cases for regression analysis — determining the strength of predictors, predicting the effect and predicting the trend.

** Types of regression — **

- Linear regression
- Logistic regression
- Polynomial regression
- Stepwise regression
- Stepwise regression
- Ridge regression
- Lasso regression
- ElasticNet regression

** Logistic regression ** is used when the dependent variable is dichotomous. Logistic regression estimates the parameters of a logistic model and is a form of binomial regression. Logistic regression is used to work with data that has two possible criteria and the relationship between criteria and predictors. Logistic regression equation: l = ,

** Polynomial regression ** is used for curved data. Polynomial regression follows the least squares method. The purpose of regression analysis — simulate the expected value of the dependent variable y relative to the independent variable x. Polynomial regression equation: l = ,

** Stepwise regression ** is used to fit regression models with predictive models. This is done automatically. With each step, a variable is added to or subtracted from the set of explanatory variables. Stepped Regression Approaches — they are forward selection, reverse exclusion, and bidirectional exclusion. Stepwise regression formula ,

** Ridge regression ** — it is a method for analyzing multiple regression data. When multicollinearity occurs, the least squares estimates are unbiased. The degree of bias is added to the regression estimates, and as a result, ridge regression reduces the standard errors. Ridge Regression Formula ,

** Lasso Regression ** — it is a regression analysis technique that performs both variable selection and regularization. Lasso regression uses soft thresholding. Lasso regression selects only a subset of the provided covariates for use in the final model. Lasso regression ,

** ElasticNet regression ** — it is a regularized regression method that linearly combines the penalties of the lasso and ridge methods. ElasticNet regression is used for support vector machines, metric training, and portfolio optimization. The penalty function is defined as: ,

Here`s a simple implementation:

` `

```
```
` # importing libraries `

` import `

` numpy as np `

` import `

` matplotlib.pyplot as plt `

` from `

` sklearn.linear_model `

` import `

` LinearRegression `

` x `

` = `

` 11 `

` * `

` np.random.random ((`

` 10 `

`, `

` 1 `

`)) `

` `

` # y = a * x + b `

` y `

` = `

` 1.0 `

` * `

` x `

` + `

` 3.0 `

` `

` # create a linear regression model `

` model `

` = `

` LinearRegression () `

` model.fit (x, y) `

` # predict y from data, where x is predicted from x `

` x_pred `

` = `

` np.linspace (`

` 0 `

`, `

` 11 `

`, `

` 100 `

`) `

` y_pred `

` = `

` model. predict (x_pred [:, np.newaxis]) `

` # build results `

` plt.figure (figsize `

` = `

` ( `

` 3 `

`, `

` 5 `

`)) `

` ax `

` = `

` plt. axes () `

` ax.scatter (x, y) `

` ax.plot (x_pred, y_pred) `

` ax.set_xlabel ( `` `predictors` `

`) `

```
``` ` ax.set_ylabel (`

` `criterion` `

`) `

` ax.axis (`

` `tight` `

`) `

` `

` plt.show () `

` `

** Output: **

X
# Submit new EBook