<> logistic regression

The thought of logical return : According to the existing data, the regression formula of classification boundary line is established , According to this classification ( It is mainly used to solve the problem of two classification )
sigmoid function

The function image is

Logistic regression model function

adopt sigmoid function , Constructing logistic regression model function :

use sigmoid The value solved by the function is a class 1 A posteriori estimate of p(y=1|x,θ) , You can get it :

use sigmoid The value solved by the function is a class 0 A posteriori estimate of p(y=0|x,θ) , You can get it :

The above two formulas are sorted out , obtain

Then the likelihood function is obtained :

Log likelihood function :

Find the partial derivative :

Update parameters θ:

Realization of logistic regression code by gradient descent method :
# Realization of logistic regression by gradient descent method import numpy as np import matplotlib.pyplot as plt import
mglearn x, y = mglearn.datasets.make_forge() x = np.hstack((np.ones((len(x),
1)), x)) plt.scatter(x[:, 1], x[:, 2], c=y) y = y.reshape(-1, 1) def
sigmoid(x): return 1/(1 + np.exp(-x)) def loss_function(x, y, w, m): left =
np.multiply(y, np.log(sigmoid(np.matmul(x, w)))).sum() right = np.multiply((1 -
y), np.log(sigmoid(1 - np.matmul(x, w)))).sum() return -(left + right) / m def
gradient_Descent(x, y): epoch = 1000 learning_rate = 0.01 m, n = x.shape w =
np.zeros((n, 1)) loss = [] for i in range(epoch): w_grad = (np.matmul(x.T,
(sigmoid(np.matmul(x, w)) - y)))/m w = w - learning_rate*w_grad
loss.append(loss_function(x, y, w, m)) return w, loss w, loss =
gradient_Descent(x, y) x_test = np.linspace(8, 12, 100).reshape(-1, 1)
plt.plot(x_test, (-w[1]*x_test-w[0])/w[2], c='g') plt.show()
plt.plot(range(1000), loss) plt.show()

sklearn Realize logical regression
# logistic regression from sklearn.datasets import load_breast_cancer from
sklearn.model_selection import train_test_split from sklearn.linear_model
import LogisticRegression from sklearn.metrics import accuracy_score,
roc_auc_score, roc_curve import matplotlib.pyplot as plt # Load data cancer =
load_breast_cancer() # Sample characteristics x = cancer.data # Sample label y = cancer.target #
Training set , Test set partition x_train, x_test, y_train, y_test = train_test_split(x, y,
random_state=2) # Model lr = LogisticRegression(class_weight='balanced') # model training
lr.fit(x_train, y_train) # forecast y_pred = lr.predict(x_test) # Model evaluation , accuracy
print('accuracy: ', accuracy_score(y_test, y_pred)) # Model evaluation ,roc_auc_score
print('roc_auc_score: ', roc_auc_score(y_test, y_pred)) # roc curve fpr, tpr, _ =
roc_curve(y_test, lr.predict_proba(x_test)[:, 1]) # Drawing plt.plot(fpr, tpr,
label='roc_auc') plt.title('roc_curve') plt.xlabel('FPR') plt.ylabel('TPR')
plt.legend() plt.show()

©2019-2020 Toolsou All rights reserved,
SQL Server Database Glossary CSS Animation effect dedecms Website is hacked How to solve hijacking to other websites Count the number of letters (java Language implementation )Java Basics ( Three ) String In depth analysis The difference between static method and non static method And storage location Django Personal blog building tutorial --- Time classified archiving Keras Save and load model (JSON+HDF5)hive Summary of processing methods for a large number of small files Website mobile phone number capture method