<> Multiple linear regression by gradient descent method （ code implementation ）

data = genfromtxt(r"\Delivery.csv", delimiter=',')
Look at the data

x_data Is the eigenvalue ,y_data Is the tag value
So it should be set 3 Parameters θ0,θ1,θ2
lr = 0.0001 # parameter theta0 = 0 theta1 = 0 theta2 = 0 # Maximum number of iterations epochs = 1000
Next, it's the same as linear regression , Solving parameters by least squares and gradient descent

def compute_error(theta0, theta1, theta2, x_data, y_data): totalError = 0 for i
in range(0, len(x_data)): totalError += (y_data[i] - (theta1 * x_data[i, 0] +
theta2* x_data[i, 1] + theta0)) ** 2 return totalError / float(len(x_data)) def
gradient_descent_runner(x_data, y_data, theta0, theta1, theta2, lr, epochs): #
Calculate the total amount of data m = float(len(x_data)) for i in range(epochs): theta0_grad = 0
theta1_grad= 0 theta2_grad = 0 # Calculate the sum of gradients and then average them for j in range(0, len(x_data)):
theta0_grad+= -(1/m) * (y_data[j] - (theta1 * x_data[j, 0] + theta2*x_data[j, 1]
+ theta0)) theta1_grad += -(1 / m) * x_data[j, 0] * (y_data[j] - (theta1 *
x_data[j, 0] + theta2 * x_data[j, 1] + theta0)) theta2_grad += -(1 / m) * x_data
[j, 1] * (y_data[j] - (theta1 * x_data[j, 0] + theta2 * x_data[j, 1] + theta0))
# to update b and k theta0 = theta0 - (lr*theta0_grad) theta1 = theta1 - (lr*theta1_grad)
theta2= theta2 - (lr*theta2_grad) return theta0, theta1, theta2 theta0, theta1,
theta2= gradient_descent_runner(x_data, y_data, theta0, theta1, theta2, lr,
epochs)
Drawing
ax = plt.figure().add_subplot(111, projection='3d') ax.scatter(x_data[:, 0],
x_data[:, 1], y_data, c='r', marker='o', s=100) # The dot is a red triangle x0 = x_data[:, 0] x1 =
x_data[:, 1] # Generating network matrix x0, x1 = np.meshgrid(x0, x1) z = theta0 + x0 * theta1 +
theta2# painting 3D chart ax.plot_surface(x0, x1, z) # Set axis ax.set_xlabel('Miles') ax.
set_ylabel('Num of Deliveries') ax.set_zlabel('Time') plt.show()

<> use sklearn Realize multiple linear regression

Load data , Data segmentation is the same as common methods
Creating models
model = linear_model.LinearRegression() model.fit(x_data, y_data)
Print out the relevant information
# coefficient print('coefficients:', model.coef_) # intercept print('intercept:', model.
intercept_)
Test the model
x_test = [[10, 45]] predict = model.predict(x_test) print('predict:', predict)

Drawing , Grid generation includes ,3D chart , The fitting effect can be observed better
ax = plt.figure().add_subplot(111, projection = '3d') ax.scatter(x_data[:, 0],
x_data[:, 1], y_data, c='r', marker='o', s=100) x0 = x_data[:, 0] x1 = x_data[:,
1] # Generating network matrix x0, x1 = np.meshgrid(x0, x1) z = model.intercept_ + x0 * model.coef_
[0] + x1 * model.coef_[1] # painting 3D chart ax.plot_surface(x0, x1, z) # Set axis ax.
set_xlabel('x') ax.set_ylabel('y') ax.set_zlabel('z') plt.show()

Technology
Daily Recommendation
views 2