简介:本文将提供吴恩达机器学习课程中Python版本的答案,帮助你更好地理解和学习机器学习。
在吴恩达的机器学习课程中,Python作为一种强大的编程语言被广泛使用。这里我们为你提供了作业答案的Python版。注意,为了简洁明了,我们将仅提供主要代码,并未包含完整的注释和详细解释。在实际操作中,请确保理解每行代码的含义。
作业一:线性回归
作业要求:使用Python实现线性回归,并使用梯度下降法进行参数优化。
以下是Python代码示例:
import numpy as np# 定义模型函数def model(X, w, b):return np.dot(X, w) + b# 定义损失函数def loss(X, y, w, b):predictions = model(X, w, b)return np.sum((predictions - y) ** 2) / len(y)# 定义梯度下降函数def gradient_descent(X, y, w, b, learning_rate=0.01, num_iterations=1000):m = len(y)for i in range(num_iterations)::predictions = model(X, w, b)dw = (2 / m) * np.dot(X.T, (predictions - y))db = (2 / m) * np.sum(predictions - y)w = w - learning_rate * dwb = b - learning_rate * dbreturn w, b# 训练模型X = np.array([[1, 1], [1, 2], [2, 2], [2, 3]]) # 输入数据Y = np.array([3, 3, 4, 4]) # 输出数据w, b = gradient_descent(X, Y, np.zeros(2), 0)print('w:', w, 'b:', b)
作业二:逻辑回归
作业要求:使用Python实现逻辑回归,并使用梯度下降法进行参数优化。
以下是Python代码示例:
```python
import numpy as np
from scipy.optimize import minimize
import matplotlib.pyplot as plt
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_score
def sigmoid(z):
return 1 / (1 + np.exp(-z))
def model(X, w):
return sigmoid(np.dot(X, w))
def loss(X, y, w):
predictions = model(X, w)
cost = np.sum((y np.log(predictions) + (1 - y) np.log(1 - predictions)) / len(y))
return -cost
titanicdata = np.genfromtxt(‘titanic.csv’, delimiter=’,’)
x = titanic_data[:, :-1]
y = titanic_data[:, -1]
x_train, x_test, y_train, y_test = train_test_split(x, y, test_size=0.2)
x_train = np.c[np.ones(len(xtrain)), x_train]
x_test = np.c[np.ones(len(x_test)), x_test]
y_train = np.array([0 if i <= 0.5 else 1 for i in y_train])
y_test = np.array([0 if i <= 0.5 else 1 for i in y_test])
best_params = minimize(loss, [0, 0], args=(x_train, y_train), method=’CG’, jac=True)
best_w = best_params.x[1:]
best_b = best_params.x[0]
predictions = model(