2.6 向量化 logistic 回归的梯度输出(Vectorizing Logistic Regression’s Gradient Output)

db可表示为:

dw可表示为:

单次迭代,梯度下降算法流程如下所示:

Z = np.dot(w.T,X) + b
A = sigmoid(Z)
dZ = A-Y
dw = 1/m*np.dot(X,dZ.T)
db = 1/m*np.sum(dZ)

w = w - alpha*dw
b = b - alpha*db

Last updated