2.6 向量化 logistic 回归的梯度输出(Vectorizing Logistic Regression’s Gradient Output)

db可表示为:

db=1mi=1mdz(i)db=\frac1m \sum_{i=1}^mdz^{(i)}

dw可表示为:

dw=1mXdZTdw=\frac1m X\cdot dZ^T

单次迭代,梯度下降算法流程如下所示:

Z = np.dot(w.T,X) + b
A = sigmoid(Z)
dZ = A-Y
dw = 1/m*np.dot(X,dZ.T)
db = 1/m*np.sum(dZ)

w = w - alpha*dw
b = b - alpha*db

Last updated