2.6 向量化 logistic 回归的梯度输出(Vectorizing Logistic Regression’s Gradient Output)
db可表示为:
dw可表示为:
单次迭代,梯度下降算法流程如下所示:
Z = np.dot(w.T,X) + b
A = sigmoid(Z)
dZ = A-Y
dw = 1/m*np.dot(X,dZ.T)
db = 1/m*np.sum(dZ)
w = w - alpha*dw
b = b - alpha*db
Previous2.5 梯度下降的例子(Gradient Descent on m Examples)Next2.7 (选修)logistic 损失函数的解释(Explanation of logistic regression cost function )
Last updated
Was this helpful?