bars
DeepLearning.ai深度学习课程笔记
search
circle-xmark
⌘
Ctrl
k
copy
Copy
chevron-down
第一门课 神经网络和深度学习(Neural-Networks-and-Deep-Learning)
chevron-right
第二周:神经网络的编程基础(Basics of Neural Network programming)
2.4 逻辑回归的梯度下降(Logistic Regression Gradient Descent)
对单个样本而言,
逻辑回归Loss function
表达式如下:
z
=
w
T
x
+
b
z=w^Tx+b
z
=
w
T
x
+
b
y
^
=
a
=
σ
(
z
)
\hat y=a=\sigma(z)
y
^
=
a
=
σ
(
z
)
L
(
a
,
y
)
=
−
(
y
log
(
a
)
+
(
1
−
y
)
log
(
1
−
a
)
)
L(a,y)=-(y\log(a)+(1-y)\log(1-a))
L
(
a
,
y
)
=
−
(
y
lo
g
(
a
)
+
(
1
−
y
)
lo
g
(
1
−
a
))
计算该逻辑回归的反向传播过程:
d
a
=
∂
L
∂
a
=
−
y
a
+
1
−
y
1
−
a
da=\frac{\partial L}{\partial a}=-\frac ya+\frac{1-y}{1-a}
d
a
=
∂
a
∂
L
=
−
a
y
+
1
−
a
1
−
y
d
z
=
∂
L
∂
z
=
∂
L
∂
a
⋅
∂
a
∂
z
=
(
−
y
a
+
1
−
y
1
−
a
)
⋅
a
(
1
−
a
)
=
a
−
y
dz=\frac{\partial L}{\partial z}=\frac{\partial L}{\partial a}\cdot \frac{\partial a}{\partial z}=(-\frac ya+\frac{1-y}{1-a})\cdot a(1-a)=a-y
d
z
=
∂
z
∂
L
=
∂
a
∂
L
⋅
∂
z
∂
a
=
(
−
a
y
+
1
−
a
1
−
y
)
⋅
a
(
1
−
a
)
=
a
−
y
d
w
1
=
∂
L
∂
w
1
=
∂
L
∂
z
⋅
∂
z
∂
w
1
=
x
1
⋅
d
z
=
x
1
(
a
−
y
)
dw_1=\frac{\partial L}{\partial w_1}=\frac{\partial L}{\partial z}\cdot \frac{\partial z}{\partial w_1}=x_1\cdot dz=x_1(a-y)
d
w
1
=
∂
w
1
∂
L
=
∂
z
∂
L
⋅
∂
w
1
∂
z
=
x
1
⋅
d
z
=
x
1
(
a
−
y
)
d
w
2
=
∂
L
∂
w
2
=
∂
L
∂
z
⋅
∂
z
∂
w
2
=
x
2
⋅
d
z
=
x
2
(
a
−
y
)
dw_2=\frac{\partial L}{\partial w_2}=\frac{\partial L}{\partial z}\cdot \frac{\partial z}{\partial w_2}=x_2\cdot dz=x_2(a-y)
d
w
2
=
∂
w
2
∂
L
=
∂
z
∂
L
⋅
∂
w
2
∂
z
=
x
2
⋅
d
z
=
x
2
(
a
−
y
)
d
b
=
∂
L
∂
b
=
∂
L
∂
z
⋅
∂
z
∂
b
=
1
⋅
d
z
=
a
−
y
db=\frac{\partial L}{\partial b}=\frac{\partial L}{\partial z}\cdot \frac{\partial z}{\partial b}=1\cdot dz=a-y
d
b
=
∂
b
∂
L
=
∂
z
∂
L
⋅
∂
b
∂
z
=
1
⋅
d
z
=
a
−
y
则梯度下降算法可表示为:
w
1
:
=
w
1
−
α
d
w
1
w_1:=w_1-\alpha\ dw_1
w
1
:=
w
1
−
α
d
w
1
w
2
:
=
w
2
−
α
d
w
2
w_2:=w_2-\alpha\ dw_2
w
2
:=
w
2
−
α
d
w
2
b
:
=
b
−
α
d
b
b:=b-\alpha\ db
b
:=
b
−
α
d
b
Previous
2.3 逻辑回归的代价函数(Logistic Regression Cost Function)
chevron-left
Next
2.5 梯度下降的例子(Gradient Descent on m Examples)
chevron-right
Last updated
6 years ago