3.7 激活函数的导数(Derivatives of activation functions )

sigmoidsigmoid函数的导数:

g(z)=11+e(z)g(z)=\frac{1}{1+e^{(-z)}}
g(z)=ddzg(z)=g(z)(1g(z))=a(1a)g'(z)=\frac{d}{dz}g(z)=g(z)(1-g(z))=a(1-a)

tanhtanh函数的导数:

g(z)=e(z)e(z)e(z)+e(z)g(z)=\frac{e^{(z)}-e^{(-z)}}{e^{(z)}+e^{(-z)}}
g(z)=ddzg(z)=1(g(z))2=1a2g'(z)=\frac{d}{dz}g(z)=1-(g(z))^2=1-a^2

ReLUReLU函数的导数:

g(z)=max(0,z)g(z)=max(0,z)
x={0if z<01if z0x = \begin{cases} 0 &\text{if } z < 0 \\ 1 &\text{if } z \geq 0 \end{cases}

LeakyReLULeaky ReLU函数:

g(z)=max(0.01z,z)g(z)=max(0.01z,z)
g(z)={0.01if z<01if z0g'(z) = \begin{cases} 0.01 &\text{if } z < 0 \\ 1 &\text{if } z \geq 0 \end{cases}

Last updated