# 1.12 梯度的数值逼近（Numerical approximation of gradients）

Back Propagation神经网络有一项重要的测试是梯度检验（gradient checking）。其目的是检查验证反向传播过程中梯度下降算法是否正确。

![](https://2314428465-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-Le0cHhI0S0DK8pwlrmD%2F-Le0cKOp1vaxoORIi4ak%2F-Le0ccUMoXtbcRdkrVmb%2F2import.png?generation=1556953075491748\&alt=media)

对于一个非零的$$\varepsilon$$，它的逼近误差可以写成$$O(\varepsilon^2)$$，$$\varepsilon$$值非常小，大写符号$$O$$的含义是指**逼近误差**

函数$$f$$在点$$\theta$$处的梯度可以表示成：

$$
g(\theta)=\frac{f(\theta+\varepsilon)-f(\theta-\varepsilon)}{2\varepsilon}
$$

$$\varepsilon>0$$，且足够小


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://baozoulin.gitbook.io/neural-networks-and-deep-learning/di-er-men-ke-gai-shan-shen-ceng-shen-jing-wang-luo-chao-can-shu-tiao-shi-zheng-ze-hua-yi-ji-you-hua/improving-deep-neural-networks/practical-aspects-of-deep-learning/112-ti-du-de-shu-zhi-bi-jin-ff08-numerical-approximation-of-gradients.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
