# 1.8 循环神经网络的梯度消失（Vanishing gradients with RNNs）

[![](https://github.com/fengdu78/deeplearning_ai_books/raw/master/images/8fb1c4afe30b7a0ede26522b355068ba.png)](https://github.com/fengdu78/deeplearning_ai_books/blob/master/images/8fb1c4afe30b7a0ede26522b355068ba.png)

编号1**cat**是单数，应该用**was，**&#x7F16;号2 **cats**是复数，用**were**

这个例子中的句子有长期的依赖，最前面的单词对句子后面的单词有影响。但基本的**RNN**模型（编号3）不擅长捕获长期依赖效应

**RNN**反向传播很困难，会有梯度消失的问题，后面层的输出误差（编号6）很难影响前面层（编号7）的计算。即很难让一个神经网络能够意识到它要记住看到的是单数名词还是复数名词，然后在序列后面生成依赖单复数形式的**was**或者**were**。且在英语里面中间的内容（编号8）可以任意长，所以基本的**RNN**模型会有很多局部影响，输出$$\hat y^{<3>}$$主要受附近的值（编号10）的影响，编号6所示的输出很难受到序列靠前的输入（编号10）的影响，因为不管输出是什么，对的还是错的，这个区域都很难反向传播到序列的前面部分，也因此网络很难调整序列前面的计算

在反向传播的时候，随着层数的增多，梯度不仅可能指数型的下降，也可能指数型的上升。梯度消失在训练**RNN**时是首要的问题，不过梯度爆炸也会出现，但是梯度爆炸很明显，因为指数级大的梯度会让参数变得极其大，以至于网络参数崩溃。参数大到崩溃会看到很多**NaN**，或者不是数字的情况，这意味着网络计算出现了数值溢出

[![](https://github.com/fengdu78/deeplearning_ai_books/raw/master/images/ac5d647140997ba713c376fb097ea0e2.png)](https://github.com/fengdu78/deeplearning_ai_books/blob/master/images/ac5d647140997ba713c376fb097ea0e2.png)

解决方法：用**梯度修剪**。梯度向量如果大于某个阈值，缩放梯度向量，保证不会太大


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://baozoulin.gitbook.io/neural-networks-and-deep-learning/di-wu-men-ke-xu-lie-mo-xing-sequence-models/di-wu-men-kexulie-mo-578b28-sequence-models/recurrent-neural-networks/18-xun-huan-shen-jing-wangluo-de-ti-du-xiao-shi-ff08-vanishing-gradients-with-rnns.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
