Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

【PaddlePaddle Hackathon 2】16、doc for rrelu #4725

Merged
merged 7 commits into from May 31, 2022
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
2 changes: 2 additions & 0 deletions docs/api/paddle/nn/Overview_cn.rst
Expand Up @@ -127,6 +127,7 @@ Padding层
" :ref:`paddle.nn.LogSoftmax <cn_api_nn_LogSoftmax>` ", "LogSoftmax激活层"
" :ref:`paddle.nn.Maxout <cn_api_nn_Maxout>` ", "Maxout激活层"
" :ref:`paddle.nn.PReLU <cn_api_nn_PReLU>` ", "PReLU激活层"
" :ref:`paddle.nn.RReLU <cn_api_nn_RReLU>` ", "RReLU激活层"
" :ref:`paddle.nn.ReLU <cn_api_nn_ReLU>` ", "ReLU激活层"
" :ref:`paddle.nn.ReLU6 <cn_api_nn_ReLU6>` ", "ReLU6激活层"
" :ref:`paddle.nn.SELU <cn_api_nn_SELU>` ", "SELU激活层"
Expand Down Expand Up @@ -381,6 +382,7 @@ Padding相关函数
" :ref:`paddle.nn.functional.log_softmax <cn_api_nn_cn_log_softmax>` ", "log_softmax激活函数"
" :ref:`paddle.nn.functional.maxout <cn_api_nn_cn_maxout>` ", "maxout激活函数"
" :ref:`paddle.nn.functional.prelu <cn_api_nn_cn_prelu>` ", "prelu激活函数"
" :ref:`paddle.nn.functional.rrelu <cn_api_nn_cn_rrelu>` ", "rrelu激活函数"
" :ref:`paddle.nn.functional.relu <cn_api_nn_cn_relu>` ", "relu激活函数"
" :ref:`paddle.nn.functional.relu_ <cn_api_nn_cn_relu_>` ", "Inplace 版本的 :ref:`cn_api_nn_cn_relu` API,对输入 x 采用 Inplace 策略"
" :ref:`paddle.nn.functional.relu6 <cn_api_nn_cn_relu6>` ", "relu6激活函数"
Expand Down
34 changes: 34 additions & 0 deletions docs/api/paddle/nn/RReLU_cn.rst
@@ -0,0 +1,34 @@
.. _cn_api_nn_RReLU:

RReLU
-------------------------------
.. py:class:: paddle.nn.RReLU(lower=1./8., upper=1./3., name=None)

RReLU激活层(RReLU Activation Operator)。计算公式如下:

如果使用近似计算:

.. math::

\text{RReLU}(x) =
\begin{cases}
x & \text{if } x \geq 0 \\
ax & \text{ otherwise }
\end{cases}

其中,:math:`x` 为输入的 Tensor。

参数
::::::::::
- lower (float,可选) - 负值斜率的随机值范围下限,`lower` 包含在范围中。支持的数据类型:float。默认值为0.125。
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

参数需要加粗,参考:

  • lower (float,可选) - 负值斜率的随机值范围下限,lower 包含在范围中。支持的数据类型:float。默认值为0.125。

- upper (float,可选) - 负值斜率的随机值范围上限,`upper` 包含在范围中。支持的数据类型:float。默认值为0.333。
- name (str, 可选) - 操作的名称(可选,默认值为None)。更多信息请参见 :ref:`api_guide_Name`。
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

, -->

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

完成


形状:
::::::::::
- input: 任意形状的Tensor,默认数据类型为float32。
- output: 和input具有相同形状的Tensor。
Copy link
Collaborator

@Ligoml Ligoml May 10, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • 形状后面不需要加
  • --> -

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

完成


代码示例
:::::::::
COPY-FROM: paddle.nn.RRelu:RRelu-example
33 changes: 33 additions & 0 deletions docs/api/paddle/nn/functional/rrelu_cn.rst
@@ -0,0 +1,33 @@
.. _cn_api_nn_cn_prelu:

rrelu
-------------------------------

.. py:function:: paddle.nn.functional.rrelu(x, lower=1. / 8., upper=1. / 3., training=True, name=None)

rrelu激活层(RRelu Activation Operator)。计算公式如下:

.. math::

\text{RReLU}(x) =
\begin{cases}
x & \text{if } x \geq 0 \\
ax & \text{ otherwise }
\end{cases}

其中,:math:`x` 为输入的 Tensor。

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

中英文需要保持一致,英文部分有附参考论文

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

完成

参数
::::::::::
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

参数缺少x的介绍?源码中是有5个参数的,可以参考:https://github.com/PaddlePaddle/docs/blob/develop/docs/templates/common_docs.py#L40

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

完成

- lower (float,可选) - 负值斜率的随机值范围下限,`lower` 包含在范围中。支持的数据类型:float。默认值为0.125。
- upper (float,可选) - 负值斜率的随机值范围上限,`upper` 包含在范围中。支持的数据类型:float。默认值为0.333。
- training (bool) - 标记是否为训练阶段。 默认: True。
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

training有默认值,所以是可选参数

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

完成

- name (str, 可选) - 操作的名称(可选,默认值为None)。更多信息请参见 :ref:`api_guide_Name`。
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

, -->

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

完成


返回
::::::::::
``Tensor`` ,数据类型和形状同 ``x`` 一致。

代码示例
:::::::::
COPY-FROM: paddle.nn.functional.rrelu:rrelu-example