Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Work items: Contributions Welcome #4423

Open
gramalingam opened this issue Aug 9, 2022 · 2 comments
Open

Work items: Contributions Welcome #4423

gramalingam opened this issue Aug 9, 2022 · 2 comments

Comments

@gramalingam
Copy link
Contributor

gramalingam commented Aug 9, 2022

Note

New: use this query to find all issues with the “contributions welcome” tag


Creating this pinned issue to track work items, especially those where contributions would be welcome.

Features:

  • Improving function-definition infrastructure: Issue 3139

    • This is partially in place now.
    • But further improvements are required in the interface for getting a function-definition from the opschema for a target version.
  • Reference implementation of ops: see Issue 4432

  • Add support for model-local functions in version-converter. (Issue 4370)

  • An ONNX sanitizer tool: Issue 4476

  • Shape inference testing infrastructure: please see Issue 4160

Cleanup:

  • Improve distinction between interface and implementation (see below for details).

Minor:

Documentation:

  • Clarification of specs of ops: please see Issue 3651 for more details.

New Ops:

@gramalingam gramalingam pinned this issue Aug 9, 2022
@gramalingam
Copy link
Contributor Author

Another work-item: we need to improve the separation between interface and implementation in ONNX, which will improve agility when we need to change implementation aspects of ONNX. Currently, it is often unclear if changing something will break some external user because there is no clear distinction between interface files and implementation files, and too many details are exposed in include-files (which may or may not be external interfaces).

@jcwchen
Copy link
Member

jcwchen commented Aug 22, 2022

Also welcome to contribute these To do items in ONNX triaged work items which are not assigned.

github-merge-queue bot pushed a commit that referenced this issue Jul 10, 2023
### Description
These changes have been made to support the GELU operator as a function
op.

### Motivation and Context
Support for [GELU: Gaussian Error Linear
Unit](https://paperswithcode.com/method/gelu) activation function, which
was requested in #4933.
#4423 also mentions this under the new ops section of `Contributions
Welcome`.

As per the discussion in #4933, I have added GELU as a context-dependent
function-op, that uses the attribute `approximate` to return one of the
two possible function-body definitions.

The first function definition is the regular GELU:
`GELU(x)=x∗Φ(x) = 0.5 * x * (1 + erf(x / sqrt(2)))`

The second is the fast approximation based on `tanh`:
`GELU(x)=0.5 ∗ x ∗ (1+Tanh( sqrt(2/π) ∗ (x + 0.044715 ∗ x^3)))`

This implementation uses the [PyTorch docs for
GELU](https://pytorch.org/docs/stable/generated/torch.nn.GELU.html?highlight=gelu#torch.nn.GELU)
as a reference.

PS: I also refactored `onnx/defs/math/defs.cc` to bring the operator
implementation of `mish` right next to its doc string.

---------

Signed-off-by: pranshupant <pranshupant@gmail.com>
Co-authored-by: G. Ramalingam <grama@microsoft.com>
hamptonm1 pushed a commit to hamptonm1/onnx that referenced this issue Jul 10, 2023
### Description
These changes have been made to support the GELU operator as a function
op.

### Motivation and Context
Support for [GELU: Gaussian Error Linear
Unit](https://paperswithcode.com/method/gelu) activation function, which
was requested in onnx#4933.
onnx#4423 also mentions this under the new ops section of `Contributions
Welcome`.

As per the discussion in onnx#4933, I have added GELU as a context-dependent
function-op, that uses the attribute `approximate` to return one of the
two possible function-body definitions.

The first function definition is the regular GELU:
`GELU(x)=x∗Φ(x) = 0.5 * x * (1 + erf(x / sqrt(2)))`

The second is the fast approximation based on `tanh`:
`GELU(x)=0.5 ∗ x ∗ (1+Tanh( sqrt(2/π) ∗ (x + 0.044715 ∗ x^3)))`

This implementation uses the [PyTorch docs for
GELU](https://pytorch.org/docs/stable/generated/torch.nn.GELU.html?highlight=gelu#torch.nn.GELU)
as a reference.

PS: I also refactored `onnx/defs/math/defs.cc` to bring the operator
implementation of `mish` right next to its doc string.

---------

Signed-off-by: pranshupant <pranshupant@gmail.com>
Co-authored-by: G. Ramalingam <grama@microsoft.com>
Signed-off-by: Megan Hampton <hamptonm@us.ibm.com>
hamptonm1 pushed a commit to hamptonm1/onnx that referenced this issue Jul 10, 2023
### Description
These changes have been made to support the GELU operator as a function
op.

### Motivation and Context
Support for [GELU: Gaussian Error Linear
Unit](https://paperswithcode.com/method/gelu) activation function, which
was requested in onnx#4933.
onnx#4423 also mentions this under the new ops section of `Contributions
Welcome`.

As per the discussion in onnx#4933, I have added GELU as a context-dependent
function-op, that uses the attribute `approximate` to return one of the
two possible function-body definitions.

The first function definition is the regular GELU:
`GELU(x)=x∗Φ(x) = 0.5 * x * (1 + erf(x / sqrt(2)))`

The second is the fast approximation based on `tanh`:
`GELU(x)=0.5 ∗ x ∗ (1+Tanh( sqrt(2/π) ∗ (x + 0.044715 ∗ x^3)))`

This implementation uses the [PyTorch docs for
GELU](https://pytorch.org/docs/stable/generated/torch.nn.GELU.html?highlight=gelu#torch.nn.GELU)
as a reference.

PS: I also refactored `onnx/defs/math/defs.cc` to bring the operator
implementation of `mish` right next to its doc string.

---------

Signed-off-by: pranshupant <pranshupant@gmail.com>
Co-authored-by: G. Ramalingam <grama@microsoft.com>
Signed-off-by: Megan Hampton <hamptonm@us.ibm.com>
hamptonm1 pushed a commit to hamptonm1/onnx that referenced this issue Jul 10, 2023
### Description
These changes have been made to support the GELU operator as a function
op.

### Motivation and Context
Support for [GELU: Gaussian Error Linear
Unit](https://paperswithcode.com/method/gelu) activation function, which
was requested in onnx#4933.
onnx#4423 also mentions this under the new ops section of `Contributions
Welcome`.

As per the discussion in onnx#4933, I have added GELU as a context-dependent
function-op, that uses the attribute `approximate` to return one of the
two possible function-body definitions.

The first function definition is the regular GELU:
`GELU(x)=x∗Φ(x) = 0.5 * x * (1 + erf(x / sqrt(2)))`

The second is the fast approximation based on `tanh`:
`GELU(x)=0.5 ∗ x ∗ (1+Tanh( sqrt(2/π) ∗ (x + 0.044715 ∗ x^3)))`

This implementation uses the [PyTorch docs for
GELU](https://pytorch.org/docs/stable/generated/torch.nn.GELU.html?highlight=gelu#torch.nn.GELU)
as a reference.

PS: I also refactored `onnx/defs/math/defs.cc` to bring the operator
implementation of `mish` right next to its doc string.

---------

Signed-off-by: pranshupant <pranshupant@gmail.com>
Co-authored-by: G. Ramalingam <grama@microsoft.com>
Signed-off-by: Megan Hampton <hamptonm@us.ibm.com>
hamptonm1 pushed a commit to hamptonm1/onnx that referenced this issue Jul 10, 2023
### Description
These changes have been made to support the GELU operator as a function
op.

### Motivation and Context
Support for [GELU: Gaussian Error Linear
Unit](https://paperswithcode.com/method/gelu) activation function, which
was requested in onnx#4933.
onnx#4423 also mentions this under the new ops section of `Contributions
Welcome`.

As per the discussion in onnx#4933, I have added GELU as a context-dependent
function-op, that uses the attribute `approximate` to return one of the
two possible function-body definitions.

The first function definition is the regular GELU:
`GELU(x)=x∗Φ(x) = 0.5 * x * (1 + erf(x / sqrt(2)))`

The second is the fast approximation based on `tanh`:
`GELU(x)=0.5 ∗ x ∗ (1+Tanh( sqrt(2/π) ∗ (x + 0.044715 ∗ x^3)))`

This implementation uses the [PyTorch docs for
GELU](https://pytorch.org/docs/stable/generated/torch.nn.GELU.html?highlight=gelu#torch.nn.GELU)
as a reference.

PS: I also refactored `onnx/defs/math/defs.cc` to bring the operator
implementation of `mish` right next to its doc string.

---------

Signed-off-by: pranshupant <pranshupant@gmail.com>
Co-authored-by: G. Ramalingam <grama@microsoft.com>
Signed-off-by: Megan Hampton <hamptonm@us.ibm.com>
hamptonm1 pushed a commit to hamptonm1/onnx that referenced this issue Jul 10, 2023
### Description
These changes have been made to support the GELU operator as a function
op.

### Motivation and Context
Support for [GELU: Gaussian Error Linear
Unit](https://paperswithcode.com/method/gelu) activation function, which
was requested in onnx#4933.
onnx#4423 also mentions this under the new ops section of `Contributions
Welcome`.

As per the discussion in onnx#4933, I have added GELU as a context-dependent
function-op, that uses the attribute `approximate` to return one of the
two possible function-body definitions.

The first function definition is the regular GELU:
`GELU(x)=x∗Φ(x) = 0.5 * x * (1 + erf(x / sqrt(2)))`

The second is the fast approximation based on `tanh`:
`GELU(x)=0.5 ∗ x ∗ (1+Tanh( sqrt(2/π) ∗ (x + 0.044715 ∗ x^3)))`

This implementation uses the [PyTorch docs for
GELU](https://pytorch.org/docs/stable/generated/torch.nn.GELU.html?highlight=gelu#torch.nn.GELU)
as a reference.

PS: I also refactored `onnx/defs/math/defs.cc` to bring the operator
implementation of `mish` right next to its doc string.

---------

Signed-off-by: pranshupant <pranshupant@gmail.com>
Co-authored-by: G. Ramalingam <grama@microsoft.com>
Signed-off-by: Megan Hampton <hamptonm@us.ibm.com>
hamptonm1 pushed a commit to hamptonm1/onnx that referenced this issue Jul 10, 2023
### Description
These changes have been made to support the GELU operator as a function
op.

### Motivation and Context
Support for [GELU: Gaussian Error Linear
Unit](https://paperswithcode.com/method/gelu) activation function, which
was requested in onnx#4933.
onnx#4423 also mentions this under the new ops section of `Contributions
Welcome`.

As per the discussion in onnx#4933, I have added GELU as a context-dependent
function-op, that uses the attribute `approximate` to return one of the
two possible function-body definitions.

The first function definition is the regular GELU:
`GELU(x)=x∗Φ(x) = 0.5 * x * (1 + erf(x / sqrt(2)))`

The second is the fast approximation based on `tanh`:
`GELU(x)=0.5 ∗ x ∗ (1+Tanh( sqrt(2/π) ∗ (x + 0.044715 ∗ x^3)))`

This implementation uses the [PyTorch docs for
GELU](https://pytorch.org/docs/stable/generated/torch.nn.GELU.html?highlight=gelu#torch.nn.GELU)
as a reference.

PS: I also refactored `onnx/defs/math/defs.cc` to bring the operator
implementation of `mish` right next to its doc string.

---------

Signed-off-by: pranshupant <pranshupant@gmail.com>
Co-authored-by: G. Ramalingam <grama@microsoft.com>
Signed-off-by: Megan Hampton <hamptonm@us.ibm.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants