Skip to content

Commit

Permalink
Update CI pipeline README (#6086)
Browse files Browse the repository at this point in the history
1. Replace the azure devops badges with the github actions badge in
README
2. Update pipeline description in CONTRIBUTING
3. Schedule the workflow to run daily to report coverage to codecov.
Fixes #6084

Signed-off-by: Justin Chu <justinchu@microsoft.com>
Co-authored-by: G. Ramalingam <grama@microsoft.com>
  • Loading branch information
justinchuby and gramalingam committed Apr 15, 2024
1 parent 07fc158 commit 16ee58b
Show file tree
Hide file tree
Showing 3 changed files with 5 additions and 8 deletions.
4 changes: 3 additions & 1 deletion .github/workflows/main.yml
@@ -1,6 +1,8 @@
name: CI

on:
schedule:
- cron: '0 0 * * *' # every day at midnight for reporting code coverage to codecov
push:
branches:
- main
Expand Down Expand Up @@ -78,7 +80,7 @@ jobs:
if: matrix.os == 'windows-latest'
uses: microsoft/setup-msbuild@6fb02220983dee41ce7ae257b6f4d8f9bf5ed4ce # v2.0.0
with:
msbuild-architecture: ${{ matrix.architecture }}
msbuild-architecture: x64

- name: Install external protobuf - Windows
if: matrix.protobuf_type == 'External' && matrix.os == 'windows-latest'
Expand Down
5 changes: 1 addition & 4 deletions README.md
Expand Up @@ -7,16 +7,13 @@ Copyright (c) ONNX Project Contributors
<p align="center"><img width="40%" src="https://github.com/onnx/onnx/raw/main/docs/onnx-horizontal-color.png" /></p>

[![PyPI - Version](https://img.shields.io/pypi/v/onnx.svg)](https://pypi.org/project/onnx)
[![Build Status](https://dev.azure.com/onnx-pipelines/onnx/_apis/build/status/Windows-CI?branchName=main&label=Windows)](https://dev.azure.com/onnx-pipelines/onnx/_build/latest?definitionId=5&branchName=main)
[![Build Status](https://dev.azure.com/onnx-pipelines/onnx/_apis/build/status/Linux-CI?branchName=main&label=Linux)](https://dev.azure.com/onnx-pipelines/onnx/_build/latest?definitionId=7&branchName=main)
[![Build Status](https://dev.azure.com/onnx-pipelines/onnx/_apis/build/status/MacOS-CI?branchName=main&label=MacOS)](https://dev.azure.com/onnx-pipelines/onnx/_build/latest?definitionId=6&branchName=main)
[![CI](https://github.com/onnx/onnx/actions/workflows/main.yml/badge.svg)](https://github.com/onnx/onnx/actions/workflows/main.yml)
[![CII Best Practices](https://bestpractices.coreinfrastructure.org/projects/3313/badge)](https://bestpractices.coreinfrastructure.org/projects/3313)
[![OpenSSF Scorecard](https://api.securityscorecards.dev/projects/github.com/onnx/onnx/badge)](https://api.securityscorecards.dev/projects/github.com/onnx/onnx)
[![REUSE compliant](https://api.reuse.software/badge/github.com/onnx/onnx)](https://api.reuse.software/info/github.com/onnx/onnx)
[![Ruff](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/astral-sh/ruff/main/assets/badge/v2.json)](https://github.com/astral-sh/ruff)
[![Black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)


[Open Neural Network Exchange (ONNX)](https://onnx.ai) is an open ecosystem that empowers AI developers
to choose the right tools as their project evolves. ONNX provides an open source format for AI models, both deep learning and traditional ML. It defines an extensible computation graph model, as well as definitions of built-in operators and standard
data types. Currently we focus on the capabilities needed for inferencing (scoring).
Expand Down
4 changes: 1 addition & 3 deletions docs/CIPipelines.md
Expand Up @@ -10,9 +10,7 @@ SPDX-License-Identifier: Apache-2.0

|   | When it runs | Config | Test |
-- | -- | -- | -- |
[Linux-CI](/.azure-pipelines/Linux-CI.yml) | Every PR | <ul><li>Ubuntu-20.04</li><li>DEBUG=1 or 0</li><li>ONNX_USE_LITE_PROTO=OFF</li><li>ONNX_USE_PROTOBUF_SHARED_LIBS=OFF</li><li>ONNX_BUILD_TESTS=1</li><li>ONNX_WERROR=ON</li><li>ONNX_ML=1 or 0</li></ul>| <ul><li>ONNX C++ tests</li><li>Style check (flake8, mypy, and clang-format)</li><li>Test doc generation</li><li>Test proto generation</li><li>Verify node test generation</li></ul> |
[Windows-CI](/.azure-pipelines/Windows-CI.yml) | Every PR  | <ul><li>windows-2019</li><li>ONNX_USE_LITE_PROTO=ON</li><li>ONNX_USE_PROTOBUF_SHARED_LIBS=ON</li><li>ONNX_BUILD_TESTS=1</li><li>ONNX_WERROR=ON</li><li>ONNX_ML=1 or 0</li></ul>| <ul><li>Test building ONNX in conda environment</li><li>Test doc generation</li><li>Test proto generation</li><li>Verify node test generation</li></ul> |
[Mac-CI](/.azure-pipelines/MacOS-CI.yml) | Every PR  | <ul><li>macOS-10.15</li><li>DEBUG=1</li><li>ONNX_USE_LITE_PROTO=ON or OFF</li><li>ONNX_ML=1 or 0</li><li>ONNX_BUILD_TESTS=1</li><li>ONNX_WERROR=ON</li></ul>| <ul><li>ONNX C++ tests</li><li>Test doc generation</li><li>Test proto generation</li><li>Verify node test generation</li></ul>|
[CI / Test](/.github/workflows/main.yml) | Every PR | <ul><li>linux-latest</li><li>DEBUG=1 or 0</li><li>ONNX_USE_LITE_PROTO=OFF</li><li>ONNX_USE_PROTOBUF_SHARED_LIBS=ON or OFF</li><li>ONNX_BUILD_TESTS=1</li><li>ONNX_WERROR=ON</li><li>ONNX_ML=1 or 0</li></ul>| <ul><li>ONNX C++ tests</li><li>Test doc generation</li><li>Test proto generation</li><li>Verify node test generation</li></ul> |
[Windows_No_Exception CI](/.github/workflows/win_no_exception_ci.yml) | Every PR  | <ul><li>vs2019-winlatest</li><li>ONNX_DISABLE_EXCEPTIONS=ON</li><li>ONNX_USE_LITE_PROTO=ON</li><li>ONNX_USE_PROTOBUF_SHARED_LIBS=OFF</li><li>ONNX_ML=1</li><li>ONNX_USE_MSVC_STATIC_RUNTIME=ON</li><li>ONNX_DISABLE_STATIC_REGISTRATION=ON or OFF</li></ul>| <ul><li>Only ONNX C++ tests</li><li>Test selective schema loading</li></ul> |
[Lint / Optional Lint](/.github/workflows/lint.yaml) | Every PR |<ul>ubuntu-latest</ul>| <ul><li>Not required -- it shows lint warnings for suggestions in PR</li><li>misspell</li><li>shellcheck</li></ul> |
[Lint / Enforce style](/.github/workflows/lint.yaml) | Every PR |<ul>ubuntu-latest</ul>| <ul><li>flake8</li><li>isort</li><li>black</li><li>mypy</li><li>clang-format</li><li>unix line endings</li><li>c++ namespace rules</li><li>Auto-generated files are up to date</li></ul> |
Expand Down

0 comments on commit 16ee58b

Please sign in to comment.