Skip to content

Development guidelines

Julia Magán edited this page Dec 24, 2021 · 7 revisions

Development

  • Integration tests: Encapsulated tests that test the minimum possible functionality.
  • Wazuh is assumed off, only the necessary demons are turned on and turned off at the end of the test.
  • The test should leave the machine as it was found.
  • The test must be 100% self-configurable, no user intervention is expected to have to run it with a PASS.
  • Do not use simulators of any kind, basic mockers, if more is needed is system test and is implemented in another way
  • Prioritize the use of common repository tools.
  • An issue in Wazuh has directly an issue in QA, whether or not tests have to be implemented.
  • We use Test-driven development (TDD) as the SLDC for QA tests + Core implementations.

Contribution

  • Every issue must have: The corresponding labels, the assigned weight, the target release, and sprint.
  • We must always have a FULL GREEN. Any PR of ours that avoids a full green should not be merge.
  • Epics are not scored, Zenhub auto-calculates.
  • Labels go to issues, but never to PRs (except draft/hold).
  • The QA issue in Core development should keep a list of candidate tests.
  • Link the PR with "Connect to issue" to the issue.
  • Both reviewer and developer test and run the test. And paste the tests in the PR.
  • No brackets in issue titles. Natural language describing the task.
  • Style guide for writing commits: https://github.com/emilio1625/guia-estilo-git#commits.
  • Good formatted markdown, use Collapse/Details when is necessary.
  • PR and issues will use Snippets no screenshots.
  • Only tests that obtain a full green will be merged. Where colors codes are the following:
    • 🟢 : All pass, no error, no fails, no warnings.
    • 🟡 : All pass, some warning.
    • 🔴 : Any fail or error.
  • If changes to the wazuh_testing library are required, they must be made in a separate Issue/PR.

Style

" and '

The use of ' and " is the following:

f"{var}string"
'raw string'
"this is my raw 'string' with single quotation marks"

Testing

How to generate a report?

python3 -m pytest test_module --html=r<n_report>-<OS>-<manager/agent>-<pr_id>.html

Several considerations about report structure:

  • Report must be a compressed file, with the html generated by the pytest-html plugin and the assets folder generated in that pytest session.

  • Name of the report (Compressed and html file) should fit the follow structure: <n_report>-<OS>-<manager/agent>-<pr_id>

  • Always is necessary to index 3 reports (from developer and QA) in order to consider approving a development.

  • The required report target is the whole module. However, in the case of demanding modules (FIM or VD) is possible to:

    • Run only the submodule affected for target development. For example: python3 -m pytest test_fim/test_files/test_audit ...
    • Run the previous and posterior tests, along with the developed test. For example, in a development in which tests/integration/test_logcollector/test_configuration/test_basic_configuration_exclude.py is changed, all these tests should be run:
      • tests/integration/test_logcollector/test_configuration/test_basic_configuration_command.py (previous test)
      • tests/integration/test_logcollector/test_configuration/test_basic_configuration_exclude.py
      • tests/integration/test_logcollector/test_configuration/test_basic_configuration_frequency.py (posterior test)

    This can be done using -k option as follows:

python3 -m pytest test_module --html=r<n_report>-<OS>-<manager/agent>-<pr_id>.html -k "test_1 or test_2 or test_3"

Report message template

## day/month/year
### Package
| Version | Revision | Link|
|---|---|---|
|||


## Testing

### Module 1

|  OS  | Local   | Jenkins     | Notes
|---    |---    |---    |---    |
| R1   | [:green_circle: ](link-to-the-report-r1)  |  [:green_circle: ](link-to-the-report-r1) - [Build]()   | |
| R2   | [:green_circle:](link-to-the-report-r2)   |  [:green_circle: ](link-to-the-report-r1) - [Build]()   | |
| R3   | [:green_circle:](link-to-the-report-r3)   |  [:green_circle: ](link-to-the-report-r1) - [Build]()   | |

* * * 

### Module 2

|  OS  | Local   | Jenkins     | Notes
|---    |---    |---    |---    |
| R1   | [:green_circle: ](link-to-the-report-r1)  |  [:green_circle: ](link-to-the-report-r1) - [Build]()   | |
| R2   | [:green_circle:](link-to-the-report-r2)   |  [:green_circle: ](link-to-the-report-r1) - [Build]()   | |
| R3   | [:green_circle:](link-to-the-report-r3)   |  [:green_circle: ](link-to-the-report-r1) - [Build]()   | |

* * * 

- :green_circle:: All pass
- :yellow_circle:: Some warnings
- :red_circle:: Some errors/fails
- :large_blue_circle:: In progress 

Tools for report generation

Report alias for Linux systems

  • Add the following in ~/.bashrc
report_test(){

if [ -z $3 ]
then
        n_report=3
else
        n_report=$3
fi


revision=""
if [ -z $2  ]
then
        echo "Revision is needed"
        exit -1
else
    revision="-$2"
fi

for i in $(seq 1 $n_report)
  do
    if [ -z $4 ]
    then
       python3 -m pytest $1 --html=R${i}${revision}.html
    else
       python3 -m pytest $1 --html=R${i}${revision}.html -k $4
    fi
    tar -czvf R${i}${revision}.tar.gz ./assets R${i}${revision}.html
    rm -rf ./assets
    rm -rf ./R${i}${revision}.html
  done
}

alias report=report_test
  • source ~/.bashrc
  • Generate reports as follows:

report <test_module> <revision> <n_reports_to_generate> <k_option_value>

For example:

report tests/integration/test_logtest/ 1507-centos-manager 3

Using this command, 3 reports with 1507-centos-manager revision will be generated.

Documentation

Docstring