Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Document QA and development details #123

Closed
4 tasks done
Potherca opened this issue Nov 24, 2020 · 10 comments · Fixed by #133
Closed
4 tasks done

Document QA and development details #123

Potherca opened this issue Nov 24, 2020 · 10 comments · Fixed by #133
Assignees

Comments

@Potherca
Copy link
Member

Potherca commented Nov 24, 2020

This issue addresses a point of concern raised in the conversation of #122

There seems to be a lot of implicit knowledge assumed regarding the GitHub actions we run.

After having documented the current release process, it might also be good to document how to run all of the QA tools in our current pipeline locally, during development.

Any other non-QA tooling should also be documented.

The following tools are used and should be documented:

  • jsonlint
  • yamllint
  • php-codesniffer
  • remark-lint

We use Pipeline-Components, which are basically just Docker images wrapped around a specific tool.

The options, from top to bottom are:

  • Run GitHub actions locally with act act --job job-name
  • Run a Pipeline-Component using docker docker run -v $PWD:/code pipelinecomponents/some-tool --no-stdout .
  • Run a tool locally with NodeJS, Ruby, PHP, or whatever the tool is written in.

Not sure which route is the best to go and how much to explain or knowledge to assume (i.e. what to explain and where to link to the Remark docs).

Also:

  • Should information be added to the CONTRIBUTING file to document that code is expected to conform to specific rules?
  • If installed in a local manner, are there files which should be added to the .gitignore file?
@jrfnl
Copy link
Member

jrfnl commented Nov 24, 2020

My two pennies:

I think we also may want to review which tools are used and for what reason.

  • jsonlint - AFAICS the only json file is composer.json - would running composer validate be sufficient ? If so, this is done automatically on a composer install anyhow. If not: is there a Composer plugin available for the same ? Would it be easier/more intuitive to use that instead ?
  • yamllint - AFAICS the only yml file is .travis.yml and this file is validated automatically by Travis on each run. As this file is not part of the user-facing code, should we even care about anything other than that it works ?

For any PHP tools/tools installed via Composer, I'm generally a fan of documenting the "how to use" by adding a command for each to the Composer scripts section.
This both documents the fact that the tool is being used, as well as what the default command line args are (if any) and makes it real easy to explain how to run something (composer install ->composer scriptname).

Not sure which route is the best to go and how much to explain or knowledge to assume (i.e. what to explain and where to link to the Remark docs).

As this is a Composer plugin written in PHP, I think we can safely assume that contributors will have PHP and Composer installed. For everything else, detailed instructions would be needed.

It may be worth investigating whether there is a Composer package available as an alternative to the currently used (non-Composer based) tooling and to use that instead as it will make it much easier for people to replicate the checks locally.

Should information be added to the CONTRIBUTING file to document that code is expected to conform to specific rules?

Yes, with a link to the authoritative article listing the rules.

@jrfnl
Copy link
Member

jrfnl commented Nov 24, 2020

Note: if all tooling used would be Composer based, we could even have a checkall script in the composer.json and just suggest running that.

@Potherca
Copy link
Member Author

I'd rather opt for using docker or act to replicate the checks locally instead of looking for PHP alternatives, especially once we start running integration tests.

I prefer having Yaml and JSON checks but I don't think there is much value in developers running them locally.
My current thinking is to leave the Yaml and JSON lint in the github actions but leave them mostly out of the documentation, other mentioning that they are there.

That would leave php-codesniffer and remark-lint for "full" documentation.

I've added docker commands to as composer scripts in other projects, as documentation on how things should be run, but I ran into problems that didn't happen when docker was run direct (like timeouts and formatting issues).

Although I'm in favour of adding Composer scripts, I'm not sure they'll offer much benefit if docker or act are used.

I'll think about this some more whilst writing the docs, see if any insights change...

@Potherca
Copy link
Member Author

Also @mjrider, you seem awful quiet, considering the topic... 😈

@jrfnl
Copy link
Member

jrfnl commented Nov 26, 2020

I'd rather opt for using docker or act to replicate the checks locally instead of looking for PHP alternatives, especially once we start running integration tests.

That does presume contributing devs use Docker (I don't).

Although I'm in favour of adding Composer scripts, I'm not sure they'll offer much benefit if docker or act are used.

Agreed. IMO scripts in the composer file should be able to run after doing composer install without any further assumptions about the environment being made. Scripts which need other setup should only be added to Composer scripts if that setup can also be run from within the script without breaking on different OSes.

So adding a script for phpcs is fine as that won't need any further setup aside from the composer install.
remark is a different matter.

@jrfnl
Copy link
Member

jrfnl commented Nov 26, 2020

I prefer having Yaml and JSON checks but I don't think there is much value in developers running them locally.

I don't agree with this reasoning.

  1. If we have those checks, devs should be able to run them locally. Nothing more annoying than fixing something, doing the QA checking locally, only to have the build fail on something you can't run locally.
  2. If Yaml and Json files were part of the actual package - either src or as part of the tests - then, yes, I agree they should be checked, however in this case they are not and we're just making contributing more difficult by having QA checks on CI files which are not part of the actual package.

@jrfnl
Copy link
Member

jrfnl commented Nov 26, 2020

Re the JSON check: we could probably use Composer Normalize instead.

Mind: if/when we add tests, those will probably contain Composer config files to test various situations. I'm not sure we should run any validation on them as it may prevent testing some outlier situations.
(this remark also applies to the current JSON check)

@mjrider
Copy link
Contributor

mjrider commented Nov 29, 2020

With respect to the json/yaml lint
composer and travis only care about the content of the file, json and yaml lint care about the syntax of the file.
if i collapse the composer.json to 1 line composer validate will still be happy, but json lint with complain.

so the difference is between content and well-formed of the file and i would advocate for the fact that both are checked

@jrfnl
Copy link
Member

jrfnl commented Dec 7, 2020

Looking at the recent PRs, I honestly think we need to turn yamlint off.

  1. It is reporting in PRs which have no changes in the .travis.yml file, giving the impression that those issues should be fixed before a PR can get merged.
  2. It is reporting on line length issues which cannot be fixed as these are commands which can't easily be broken up over multiple lines.

PR #128 fixes those line length issues which can be fixed. Other than that, the remaining warnings are just noise.

@Potherca
Copy link
Member Author

@jrfnl I've made a first draft in #133, your scrutiny is very much welcomed!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
3 participants