Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Survey provider-supported features #77

Open
merkys opened this issue Dec 6, 2021 · 2 comments
Open

Survey provider-supported features #77

merkys opened this issue Dec 6, 2021 · 2 comments
Labels
enhancement New feature or request

Comments

@merkys
Copy link
Member

merkys commented Dec 6, 2021

It would be very nice to have a survey of provider-supported features. OPTIMADE specification is becoming more and more feature-rich with every merged PR, but it would be enlightening to know how many (or even which?) providers support one or another feature. Of course this should not be a tool to shame the providers, thus just numbers of providers could be shown, without actual names.

In particular, I would be interested in finding out how many of the providers:

  • Supports the latest version of OPTIMADE (v1.1.0 as of now);
  • Supports (not only parses, but also returns something meaningful for) optional filter language features;
  • Returns atom coordinates upon request;
  • Supports queries on nsites;
  • Has Calculations or References;
  • Supports queries on relationships;
  • ...

This could help the developers to observe the implementation of the OPTIMADE standard. Technically this could be implemented by borrowing some code from OPTIMADE validator, but I understand that some of the checks might be difficult to implement reliably.

Pinging top developers of optimade-python-tools, @ml-evs and @CasperWA.

@merkys merkys added the enhancement New feature or request label Dec 6, 2021
@ml-evs
Copy link
Member

ml-evs commented Dec 6, 2021

This would certainly be useful for users (and could encourage people to add optional features, if they see that everyone else has). This could probably be added to the validator in a fairly succinct way (tag particular validator function calls with a decorator that simply provides a string description for what that function checks, e.g. @tests_for("Supports queries on nsites") that could then be added to the JSON response), though some additional tests would have to be hard-coded.

Another way of framing this is just adding a better human-readable output of the successes rather than the failures of the validator, and promoting some of the warnings/optional tests.

My only concern is how we would define which features to list, it feels quite a lot like the "support level" discussions we had at previous workshops.

Whilst it could be added succinctly in the code, this would be a decent chunk of work, and perhaps too few users are using the dashboard (or even OPTIMADE altogether) to make this a priority right now?

@merkys
Copy link
Member Author

merkys commented Dec 7, 2021

Happy to hear that this should not be too difficult to implement. Indeed, maybe just collecting the descriptions of passed tests would be enough.

My only concern is how we would define which features to list, it feels quite a lot like the "support level" discussions we had at previous workshops.

There is certainly an objectively determined bit telling that a provider supports all "MUST" level features. But other than that we will most likely have arbitrary lists of features (there were discussions of this topic at Materials-Consortia/OPTIMADE#91). I would say the more feature checks we have the better overview we get.

Whilst it could be added succinctly in the code, this would be a decent chunk of work, and perhaps too few users are using the dashboard (or even OPTIMADE altogether) to make this a priority right now?

Sure, this is not a priority now. I came up with this idea while looking for arguments against introducing new OPTIMADE features which require a lot of adaptation from the providers, as well as are quite computational-intensive for the server side. But probably for the time being I can check a lot of things manually.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants