Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Experiment: Fallible conversion trait #4060

Draft
wants to merge 12 commits into
base: main
Choose a base branch
from
Draft

Conversation

Icxolu
Copy link
Contributor

@Icxolu Icxolu commented Apr 8, 2024

This implements a proof of concept of a new fallible conversion trait.


Design

IntoPyObject

This introduces a new trait IntoPyObject to perform the conversion of implemented type into Python. Currently the trait is generic so that the same type can have multiple conversions to different Python types. In this version I used it to provide an additional typed conversion for the chrono types when the unlimited API is enabled. There might be more situations where this could be useful. I changed the generic to an associated type as outlined in #4060 (comment)

The Error is specified as an associated type, since every conversion should have a single applicable error (or Infallible if it can't fail).

The return type is specified to Bound<T> to guarantee that we convert to a Python type.

pub trait IntoPyObject<'py>: Sized {
    type Target;
    type Error;

    fn into_pyobject(self, py: Python<'py>) -> Result<Bound<'py, Self::Target>, Self::Error>;
}

IntoPyObjectExt

Additionally to IntoPyObject this also adds IntoPyObjectExt. The sole purpose of this trait is to allow specifying the desired output type using the turbo fish syntax:

obj.into_pyobject::<PyDateTime, _>(py)

This makes it much nicer to work with types that implement multiple conversions and need type hinting. This would probably be the API most users would interact with. This is only beneficial in the case of a generic trait.

Macros

To use the new conversion on return values of #[pyfunction] and friends without breaking current code, a variant of autoref deref specialization is implemented. It prefers the implementation of IntoPyObject over IntoPy<PyObject>. (Similar to the current approach the conversion to PyAny needs to be available.) This means there should be no changes required by users. The macros will pick up the fallible IntoPyObject implementation automatically, if available. For the migration we could then just deprecate IntoPy (and ToPyObject) and advise to implement IntoPyObject instead, removing the old conversions in a future release. Additionally there is another layer of specialization for the () return type in #[pymethods] and #[pyfunctions]. This allows converting them into PyNone as part of the macro code, because that's the Python equivalent of a function returning "nothing", while still using PyTuple as the IntoPyObject::Target.

Open Questions

  • Do we need IntoPyObject be generic, or would an associated type also work? Maybe implementing it on some more types will give us more insight whether the generic is needed.
    From my initial implementation I have the feeling that the generic version introduces a lot of complexity while not really providing a big benefit.
  • What to do with APIs that are generic over IntoPy and ToPyObject? I assume we need to introduce new variants that are generic over IntoPyObject, similar to what we did with the _bound constructors.

There are probably some edge cases that don't work correctly, but I thought I'll gather some feedback first whether this goes into the right direction before I chase those down.

Xref #4041

Copy link

codspeed-hq bot commented Apr 8, 2024

CodSpeed Performance Report

Merging #4060 will not alter performance

Comparing Icxolu:into-pyobject (3423b33) with main (a5201c0)

Summary

✅ 69 untouched benchmarks

@Icxolu
Copy link
Contributor Author

Icxolu commented Apr 15, 2024

I've played around with this some more, and added more conversions for experimentation. (I will remove most of the impls again from here, once we reached some consensus, since many of them warrant their own PRs)

I have the feeling that an associated type is the more appropriate choice:

  • In pretty much all cases there is one clear Python type to turn a Rust type into.
  • If there would be a case for multiple different conversions, I think having wrapper types for each conversion would a much clearer option.
  • The API is more ergonomic
    • We don't need a separate implementation for PyAny, which is nearly all cases just forwards to a different typed impl.
    • Stricter type safety - because there is only one target type, choosing most appropriate one (instead of falling back to PyAny for the macros) involves less boilerplate.
    • There is no need for an additional extension trait.

For these points I switched the generic for an associated type.

The only type that's a bit special is (), which should have Target = PyTuple, but still turn into None as a return type of #[pyfunction] and #[pymethods]. Since that is all inside macro code, I added another specialization layer to handle that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant