-
-
Notifications
You must be signed in to change notification settings - Fork 49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature Request] Support typing.get_overloads()
under Python ≥ 3.11
#54
Comments
In Canada, such individuals are commonly referred to as... Canadians. badum ching sound effect But I jest. Hi, masterful Brazilian tech-lead maestro! Let's do everything we can for your crew and this delightfully thought-provoking feature request. 🇧🇷 🌴 🇧🇷 Also, did I mention your English is impeccable? Because it is. I can barely spell-check most words in my native language and you're over here rocking English, Brazilian-Portuguese, Python, and academic jargon (which is like it's own horrifying language). Le sigh.
Ah-ha! Type-checking bears everywhere would likewise love to see this feature. As you slyly insinuate,
wut Thus your runtime-friendly syntax – which, like your English, is impeccable beyond all reasoning. I'm so appreciative of the time you invested in this. It's like a full-blown PEP, but authored just for me. I can feel the love for this topic radiating from here. Big Issue Is BigThere's only one issue and it is a Big Issue™:
The Big Issue™ is mypy hates redefinition of callables not explicitly decorated by the There's no way for us to inform mypy that our approach is better than the official approach. So, mypy being mypy, mypy just unconditionally flags everything as bad, throws up a nuclear hellstorm of red error messages, and fails with non-zero exit status. 💢 Mypy Is Not Why I Came HereYou are now wondering: "Bro, why are we even talking about mypy? I ate my pie for breakfast!" Yes, yes. We all hate mypy. ...just kidding, mypy devs? For better or worse, mypy is still the definitive type checker. It's word is the iron law that we all type-check by – even moreso than the official PEP standards that mypy implements. If we violate mypy, we violate PEP standards. In this case, we violate both PEP 484 (which standardizes callable overloading) and PEP 561 (which equates mypy conformance with If we violate PEP standards, the fabric of quality assurance itself comes crashing down around our soothing mechanical keyboards and we stand dumb-founded as the smart money quietly abandons beartype for competing constant-time runtime type-checkers that actually preserve PEP standards. So we don't violate mypy. Then There Is No Hope for UsAh-ha! Fear not, Master Commelli. All is not well – but all is not lost, either. Thanks to the hideous power of monkey-patching, we can preserve compatibility with both mypy and existing PEP standards while still getting our cake and eating it, too. Specifically, we can "improve" our Can we implement def beartype_overload(func: T) -> T:
# Dictionary mapping from the name to value of all
# attributes declared in the global scope of the passed
# callable.
global_name_to_value = func.__globals__
# Most recently declared callable overloading the
# passed callable if any *OR* "None" otherwise.
func_overloaded = global_name_to_value.get(func.__name__)
# List of all callables overloading the passed callable
# if any *OR* "None" otherwise.
funcs_overloaded = func_overloaded.get(
'__beartype_funcs_overloaded')
# If the passed callable has *not* already been declared in
# the current scope, attach a new private beartype-specific
# attribute to this callable recording all overloaded
# alternatives of this callable to be subsequently declared.
if funcs_overloaded is None:
func.__beartype_funcs_overloaded = [func,]
# Else, the passed callable has already been declared in
# the current scope. In this case, append the currently
# overloaded alternative of this callable to the existing
# private beartype-specific attribute recording overloads.
else:
func.__beartype_funcs_overloaded = (
funcs_overloaded + [func,])
# Return the currently overloaded alternative as is.
return func
# Monkey-patch us up the bomb. Suck it, mypy!
import typing
typing.overload = beartype_overload So, we have now (possibly successfully) defined an ad-hoc decorator recording all overloaded alternatives of any arbitrary callable. We store each callable in entirety, as the
Given that, we then refactor the
...I Don't Feel So Good AnymoreRight there with you, bro. On the bright side, none of this is infeasible. It's all feasible and possibly even fun! On the dark side, all of this will consume my precious life force that might better be redirected towards lower-hanging and less dangerous fruit like deep PEP 484 and 585 support. What I'm saying is: "By the power of the Brazilian rain forest gods, someone do this for us by submitting a working PR." I will merge anything that passes tests... anything. Until then, thanks again for your spellbinding dive into callable overloading, Ruan! We'll eventually get this done for everyone with risky behaviour like monkey-patching the core |
Unfortunately that wouldn't work, since the
|
Bwa-hah! Team Spen 2010 is, of course, absolutely right. The above def beartype_overload(func: T) -> T:
# Dictionary mapping from the name to value of all
# attributes declared in the global scope of the passed
# callable.
global_name_to_value = func.__globals__
# Most recently declared callable overloading the
# passed callable if any *OR* "None" otherwise.
func_overloaded = global_name_to_value.get(func.__name__)
# List of all callables overloading the passed callable
# if any *OR* "None" otherwise.
funcs_overloaded = func_overloaded.get(
'__beartype_funcs_overloaded')
# If the passed callable has *not* already been declared in
# the current scope, attach a new private beartype-specific
# attribute to this callable recording all overloaded
# alternatives of this callable to be subsequently declared.
if funcs_overloaded is None:
func.__beartype_funcs_overloaded = [func,]
# Else, the passed callable has already been declared in
# the current scope. In this case, append the currently
# overloaded alternative of this callable to the existing
# private beartype-specific attribute recording overloads.
else:
func.__beartype_funcs_overloaded = (
funcs_overloaded + [func,])
# Return the currently overloaded alternative as is.
return func
#FIXME: This is unsafe, thanks to order-of-importation issues.
#Specifically, this fails when the first external user module
#importing "beartype" imports "typing.overload" first. The
#GLaDOS-resembling terror AI @TeamSpen210 has an equally
#clever and frightening solution: do this, but also monkey-patch
#the original @typing.overload callable by replacing its internal
#code object (i.e., "typing.overload.__code__") with that of our
#@beartype_overload callable. In theory, that should work; in
#practice, we tread on thin ice and we just saw a crack widen.
# Monkey-patch us up the bomb. Suck it, mypy!
import typing
typing.overload = beartype_overload Again, exactly as you say, even that generalization fails for nested callables. Thankfully, we can account for that as well by deferring to our previously defined and exhaustively tested To my knowledge, that getter is the most robust algorithm for retrieving the set of all locals of a possibly deeply nested callable at decoration time. I'm pretty sure no one's done better. It has to be robust, because Python 3.10 unconditionally enabled PEP 563. If it wasn't robust, So... this is all still feasible in practice, maybe? If even all the above generalizations still fail, I'm afraid Cue hypothetical sad cat. 😿 |
Ah yes, you already have to deal with that for stringified annotations. (3.10 actually reverted defaulting to that, so a solution could be found that works better for runtime typing.) Instead of overriding the entire fucntion, it might also be good to transfer the code object across. Then copies of the function imported before |
Wait. Rly? Like, srsly? I mean, that's great. I'll need to immediately revert all of the assumptions Because PEP 563 is fundamentally hostile to runtime type-checking. Sure, it doubles the speed of static type-checking – but who cares about a mere doubling of an edge case? You don't break all Python parsers everywhere for a mere doubling that only benefits a vocal minority while simultaneously reducing everyone else's runtime performance. That said... I now see that PEP 563 was merely delayed until Python 3.11, which remains scheduled for rapid release next year. At most, this buys the runtime type-checking community a brief reprieve. That said... I also now see that PEP 649 supersedes PEP 563 with a slightly saner solution. Because Either way, CPython devs are still breaking all Python parsers everywhere for a mere doubling that only benefits a vocal minority while simultaneously reducing everyone else's runtime performance. They're only waiting another year to do it. thus concludes another vacuous rant of the Type-checking Bear Conclave |
The idea is that it gives time to figure out a better solution, perhaps one of those PEPs or something entirely different - I guess check mailing lists and chime in? |
...I know, I know. I should do that. Gods, why are you always right! This is like participatory democracy all over again: if you don't participate, you don't get the right to complain. And I want that right. I love complaining. The innermost fire that burns with every internet rant is how I warm myself on frigid Canadian nights when the pellet stove has burned its last pellet, the cats are shivering unconsolably in their roosts, and the night wind howls like a thousand forlorn lynxes across the chittering lake of ice outside our crumbling doorstep. My personal read on the situation is that the only reason we're delaying the PEP 563 (and maybe now PEP 649) rollout is that it upsets the huge, highly influential, and highly profitable FastAPI community, which leverages pydantic for its runtime validation. That's a good reason, but it should have never come to that. PEP 563 should have never passed the peer review process, but it did – which means the peer review process seems to be broken here. PEP 649 is now being presented as the "middle ground," but my position is rather more pragmatic: both PEP 563 and 649 are bad and doing nothing is better than doing something bad. Kinda doubt Guido wants to hear that. I guess what I'm saying is... I hate shouting into the wind. 🌬️ |
OMFG. Gods! You are always right. It's like you're Nega-@leycec, my nefarious arch-nemesis rival from a dismal adjacent hyperplane managed by the Aperture Science Innovators-funded GLaDOS. You... you're not GLaDOS, are you? I mean, GLaDOS probably wouldn't admit to that. But you kinda seem omnipotent, which is scaring the cats over here a little bit. 🤖 I never even knew you could monkey-patch a callable by quietly replacing its code object while no one was looking, preserving the superficial shell of the original callable like an alien xenomorph implanted into the turgid belly of an unsuspecting Nostradamus miner. That's wicked, bro. I love it. Pragmatically, I have no idea how to safely do that – but where this is a devious will, there is a devious way. I've updated the last code snippet with a |
I'm not, no, just someone who likes looking at internals. If you do do it though, it will still use def beartype_overload(func):
...
overload.__code__ = beartype_overload.__code__ It might be desirable to do some checks to ensure another module hasn't overridden the method, and warn in that case - maybe revert to normally monkey patching, and call the overridden version at the end so hopefully they chain together. |
Right-o. As well as patch up
Yikes. Will the caller module still get the previously jitted Disastrous Monkey-patches of the Young & Dangerous.
You even thought of that. I thought we subconsciously agreed not to publicly talk about that, because everything just gets so ugly so fast. But... yes. We can't be the only back-alley cretins contemplating this. Somewhere in a locked vault secure under the Alaska-British Columbian border, someone with Roswell-level security clearance of is already doing this. And they're not going to be pleased when they read this issue.
You're right, of course. Replacing the So, we can't just replace code objects in that case. Instead, we reach for the oxygen bag and hope for the best.
an adverb i never hoped to hear again You're right, of course. Let's pretend there's a sane plugin API here and just blindly chain everything together. Nothing could possibly go wrong. cue GLaDOS |
Lastly, I have a painfully stupid alternative to everything clever you devised above. Rather than abuse code objects, just:
Machine Gods, spare us from my dumbness! Please tell me that will never work or only occasionally work but fall down in pernicious edge cases. The most significant edge case I see is conditional deferred imports of either |
It wouldn't work if you import it no-globally, just import |
I didn't even think of the obvious bare Very well. We congenially agree as gentlemen that this is ugly, then. |
Ah-ha! The obvious bare Using it from elsewhere: ditto... probably. I'm mostly trying to cover the obscure corner case of: "What do we do when someone else already monkey-patched If we can nail down 99% of all edge cases, I'm mostly fine with just documenting that:
Happily, importing Oh, and thanks much-ly for hashing this over with me this Saturday evening. You were a tremendous help, AI pal @TeamSpen210. Truly. Thank you. |
Oh wow, the conversation here became too complex for me to follow properly. Sorry for the delay, I was hibernating (and by that I mean being buried in not-so-exciting stuff from my studies). And sorry if the following will sound stupid...it's your fault actually, I didn't even ask for such an ingenious discussion, but can I add my two cents here? My humble opinion is that from typing import overload
from beartype import beartype, beartype_overload
@overload
@beartype_overload
def stringify(x: int) -> str:
...
@overload
@beartype_overload
def stringify(x: float) -> str:
...
@beartype
def stringify(x: int | float) -> str:
return str(x) Mypy seems to be happy with this, and I even get pretty code completion. The most important thing here is that from typing import Any, Callable, TypeVar
CallableT = TypeVar('CallableT', bound=Callable[..., Any])
def beartype_overload(func: CallableT) -> CallableT:
... # <clever implementation here> If I dare to write something so seemingly innocent as Here goes a sample implementation: from collections import defaultdict
from inspect import signature
from pprint import pprint
from typing import Any, Callable, TypeVar, overload
AnyCallable = Callable[..., Any]
CallableT = TypeVar("CallableT", bound=AnyCallable)
RECORD: defaultdict[str, list[AnyCallable]] = defaultdict(list)
def beartype_overload(func: CallableT) -> CallableT:
RECORD[func.__name__].append(func)
return func
def beartype(func: AnyCallable) -> AnyCallable:
pprint([signature(f) for f in RECORD[func.__name__]])
return func
# ------------------------------------------------ USAGE:
@overload
@beartype_overload
def stringify(x: int) -> str:
...
@overload
@beartype_overload
def stringify(x: float) -> str:
...
@beartype
def stringify(x: int | float) -> str:
return str(x) A summary of what is going on here:
The main advantage I see here is that there are no conflicts with Does this make sense to you people? |
...is what I'd say if I was a jerk. Instead, I'm Canadian. Like all patriotic frost-bite victims, I'm required by federal law to be congenial, punctual, and polite. Apparently, this is an alternative to actually having nuclear weapons. i remain skeptical Srsly, tho. You are absolutely right about everything. Decorator composability is the viable third way we shamefully failed to consider – even though it's also the least fragile and most explicit approach. I am now self-flagellating myself with a burlap handbag. I'm deeply indebted for all of the voluminous code, too. I can personally confirm that:
# In "beartype._decor.main":
T = TypeVar('T', bound=Callable[..., Any])
def beartype(func: T) -> T: ...
Quick – Get the Toilet Snake, SomebodyThe Turing-complete devil is in the details. The most significant blocker is still the # In the official "typing" module:
def _overload_dummy(*args, **kwds):
raise NotImplementedError(
"You should not call an overloaded function. "
"A series of @overload-decorated functions "
"outside a stub module should always be followed "
"by an implementation that is not @overload-ed.")
def overload(func):
return _overload_dummy Let's take a brief moment to appreciate the anal-retentive banality in the hearts of men.
This means that order is now significant. Specifically, everything will silently blow up if users accidentally reverse the decorator order: e.g., from typing import overload
from beartype import beartype, beartype_overload
# This is like "Where's Waldo?" all over again.
# Can you spot the error? We can, because
# we have peered into the dark abyss that is
# the "typing" module. And it has stared back.
@beartype_overload
@overload
def stringify(x: int) -> str:
...
@beartype_overload
@overload
def stringify(x: float) -> str:
...
@beartype
def stringify(x: int | float) -> str:
return str(x) The naïve implementation of The smart solution is for # In a new hypothetical "beartype._overload" submodule:
from typing import overload
# This is insane. This is typing.
_typing_dummy_function = overload(lambda: ...)
def beartype_overload(func: CallableT) -> CallableT:
if func is _typing_dummy_function:
raise BeartypeOverloadOrderException(
'@beartype_overload erroneously applied before @typing.overload. '
'Please apply @beartype_overload after @typing.overload instead: e.g.,\n'
' @overload\n'
' @beartype_overload\n'
' def my_overloaded_func(): ...'
)
RECORD[func.__name__].append(func)
return func Great. We have murdered a critically endangered dragon in its own lair. I hope we feel happy with ourselves. The Old & The Balding: This Is My StorySimilar issues arise with the Since overloaded callables must now be non-obviously decorated with multiple decorators in a specific order,
from typing import overload
from beartype import beartype
# Oh, you sweet summer child.
@overload
def stringify(x: int) -> str:
...
@overload
def stringify(x: float) -> str:
...
@beartype
def stringify(x: int | float) -> str:
return str(x)
from typing import overload
from beartype import beartype, beartype_overload
# ...good, good.
@overload
@beartype_overload
def stringify(x: int) -> str:
...
# ...what is this horrible thing you have done.
@overload
def stringify(x: float) -> str:
...
@beartype
def stringify(x: int | float) -> str:
return str(x) Of course, we can reliably detect both cases: def beartype(func: CallableT) -> CallableT:
# Detect both cases and then...
if func.__globals__.get(func.__name__) is _typing_dummy_function:
# If this is the uncommon case, cry @beartype a river.
if func.__name__ in RECORD:
raise BeartypeOverloadOrderException(
f'Last overload of @typing.overload-decorated callable {func.__name__}() '
f'not also decorated by @beartype.beartype_overload. '
'Please apply @beartype_overload after @typing.overload: e.g.,\n'
' @overload\n'
' @beartype_overload\n'
' def {func.__name__}(): ...'
)
# Else, this is the common case. Cry until it stops hurting.
else:
raise BeartypeOverloadOrderException(
f'@typing.overload-decorated callable {func.__name__}() '
f'not also decorated by @beartype.beartype_overload. '
'Please apply @beartype_overload after @typing.overload: e.g.,\n'
' @overload\n'
' @beartype_overload\n'
' def {func.__name__}(): ...'
)
pprint([signature(f) for f in RECORD[func.__name__]])
return func Of course, the kinda bigger catastrophe is that we're now breaking backward compatibility. Of course, we're not Of course, the even bigger catastrophe is that Now, Stalin preserve us. We've become the self-loathing bureacratic paper-shuffler in Papers, Please. Wake Me up When the Pain Has Finally SubsidedThere's also a crazy number of wide-eyed gremlins lurking about with tetanus-encrusted rusty nails like little murder hobos, including:
def _get_func_hashname(func: Callable) -> str:
'''
Shoot me now, fam.
'''
return 'f{getattr(func.__module__, "00-in_memory")}.{func.__qualname__}'
So, we'll also need to declare our own private (but well-tested) I am sighing fitfully into my hipster kombucha. So What You're Saying Is...Happiness remains an elusive dream for @leycec. It's hard to be fully satisfied by any of the solutions on hand. We either:
This is why you don't force-install Breaking the Fourth WallDo not crucify me without a warrant, but there's actually a fourth hidden option that preserves backward compatibility, user expectations, and robustness:
The line that threads through all three of these three |
As you mentioned earlier, you'll still need to use |
Right-o. I knew I was forgetting something from the above dissection of angry dragons. from warnings import warn
def beartype(func: CallableT) -> CallableT:
# Arbitrary string uniquely identifying this callable.
func_hashname = _get_func_hashname(func)
#FIXME: Also handle nested callables by inspecting the
#call stack. Odin, hear your disciple's plaintive cry!
# Detect both cases and then...
if func.__globals__.get(func.__name__) is _typing_dummy_function:
# If this is the uncommon case, cry @beartype a river.
if func.__name__ in RECORD:
raise BeartypeOverloadOrderException(
f'Last overload of @typing.overload-decorated callable {func.__name__}() '
f'not also decorated by @beartype.beartype_overload. '
'Please apply @beartype_overload after @typing.overload: e.g.,\n'
' @overload\n'
' @beartype_overload\n'
' def {func.__name__}(): ...'
)
# Else, this is the common case. Cry until it stops hurting.
else:
warn(
f'@typing.overload-decorated callable {func.__name__}() '
f'not also decorated by @beartype.beartype_overload. '
'Please apply @beartype_overload after @typing.overload: e.g.,\n'
' @overload\n'
' @beartype_overload\n'
' def {func.__name__}(): ...',
BeartypeOverloadOrderWarning,
)
pprint([signature(f) for f in RECORD[func_hashname]])
del RECORD[func_hashname]
return func So, |
@leycec your fourth option looks awesome to me. Like, really great. After connecting the wires correctly, this should work just perfect - if users opt-in for "automatic" patching, it is their responsibility to deal with the possible obscure errors that may arise in case of incompatibility. Otherwise, it's just a matter of remembering to Unfortunately, I'm kind of in a lack of time right now, so I will stop commenting on this (for now) and head on to share one more idea (our 5th so far). It will sound crazy, I know, but aren't we all mad here? The idea is... Instead of monkey-patching Just to be clear, this is the structure I'm using for this example: .
├── beartype
│ ├── __init__.py
│ └── _typing.py
└── main.py Here, beartype/_typing.pyfrom typing import overload Do you know what mypy & fellow type-checkers will think? That beartype/__init__.pyfrom collections import defaultdict
from inspect import signature
from pprint import pprint
from typing import Any, Callable, TypeVar
__all__ = ["beartype_function", "beartype_overload"]
AnyCallable = Callable[..., Any]
CallableT = TypeVar("CallableT", bound=AnyCallable)
# This RECORD dictionary is just part of a very simplified implementation!
# As pointed out in previous comments, it is not necessarily safe, but it works
# well for this example, so let's keep it as is.
RECORD: defaultdict[str, list[AnyCallable]] = defaultdict(list)
# Simple implementation of our brand-new `@beartype_overload` decorator
# Currently, it just adds functions to the RECORD dictionary based on function
# name. For a more real-world implementation, see previous comments.
def _beartype_overload(func: CallableT) -> CallableT:
RECORD[func.__name__].append(func)
return func
# I renamed `beartype` as `beartype_function` to avoid name conflict,
# but this is our good old friend `@beartype` - simulated here as a simple
# function that prints all signatures seen so far
def beartype_function(func: AnyCallable) -> AnyCallable:
pprint([signature(f) for f in RECORD[func.__name__]])
return func
# ^^^^^^^ YOU HAVE ALREADY SEEN THIS ^^^^^^^
# vvvvvvv NEW STUFF BELOW vvvvvvv
import beartype._typing
# A-HA! Type-checkers are oblivious to this runtime monkey-patching.
# Also, we are not monkey-patching the stdlib `typing` module, so no one gets
# sad here, and there are no conflicts with third-party patches.
beartype._typing.overload = _beartype_overload
# Now we just re-export `beartype._typing.overload` with a cute name like
# `beartype_overload`
from beartype._typing import overload as beartype_overload Finally, our cute little main.pyfrom beartype import beartype_function, beartype_overload
@beartype_overload
def stringify(x: int) -> str:
...
@beartype_overload
def stringify(x: float) -> str:
...
@beartype_function
def stringify(x: int | float) -> str:
return str(x) Again, mypy is happy even if I execute |
That’s another good solution as well. Mypy will probably flag the assignment as illegal and/or potentially do something in the future, so you’ll probably need to guard with |
That's deviousness beyond all prior clinically understandings of deviousness. You just out-Machiavelli-ed mypy at its own game – and I for one welcome and nervously applaud our new compact However, as The Spenster observes with his formidable Sherlock-like powers of perception, mypy devs will consider this an attack on their core business model. They should and they will. If you can trivially circumvent I'm not reporting this upstream, because I don't want them to try resolving this. But someone will, because other people are like that. When this happens, the poorly concealed butterfly knives will come out with a disturbing scritching noise. A cool grindhouse blood vengeance scene choreographed by Tarantino then ensues. I don't want mypy devs to hate us, because life here in the runtime trenches is already hard enough. They will resolve that loophole singularity – but they can't resolve every possible permutation, eh? We are Turing-complete and they aren't. As a fully foolproof, battle-hardened, last-ditch, End Times-ready defense against the living dead that surely walk amongst us, we might layer all of God-tier Spen's hacks into one mammoth hack. And it shall be known as... Super Turbo Kludge: # ^^^^^^^ YOU HAVE ALREADY SEEN THIS ^^^^^^^
# vvvvvvv NEW STUFF BELOW vvvvvvv
from typing import TYPE_CHECKING
import beartype._typing
if not TYPE_CHECKING:
# A-HA! Type-checkers are oblivious to this runtime monkey-patching.
# Also, we are not monkey-patching the stdlib `typing` module, so no one gets
# sad here, and there are no conflicts with third-party patches.
setattr(
beartype._typing, (
# Make this hexadecimal for mega bonus points.
'o' +
'v' +
'e' +
'r' +
'l' +
'o' +
'a' +
'd'
),
_beartype_overload,
)
# Now we just re-export `beartype._typing.overload` with a cute name like
# `beartype_overload`
from beartype._typing import overload as beartype_overload But all this begs the question... Maybe It Should Just Be PublicPEP 585 is a problem for all type-checkers (both static and runtime), because it deprecates most of PEP 484 and thus the entire standard Very well. Let's admit I did do that. But if you scan to the end of the prior link, you might notice that the optimal hot fix for PEP 585 is for everyone to define their own private # In "{your_package}._typing":
from sys import version_info
if version_info >= (3, 9):
List = list
Tuple = tuple
...
else:
from typing import List, Tuple, ... Wait. Wait just a minute there, Keanu! The right-brain pattern-matching synapses are firing with a dull ache in my forehead. Above, Ruan cleverly suggested we define our own private # In our public "beartype.typing" submodule:
from beartype._overload import beartype_overload as _beartype_overload
from sys import version_info as _version_info
from typing import TYPE_CHECKING, overload
if _version_info >= (3, 9):
List = list
Tuple = tuple
...
else:
from typing import List, Tuple, ...
if not TYPE_CHECKING:
# A-HA! Type-checkers are oblivious to this runtime monkey-patching.
# Also, we are not monkey-patching the stdlib `typing` module, so no one gets
# sad here, and there are no conflicts with third-party patches.
globals()[
# Make this hexadecimal for mega bonus points.
'o' +
'v' +
'e' +
'r' +
'l' +
'o' +
'a' +
'd'
] = _beartype_overload Everyone then imports Of course, We now return to your regularly scheduled Friday night debauchery at the GitHub cantina. Cue "Cantina Song" on a sketchy jukebox. |
You may be interested in https://bugs.python.org/issue46821 where I propose adding runtime introspection support to |
Yes! Thanks so much generating my excitement on a Monday. It's a hard day to look forward to – but I now I do, thanks to @JelleZijlstra. Deferring to a standard The Risky Plan Until ThenIn the meanwhile, @beartype recently added a new mostly undocumented 😓 It's kinda unlikely a runtime-introspectable Let's see if @leycec makes this happen earlier for everyone in 2022! 🥳 |
This change should not need a PEP, since it only affects the runtime. We already added introspection support for @Final in 3.11 (https://bugs.python.org/issue46342). If the overload change is accepted, I'll also backport it to typing-extensions for the benefit of older versions. |
Oh – you're quite right. Since I'm getting the creeping feeling that you usually are, I applaud everything you are and everything you do. Thanks so much for all your tremendous volunteerism throughout the community, Jelle. Also, my resume will never look like this:
...kinda in awe of that work ethic. I can barely snow-shovel our shed.
👍 👍 👍 Such excitement. Given that, Magic like that only happens once in a generation. |
An update: |
Super-hype. I'm delighted I no longer need to do anything, because 2022 is hard enough. Let's rename this issue accordingly. Thanks so much for inspiring and driving the details behind upstream CPython support, Spence! You're a living phenomena in the typing community. |
beartype.overload
decorator to accept overloaded functionstyping.get_overloads()
under Python ≥ 3.11
Greetings
Hi, dear weird bear aficionado! First of all, thanks for this awesome library! I've just started using it and it looks great, I can't wait to write runtime-type-safe functions everywhere.
What I would like to see
One functionality that I would love to see here is the ability to write overloaded functions, akin to
typing.overload
, but at runtime. Something like this:Note that this is not equivalent to writing
greet(name_or_age: Union[str, int], age: Optional[int])
since, for instance,greet(10, 10)
is invalid.Basically, beartype should test function calls against all of the overloaded signatures. If any of them matches, it's okay. Otherwise, ROAR!
Also note that here I separated the
@overload
part from the@implement
one. This is because...What I am not proposing
I am not proposing function dispatching here. So the following is not what I wish to see:
Function dispatching brings a lot of issues such as deciding which implementation to choose for a given signature. And also, there are already some libraries out there that implement single and even multiple dispatching, I don't believe that that is a job for beartype.
So...
That is why I separated the
@overload
and the@implement
parts. The way I see it, beartype should first check for the first overload, and then the second, the third and so on until one matching overload is found. If this is the case, then beartype should just call the@implement
part without any further checks.Does that make sense to you? Looking forward to seeing your opinion on this!
The text was updated successfully, but these errors were encountered: