[Docos] Add a new FAQ entry on preventing beartype.claw
from type-checking "bad boy" callables and classes
#263
-
Hi! I just tried the new amazing feature
So from my understanding, it looks like the authors of the upstream package Flax package have annotated a parameter as typing.Type[TheType] instead of TheType directly and that causes everything to fail. Is there a way to pause beartype for that specific call? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 2 replies
-
Aww. Thanks so much for such kind words that both massage my aching soul and stroke my ego gently. Let's see if @beartype can help set this typing conundrum right for the good of AI in Amsterdam. 🤖 + ☕ = win! Now, If I'm reading that type violation right: ...which I'm probably not, because the Canadian heat wave has fried my neocortex like an overcooked pancake on the outdoor skillet.
class GraphTransformer(...):
def __init__(self, parent: typing.Union[
typing.Type[flax.linen.module.Module],
typing.Type[flax.core.scope.Scope],
typing.Type[flax.linen.module._Sentinel],
NoneType
], ...) -> None: ... Is that all right? I suppose what I'm trying (but failing) to obliquely suggest is that the Therefore, musn't it be the case that your class GraphTransformer(...):
def __init__(self, parent: typing.Union[
flax.linen.module.Module,
flax.core.scope.Scope,
flax.linen.module._Sentinel,
None
], ...) -> None: ... Admittedly, I have no idea what's going on here. Curse you yet again, vicious Ontario heat wave. For example, why does that type hint reference the private Flax type |
Beta Was this translation helpful? Give feedback.
-
Thank you for your answer! And sorry for not explaining better the misleading stack trace. class GraphTransformer(nn.Module):
n_layers: int
hidden_mlp_dims: Dims = Dims(x=128, e=128, y=128)
hidden_dims: HiddenDims = HiddenDims(
x=256, e=64, y=64, n_head=8, dim_ffx=256, dim_ffe=128, dim_ffy=2048
)
@classmethod
def initialize(
cls,
key: Key,
in_node_features: int,
in_edge_features: int,
number_of_nodes: int,
num_layers: int,
):
...
@nn.compact
def __call__(self, g: gd.OneHotGraph, y, deterministic: bool):
... |
Beta Was this translation helpful? Give feedback.
Ah-ha! Fantastic. Thanks so much for the minimal-reproducible example (MRE). You are totally right about everything. Indeed, this is an upstream Google Flax typing bug. Woooooooah. Never expected Google of all tech giants and Flax of all tech giant ML frameworks to drop the QA ball, but... we're all AI-augmented humans here, right? Mistakes do happen and then @beartype raises its claw in anger against those mistakes.
Would you mind submitting an upstream issue against Flax's issue tracker? Interestingly, this further substantiates the need for
beartype.claw
-fueled hybrid runtime static type-checking. After all, Google (probably) already statically type-checks Flax against their own in-house