Replies: 1 comment
-
I have tried to simplify the solution by duplicating the attribute and using the create_model function: from pydantic import BaseModel, create_model
from typing import Type, ClassVar, Optional
import json
class Event(BaseModel):
TYPE: ClassVar[str] = 'base_event'
type: Optional[str] = None
data: Optional[str] = None # Example field
@classmethod
def create_new_type(cls, name: str, fields: dict, type_identifier: str) -> Type[BaseModel]:
fields.update({'type': (str, type_identifier)})
new_type = create_model(name, __base__=cls, **fields)
new_type.TYPE = type_identifier
return new_type
# Usage
MySpecialEvent = Event.create_new_type('MySpecialEvent', {'data': (str, None)}, 'my_special_event')
my_event = MySpecialEvent(data='Example data')
json_data = my_event.json()
print(json_data) # {"type": "my_special_event", "data": "Example data"}
# Deserialization
deserialized_event = MySpecialEvent.parse_raw(json_data)
print(deserialized_event.type) # "my_special_event"
# EventManager subscription example
# EventManager.subscribe(MySpecialEvent.TYPE, callback_function) A bit less dirty in my opinion but still not fulfilling the identity clause. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi everyone, let me start with a huge thanks and compliments for creating this amazing library. I love Pydantic and I am learning how to use it more and more deeply. I have been handling V1 quite well and I am learning to handle V2 by trying and failing 😉
I have a question for you all about what would be style best approach to solve the following scenario:
I have created a multiprocessing event driven system and after several experiments I have ended up choosing Pydantic models to act as Events within the infrastructure. Because Events travel via web sockets they need to be serialized into JSON and deserialized. Additionally there are certain validations such as the data type that each different type of Event carries. Ideally I would use an Enum type of object to handle this locally but Enum don't allow extensions by design and serializing would be on me while Pydantic has all built in. So I have created a simple model to experiment with:
The idea is that every class which produces Event(s) could extend the Event and create custom types which will have specialized data and type. So I can have:
Now the issues I am facing are:
And I would love to have a be able to use the type as hashable with the EventManager as follows:
event_manager.subscribe(MyUniqueEvent, callback)
The problem is that with a hierarchy of Event objects all extending Event when they get serialized into JSON I won't necessarily know which specific event is coming on the other side of the websocket so if I deserialize the event I would do it with the generic class:
Event.validate_model_json(serialized_event)
which would create an Event model instance with the right content, but would fail the type checking because it is an Event instance now and not a MyUniqueEvent.
event_management.subscribe(MyUniqueEvent().type, callback)
Which is as ugly as it is inefficient.
What I have half hacked now is the following:
So when I subscribe to an event which I know of I can do:
event_manager.subscribe(MyUniqueEvent.TYPE, callback)
and when deserializing I can check the .type of the event instance reconstructed like so:
This works but it's a bit tricky and inconsistent, because
Event.TYPE
will still be the original Event one but.type
will be of the deserialized event. But it allows me to register using a static type definition without having to create an instance. And it allows me to check if the .type is the right one, even though the type(Event) isn't.I am sure there is a better Pydantic way of solving this problem and I am looking forward to your suggestions.
Thanks in advance
Beta Was this translation helpful? Give feedback.
All reactions