New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Change in behaviour with JSON serialization between v1.8.2 and v1.9.1 #4255
Comments
I think the behavior changed in 458f257 you can use Custom JSON serialisation from pydantic import BaseModel
from datetime import datetime
def custom_dumps(v, *, default):
v['datetime_1'] = v['datetime_1'].isoformat()
v['datetime_2'] = v['datetime_2'].isoweekday()
return v
class ExportTest(BaseModel):
datetime_1: datetime = datetime.utcnow()
datetime_2: datetime = datetime.utcnow()
def dict(self, **kwargs):
data = super().dict()
data['datetime_1'] = data['datetime_1'].isoformat()
data['datetime_2'] = data['datetime_2'].isoweekday()
return data
class Config:
json_dumps = custom_dumps
e = ExportTest()
print(e.dict())
print(e.json()) Output:
|
@hramezani thank you for pointing that out, i was not aware of that usage of |
I'm sorry about the breaking change, it was not intentional, we should have thought about it more and been more careful. 🙏 I don't think we can revert, the real solution is to be much stricter about breaking changes after V2 and consequently, make more regular major releases. |
@samuelcolvin no worries, I greatly appreciate the work you guys are doing on this amazing project! |
PR welcome to update the changelog. I'm hoping to work full time on pydantic from next week once I get the first release of pydantic-core out and give arq some much needed love. |
I'll create the PR and close this issue accordingly. |
After further examination, the provided solution does not seem to be as straightforward as it seems. Running the example provided by @hramezani returns this in v1.9.1:
Adding from pydantic import BaseModel
from datetime import datetime
import json
def custom_dumps(v, *, default):
v['datetime_1'] = v['datetime_1'].isoformat()
v['datetime_2'] = v['datetime_2'].isoweekday()
return json.dumps(v)
class ExportTest(BaseModel):
datetime_1: datetime = datetime.utcnow()
datetime_2: datetime = datetime.utcnow()
def dict(self, **kwargs):
data = super().dict()
data['datetime_1'] = data['datetime_1'].isoformat()
data['datetime_2'] = data['datetime_2'].isoweekday()
return data
class Config:
json_dumps = custom_dumps
e = ExportTest()
print(e.dict())
print(e.json()) BUT, there is a caveat here. While this example is super simplified, in the real world I have a lot of nested models in my code. Consider this: from pydantic import BaseModel
from datetime import datetime
import json
def custom_dumps(v, *, default):
v['datetime_1'] = v['datetime_1'].isoformat()
v['datetime_2'] = v['datetime_2'].isoweekday()
return json.dumps(v)
class Internal(BaseModel):
time: datetime = datetime.utcnow()
class Config:
json_encoders = {datetime: str(datetime)}
class ExportTest(BaseModel):
internal: Internal = Internal()
datetime_1: datetime = datetime.utcnow()
datetime_2: datetime = datetime.utcnow()
def dict(self, **kwargs):
data = super().dict()
data['datetime_1'] = data['datetime_1'].isoformat()
data['datetime_2'] = data['datetime_2'].isoweekday()
return data
class Config:
json_dumps = custom_dumps
e = ExportTest()
print(e.dict())
print(e.json()) This does NOT work as my custom json_dumps method now need to handle all internal serializations:
Even if I add another custom dumps method to the internal class, apparantly it's not called when using the external serializaiton: from pydantic import BaseModel
from datetime import datetime
import json
def internal_dumps(v, *, default):
v['time'] = str(v['time'])
return v
def custom_dumps(v, *, default):
v['datetime_1'] = v['datetime_1'].isoformat()
v['datetime_2'] = v['datetime_2'].isoweekday()
return json.dumps(v)
class Internal(BaseModel):
time: datetime = datetime.utcnow()
class Config:
json_encoders = {datetime: str(datetime)}
json_dumps = internal_dumps
class ExportTest(BaseModel):
internal: Internal = Internal()
datetime_1: datetime = datetime.utcnow()
datetime_2: datetime = datetime.utcnow()
def dict(self, **kwargs):
data = super().dict()
data['datetime_1'] = data['datetime_1'].isoformat()
data['datetime_2'] = data['datetime_2'].isoweekday()
return data
class Config:
json_dumps = custom_dumps
e = ExportTest()
print(e.dict())
print(e.json()) This fails exactly the same. For further comparison, this is the code and output when using pydantic 1.8.2: from pydantic import BaseModel
from datetime import datetime
class Internal(BaseModel):
time: datetime = datetime.utcnow()
class ExportTest(BaseModel):
internal: Internal = Internal()
datetime_1: datetime = datetime.utcnow()
datetime_2: datetime = datetime.utcnow()
def dict(self, **kwargs):
data = super().dict()
data['datetime_1'] = data['datetime_1'].isoformat()
data['datetime_2'] = data['datetime_2'].isoweekday()
return data
e = ExportTest()
print(e.dict())
print(e.json())
So it seems there isn't an easy workaround for this issue (other than let the external dumps method be in-charge of also handling of all the nested models serializations, which is not ideal to say the least). As it stands this is a blocker for me to upgrade my python version to 3.10+ |
@PrettyWood do you have any suggestions? I'm really sorry about your pain but I'd rather not add a flag for one release, unless we really have to - the behavior will change completely in V2 (we also need to make sure customisation like this is possible in V2). |
I understand @samuelcolvin and waiting for V2 is a perfectly reasonable solution for me, thanks! If I may make a suggestion though, this entire customization was done in order to allow different serialization of the same type for different fields. I'd like to propose an API suggestion for future releases (if relevant): from pydantic import BaseModel, Field
from datetime import datetime
class ExportTest(BaseModel):
datetime_1: datetime = Field(default_factory=datetime.utcnow, json_encoder=lambda x: x.isoformat())
datetime_2: datetime = datetime.utcnow()
class Config:
json_encoders = {datetime: lambda x: x.isoweekday()} Setting the encoder using |
I think that makes complete sense. I haven't started working (or thinking) on how we do exporting/dumping in v2/pydantic-core but imagine it'll be somewhat similar to validation where we have a linked collection of serialisers, a functional serialiser would accomplish exactly what you need. |
That sound awesome, looking forward to V2. Would you like to keep this issue open as a reference? |
Yes, I think best to keep it open. |
Ok, so I was able to come up with a pretty silly workaround: from pydantic import BaseModel, Field
from datetime import datetime
import json
class Internal(BaseModel):
d: datetime = Field(default_factory=datetime.utcnow)
class External(BaseModel):
internal: Internal = Internal()
datetime_1: datetime = datetime.utcnow()
datetime_2: datetime = datetime.utcnow()
class Config:
json_encoders = {datetime: lambda v: v.isoformat()}
def json(self, **kwargs):
json_text = super().json(**kwargs)
data = json.loads(json_text)
data['datetime_2'] = datetime.fromisoformat(data['datetime_2']).isoweekday()
return json.dumps(data)
e = External()
print(e.json())
I'll just override the Not great but calling |
Given the work around (👀), and the fact that we are focusing on V2, I'll be closing this issue. 🙏 |
Checks
Bug
When converting to JSON via the
.json()
method, I used to overridedict()
in order to override behavior for different fields with the same type. Since 1.9.2 this does not seem to work anymore, as it appears that.json()
does not calldict()
internally. This looks to me like its related to #3542 but I'm not entirely sure if I'm doing something wrong or misunderstanding.Output of
python -c "import pydantic.utils; print(pydantic.utils.version_info())"
:Output for v1.9.1:
Output for v1.8.2:
Also, Is there any other way to define a different serialization for the same type but on different fields? If so,I was not able to find it and I apologize.
The text was updated successfully, but these errors were encountered: