Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Denial of service when parsing a JSON object with an unexpected field that has a big number #609

Closed
plokhotnyuk opened this issue Oct 20, 2022 · 32 comments

Comments

@plokhotnyuk
Copy link

Sub-quadratic decreasing of throughput when length of the JSON object is increasing

On contemporary CPUs parsing of such JSON object with an additional field that has of 1000000 decimal digits (~1Mb) can took ~13 seconds:

[info] Benchmark                               (size)   Mode  Cnt         Score          Error  Units
[info] ExtractFieldsReading.jacksonScala            1  thrpt    3   2194453.617 ±  1329582.812  ops/s
[info] ExtractFieldsReading.jacksonScala           10  thrpt    3   2117455.756 ±  1388472.780  ops/s
[info] ExtractFieldsReading.jacksonScala          100  thrpt    3    692095.913 ±   578645.523  ops/s
[info] ExtractFieldsReading.jacksonScala         1000  thrpt    3     49243.276 ±     4670.093  ops/s
[info] ExtractFieldsReading.jacksonScala        10000  thrpt    3       673.445 ±        6.794  ops/s
[info] ExtractFieldsReading.jacksonScala       100000  thrpt    3         7.024 ±        0.040  ops/s
[info] ExtractFieldsReading.jacksonScala      1000000  thrpt    3         0.070 ±        0.005  ops/s

Probably the root cause is in the jackson core library, but reporting hear hoping that a hot fix to just skip unwanted values can be applied on the jackson-module-scala level.

Flame graph for CPU cycles at size=1000000

image

jackson-module-scala version

2.14.0-rc2

Scala version

2.13.10

JDK version

17

Steps to reproduce

To run that benchmarks on your JDK:

  1. Install latest version of sbt and/or ensure that it already installed properly:
sbt about
  1. Clone jsoniter-scala repo:
git clone --depth 1 https://github.com/plokhotnyuk/jsoniter-scala.git
  1. Enter to the cloned directory and checkout for the specific branch:
cd jsoniter-scala
git checkout jackson-DoS-by-a-big-number
  1. Run a benchmark that reproduce the issue:
sbt clean 'jsoniter-scala-benchmarkJVM/jmh:run -wi 3 -i 3 ExtractFieldsReading.jacksonScala'
@plokhotnyuk
Copy link
Author

plokhotnyuk commented Oct 20, 2022

A similar issue can be reproduced using weePickle: rallyhealth/weePickle#118

@pjfanning
Copy link
Member

@plokhotnyuk jackson-module-scala only really adds the ability to process Scala classes to jackson-databind/jackson-core. I think any changes needed to help here will end up going into jackson-databind or jackson-core.

@cowtowncoder is there any existing support in jackson-databind to skip JSON subtrees that are not needed? The scenario is that a Java or Scala class can have certain fields but the JSON has extra fields that are not needed because they do not appear in the class.

@pjfanning
Copy link
Member

pjfanning commented Oct 21, 2022

I get these results on my laptop and the results don't seem sub-quadratic (default branch as opposed to git checkout jackson-DoS-by-a-big-number branch). They are not great, given the amount of processing time spent on parsing the unexpected field.

Zulu 11.0.14 JRE

[info] Benchmark                           (size)   Mode  Cnt       Score         Error  Units
[info] ExtractFieldsReading.jacksonScala        1  thrpt    3  806409.694 ± 1014343.406  ops/s
[info] ExtractFieldsReading.jacksonScala       10  thrpt    3  165178.161 ±  412668.195  ops/s
[info] ExtractFieldsReading.jacksonScala      100  thrpt    3   21372.762 ±   39697.040  ops/s
[info] ExtractFieldsReading.jacksonScala     1000  thrpt    3    2060.872 ±    4385.220  ops/s
[info] ExtractFieldsReading.jacksonScala    10000  thrpt    3     161.848 ±     184.437  ops/s
[info] ExtractFieldsReading.jacksonScala   100000  thrpt    3      19.154 ±       6.080  ops/s
[info] ExtractFieldsReading.jacksonScala  1000000  thrpt    3       1.862 ±       1.127  ops/s

@plokhotnyuk
Copy link
Author

plokhotnyuk commented Oct 21, 2022

I get these results on my laptop and the results don't seem sub-quadratic (default branch as opposed to git checkout jackson-DoS-by-a-big-number branch). They are not great, given the amount of processing time spent on parsing the unexpected field.

The default branch tests for vulnerabilities of hash code collision handling only. Need to uncomment some subsequent line for setup of input data to test other types of vulnerabilities:
https://github.com/plokhotnyuk/jsoniter-scala/blob/5480a829837a4d58affdca180025055340afbeea/jsoniter-scala-benchmark/shared/src/main/scala/com/github/plokhotnyuk/jsoniter_scala/benchmark/ExtractFieldsBenchmark.scala#L18-L22

@pjfanning
Copy link
Member

Thanks @plokhotnyuk - I see the sub-quadratic performance in the jackson-DoS-by-a-big-number branch.

From your flame graph, it looks like the new BigInteger(string) implementation in the JRE could be a big part of the issue. I haven't debugged this, but @plokhotnyuk would you think that would be a good place to start?

@plokhotnyuk
Copy link
Author

plokhotnyuk commented Oct 21, 2022

I think that an acceptable way to fix is replacing all this kind of calls by some more efficient algorithm of parsing BigInteger/BigDecimal values with limiting of the max length for mantissas and scales.

Also, I'm thinking about detection of number types without parsing of big numbers.

The best option would be skipping of number values without parsing even small values like here.

@pjfanning
Copy link
Member

@plokhotnyuk jackson 2.14.0 full release is about to get released and I don't want to delay it.

jackson-core already has a BigDecimalParser that has some handling for very large numbers. This code is not nearly as good as yours but a small change to jackson-core to make it use BigDecimalParser to parse BigIntegers when the string is big does help quite a lot.

My initial testing shows that new BigInteger(string) is faster than BigDecimalParser.parse(string).toBigInteger() for smaller strings.

So a quick and dirty solution for v2.14 could be to use new BigInteger(string) when the string size is less than some magic size and to use BigDecimalParser.parse(string).toBigInteger()

I will look to add the FastBigDecimalParser (that is a Java port of your code) in Jackson 2.15. There is currently a bug in my FastBigDecimalParser that stops it parsing very large integers.

@cowtowncoder
Copy link
Member

One minor comment for one of questions ("does jackson-databind have a way to skip sub-trees"): no and yes -- nothing explicit, but any content that is not bound will be skipped. So fully typed POJOs/Scala objects define values that are extracted; anything else will be skipped (... or report exception depending).
Similarly anything explicitly ignored via annotations etc is skipped.
In all of these cases skipping avoids any processing beyond basic lexical conformance validation.

This would not apply to use cases where the whole content is bound to JsonNode or "untyped" java.lang.Object.

@pjfanning
Copy link
Member

pjfanning commented Oct 22, 2022

@cowtowncoder in the benchmarks highlighted by @plokhotnyuk, a Scala class is used that does not have the field that contains the big number. In theory, that field can be skipped and not parsed. The evidence from the benchmark is that the field is not skipped. If the data in the ignorable field is made larger, the benchmark slows down. It appears that JsonParser code in jackson-core is parsing the number using new BigInteger(str) when this can in theory, be skipped.

@cowtowncoder
Copy link
Member

@pjfanning Would it be possible to have a snippet of class definition here? There are possibilities of things that could explain it; for example scala module or some specific deserializer reading content into interemediate representation -- or, maybe, requiring buffering due to use of Constructor/Static factory method. Buffering is probably the likeliest case; currently that would force reading of content into value, here, Number.

@pjfanning
Copy link
Member

The class (basically Scala equivalent of a JavaBean with String s and int i)

case class ExtractFields(s: String, i: Int)

The JSON is generated using this (where size is a large number):

s"""{"s":"s","n":${"7" * size},"i":1}"""

generates something like:

{"s":"s","n":777777777777,"i":1}

@cowtowncoder
Copy link
Member

cowtowncoder commented Oct 22, 2022

Ok I am guessing that case classes must be using constructor so that would often require buffering, depending on order of properties being found from input.
But in this case number is included, not ignored?

It could of course also be that case classes use custom deserializer which could do whatever it wants (and not be using BeanDeserializer)
(although TBH I didn't see specific deserializer with expected name).

@pjfanning
Copy link
Member

I'll try to produce a Java benchmark.
The Scala code doesn't do much custom work. There is code for creating the Jackson Property descriptors but otherwise, the deserialization should be very similar to Java equivalent.

@pjfanning
Copy link
Member

Ok. In Java equivalent test, I get an exception:

com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException: Unrecognized field "n" (class org.example.jackson.bench.ExtractFields), not marked as ignorable

        at com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException.from(UnrecognizedPropertyException.java:61)
        at com.fasterxml.jackson.databind.DeserializationContext.handleUnknownProperty(DeserializationContext.java:1132)
        at com.fasterxml.jackson.databind.deser.std.StdDeserializer.handleUnknownProperty(StdDeserializer.java:2202)
        at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.handleUnknownProperty(BeanDeserializerBase.java:1705)
        at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.handleUnknownVanilla(BeanDeserializerBase.java:1683)
        at com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:320)
        at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:177)
        at com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:323)

I'm not sure why this does not kick in when Scala is involved.

@pjfanning
Copy link
Member

@cowtowncoder I set .disable(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES) and pure Java benchmark now runs

pjfanning/jackson-number-parse-bench#1

The slowdown is subquadratic. I haven't debugged this to see where the time is spent.

Benchmark                                  Mode  Cnt       Score      Error  Units
BigIntegerJsonParseBench.bigParse1000     thrpt    5  516460.877 ± 5539.976  ops/s
BigIntegerJsonParseBench.bigParse1000000  thrpt    5     240.166 ±   16.468  ops/s

@pjfanning
Copy link
Member

Could it be due to ignoring of unknown fields? https://github.com/plokhotnyuk/jsoniter-scala/blob/master/jsoniter-scala-benchmark/shared/src/main/scala/com/github/plokhotnyuk/jsoniter_scala/benchmark/JacksonSerDesers.scala#L44

Yes @plokhotnyuk, when accepting json from untrusted sources, you should not use .disable(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES) but it is still interesting to analyse the performance.

@pjfanning
Copy link
Member

@cowtowncoder since this only happens if .disable(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES) - I think we can probably just press on with v2.14 release. I have a few issues related to this in jackson-core and jackson-databind that I can come back to later.

@plokhotnyuk
Copy link
Author

That is great that default settings are safe, but it should be stated somewhere in docs and tutorials that disabling of DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES is unsafe for untrusted input.

Also, in Scala almost all JSON parsers allow unknown fields by default so users tend to skip unknown field with jackson-module-scala too and in Java some frameworks override defaults like Spring.

@cowtowncoder
Copy link
Member

@pjfanning Right, I am mostly interested in figuring out why skipping won't work, as a potential optimization. But agreed not a real blocker especially if and when it's not something new.

@cowtowncoder
Copy link
Member

Is the idea that this might allow easier DoS somehow since -- in some cases.... --- skipping may not work as efficiently as we'd like?

Otherwise I am not sure calling out disabling FAIL_ON_UNKNOWN_PROPERTIES is a common security issue.

But if anyone has an idea of good wording I would of course be open to addition to Javadocs for annotation (@JsonIgnoreProperties.ignoreUnknown) and feature (DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES).

@pjfanning
Copy link
Member

@cowtowncoder I added benchmarks to https://github.com/pjfanning/jackson-number-parse-bench and I was found that @JsonIgnoreProperties.ignoreUnknown seemed not to help performance but that could have been a rogue run. I can try again later.

@cowtowncoder
Copy link
Member

cowtowncoder commented Oct 23, 2022

@pjfanning Thank you for digging into this. Could you file a separate issue in jackson-databind to outline problem here, so perhaps I find time to look into it bit more from plain Java/databind side? Improvements to lazier parsing are good but I feel I am missing something obvious, resulting in you having to dig into things I should be able to point out already. So I'd be happy to see if I can understand why there is some eager handling beyond (I think) use of getNumberType() (which databind should only use if there is an actual target... including JsonNode / Object target type).

One thing to note on skipping unknown properties: this setting just prevents throwing of exception.
It is possible to ignore properties using other methods too (enumerated with @JsonIgnoreProperties etc), but fundamentally they all work the same and call JsonParser.skipChildren() which should skip current token (if scalar) or current token and its "contents" (JSON Object, Array, if pointing to START_xxx token).
This should be as efficient as it gets BUT if any access has occurred before this -- like full decoding and construction of BigInteger -- that would have incurred some cost.

@cowtowncoder
Copy link
Member

cowtowncoder commented Oct 23, 2022

@pjfanning Ok looking at the test, I wonder if this is only due to required buffering of the number value itself -- integer with 1M digits is 1000x length of one with 1k digits, and performance difference seems to be factory of 2000x. So longer value is 50% slower than might be expected, given the document content in both cases is 99.9%+ just one biggish integer.

Now the overhead for 1M digit number is due to TextBuffer having to keep a copy (unless input is all in shareable buffer, which I don't think is the case with String input, nor in most real usages); and although buffer recycling could help, biggest buffer retained is, I think, 256k characters (or something like that). So it may be that copying of characters and allocation of char[] buffer segments is where the overhead comes from.

Of course it'd be possible to implement fully lazy decoding (Woodstox does this, I think)... but that adds complexity.
And I am actually not sure that this case is problematic per se: performance is sub-linear but not, as far as I see, exponentially bad or such? Or did I misread numbers?

EDIT: actually lazy decoding is bit tricky since we can have both rather long BigDecimals AND BigIntegers; reported as different tokens (JsonToken.VALUE_NUMBER_INT vs JsonToken.VALUE_NUMBER_FLOAT) -- and detection must be done eagerly. And in case of big integer value it does mean that the whole number must be read and retained. Meaning that for long really big integer, avoiding buffering would be difficult if not impossible with current JsonParser API: there'd need to be something to skip not just current token but next token -- "JsonParser.skipNextToken()`.

@pjfanning
Copy link
Member

pjfanning commented Oct 24, 2022

The lazy parsing of BigInt/BigDec doesn't appear to help this jackson-module-scala use case but should help in the weePickle use case.

I had expected that the @JsonIgnoreProperties(ignoreUnknown=true) to help peformance. jackson-databind BeanDeserializer parses the numbers even for unknown props (this line). That line is skipped if you have @JsonIgnoreProperties(ignoreUnknown=true) but the benchmark seems no better.

Investigating the performance is not my priority because supporting unexpected fields is not the default config.

The next item I would like to get done is to limit the size of numbers. The bad perf only happens for giant numbers (10000+ chars) and most users would be better off banning JSON inputs like that.

@cowtowncoder
Copy link
Member

cowtowncoder commented Oct 24, 2022

Good investigation @pjfanning. Some notes:

First: buffering in this case is done if "property-based" constructor is used. This is the case for (no pun intended) case classes. For "dumb" value classes with just fields and/or setters/getters no buffering would be needed. So anyone wanting maximum performance may consider use of such dumb value classes (and just for those).

Second, as you mentioned, there's earlier check:

if (_ignoreAllUnknown) {
   // skips value
}

which should skip handling for some cases and it sounds like annotation does cause this to be true.

Features that would necessitate buffering (aside from matching one of Constructor parameters) include:

  • Use of "any setter"
  • Use of "unwrapping" (@JsonUnwrapped)

@cowtowncoder
Copy link
Member

@pjfanning Ah ha! So that flag is ONLY set if MapperConfigBase method

public final JsonIgnoreProperties.Value getDefaultPropertyIgnorals(Class<?> type) { ... }

returns setting to indicate it. It uses:

  1. Annotation @JsonIgnoreProperties(ignoreUnknown = true) OR
  2. Equivalent ConfigOverride for class

but as far as I can see, logic does NOT (unfortunately!) consider global DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES.
From performance perspective it definitely should.

Now... I think I'll file an issue for this, but I am not sure it can be resolved for 2.14.0: it seems slightly risky to add short-cut. There also needs to be a way to test this, somehow, to verify logic of detection.

Still, it is kind of odd that even when seemingly skipping content there is no real difference in performance.

@pjfanning
Copy link
Member

pjfanning commented Dec 13, 2022

@plokhotnyuk @cowtowncoder if you add this annotation to ExtractFields class, then the BigInteger is ignored and not parsed.

@JsonIgnoreProperties(ignoreUnknown = true)

jackson-databind skips the unexpected fields in vanilla mode but the Scala classes are not treated as vanilla because there is not a no param constructor to use. In non-vanilla mode, jackson-databind insists on parsing the unexpected fields unless you set the annotation above.

@plokhotnyuk
Copy link
Author

plokhotnyuk commented Dec 13, 2022

@JsonIgnoreProperties(ignoreUnknown = true)

Unfortunately, it doesn't compile in the jsoniter-scala benchmark codebase:
plokhotnyuk/jsoniter-scala@0e218ce

Scala 3 compiler fails to generate byte code with the following error (and, definitely, should be reported here https://github.com/lampepfl/dotty/issues):

[error] scala.MatchError: NoType (of class dotty.tools.dotc.core.Types$NoType$)
[error] 	at dotty.tools.backend.jvm.BCodeHelpers.dotty$tools$backend$jvm$BCodeHelpers$$typeToTypeKind(BCodeHelpers.scala:844)
[error] 	at dotty.tools.backend.jvm.BCodeHelpers$BCInnerClassGen.toTypeKind(BCodeHelpers.scala:285)
[error] 	at dotty.tools.backend.jvm.BCodeHelpers$BCInnerClassGen.toTypeKind$(BCodeHelpers.scala:213)
[error] 	at dotty.tools.backend.jvm.BCodeSkelBuilder$PlainSkelBuilder.toTypeKind(BCodeSkelBuilder.scala:74)
[error] 	at dotty.tools.backend.jvm.BCodeSkelBuilder$PlainSkelBuilder.symInfoTK(BCodeSkelBuilder.scala:103)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoadTo(BCodeBodyBuilder.scala:415)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoad(BCodeBodyBuilder.scala:290)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoadArguments(BCodeBodyBuilder.scala:1166)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genApply(BCodeBodyBuilder.scala:768)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoadTo(BCodeBodyBuilder.scala:369)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoad(BCodeBodyBuilder.scala:290)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genArray(BCodeBodyBuilder.scala:865)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genArrayValue(BCodeBodyBuilder.scala:850)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoadTo(BCodeBodyBuilder.scala:457)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoad(BCodeBodyBuilder.scala:290)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoadArguments(BCodeBodyBuilder.scala:1166)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genApply(BCodeBodyBuilder.scala:802)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoadTo(BCodeBodyBuilder.scala:369)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoad(BCodeBodyBuilder.scala:290)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoadArguments(BCodeBodyBuilder.scala:1166)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genApply(BCodeBodyBuilder.scala:802)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoadTo(BCodeBodyBuilder.scala:369)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoad(BCodeBodyBuilder.scala:290)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoad(BCodeBodyBuilder.scala:285)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoadQualifier(BCodeBodyBuilder.scala:1150)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genTypeApply(BCodeBodyBuilder.scala:668)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoadTo(BCodeBodyBuilder.scala:467)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoadTo(BCodeBodyBuilder.scala:449)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoad(BCodeBodyBuilder.scala:290)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoadArguments(BCodeBodyBuilder.scala:1166)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genApply(BCodeBodyBuilder.scala:802)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoadTo(BCodeBodyBuilder.scala:369)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoad(BCodeBodyBuilder.scala:290)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoadArguments(BCodeBodyBuilder.scala:1166)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoadArguments(BCodeBodyBuilder.scala:1167)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoadArguments(BCodeBodyBuilder.scala:1167)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoadArguments(BCodeBodyBuilder.scala:1167)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoadArguments(BCodeBodyBuilder.scala:1167)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genApply(BCodeBodyBuilder.scala:746)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoadTo(BCodeBodyBuilder.scala:369)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoad(BCodeBodyBuilder.scala:290)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genStat(BCodeBodyBuilder.scala:111)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genBlockTo$$anonfun$1(BCodeBodyBuilder.scala:1052)
[error] 	at scala.runtime.function.JProcedure1.apply(JProcedure1.java:15)
[error] 	at scala.runtime.function.JProcedure1.apply(JProcedure1.java:10)
[error] 	at scala.collection.immutable.List.foreach(List.scala:333)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genBlockTo(BCodeBodyBuilder.scala:1052)
[error] 	at dotty.tools.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoadTo(BCodeBodyBuilder.scala:441)
[error] 	at dotty.tools.backend.jvm.BCodeSkelBuilder$PlainSkelBuilder.emitNormalMethodBody$1(BCodeSkelBuilder.scala:809)
[error] 	at dotty.tools.backend.jvm.BCodeSkelBuilder$PlainSkelBuilder.genDefDef(BCodeSkelBuilder.scala:832)
[error] 	at dotty.tools.backend.jvm.BCodeSkelBuilder$PlainSkelBuilder.gen(BCodeSkelBuilder.scala:615)
[error] 	at dotty.tools.backend.jvm.BCodeSkelBuilder$PlainSkelBuilder.gen$$anonfun$1(BCodeSkelBuilder.scala:621)
[error] 	at scala.runtime.function.JProcedure1.apply(JProcedure1.java:15)
[error] 	at scala.runtime.function.JProcedure1.apply(JProcedure1.java:10)
[error] 	at scala.collection.immutable.List.foreach(List.scala:333)
[error] 	at dotty.tools.backend.jvm.BCodeSkelBuilder$PlainSkelBuilder.gen(BCodeSkelBuilder.scala:621)
[error] 	at dotty.tools.backend.jvm.BCodeSkelBuilder$PlainSkelBuilder.genPlainClass(BCodeSkelBuilder.scala:233)
[error] 	at dotty.tools.backend.jvm.GenBCodePipeline$Worker1.visit(GenBCode.scala:266)
[error] 	at dotty.tools.backend.jvm.GenBCodePipeline$Worker1.run(GenBCode.scala:231)
[error] 	at dotty.tools.backend.jvm.GenBCodePipeline.buildAndSendToDisk(GenBCode.scala:598)
[error] 	at dotty.tools.backend.jvm.GenBCodePipeline.run(GenBCode.scala:564)
[error] 	at dotty.tools.backend.jvm.GenBCode.run(GenBCode.scala:69)
[error] 	at dotty.tools.dotc.core.Phases$Phase.runOn$$anonfun$1(Phases.scala:316)
[error] 	at scala.collection.immutable.List.map(List.scala:250)
[error] 	at dotty.tools.dotc.core.Phases$Phase.runOn(Phases.scala:320)
[error] 	at dotty.tools.backend.jvm.GenBCode.runOn(GenBCode.scala:77)
[error] 	at dotty.tools.dotc.Run.runPhases$1$$anonfun$1(Run.scala:233)
[error] 	at scala.runtime.function.JProcedure1.apply(JProcedure1.java:15)
[error] 	at scala.runtime.function.JProcedure1.apply(JProcedure1.java:10)
[error] 	at scala.collection.ArrayOps$.foreach$extension(ArrayOps.scala:1321)
[error] 	at dotty.tools.dotc.Run.runPhases$1(Run.scala:244)
[error] 	at dotty.tools.dotc.Run.compileUnits$$anonfun$1(Run.scala:252)
[error] 	at dotty.tools.dotc.Run.compileUnits$$anonfun$adapted$1(Run.scala:261)
[error] 	at dotty.tools.dotc.util.Stats$.maybeMonitored(Stats.scala:68)
[error] 	at dotty.tools.dotc.Run.compileUnits(Run.scala:261)
[error] 	at dotty.tools.dotc.Run.compileSources(Run.scala:185)
[error] 	at dotty.tools.dotc.Run.compile(Run.scala:169)
[error] 	at dotty.tools.dotc.Driver.doCompile(Driver.scala:35)
[error] 	at dotty.tools.xsbt.CompilerBridgeDriver.run(CompilerBridgeDriver.java:88)
[error] 	at dotty.tools.xsbt.CompilerBridge.run(CompilerBridge.java:22)
[error] 	at sbt.internal.inc.AnalyzingCompiler.compile(AnalyzingCompiler.scala:91)
[error] 	at sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$7(MixedAnalyzingCompiler.scala:193)
[error] 	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
[error] 	at sbt.internal.inc.MixedAnalyzingCompiler.timed(MixedAnalyzingCompiler.scala:248)
[error] 	at sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$4(MixedAnalyzingCompiler.scala:183)
[error] 	at sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$4$adapted(MixedAnalyzingCompiler.scala:163)
[error] 	at sbt.internal.inc.JarUtils$.withPreviousJar(JarUtils.scala:239)
[error] 	at sbt.internal.inc.MixedAnalyzingCompiler.compileScala$1(MixedAnalyzingCompiler.scala:163)
[error] 	at sbt.internal.inc.MixedAnalyzingCompiler.compile(MixedAnalyzingCompiler.scala:209)
[error] 	at sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1(IncrementalCompilerImpl.scala:534)
[error] 	at sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1$adapted(IncrementalCompilerImpl.scala:534)
[error] 	at sbt.internal.inc.Incremental$.$anonfun$apply$5(Incremental.scala:179)
[error] 	at sbt.internal.inc.Incremental$.$anonfun$apply$5$adapted(Incremental.scala:177)
[error] 	at sbt.internal.inc.Incremental$$anon$2.run(Incremental.scala:463)
[error] 	at sbt.internal.inc.IncrementalCommon$CycleState.next(IncrementalCommon.scala:116)
[error] 	at sbt.internal.inc.IncrementalCommon$$anon$1.next(IncrementalCommon.scala:56)
[error] 	at sbt.internal.inc.IncrementalCommon$$anon$1.next(IncrementalCommon.scala:52)
[error] 	at sbt.internal.inc.IncrementalCommon.cycle(IncrementalCommon.scala:263)
[error] 	at sbt.internal.inc.Incremental$.$anonfun$incrementalCompile$8(Incremental.scala:418)
[error] 	at sbt.internal.inc.Incremental$.withClassfileManager(Incremental.scala:506)
[error] 	at sbt.internal.inc.Incremental$.incrementalCompile(Incremental.scala:405)
[error] 	at sbt.internal.inc.Incremental$.apply(Incremental.scala:171)
[error] 	at sbt.internal.inc.IncrementalCompilerImpl.compileInternal(IncrementalCompilerImpl.scala:534)
[error] 	at sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileIncrementally$1(IncrementalCompilerImpl.scala:488)
[error] 	at sbt.internal.inc.IncrementalCompilerImpl.handleCompilationError(IncrementalCompilerImpl.scala:332)
[error] 	at sbt.internal.inc.IncrementalCompilerImpl.compileIncrementally(IncrementalCompilerImpl.scala:425)
[error] 	at sbt.internal.inc.IncrementalCompilerImpl.compile(IncrementalCompilerImpl.scala:137)
[error] 	at sbt.Defaults$.compileIncrementalTaskImpl(Defaults.scala:2363)
[error] 	at sbt.Defaults$.$anonfun$compileIncrementalTask$2(Defaults.scala:2313)
[error] 	at sbt.internal.server.BspCompileTask$.$anonfun$compute$1(BspCompileTask.scala:30)
[error] 	at sbt.internal.io.Retry$.apply(Retry.scala:46)
[error] 	at sbt.internal.io.Retry$.apply(Retry.scala:28)
[error] 	at sbt.internal.io.Retry$.apply(Retry.scala:23)
[error] 	at sbt.internal.server.BspCompileTask$.compute(BspCompileTask.scala:30)
[error] 	at sbt.Defaults$.$anonfun$compileIncrementalTask$1(Defaults.scala:2311)
[error] 	at scala.Function1.$anonfun$compose$1(Function1.scala:49)
[error] 	at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:62)
[error] 	at sbt.std.Transform$$anon$4.work(Transform.scala:68)
[error] 	at sbt.Execute.$anonfun$submit$2(Execute.scala:282)
[error] 	at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:23)
[error] 	at sbt.Execute.work(Execute.scala:291)
[error] 	at sbt.Execute.$anonfun$submit$1(Execute.scala:282)
[error] 	at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:265)
[error] 	at sbt.CompletionService$$anon$2.call(CompletionService.scala:64)
[error] 	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[error] 	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
[error] 	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[error] 	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
[error] 	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
[error] 	at java.base/java.lang.Thread.run(Thread.java:833)

Scala 2 compiler starts to entangle magnolia macros of zio-json that leads to generation of invalid codecs that fail tests (that is why I don't use Java annotations for benchmarked data types at all, probably, worth to be reported here: https://github.com/scala/bug/issues).

@pjfanning
Copy link
Member

@plokhotnyuk I have https://github.com/pjfanning/jackson-scala-bench and this seems to run with Scala 2.13 and Scala 3.2.1.

@pjfanning
Copy link
Member

@cowtowncoder is there another way to ignore unknown properties for cases where users disable failure on unknown properties? @JsonIgnoreProperties(ignoreUnknown = true) is useful but with Scala, the generated Java classes for the Scala code can be complicated - there can be multiple Java class files created.
Something like a DeserializationFeature that can be set on the ObjectMapper would also be useful.

@pjfanning
Copy link
Member

closing in favour of FasterXML/jackson-databind#3721

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants