Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gamut mapping algorithm seems to be incorrect #91

Closed
facelessuser opened this issue May 4, 2021 · 11 comments
Closed

Gamut mapping algorithm seems to be incorrect #91

facelessuser opened this issue May 4, 2021 · 11 comments
Assignees

Comments

@facelessuser
Copy link
Collaborator

This is really more a question.

I know color.js defaults to gamut mapping instead of naive clipping, which is cool. And I understand that this helps give more accurate colors when converting from a large color space to a small color space. And color.js generally recommends against using simple clipping, but from what I've observed, gamut mapping seems to give choppy results in interpolation.

You can kind of see the solid bands and such instead of a smooth transition.

Screen Shot 2021-05-03 at 11 48 12 PM

We can see here that clipping gives much smoother results (it was easier to show both in this fashion). The second gradient is the clipped one:

Screen Shot 2021-05-03 at 11 47 16 PM

It seems in some cases that gamut mapping maybe isn't always the best. It seems that, for instance, if you are in a smaller gamut, and want to interpolate in a larger gamut, that maybe clipping is better. In the above scenario, you ramp up to the gamut limit, so it seems clipping will just keep it at that limit until it drops back into the gamut. If you gamut map in these situations, it seems it can create a discontinuity as the chroma compression can give a noticeably different color than what you were transitioning through.

I guess my question is whether I am missing something? Is there some kind of flaw in my thinking?

@facelessuser
Copy link
Collaborator Author

facelessuser commented May 7, 2021

I'm now convinced that there is a flaw in the gamut mapping algorithm. I spent some time actually going over the algorithm, and the issue (seems to me) to be related to the second condition in this loop:

while ((high - low > ε) && (error < base_error)) {

I removed that, I found that I get a smooth transition:

Screen Shot 2021-05-06 at 7 12 11 PM

I think the algorithm is kicking out too soon sometimes.

@facelessuser
Copy link
Collaborator Author

Cleans up gradients a lot in the color.js docs as well:

Before:

Screen Shot 2021-05-06 at 11 41 50 PM

After:

Screen Shot 2021-05-06 at 11 42 14 PM

I'm happy to create a pull request, but I'll wait for a response to see if maybe I'm missing something.

@facelessuser facelessuser changed the title Why should one gamut map in interpolation instead of clipping? Gamut mapping algorithm seems to be incorrect May 7, 2021
@svgeesus
Copy link
Member

I'm now convinced that there is a flaw in the gamut mapping algorithm. I spent some time actually going over the algorithm, and the issue (seems to me) to be related to the second condition in this loop:

@danburzo also noticed this: Gamut mapping: clarification of chroma reduction algorithm

I agree, and need to overhaul this. The idea is to find, on each chroma-reduction iteration, whether one is

  1. just outside the gamut boundary, and
  2. per-component clipping will not be visually noticeable (low deltaE2000)
  3. and thus, return the clipped value

@facelessuser
Copy link
Collaborator Author

If it helps, I refactored the algorithm and removed some noticeably dead code: https://github.com/facelessuser/coloraide/blob/master/coloraide/color/gamut/lch_chroma.py. It performs much better for me.

I have a number of examples here: https://facelessuser.github.io/coloraide/interpolation/. Obviously, some grayscale interpolation could use a higher resolution of stops as grayscale is more sensitive and requires a lower Delta E between the stops, but keeping the resolution a little lower provides a better response when people are goofing around and editing live as Python in the browser isn't going to be super fast 🙃 .

@facelessuser
Copy link
Collaborator Author

I did notice in the linked issue (#63) the argument about and how it doesn't actually trigger. That is also what I found, which is why I removed it.

            if abs(delta - 2) < EPSILON:
                # We've found the boundary
                break

TBH, it only triggers if moved up to be under if (delta - 2) < EPSILON:. Then the code triggers, but the difference seems minimal. It's more slight differences in rounding. It is also possible it is tailored to very specific cases that I have not tested.

I can see maybe playing with thresholds to maybe smooth some things further (lower Delta E target). But generally, things seem much smoother than they did before.

I have not compared results with the displayable(color) alternative yet.

@facelessuser
Copy link
Collaborator Author

I am starting to question whether the enhancement of adding all the delta E 2000 comparisons is worth the cost:

So here, I lower the Delta E distance threshold to get "closer" as I wanted a kind of best case scenario (even if it is expensive) so I lowered the threshold to a Delta E of 1 target, whether it is easily seen in the picture below or not is debatable, but it does seem that the results are a bit smoother. But then, when just comparing against whether the result is "displayable/in gamut" the results are just as good or maybe even a little bit smoother.

  • First: Current revised lch-chroma that I referenced earlier
  • Second: Same as first, but lowered Delta E target to 1
  • Third: Just checking if the color is in gamut (we do a single Delta E check just to see if we can shortcut, clip, and kick out before bisecting).

Screen Shot 2021-05-10 at 9 33 46 AM

Even if this is just a trick of the eye, I'm wondering if the cost of all the Delta E checks is worth it as those checks are expensive. I also acknowledge that there may be very specific cases where the Delta E checks give much better results, but this is just a first, quick evaluation.

Anyways, maybe some of this information is useful. Though, it is making me question whether Delta E checks really give a significant enough increase to warrant the expensive cost (at least in reference to using them while bisecting - using it as a shortcut doesn't seem too bad). I would definitely be interested in knowing if there are certain known, worst case scenarios.

@svgeesus
Copy link
Member

Is this for the case where, say in LCH, L is > 100 so always out of gamut regardless of chroma?

    # If flooring chroma doesn't work, just clip the floored color
    # because there is no optimal compression.
    floor = color.clone().set('lch.chroma', 0)
    if not floor.in_gamut():
        return floor.fit(method="clip").coords()

@svgeesus
Copy link
Member

DeltaE 2000 is indeed expensive; it is optimized for giving comparable, consistent values for small differences regardless of where you are in the colorspace. So basically it is compensating for known deficiencies of Lab like the blue hue nonlinearity, the exaggeration of deltaE76 for high chroma colors, etc.

Using a better colorspace, such as OKLab, also gives you a less expensive deltaE because it is just euclidean distance. I'm examining that now.

@facelessuser
Copy link
Collaborator Author

Is this for the case where, say in LCH, L is > 100 so always out of gamut regardless of chroma?

I guess if we are strictly talking about colors that are already kind of in the acceptable CSS range, though I guess any color past that range would qualify.

I think I've expressed this before, but generally fitting a color in one space for another space isn't always perfect, especially if the conversion isn't "perfect". Even though this color is clipped perfectly in sRGB, the trip back to LCH (even with the much better calculated matrices I'm using now), still can give you something a little off, which can be exaggerated in other spaces.

>>> Color('lch(100% 0 0)').fit('srgb').convert('srgb').coords()
[0.9999999999999994, 1.0000000000000002, 0.9999999999999997]
>>> Color('lch(100% 0 0)').fit('srgb').convert('hsl').coords()
[137.14285714285714, 200.0, 99.99999999999997]
>>> Color('lch(100% 0 0)').fit('hsl').convert('hsl').coords()
[140.0, 75.0, 99.99999999999996]

Anyways, that's another topic, and you can probably really only solve that by just rounding stuff off. My real concern was more not wasting time going through the bisecting if we already know it isn't worth optimizing.

DeltaE 2000 is indeed expensive; it is optimized for giving comparable, consistent values for small differences regardless of where you are in the colorspace. So basically it is compensating for known deficiencies of Lab like the blue hue nonlinearity, the exaggeration of deltaE76 for high chroma colors, etc.

Using a better colorspace, such as OKLab, also gives you a less expensive deltaE because it is just euclidean distance. I'm examining that now.

Very cool. For usability, if the choice is DeltaE 2000 or just checking if in Gamut, unless there are specific cases where things are just way off, I'm thinking just checking if in gamut may be "good enough". But if there are scenarios where Delta E does much, much better, and there was a far better, less expensive DeltaE, that would definitely be really good.

@facelessuser
Copy link
Collaborator Author

I see that color.js has been updated. Gamut mapping looks much better now 🙂.

I also did more testing, and I did find some cases where using simple gamut checks was not ideal, so I think I will stick with Delta E as well 🙂. It may be more expensive, but I agree it gives the overall best results.

@svgeesus
Copy link
Member

I just updated it again and, if the gamut mapping space is OKLCH then it uses the (better, also cheaper) deltaEOK rather than deltaE2000. (It also adjusts the value for a JND, in that case, from 2 to 0.02).

The default space for gamut mapping is also OKLCH, now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants