Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Strange (bad) pixel accuracy when resolution being used is 640 x 480 (and other "big" values) #144

Closed
friguron opened this issue Mar 4, 2021 · 4 comments
Labels
invalid This doesn't seem right question Usability question

Comments

@friguron
Copy link

friguron commented Mar 4, 2021

Hi all,

I'm really newbie in Rust (with a historical background in many other languages around, being C the one I started with decades ago). As of today my main everyday language is Swift... Whatever, after this short presentation, let's get to the point:

The thing is I have some 2D/retro ideas to develop in Rust, and I've come accross "pixels" (among the rest of the pixel related troupe of crates).

What is the problem I've detected? Somewhat strange: Some pixels are painted "double width/height" when multiples of certain value are being used (different values depending on the selected window resolution). As it's kind of hard to explain, I've setup a demo project for people to test.

I've simplified my (aesthetically) faulty program into this easy git project: test project (Please, ignore warnings concerning unused variables, bad rust skills, etc... I know they're there).

As of now I've downgraded it to a main loop and a mouse silly painter... In fact the mouse drawing part was optional but I left it there because of the potential it adds for debugging.
I've also drawn this pattern onto the top left part of my pixel buffer to test it without mouse interference, just cold putpixels:

1.1.1.1.1.1.1.1.1.1
.1.1.1.1.1.1.1.1.1.
1.1.1.1.1.1.1.1.1.1
.1.1.1.1.1.1.1.1.1.
1.1.1.1.1.1.1.1.1.1
.1.1.1.1.1.1.1.1.1.
1.1.1.1.1.1.1.1.1.1
.1.1.1.1.1.1.1.1.1.
1.1.1.1.1.1.1.1.1.1
.1.1.1.1.1.1.1.1.1.
1.1.1.1.1.1.1.1.1.1
.1.1.1.1.1.1.1.1.1.

What am I typically getting? Different groups of situations:

a) Resolutions that show faulty pixel accuracy when set up with no need to resize the window to trigger the faulty render
(of course, when you resize these windows, you also get the faulty effect, but then with bigger size)
typically, 640 x 480 (and bigger sizes) trigger this mode.

b) Resolutions that show pixel perfect accuracy when set up, but show faulty behaviour when the window is resized (no matter if bigger or smaller).
320 x 200 falls into this category.

c) Resolutions that show perfect pixel accuracy when set up AND after resizing their window (160 x 100, for example).
No matter how big you resize the window, you get proportional pixels, unlike in the other cases.

In the initial lines of the main function of my code you can find my resolution setup function... I hope everything is properly setup there...

All this started to smell fishy (and behave not as intended) with the "simple" 640 x 480 one, where every 4 pixels, (column or row, it doesn't matter), became "doubled pixels" when lines are drawn... (or when putpixels are shown, as in the top left corner).

If you enlarge a 640 x 480 window you can see the effect with more detail (specially if you've also drawn with the mouse)

OTOH, if you play booting with 320 x 200, you'll find everything works PIXEL PERFECT at boot time (when the window is not resized, I made my screen captures to double check it), but then the pixel accuracy becomes faulty on the X axis if you enlarge the window (the Y axis shows perfectly uniform pixel heights, no matter the window size).

Finally, if you play with 160 x 100 (or close to these small values), you get pixel perfect resolutions at boot time, AND when you resize the window also, so that's the mistery I'm dealing with...


So:

1.- Is it me? I can assume I'm newbie not only using Rust but also using all the crates involved (Winit, Pixels, wgpu, etc...)
and maybe I've forgotten something in the init part of my code for the expected pixel perfect behaviour?

2.- Is it my Hackintosh/Nvidia convo ? (I can't test in on windows/linux as of now), I suppose the wgpu crate is using Metal in the low level layer... How can I know it? Is wgpu using OpenGL maybe?
Whatever, no other program shows this behaviour whatsoever. I have to say I've also checked my program with my job's actual mac mini, and Iris driver also shows the faulty behaviour, so...

3.- Is it Pixels itself? Can it be some underlying crate instead? (wgpu being the main candidate?)

I hope everything is clear (I'm aware some explanations can be a bit dark), but I hope people can play with my program to test it and then get my point easily.

I'm open to further testing/questions if someone can shed some light and/or insightful ideas for me to test.

Thanks for your time and greetings!!

PS: As I was checking my test project code I came accross the following fact... when the faulty resolution/axis appears, said axis has an original resolution value > 256... Can it be something related to 1-byte crossing boundaries somewhere? (crazy idea)

@parasyte parasyte added the question Usability question label Mar 4, 2021
@parasyte
Copy link
Owner

parasyte commented Mar 4, 2021

Hi! Thanks for providing the git repo for troubleshooting. It has made my job very easy, I appreciate it.

The issue you are running into is the window manager stretching the surface texture to fit the window's inner size. By default, the window is created with 1600x1200 inner resolution on macOS (at least on my laptop). You try to correct this by calling window.set_inner_size() ... but this function resizes the window asynchronously. So calling window.inner_size() immediately afterward returns the old size, which is not an integer scale ratio of your pixel buffer size. This non-integer ratio is what leads to the blurriness in the presented image.

Yeah, those weird pixel artifacts are just from bilinear scaling... it's blurriness. :)

I fixed this by building the window with the appropriate inner size (the same way the winit example does it).

The other thing I fixed is the resize handler. It has two problems:

  1. It isn't resizing the surface texture renderer maintained by pixels, so the same stretching will happen when you resize the window.
  2. Even if the surface texture is properly resized, it will not be redrawn unless needs_redraw is set.

Altogether, here are the changes that improves your integration with winit:

diff --git a/src/main.rs b/src/main.rs
index 2bcacf7..ea19450 100644
--- a/src/main.rs
+++ b/src/main.rs
@@ -281,16 +281,15 @@ fn main() -> Result<(), Error>   {

     let event_loop = EventLoop::new();

+    let default_size = LogicalSize::new(config.selected_resolution.x, config.selected_resolution.y);

     let window = WindowBuilder::new()
         .with_title(format!("RUST WGPU PIXELS TEST") )
+        .with_inner_size(default_size)
         .build(&event_loop)
         .unwrap();


-    let default_size = LogicalSize::new(config.selected_resolution.x , config.selected_resolution.y);
-    window.set_inner_size(default_size);
-
     let window_size = window.inner_size();
     let surface_texture = SurfaceTexture::new(window_size.width, window_size.height, &window);

@@ -406,10 +405,11 @@ fn main() -> Result<(), Error>   {
             }


-            if input.window_resized() != None
+            if let Some(window_size) = input.window_resized()
             {
-                let window_size = window.inner_size();
                 window.set_title(format!("orig:[{},{}], resized:[{} x {}]", original_x, original_y, window_size.width, window_size.height).as_str());
+                pixels.resize(window_size.width, window_size.height);
+                needs_redraw = true;
             }

             if(needs_redraw)

And now it looks pretty dang good, and resizing works properly.

@parasyte
Copy link
Owner

parasyte commented Mar 4, 2021

Also, if you want to support the mouse cursor coordinates when the window is resized (black border around the image) you can use pixels.window_pos_to_pixel(). Below is an example:

            if input.mouse_pressed(0) {
                if let Some(pos) = input.mouse() {
                    let pos: (usize, usize) = pixels.window_pos_to_pixel(pos)
                        .unwrap_or_else(|pos| pixels.clamp_pixel_pos(pos));

                    previous_mouse_x = pos.0 as u16;
                    previous_mouse_y = pos.1 as u16;
                }
            }

            if input.mouse_held(0) {
                if let Some(pos) = input.mouse() {
                    let pos: (usize, usize) = pixels.window_pos_to_pixel(pos)
                        .unwrap_or_else(|pos| pixels.clamp_pixel_pos(pos));

                    let corr_x = pos.0 as u16;
                    let corr_y = pos.1 as u16;

                    Canvas::draw_line( corr_x, corr_y, previous_mouse_x, previous_mouse_y, config.fg_index_color,&mut indexed_pixels, &config);
                    needs_redraw=true;
                    previous_mouse_x = corr_x;
                    previous_mouse_y = corr_y;
                }
            }

@friguron
Copy link
Author

friguron commented Mar 4, 2021

Fantastic answer, thanks a lot. You hit the nail.

First of all, and as it was expectable, the problem was me not knowing the nuances of the "default size" and "inner size" setup at the inititalization lifecycle... Now that you explained it, my problems went away with just the first part of the fix.

BTW, I would say my pixel artifact problems weren't blurring artifacts... They were "closest neighbour rescaling" ones, or something similar... As they already appeared without rescaling anything...
Not exactly the same thing as proper blurring artifacts when enlarging, which appear if I resize my window. In fact this behaviour is somewhat expected...

The thing is I can remember I arrived to my solution after some creative (?) simplification of the conway example... Apparently I got lost somewhere inbetween :)

OTOH, my resize window code was there just to change the window title and nothing more. My original behaviour is the intended one, I won't add your code, as it adds all that extra black space which I don't need (at least now).
I already expected the blurriness of a scaled bitmap when enlarging mine, so I won't act against it either... Maybe, in a future big refactor, I'll change the render logic to add some crystal clear 2x or 3x pixel rendering mode, etc... But again, that will be in a really future timeline on my plan, not now. All good.

OTOH 2, as I still didn't study all the pixels' crate API in depth, I didn't realize it already offered some coordinate translation for me to use. I'll adapt my code to use them as the code produced is really cleaner.

Again, thanks for your support.

Now, if some other user comes here looking for help, they will find all this insight to properly use your crate.

Greetings!!

@parasyte
Copy link
Owner

parasyte commented Mar 5, 2021

Is it my Hackintosh/Nvidia convo ? (I can't test in on windows/linux as of now), I suppose the wgpu crate is using Metal in the low level layer... How can I know it? Is wgpu using OpenGL maybe?

You had this other question that I neglected to answer, sorry about that! I think I covered everything else, though.

The short answer is that wgpu only supports the Metal backend on macOS (that includes Hackintosh).

The long answer is that wgpu has support for selecting a backend based on a user-definable preference (see PixelsBuilder::wgpu_backend), but it also has its own builtin precedence for backend selection. However, this is really only used on Linux or Windows, because it only supports the Metal backed on macOS. You can see this in the build script: https://github.com/gfx-rs/wgpu/blob/ad2e1b6ff8e7375d640b4c78b5a495dfa3dcf0a1/wgpu-core/build.rs#L13-L18

This is fine for most users because Metal is very well-supported and OpenGL has been deprecated on macOS since 10.14. It appears that the highest version of OpenGL ever supported on macOS was 4.1, which is about 10 years old. The gfx_backend_gl crate documentation also explicitly calls out that it doesn't support Apple platforms... Although that is a semi-recent change since they switched the surface manager to EGL. (The tracking ticket also has some additional context on the state of OpenGL support in wgpu, including no OpenGL support on Windows.)

Anyway, yeah. That should be a fairly thorough answer.


BTW, I would say my pixel artifact problems weren't blurring artifacts... They were "closest neighbour rescaling" ones, or something similar... As they already appeared without rescaling anything...
Not exactly the same thing as proper blurring artifacts when enlarging, which appear if I resize my window. In fact this behaviour is somewhat expected...

These artifacts are from a process that is roughly equivalent to a gaussian blur because scaling an image distorts frequency information by definition. This is clearly a technicality, but that's why I called it blurring.

Let's test a theory: This may actually be a difference in driver implementation or other, but what I see from the provided example code is definitely not nearest neighbor. Here's a screen capture of the checkerboard pattern in the window scaled up 5x (with nearest neighbor) for detail:

bad-scaling

What would it look like with nearest neighbor scaling? Well, let's find the original ratio of the surface to the pixel buffer. The window is created with a 1600x1200 inner size, this is in physical pixel units. Divide the pixel buffer size into this, and we get a ratio of 2.5. If I create a 20x20 black and white checkerboard pattern in GIMP and scale it to 250% with no interpolation (i.e. nearest neighbor) then scale that result 4x, we get this simulation of the scaled checkerboard pattern:

nearest-neighbor

The difference is pretty clear, visually. A more accurate description is that nearest neighbor will not "imagine" new colors by interpolating between pixels. It will simply choose one color or the other, whichever neighbor is nearest to the interpolated color. This causes distortion of the shape of the image, e.g. some black squares are big and some are small, but not distortion of colors. The only colors in the simulation are black and white. There are several shades of gray in the original screenshot.


So that's not the end of the story. What is shown in the screenshot is actually a result of multiple scaling steps. First, pixels scales with nearest neighbor (to retain sharp pixel edges) but it does so by carefully adding a border if necessary to maintain sharpness even when the scaling ratio is not a whole number. There was a bug fixed recently that addressed an issue with the creation of a surface that was not an integer scale size of the pixel buffer. This bug would lead to the resulting image looking like the simulated scaling. And indeed, if I remove the asynchronous window resize from the code, this is produced:

bad-scaling-2

It looks identical to the simulation! This shouldn't be surprising... But what about all of the gray pixels from the first screenshot? Well, that's caused by linear scaling that the macOS window manager does. The asynchronous widow resize reduced the image size from 1600x1200 to 1280x960, which is a ratio of 0.8. Here's that wonky simulated image again where the checkerboard pattern was scaled up 250% with nearest neighbor, but this time I've scaled it down by 80% using linear interpolation. Then scaled that result up by 5x (nearest neighbor) so we can see all the pixels:

nearest-neighbor-2

Looks like a close enough approximation, to me. So that confirms what I was seeing, at least in part; bilinear scaling is absolutely involved, and it is blurring pixels. The blur makes all kinds of shades of gray.

This may actually be very different from what you were seeing. For all I know, maybe your Hackintosh was doing nearest neighbor in the window manager scaling step, too.

The most likely scenario, though, is that we are both right. 😅

@parasyte parasyte closed this as completed Mar 9, 2021
@parasyte parasyte added invalid This doesn't seem right question Usability question and removed question Usability question labels Apr 17, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
invalid This doesn't seem right question Usability question
Projects
None yet
Development

No branches or pull requests

2 participants