Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pixels panics when using egui #351

Open
tseli0s opened this issue Mar 25, 2023 · 6 comments
Open

pixels panics when using egui #351

tseli0s opened this issue Mar 25, 2023 · 6 comments
Labels
info needed More information is required

Comments

@tseli0s
Copy link
Contributor

tseli0s commented Mar 25, 2023

For some reason, when I simply try to create a UI using egui, I get this panic:

[2023-03-25T22:10:29Z ERROR wgpu::backend::direct] Handling wgpu errors as fatal by default
thread 'main' panicked at 'wgpu error: Validation Error

Caused by:
    In a RenderPass
      note: encoder = `pixels_command_encoder`
    In a set_scissor_rect command
    Scissor Rect { x: 0, y: 0, w: 1024, h: 768 } is not contained in the render target Extent3d { width: 256, height: 192, depth_or_array_layers: 1 }
' (...)

The examples work fine, but this doesn't, although I basically copied half of the code from the examples.

If I reduce the size of the window in the screen descriptor to 256x192 (As the error seems to want me to do), it works, but then only that part of the window is actually used, the rest is left over.

@parasyte
Copy link
Owner

parasyte commented Mar 26, 2023

The code in question might need to be shared to get a proper diagnosis. But in general, this kind of thing can happen when your pixel buffer is larger than the window surface. (I'm not certain if this requirement is documented properly? We could also add assertions in various places to help users troubleshoot these issues.)

The pixel buffer size is set with Pixels::resize_buffer(), but it is usually expected to be a fixed size. You would set it once when calling the constructor, then never change it again.

The surface size is set with Pixels::resize_surface() and this needs to be done every time the window is resized. The example does that for instance here:

// Resize the window
if let Some(size) = input.window_resized() {
if let Err(err) = pixels.resize_surface(size.width, size.height) {
log_error("pixels.resize_surface", err);
*control_flow = ControlFlow::Exit;
return;
}
framework.resize(size.width, size.height);
}

Also note that the initial window size is passed to the constructor via the Surface type:

let window_size = window.inner_size();
let scale_factor = window.scale_factor() as f32;
let surface_texture = SurfaceTexture::new(window_size.width, window_size.height, &window);
let pixels = Pixels::new(WIDTH, HEIGHT, surface_texture)?;

Together, these keep the wgpu surface in sync with the window. That just leaves the other requirement which is the pixel buffer cannot be set smaller than the surface. We handle that by setting a minimum size on the window. In other words, the user is not allowed to resize the window to anything smaller than the pixel buffer (which you recall is a fixed size):

WindowBuilder::new()
.with_title("Hello Pixels + egui")
.with_inner_size(size)
.with_min_inner_size(size)
.build(&event_loop)
.unwrap()


Changing the library to allow a surface size smaller than the pixel buffer is probably possible, but as you described yourself, it would be undesirable!

@parasyte parasyte added the info needed More information is required label Mar 26, 2023
@tseli0s
Copy link
Contributor Author

tseli0s commented Mar 26, 2023

Both the window and the screen descriptor are initially created using 1024x768, I don't know why would this go wrong.

Here's me creating the UI handler (Basically where all the UI stuff is done):

pub fn new(
        event_loop: &EventLoopWindowTarget<()>,
        scale_factor: f32,
        pixels: &pixels::Pixels,
    ) -> Self {
        let max_texture_size = pixels.device().limits().max_texture_dimension_2d as usize;

        let ctx = Context::default();
        let mut state = egui_winit::State::new(event_loop);
        state.set_max_texture_side(max_texture_size);
        state.set_pixels_per_point(scale_factor);
        let screen_descriptor = ScreenDescriptor {
            size_in_pixels: [WINDOW_WIDTH as u32, WINDOW_HEIGHT as u32],
            pixels_per_point: scale_factor,
        };
        let renderer = Renderer::new(pixels.device(), pixels.render_texture_format(), None, 1);
        let textures = TexturesDelta::default();

        Self {
            ctx,
            ui_state: state,
            screen_descriptor,
            renderer,
            paint_jobs: Vec::new(),
            textures,
            panels_shown: true,
        }
    }

And here's the resize event being handled:

WindowEvent::Resized(p) => {
    window.set_title(&format!("({}x{})", p.width, p.height));
                        
    // Resize pixels basically
    graphics_mgr.resize(p.width, p.height);
    // Then resize the UI
    ui.resize(p.width, p.height);
},

@parasyte
Copy link
Owner

The egui screen descriptor size is being initialized from constants in the sample code, but it should be initialized to match the default window size. Are you sure the constants are correct?

The minimal-egui example passes the size dimensions explicitly to the Framework constructor.

@tseli0s
Copy link
Contributor Author

tseli0s commented Mar 26, 2023

but it should be initialized to match the default window size.

But isn't this the size I create the window with?
For reference, here's me creating the window:

let window = WindowBuilder::new()
            .with_inner_size(LogicalSize::new(WINDOW_WIDTH, WINDOW_HEIGHT))
            .with_max_inner_size(LogicalSize::new(8192, 8192))
            .with_min_inner_size(LogicalSize::new(320, 240))
            .with_window_icon(Some(
                Icon::from_rgba(img.into_vec(), icon_width, icon_height).unwrap(),
            ))
            .with_transparent(true)
            .build(&event_loop)
            .unwrap_or_else(|e| {
                log::error!("Unable to create window: {}", e.to_string());
                abort()
            });

@tseli0s
Copy link
Contributor Author

tseli0s commented Mar 26, 2023

I ended up switching to ImGui, seems to work fine for now. I'll leave the issue open in case someone else runs into the same issue.

@parasyte
Copy link
Owner

But isn't this the size I create the window with?

Not necessarily! The LogicalSize is internally scaled by the dpi scale factor. It is recommended that you query the physical window inner size and use that. The minimal-egui example does that here:

let window_size = window.inner_size();
let scale_factor = window.scale_factor() as f32;
let surface_texture = SurfaceTexture::new(window_size.width, window_size.height, &window);
let pixels = Pixels::new(WIDTH, HEIGHT, surface_texture)?;
let framework = Framework::new(
&event_loop,
window_size.width,
window_size.height,
scale_factor,
&pixels,
);

I think your code has probably diverged too much from the example reference, which is why it doesn't work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
info needed More information is required
Projects
None yet
Development

No branches or pull requests

2 participants