Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dispose tester iFrame in browser mode (#5595) should be configurable. #5655

Closed
4 tasks done
xenobytezero opened this issue May 2, 2024 · 8 comments
Closed
4 tasks done
Labels
feat: browser Issues and PRs related to the browser runner p3-significant High priority enhancement (priority)

Comments

@xenobytezero
Copy link

Clear and concise description of the problem

After updating Vitest to 1.5.2 I found that it was no longer possible to use DevTools in the browser to look at the test files and add breakpoints etc, since #5595 now destroys the iFrame after the tests have been run.

Suggested solution

Make this behaviour configurable as discussed in the ticket.

We might not want to always dispose the iframe, for example, if users are expecting to view the actual browser render (cf. #5568), but I think that can be implemented as a dedicated option (like browser.retainWindow or something) and disposing them by default for stability seems right.

Alternative

No response

Additional context

After reading the documentation and mutliple tickets raised here, I believe that the only way to debug tests in Browser mode is via DevTools on the controlled browser instance? If this is true, then #5595 as an implementation is slightly more confusing.

What is the current "correct" way to debug/examine/step through tests in Browser mode, assuming #5595 is correct?

Validations

@sheremet-va sheremet-va added feat: browser Issues and PRs related to the browser runner p3-significant High priority enhancement (priority) and removed enhancement: pending triage labels May 2, 2024
@hi-ogawa
Copy link
Contributor

hi-ogawa commented May 3, 2024

Thanks for feedback and suggestions. Can you elaborate what your debugging workflow was like? (I don't know what is/was the "correct" way as I don't use debugger much).

From what I can guess, I suppose you were using "Source > Page" in devtools to pick up source files (which doesn't exist since v1.5.1):

Show screenshots
  • v1.5.0

image

  • v1.5.1

image

And overall steps would be something like:

  • run vitest (maybe file filter) with browser.headless: false (default)
  • wait for tests to finish first run
  • on browser, open devtools and find source file and set breakpoint
  • on terminal, press "r" to rerun
  • browser hits breakpoint

Assuming this is what you were doing, it looks like it's still possible to put debugger statement directly in test files to replace 3rd step without going through "Source > Page" tabs.

import { expect, it } from 'vitest'

it('basic', async () => {
  debugger;
  expect(typeof window).toBe("object");
})

Just for a reference, this is what I tested https://github.com/hi-ogawa/reproductions/tree/main/vitest-browser-debugger

@xenobytezero
Copy link
Author

From what I can guess, I suppose you were using "Source > Page" in devtools to pick up source files (which doesn't exist since v1.5.1):

Generally I was using Ctrl + P in Chrome DevTools. When the iFrames are destroyed, it removes those files from the list.

And overall steps would be something like:

Yes these are the steps I am using.

I had found the workaround about adding debugger statements, but that seems like an unusual workaround to me. In general adding debugger statements seems like a code smell to me even in this debugging workflow.

Ideally I would have the behaviour configurable, like it describes in the original ticket, that can be configured using developement/production modes, the same way as the rest of my config file.

@sheremet-va
Copy link
Member

Ideally I would have the behaviour configurable, like it describes in the original ticket, that can be configured using developement/production modes, the same way as the rest of my config file.

I don't think it makes any sense to base it on dev/prod because the reason why we remove the iframe has nothing to do with it. We remove the iframes to free the memory because it can affect your other tests and stop WebSocket connections since they have a limit.

I think debugger is a very good workaround for this at the moment. Ideally, I would love for us to display "snapshots" of different test stages. For now, I think the solution would be to keep the last iframe open until we can provide a better UX.

@xenobytezero
Copy link
Author

I think debugger is a very good workaround for this at the moment. Ideally, I would love for us to display "snapshots" of different test stages. For now, I think the solution would be to keep the last iframe open until we can provide a better UX.

Can I propose leaving iFrames with failed tests open? That would at least shorten some of the iteration time to finding the cause of issues?

@sheremet-va
Copy link
Member

Can I propose leaving iFrames with failed tests open? That would at least shorten some of the iteration time to finding the cause of issues?

This doesn't solve the problem if all tests have failed.

@xenobytezero
Copy link
Author

Fair, if the failures caused by memory/WebSocket issues are not obvious.

I will use the workaround for now. Is there a reason why just having the option as part of the config was dropped? There is nothing on the original ticket to describe what happened there even though it's part of the feature description?

@sheremet-va
Copy link
Member

Because removing the iframe is required to function properly right now.

@xenobytezero
Copy link
Author

No problem, obviously missing some context here, will close the ticket.

Thanks for the responses.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feat: browser Issues and PRs related to the browser runner p3-significant High priority enhancement (priority)
Projects
None yet
Development

No branches or pull requests

3 participants