Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow TEST_ENV_NUMBER to support duplicate file groups #943

Merged
merged 6 commits into from Apr 3, 2024

Conversation

jaydorsey
Copy link
Contributor

@jaydorsey jaydorsey commented Mar 30, 2024

This is a follow up to #940

After that PR was merged, I was writing a little helper script that basically let me do <script> <file> and it would basically duplicate that file name sysctl -n hw.ncpu, and then run parallel_rspec w/ the --allow-duplicates flag.

What I discovered was that the TEST_ENV_NUMBER was set to the same value for each file. I traced the root behavior back to the small change I made in this PR

Checklist

  • Feature branch is up-to-date with master (if not - rebase it).
  • Added tests
  • Added an entry to the Changelog if the new
    code introduces user-observable changes.
  • Update Readme.md when cli options are changed

if options[:allow_duplicates]
proc { start += 1 }
else
->(groups, group) { groups.index(group) }
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the original behavior. As I was writing this, I was wondering:

  1. Could we use the proc/counter as a default for everything?
  2. Is there something else I'm missing like "spec order randomization" where it's not as simple as just using an incrementing counter (e.g. maybe we don't even need the proc; we could just increment)

If it's not as simple as this proc (or a basic counter) I'm open to other suggestions; the only other idea I had was to track the index in a hash with a counter so I could see when I've re-used a group already.

If none of these ideas would work, then maybe --allow-duplicates should only work with a single file? This would be OK for my primary use case (and still useful I think)

spec/parallel_tests/cli_spec.rb Show resolved Hide resolved
@jaydorsey jaydorsey marked this pull request as ready for review March 30, 2024 22:13
lib/parallel_tests/cli.rb Outdated Show resolved Hide resolved
Copy link
Owner

@grosser grosser left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm so I think the issue is that the groups all look the same to the code since we use index
so maybe instead of the current code we could iterate the groups with each_with_index and get our index that way ?

@jaydorsey
Copy link
Contributor Author

hmm so I think the issue is that the groups all look the same to the code since we use index so maybe instead of the current code we could iterate the groups with each_with_index and get our index that way?

Yeah that's a good point. I wasn't 100% confident that I understood what was going on inside of the execute_in_parallel block so I went down the rabbit hole of trying to shim in some new behavior, conditionally.

I just pushed up another branch that uses Parallel#map_with_index that looks right in my head, but I'm also having trouble visualizing/tracing/wrapping my head around the block yields.

I need to test it on my other machine to make triple-sure it's doing what I think it's doing (will do that later today)

@@ -57,8 +57,8 @@ def execute_in_parallel(items, num_processes, options)
Tempfile.open 'parallel_tests-lock' do |lock|
ParallelTests.with_pid_file do
simulate_output_for_ci options[:serialize_stdout] do
Parallel.map(items, in_threads: num_processes) do |item|
result = yield(item)
Parallel.map_with_index(items, in_threads: num_processes) do |item, index|
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If I understand all of this correctly, the item that gets yielded on the next line is the group in the execute_in_parallel call on line 84

If that's the case, I wonder if naming these to groups/group instead of items/item makes it easier to understand?

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

execute_in_parallel is a generic method that does not always get groups passed
for example execute_in_parallel(runs, runs.size, options)
so afaik this should stay items

Parallel.map(items, in_threads: num_processes) do |item|
result = yield(item)
Parallel.map_with_index(items, in_threads: num_processes) do |item, index|
result = yield(item, index)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cli.rb:376 will need a change to accept the index

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

... hmm I guess no it does not since it ignores the extra arg 👍

test_results = execute_in_parallel(groups, groups.size, options) do |group|
run_tests(group, groups.index(group), num_processes, options)
test_results = execute_in_parallel(groups, groups.size, options) do |group, index|
test_env_number = options[:first_is_1] ? index + 1 : index
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the first_is_1 logic was not here before, so it should get removed or be removed from some other place ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're right, good catch. I didn't trace the test_env down far enough; I see it in test/runner.rb now.

I've updated the PR to remove line 85 and just use index.

@jaydorsey
Copy link
Contributor Author

jaydorsey commented Apr 1, 2024

I've been testing this with a little helper script I wrote to help me run 1 test multiple times in parallel:

(Edit: I've made this part of my dotfiles, with some tweaks here)

#!/usr/bin/env ruby
# frozen_string_literal: true

# Useful for running a single test multiple times, locally
#
# 1. Save this file locally as something like `bin/runit`, or put it in your path
# 2. chmod +x bin/runit
# 3. Add it to your gitignore and/or gitignore_global file
#
# Usage:
#
#    bin/runit spec/models/user_spec.rb
cpus  = (`sysctl -n hw.ncpu`.to_i - 1)
files = "#{ARGV[0]} " * cpus
exec("bin/parallel_rspec --allow-duplicates --verbose-command --prefix-output-with-test-env-number -- #{files}")

I've tested it with and without the first-is-1 flag and it all runs exactly like I expect:

  1. Test numbers appear correct w/ and w/out the flag
  2. Zero database conflicts (which it had before) because it's using the correct/distinct envs for the same file now

(I get a lot of activerecord errors before, because it was using the same test env number/database)

Before

All the same test env (not intended/desired behavior):

image

These would happen pretty often:

image

After

With this PR, test envs all look correct

image

@@ -7,6 +7,7 @@
### Added

### Fixed
- The `--allow-attributes` flag now runs duplicate tests in different groups
Copy link
Owner

@grosser grosser Apr 3, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- The `--allow-attributes` flag now runs duplicate tests in different groups
- The `--allow-duplicates ` flag now runs duplicate tests in different groups

Copy link
Owner

@grosser grosser left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

❤️ ended up being a great small bugfix :D

@grosser grosser merged commit 2ae6606 into grosser:master Apr 3, 2024
13 checks passed
@grosser
Copy link
Owner

grosser commented Apr 3, 2024

4.6.1

@jaydorsey jaydorsey deleted the jaydorsey/groups branch April 4, 2024 13:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants