You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am not requesting this for my own benefit, but if faster CI performance is of interest, I would suggest using the official cache action or, more aggressively, zero installs.
The text was updated successfully, but these errors were encountered:
Most of the time goes towards building the dictionaries.
I have thought about keeping the {dictionary}.txt.gz files and only building them if their source changed. This is how the en_US dictionary is handled.
There are a few steps necessary to do this:
Create a .js equivalent to shasum because it is not available on all platforms.
Add checksum files and scripts to all dictionaries
Create a workflow action that will build dirty dictionaries to ensure that the *.txt.gz files are up to date.
This is needed because I think it is important to enable people to add words from the GitHub UI without needing to clone and build locally.
I have been moving towards this model because it make publishing easier and quicker.
Oh, that makes sense that the dictionaries are the bottleneck on account of their size. What does building them entail exactly? For step 1, my instinct would be to use hashFiles. Manually compressing the dictionary files seems at face value like more work but the same result as using zero installs. I obviously have vastly less context on this code base than you do, so it won't surprise to learn I am missing some things.
I am not requesting this for my own benefit, but if faster CI performance is of interest, I would suggest using the official cache action or, more aggressively, zero installs.
The text was updated successfully, but these errors were encountered: