You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
While working with millions (up to billions) of structs in a big slice, I found myself in the position of wanting to remove duplicates (due to database limitations related to transactions and duplicate updates/inserts).
This would, at given speeds, take hours to process. Before building more complex structures using hashes etc., I'd prefer to just run my task in parallel - given I don't care about order, this isn't that big of a task.
Therefore my proposal is to implement this: func UniqByParallel[T comparable](slice []T, numThreads int, comparator func(item T, other T) bool) []T
I would also propose my (very simplistic and not optimised) version of this, but first I would like to know what others think and what problems they might see that I don't.
PS: I searched for other libraries and solutions, but didn't find easy alternatives - maybe someone knows a thing :)
The text was updated successfully, but these errors were encountered:
While working with millions (up to billions) of structs in a big slice, I found myself in the position of wanting to remove duplicates (due to database limitations related to transactions and duplicate updates/inserts).
This would, at given speeds, take hours to process. Before building more complex structures using hashes etc., I'd prefer to just run my task in parallel - given I don't care about order, this isn't that big of a task.
Therefore my proposal is to implement this:
func UniqByParallel[T comparable](slice []T, numThreads int, comparator func(item T, other T) bool) []T
I would also propose my (very simplistic and not optimised) version of this, but first I would like to know what others think and what problems they might see that I don't.
PS: I searched for other libraries and solutions, but didn't find easy alternatives - maybe someone knows a thing :)
The text was updated successfully, but these errors were encountered: