New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fake names based on ethnicity #1974
Comments
There currently is any way to do this, and TBH I don't think we should add such a feature. But you're free to implement your own provider from your own data. |
so? def test_random_elements():
from collections import OrderedDict
faker = Faker(["zh-CN"])
items = OrderedDict([("亚洲人", 0.25), ("非洲人", 0.25), ("欧洲人", 0.25), ("美洲人", 0.25)])
print(faker.random_elements(items)[0]) |
Interesting question and answer. Looks like you made a typo and meant "There currently isn't any way to do this" @fcurella? In the current AI day and age it's inevitable people will implement this (whether we like it or not). It's all Fake(r) anyway! |
Beside the fact that it would be considered borderline racist in some
cultures, where would you even get the data?
I think this would be better served by implementing your own provider.
…On Wed, Jan 31, 2024 at 5:15 PM Thijs Triemstra ***@***.***> wrote:
Interesting question and answer. Looks like you made a typo and meant
"There currently isn't any way to do this" @fcurella
<https://github.com/fcurella>?
—
Reply to this email directly, view it on GitHub
<#1974 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAAV4B2WIKEQDIX337NCWKTYRLGC3AVCNFSM6AAAAABB3724N2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMRQGE2TGMBQHE>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
This issue is stale because it has been open for 30 days with no activity. |
Brief summary of the issue goes here.
I am performing bias evaluation tests for my LLM (language models). While I was able to perform gender bias tests by generating fake male/female data using Faker, I couldnt find a way to generate names based on ethnicity (asian, white, black, etc) to perform ethnicity-based bias tests. Is there a way in Faker to do this?
The text was updated successfully, but these errors were encountered: