New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add mysql queries #195
Comments
I would love to take a crack at this as an opportunity to do more with SQL & Go. What would be the primary use case of fake SQL queries? |
So we seed our local databases for testing and to have data to be able to pass to the front end so front end developers have data they can use to visualize what the website looks like with actual data in it. |
So, for example, we would have a function named, say, type PersonEntry struct {
Fname string `fake:"{firstname}"`
Lname string `fake:"{lastname}"`
Age int `fake:"{number}"`
}
var p PersonEntry
gofakeit.Struct(&p) and say the function call was INSERT INTO PEOPLE VALUES('<p.Fname>','<p.Lname>',<p.Age>); Is this what you're looking for? Basically generating various queries based on structs populated with fake data? |
So I would probably think about it in terms of batch inserts. INSERT INTO yourtable VALUES (1,2), (5,5), ...; So just like the json function they would pass in a MysqlOptions and one of the fields would be an array of fields. https://github.com/brianvoe/gofakeit/blob/master/json.go#L58 |
I get it! That makes sense. I will give this a shot as soon as I have some free time this week. |
This is what I've got so far: type SQLOptions struct {
Table string `json:"table" xml:"table"`
EntryCount int `json:"entry_count" xml:"entry_count"`
Fields []Field `json:"fields" xml:"fields"`
}
func SQLInsert(so *SQLOptions) ([]byte, error) {
return sqlInsertFunc(globalFaker.Rand, so)
}
func sqlInsertFunc(r *rand.Rand, so *SQLOptions) ([]byte, error) {
if so.Table == "" {
return nil, errors.New("must provide table name to generate SQL")
}
if so.Fields == nil || len(so.Fields) <= 0 {
return nil, errors.New(("must pass fields in order to generate SQL queries"))
}
if so.EntryCount <= 0 {
return nil, errors.New("must have entry count")
}
var sb strings.Builder
sb.WriteString("INSERT INTO " + so.Table + " VALUES")
for i := 0; i < so.EntryCount; i++ {
sb.WriteString(" (")
// Now, we need to add all of our fields
for _, field := range so.Fields {
if field.Function == "autoincrement" { // One
// TODO: We need to do something here still...
continue
}
// Get the function info for the field
funcInfo := GetFuncLookup(field.Function)
if funcInfo == nil {
return nil, errors.New("invalid function, " + field.Function + " does not exist")
}
// Generate the value
val, err := funcInfo.Generate(r, &field.Params, funcInfo)
if err != nil {
return nil, err
}
sb.
}
}
return []byte(sb.String()), nil
} What I am not understanding here is how we would do batch inserts when given an array of fields, to me this is looking like this would just end up doing one tuple, no? EDIT: I totally get it now, I found the implementation for the |
Haha these are all great questions! Its also probably why I havent tackled this one yet. That being said as long as you can generate the value you can use the funcInfo to know what the output type is and have a conversion function that looks at the type and converts it to the type that sql supports. Thats where I would start and see how far you get and hit me up if you need anymore help. |
Is the EDIT: It seems to be |
I'm thinking, I would say 95% of the use cases are putting either Would something as simple as this suffice for now? I'm missing the DATE stuff right now, but I can probably put something together. Basically, we just add func ConvertType(t string, val interface{}) string {
switch t {
case "string":
return `'` + fmt.Sprintf("%v", t) + `'`
default:
return fmt.Sprintf("%v", t)
}
} EDIT: Actually, you're already returning dates as strings, which I believe is what you do for SQL anyways to insert it into a table. This is what I have so far: type SQLOptions struct {
Table string `json:"table" xml:"table"`
EntryCount int `json:"entry_count" xml:"entry_count"`
Fields []Field `json:"fields" xml:"fields"`
}
func SQLInsert(so *SQLOptions) ([]byte, error) {
return sqlInsertFunc(globalFaker.Rand, so)
}
func sqlInsertFunc(r *rand.Rand, so *SQLOptions) ([]byte, error) {
if so.Table == "" {
return nil, errors.New("must provide table name to generate SQL")
}
if so.Fields == nil || len(so.Fields) <= 0 {
return nil, errors.New(("must pass fields in order to generate SQL queries"))
}
if so.EntryCount <= 0 {
return nil, errors.New("must have entry count")
}
var sb strings.Builder
sb.WriteString("INSERT INTO " + so.Table + " VALUES")
for i := 0; i < so.EntryCount; i++ {
sb.WriteString(" (")
// Now, we need to add all of our fields
for i, field := range so.Fields {
if field.Function == "autoincrement" { // One
// TODO: We need to do something here still...
continue
}
// Get the function info for the field
funcInfo := GetFuncLookup(field.Function)
if funcInfo == nil {
return nil, errors.New("invalid function, " + field.Function + " does not exist")
}
// Generate the value
val, err := funcInfo.Generate(r, &field.Params, funcInfo)
if err != nil {
return nil, err
}
convertType := ConvertType(funcInfo.Output, val)
if i == len(so.Fields)-1 { // Last field
sb.WriteString(convertType)
} else {
sb.WriteString(convertType + ", ")
}
}
if i == so.EntryCount-1 { // Last tuple
sb.WriteString(");")
} else {
sb.WriteString("),")
}
}
return []byte(sb.String()), nil
}
func ConvertType(t string, val interface{}) string {
switch t {
case "string":
return `'` + fmt.Sprintf("%v", t) + `'`
default:
return fmt.Sprintf("%v", t)
}
} |
Here's what I have so far, what do you think? |
I think those look great. One last thing you may want to do is throw more variations at it to make sure your covering all your bases. Things like floats. Also json can be a column type in mysql so you might want to try that as well. If those look good and tests run ill get it merged in. Nice job! |
Floats/Decimal values are handled by my Looking at the MySQL documentation , JSON is inserted into a table by just stringifying it, so I can just add the logic to do the same thing we do for the EDIT: I see that JSON's "type"( func ConvertType(t string, val interface{}) string {
switch t {
case "string":
return `'` + fmt.Sprintf("%v", val) + `'`
case "[]byte":
return `'` + fmt.Sprintf("%s", val) + `'`
default:
return fmt.Sprintf("%v", val)
}
} |
Thats a good question. I think that should be the case but youll have to test it out. |
res, _ := SQLInsert(&SQLOptions{
Table: "People",
EntryCount: 3,
Fields: []Field{
{Name: "first_name", Function: "firstname"},
{Name: "last_name", Function: "lastname"},
{Name: "age", Function: "number", Params: MapParams{"min": {"1"}, "max": {"99"}}},
},
}) I know that my |
Ya it can get a little tricky putting together a sub json field. The cleanest way without having large amounts of param strings is to just create a simple lookup function with some fields in it and then in your sql test field just add the custom lookup function you added. Here is an example of adding a custom function. https://github.com/brianvoe/gofakeit/blob/master/lookup_test.go#L11 Let me know if that makes sense to you. We are really close. If you want to start a pr submission I can start looking at it more closely after you add the json test. |
Can you submit a pr and ill pull it down and take a look at it? |
#201 is the PR! Let me know what I can do! |
Just merged your commit into v6.16.0. I made some updates, cleanups and added the json usage and lookup usage. Once again nice job on this integration! |
Need to add the ability to generate mysql queries for mass imports
The text was updated successfully, but these errors were encountered: