New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Automatic schema generation? #49
Comments
Hi so far I don't have any plan to make automatic schema migration. The reason is our schema always evolving, and sometimes it's include manual data migration, so having a more control over doing it automatically is better. Tools like this is suites better for my use case: |
I'm closing this issue for now, if you think you need this feature, and want to propose an implementation for this, please let me know. thanks |
Just find out about this project and really like how it is implemented interface wise but missing database schema migrations like gorm or xorm is really something would be needed |
Hi @lafriks, It might be a good to know the use case where auto schema migrations is very useful? so I can have a better consideration about this feature :) |
in my case it would be to prevent having to write the same schema multiple times (once in a Go struct and once in an SQL file), having it done automatically helps me build things faster. |
Just curious, is the auto migration also running in production? So far how is the experience? cmiiw, So far I can only think that this feature will excels in embedded/standalone environment. |
when using libraries such as typeorm in node.js, that feature is only enabled for development |
hmm, so I believe it's mainly useful only for development? |
So mostly:
You can check out how we are using it in Gitea project to support multiple databases using xorm for example https://github.com/go-gitea/gitea/blob/master/models/migrations/v70.go That includes both struct autosync to database structure and if needed data migrations. Many projects use this also for production. Gitea does not support scaling to multiple servers so that is not that important but if implementing it would be smart to do this correctly to support multiple parallel instances. Most easiest way is to have special table for migrations with structure: type Migrations struct {
ID string `db:",primary"`
Description string
StartedAt time.Time
FinishedAt time.Time
} This would prevent multiple instances to start migrations in parallel as database server PK index would prevent inserting duplicate records even if transaction is not finished yet and other instances if restarted in same time should wait for migrations to complete on first instance. As for interface this would need two methods - initial database initialization - creating all tables in correct order provided in code and inserting all migrations as completed without running them. And other method to run migrations that are not still run. |
Some ideas can be taken from: |
Also it would be great if it would support automatic FK generation as no other Go ORM/DAL supports that currently to my knowledge |
I see, as I understand, it's not a fully auto migration where we leave the engine to do the migration magic, but more like traditional version based migration. Developer have control on what needs to be migrated on each new version, It is similar to active record, but instead of using dsl, xorm or gorm relies on built in If that's what you mean, then yes, It's something that I want to support, hopefully before version 1 (if I have time or some helps 😆 ). My design is quite different from xormigrate or gomigrate, because I prefer to use DSL instead of automigrate function because it's more explicit. This is what I have in my mind right now:
Migration file: // db/migrations/20200506191500_create_products.go
type CreateProducts struct{}
func (cp CreateProducts) Version() string {
retun "20200506191500"
}
func (cp CreateProducts) Up(m *rel.Migration) {
m.CreateTable("products", func(t rel.Table) {
t.String("name")
t.Text("description")
})
}
func (cp) Down(m *rel.Migration) {
m.DropTable("products")
} Custom API to invoke migration by code: migrator := rel.NewMigrator(adapter)
migrator.Add(CreateProducts{})
migrator.Sync()
// migrator.Rollback()
|
I like that API! Could you give an example for foreign keys? |
I haven't though that much, probably either of:
and more inspiration I think there's a lot to be considered, so far the inspiration of DSL comes mostly from activerecord, but there's other similar project in other language that have simpler DSL(example: Ecto) Let me know if you have any idea? |
Hi all, I have an update for this issue, POC version of schema migration is almost done, and you can find the example usage here: Fs02/go-todo-backend#6 let me know what you think 😄 |
I like it! I've been looking for similar tools for a while now. Is it tightly coupled with rel btw? I'd like to use it independently if possible ^^ unrelated note: keep up the great work on rel, I've been excited for its progress! |
You can use the migration functionality independently 😄 |
but should it be in a separate package / repo? |
did you mean I personally prefer to put my migration into separate package that's not imported by other runtime package just to keep it cleaner and safer from accidental migration. |
I meant should the migrations "component" / functionality (the migrator package) of rel be in a separate GitHub repo / project, this way you don't have to clone / depend on the rest of
(I'm suggesting this because I know for a fact some of my friends would like to stick to some more vanilla approach that isn't "orm-ish", but would still like the migrations part) |
My first plan is actually to have a separated package for this. but I decided to merge the ddl for the following reason:
I'm curious, what they currently using? and why they want to move to something that's not vanilla on migration level? |
upper.io or sqlx! |
What tool/library they use to manage db migration when they are using sqlx? |
Are you planning to implement automatic schema generation / migration?
The text was updated successfully, but these errors were encountered: