Description
I would love to have the option to check, in my ci pipeline, if the migrations are up to date with the implementation of the models in the code.
We are almost there
I feel like we can get quite close to this. We can spin up an SQLite and apply the migration to the database, we can even call command.revision(alembic_cfg, autogenerate=True)
. However, that, of course, creates a migration file instead of returning a concise list of changes to the python runtime.
My search for solutions
It seems like the generation of changes happens somewhere in alembic.autogenerate.compare._populate_migration_script
but since this is a protected method I am hesitant to develop against it.
What I would love
I would absolutely love it if there would be some documentation on how to properly test alembic:
- Assert that upgrading works
- Assert that downgrading works
- Assert that migrations are up-to-date with the current codebase. (IMO the most important one)
Of course, the documentation would actually require a good interface to do those things with. But I do believe this would eliminate a significant set of production database issues.
Activity
zzzeek commentedon Aug 11, 2020
hi there -
The third part of this is accomplished using the compare_metadata() API function documented at https://alembic.sqlalchemy.org/en/latest/api/autogenerate.html#getting-diffs https://alembic.sqlalchemy.org/en/latest/api/autogenerate.html#alembic.autogenerate.compare_metadata
This approach is in widespread use already for the use case you describe, for example here's how many Openstack components use it based on a library function in the "oslo.db" library: https://github.com/openstack/oslo.db/blob/master/oslo_db/sqlalchemy/test_migrations.py#L567 .
it's not too hard to mention the compare_metadata() function in the recipes section in some way.
For the "assert upgrading / downgrading" works case, openstack has a stepwise utility here as well that does what they call a "Snakewalk": https://github.com/openstack/oslo.db/blob/master/oslo_db/sqlalchemy/test_migrations.py#L39 , however this is implemented in an abstract way in oslo.db as there are projects that use it either with Alembic or SQLAlchemy-migrate. the idea of snakewalk includes that every single migration is tested individually. openstack Nova does this though they are using sqlalchemy-migrate, for every new migration they have to write a new test as you can see here : https://github.com/openstack/nova/blob/master/nova/tests/unit/db/test_migrations.py#L296 those numbers like "_check_231" "_check_232" etc. are actual version numbers.
Within Alembic itself I'm not comfortable releasing and maintaining the "snakewalk" utility as it is quite complicated and Alembic does not currently have a lot of development resources, but maybe this can be of use. I'm not really sure how to approach documenting this from Alembic as I don't have a simple standalone example that can illustrate what these test suites do.
Luttik commentedon Aug 13, 2020
Hé @zzzeek thanks for the extensive approach. Although it is not yet completely clear to me. What I want is something like this. Again I feel like the presented code snippets get me 90% of the way but that the last 10% is obscured.
zzzeek commentedon Aug 13, 2020
OK here's how to run a command, like upgrade:
https://alembic.sqlalchemy.org/en/latest/api/commands.html
so I can paste these together:
try that.
Luttik commentedon Aug 13, 2020
@zzzeek Thanks, that was enough info to get me there.
One thing that I had to change from my pretty default
migrations/env.py
(or for manyalembic/env.py
was replacing the (I believe standard)run_migrations_online()
method with this:@zzzeek Or am I wrong again and should I have utilized the
run_migrations_offline
method in some way?tl;dr it works for me now but I needed to do a small workaround.
zzzeek commentedon Aug 14, 2020
the purpose of the
env.py
script is explicitly so that it can be customized for the application, so that you were able to dive in there and do whatever you had to in order to "make it work" is a feature. In this case I forgot to finish reading my own docs, which is that yes when that "connection" is stuck in there we need to do the recipe at https://alembic.sqlalchemy.org/en/latest/cookbook.html#connection-sharing .The run_migrations_offline() function is not used unless you use --sql mode which is less common.
Luttik commentedon Aug 14, 2020
Yes the env.py file was a stroke of genius.
ziima commentedon Oct 19, 2020
Would it be possible to add
--check
option to therevision
command to return a non-zero exit code if a non-empty migration is generated?I admit that it's inspired by Django's
makemigrations --check
, see https://docs.djangoproject.com/en/3.1/ref/django-admin/#django-admin-makemigrations.zzzeek commentedon Oct 19, 2020
that sounds more like a fixture you want to have in your test suite. these are commonplace and are constructed using the approach at https://alembic.sqlalchemy.org/en/latest/api/autogenerate.html#alembic.autogenerate.compare_metadata . basically when you run your test suite, include a test that does compare_metadata and fails if any diffs are generated.
ziima commentedon Oct 20, 2020
In my point of view, alembic is external tool for the application, so migration checks shouldn't be part of application unittests. I place them on the same level as static analysis tools, such as
mypy
orflake8
. Hence I'd like to have a simple command to run the checks.I found this tool https://pypi.org/project/alembic-autogen-check/ that seems to do the job. But in my opinion it would be still worth having it as part of alembic CLI.
zzzeek commentedon Oct 20, 2020
oh, well I'd say we are both correct, in that flake8 and mypy are certainly run as part of test suites, but they are typically as top level targets in a tox.ini file, so you're right, this would be nice to have....
and... we have it! someone's already done this, so use that.
why? it only means it won't be maintained as well, Alembic is not actually keeping up with our existing issue load very well right now. it's just a line in your requirements file and using plugins and extensions for testing things so that the work of tool maintanance is spread among more people is common.
it looks like you've put a few issues on there so let's see if @4Catalyzer is responsive.
21 remaining items