-
Notifications
You must be signed in to change notification settings - Fork 55
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use alembic
to manage SQLite schema changes?
#378
Comments
Can you share any of these discussions with users? That would give me a more concrete idea of the problem this solves. I imagine it's the same issue I had between the |
I've been playing with this, and there are a few things I don't seem to have figured out, yet.
BUT, I think I have a basic migration script written for the old/new versions of the database (pre-/post- fastANI addition). |
I have set up the basic requirements for a new I was going to add a tag in the repo, prior to the merge of |
Actually, I have found an issue with the The function I am running is written in accordance with examples I've seen: def downgrade_old_db():
with op.batch_alter_table("comparisons") as batch_op:
batch_op.drop_column('kmersize')
batch_op.drop_column('minmatch') but nothing happens—note the lack of a 'Running downgrade ...' message here: (pyani_dev) ! alembic downgrade base
INFO [alembic.env] Migrating database old_db
INFO [alembic.runtime.migration] Context impl SQLiteImpl.
INFO [alembic.runtime.migration] Will assume non-transactional DDL. Many sources on the internet claim that this is because I have, however, confirmed that I can drop columns within my sqlite> .schema comparisons
CREATE TABLE comparisons (
comparison_id INTEGER NOT NULL,
query_id INTEGER NOT NULL,
subject_id INTEGER NOT NULL,
aln_length INTEGER,
sim_errs INTEGER,
identity FLOAT,
cov_query FLOAT,
cov_subject FLOAT,
program VARCHAR,
version VARCHAR,
fragsize INTEGER,
maxmatch BOOLEAN, kmersize INTEGER, minmatch FLOAT,
PRIMARY KEY (comparison_id),
UNIQUE (query_id, subject_id, program, version, fragsize, maxmatch),
FOREIGN KEY(query_id) REFERENCES genomes (genome_id),
FOREIGN KEY(subject_id) REFERENCES genomes (genome_id)
);
sqlite> alter table comparisons drop kmersize;
sqlite> .schema comparisons
CREATE TABLE comparisons (
comparison_id INTEGER NOT NULL,
query_id INTEGER NOT NULL,
subject_id INTEGER NOT NULL,
aln_length INTEGER,
sim_errs INTEGER,
identity FLOAT,
cov_query FLOAT,
cov_subject FLOAT,
program VARCHAR,
version VARCHAR,
fragsize INTEGER,
maxmatch BOOLEAN, minmatch FLOAT,
PRIMARY KEY (comparison_id),
UNIQUE (query_id, subject_id, program, version, fragsize, maxmatch),
FOREIGN KEY(query_id) REFERENCES genomes (genome_id),
FOREIGN KEY(subject_id) REFERENCES genomes (genome_id)
);
sqlite> |
IIRC, |
Does alembic provide no error message or other useful information (and appear to have "worked" without A quick search turned up at least one workaround someone implemented, but we shouldn't have to do that. I've seen two mentions of a Miguel Grinberg has a blog post outlining a possible solution. I expect you'll have seen and tried all of these, mind. |
By default it does not produce any error message, and I have tried including the I was attempting to verify that the issue was with drop table, and not just downgrading specifically, by creating a second revision so I had more things to step through. However, now I am stuck in a loop where I think this issue started when I manually deleted columns, to prove I could, though I've since recreated the database. Perhaps something is just not hooked up quite right.... |
I seem to have hacked my way through a series of arcane errors I could find little information about to something that works. I now have two migration scripts, that I am able to use to both upgrade and downgrade a database reliably. The differences from what I had before are that I did not use the with op.batch_alter_table("comparisons") as batch_op:
batch_op.add_column(sa.Column('kmersize', sa.Integer))
batch_op.add_column(sa.Column('minmatch', sa.Float)) to doing so outside of one: op.add_column('comparisons', sa.Column('kmersize', sa.Integer))
op.add_column('comparisons', sa.Column('minmatch', sa.Float)) This change was necessary because I got the following error, and nothing else both prevents the error, and still performs the changes. I have not been able to figure out why this caused an error when, for instance, I successfully used the former format yesterday. Traceback (most recent call last):
File "/Users/baileythegreen/Software/miniconda3/bin/alembic", line 8, in <module>
sys.exit(main())
File "/Users/baileythegreen/Software/miniconda3/lib/python3.8/site-packages/alembic/config.py", line 588, in main
CommandLine(prog=prog).main(argv=argv)
File "/Users/baileythegreen/Software/miniconda3/lib/python3.8/site-packages/alembic/config.py", line 582, in main
self.run_cmd(cfg, options)
File "/Users/baileythegreen/Software/miniconda3/lib/python3.8/site-packages/alembic/config.py", line 559, in run_cmd
fn(
File "/Users/baileythegreen/Software/miniconda3/lib/python3.8/site-packages/alembic/command.py", line 320, in upgrade
script.run_env()
File "/Users/baileythegreen/Software/miniconda3/lib/python3.8/site-packages/alembic/script/base.py", line 563, in run_env
util.load_python_file(self.dir, "env.py")
File "/Users/baileythegreen/Software/miniconda3/lib/python3.8/site-packages/alembic/util/pyfiles.py", line 92, in load_python_file
module = load_module_py(module_id, path)
File "/Users/baileythegreen/Software/miniconda3/lib/python3.8/site-packages/alembic/util/pyfiles.py", line 108, in load_module_py
spec.loader.exec_module(module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 848, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/Users/baileythegreen/Software/pyani/scratch/alembic/env.py", line 72, in <module>
run_migrations_online()
File "/Users/baileythegreen/Software/pyani/scratch/alembic/env.py", line 66, in run_migrations_online
context.run_migrations()
File "<string>", line 8, in run_migrations
File "/Users/baileythegreen/Software/miniconda3/lib/python3.8/site-packages/alembic/runtime/environment.py", line 851, in run_migrations
self.get_context().run_migrations(**kw)
File "/Users/baileythegreen/Software/miniconda3/lib/python3.8/site-packages/alembic/runtime/migration.py", line 620, in run_migrations
step.migration_fn(**kw)
File "/Users/baileythegreen/Software/pyani/scratch/alembic/versions/65538af5a5e1_add_nonsense_column.py", line 21, in upgrade
batch_op.add_column(sa.Column('nonsense', sa.Integer))
File "/Users/baileythegreen/Software/miniconda3/lib/python3.8/contextlib.py", line 120, in __exit__
next(self.gen)
File "/Users/baileythegreen/Software/miniconda3/lib/python3.8/site-packages/alembic/operations/base.py", line 374, in batch_alter_table
impl.flush()
File "/Users/baileythegreen/Software/miniconda3/lib/python3.8/site-packages/alembic/operations/batch.py", line 101, in flush
should_recreate = self._should_recreate()
File "/Users/baileythegreen/Software/miniconda3/lib/python3.8/site-packages/alembic/operations/batch.py", line 94, in _should_recreate
return self.operations.impl.requires_recreate_in_batch(self)
File "/Users/baileythegreen/Software/miniconda3/lib/python3.8/site-packages/alembic/ddl/sqlite.py", line 56, in requires_recreate_in_batch
and col.server_default.persisted
AttributeError: 'NoneType' object has no attribute 'persisted' |
I have no idea how this manages to get around the issue I saw before when nothing seemed to happen on downgrade. |
Does this seem like the right way to incorporate |
I'm not familiar enough with What I get from the thread linked above seems sensible. If it does what we need, doesn't break anything else, and doesn't impose any outrageous requirements, that's good enough for me. |
I meant more from a Python packaging standpoint. All this seems to do is create two empty I think the files that need to be included/given to users, and the correct structure for them is:
I have a subcommand working that will run upgrades and downgrades, and capture the output/errors from those. The main questions now are:
|
That doesn't seem unreasonable but, as I understand it, the |
I imagined that there would be a way to present only the |
I don't really know what this refers to, or should point to. I would have assumed the up/downgrade could run with a default Is this issue relevant? sqlalchemy/alembic#606 |
For
Right now, This is what I mean by specifying the driver & database location:
The
This looks like they're using an environmental variable, so maybe. If you think that's an acceptable way forward. I still need to do some digging to see where script location stuff is used. |
Examples of my current
and from logging.config import fileConfig
from sqlalchemy import engine_from_config
from sqlalchemy import pool
from alembic import context
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
# Interpret the config file for Python logging.
# This line sets up loggers basically.
fileConfig(config.config_file_name)
# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
target_metadata = None
# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.
def run_migrations_offline():
"""Run migrations in 'offline' mode.
This configures the context with just a URL
and not an Engine, though an Engine is acceptable
here as well. By skipping the Engine creation
we don't even need a DBAPI to be available.
Calls to context.execute() here emit the given string to the
script output.
"""
url = config.get_main_option("sqlalchemy.url")
context.configure(
url=url,
target_metadata=target_metadata,
literal_binds=True,
dialect_opts={"paramstyle": "named"},
)
with context.begin_transaction():
context.run_migrations()
def run_migrations_online():
"""Run migrations in 'online' mode.
In this scenario we need to create an Engine
and associate a connection with the context.
"""
connectable = engine_from_config(
config.get_section(config.config_ini_section),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
with connectable.connect() as connection:
context.configure(
connection=connection, target_metadata=target_metadata
)
with context.begin_transaction():
context.run_migrations()
if context.is_offline_mode():
run_migrations_offline()
else:
run_migrations_online() |
Environmental variables are fair game. When populating, |
Okay, good to know. I'll see if I can get that working. I have been waiting to push stuff because technically right now it won't run; without those values, it's going to throw an error. |
I'm looking forward to seeing how you manage the tests for this, too. |
That's what I'll be looking at after I try this stuff with environmental variables. I have ideas for what I'll do with tests. |
The issue described here has been resolved (and I felt really silly once it was). Skip to here if you want to bypass irrelevant questions/discussion.I have thus far failed to get the environment variable plan working. I have managed to set them, and I have the syntax for retrieving them correct. However, I am unable to get their values assigned to the The An example of how I'm trying to reset them, where $DATABASE is the value of the dbpath = os.environ.get('DATABASE')
url = "sqlite:////Users/baileythegreen/Software/pyani/" + dbpath
config.set_main_option("sqlalchemy.url", url) From what I understand, this is the correct API method for what I want to do (eliminate the |
It's hard to comment on the first part without seeing the error message or practically taking a look. In the second part, this line: url = "sqlite:////Users/baileythegreen/Software/pyani/" + dbpath looks suspect, to me. However, it's not clear what's in the The required format, described in an earlier post, is essentially:
It's not clear to me that your string, concatenated with Have you tried:
and what was/is the result? |
FEATURE REQUEST
could be copied to
before the in-place modification starts. |
To resummarise where this is (will potentially be expanded as I think of other things to add): (Branch with code found here.)
CLI help information for
|
I have chased down the issue underlying something weird I was seeing, which somewhat derailed my writing tests for Weird thing I noticedIf I took a copy of an old database (pre- Traceback (most recent call last):
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1802, in _execute_context
self.dialect.do_execute(
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 732, in do_execute
cursor.execute(statement, parameters)
sqlite3.OperationalError: error in table comparisons after drop column: no such column: kmersize
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/bin/alembic", line 8, in <module>
sys.exit(main())
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/lib/python3.8/site-packages/alembic/config.py", line 588, in main
CommandLine(prog=prog).main(argv=argv)
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/lib/python3.8/site-packages/alembic/config.py", line 582, in main
self.run_cmd(cfg, options)
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/lib/python3.8/site-packages/alembic/config.py", line 559, in run_cmd
fn(
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/lib/python3.8/site-packages/alembic/command.py", line 366, in downgrade
script.run_env()
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/lib/python3.8/site-packages/alembic/script/base.py", line 563, in run_env
util.load_python_file(self.dir, "env.py")
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/lib/python3.8/site-packages/alembic/util/pyfiles.py", line 92, in load_python_file
module = load_module_py(module_id, path)
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/lib/python3.8/site-packages/alembic/util/pyfiles.py", line 108, in load_module_py
spec.loader.exec_module(module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 843, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "alembic/env.py", line 104, in <module>
run_migrations_online()
File "alembic/env.py", line 79, in run_migrations_online
context.run_migrations()
File "<string>", line 8, in run_migrations
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/lib/python3.8/site-packages/alembic/runtime/environment.py", line 851, in run_migrations
self.get_context().run_migrations(**kw)
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/lib/python3.8/site-packages/alembic/runtime/migration.py", line 620, in run_migrations
step.migration_fn(**kw)
File "/Users/baileythegreen/Software/pyani/alembic/versions/92f7f6b1626e_add_fastani_columns.py", line 28, in downgrade
op.drop_column("comparisons", "kmersize")
File "<string>", line 8, in drop_column
File "<string>", line 3, in drop_column
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/lib/python3.8/site-packages/alembic/operations/ops.py", line 2189, in drop_column
return operations.invoke(op)
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/lib/python3.8/site-packages/alembic/operations/base.py", line 392, in invoke
return fn(self, operation)
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/lib/python3.8/site-packages/alembic/operations/toimpl.py", line 89, in drop_column
operations.impl.drop_column(
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/lib/python3.8/site-packages/alembic/ddl/impl.py", line 333, in drop_column
self._exec(base.DropColumn(table_name, column, schema=schema))
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/lib/python3.8/site-packages/alembic/ddl/impl.py", line 197, in _exec
return conn.execute(construct, multiparams)
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1289, in execute
return meth(self, multiparams, params, _EMPTY_EXECUTION_OPTS)
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/lib/python3.8/site-packages/sqlalchemy/sql/ddl.py", line 80, in _execute_on_connection
return connection._execute_ddl(
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1381, in _execute_ddl
ret = self._execute_context(
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1845, in _execute_context
self._handle_dbapi_exception(
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 2026, in _handle_dbapi_exception
util.raise_(
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/lib/python3.8/site-packages/sqlalchemy/util/compat.py", line 207, in raise_
raise exception
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1802, in _execute_context
self.dialect.do_execute(
File "/Users/baileythegreen/Software/miniconda3/envs/pyani_dev/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 732, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) error in table comparisons after drop column: no such column: kmersize
[SQL: ALTER TABLE comparisons DROP COLUMN kmersize]
(Background on this error at: https://sqlalche.me/e/14/e3q8) This puzzled me, because the column is clearly present in the database. Upon further inspection, it seems to be a difference in the unique constraints specified in the updated ORM, versus those achieved by my migration script to update the old database. Schema for
|
sigh Apparently dropping unique constraints is one of the things SQLite doesn't like to do, so this migration just got a lot harder. I have also been thinking about the utility of the downgrade option, and have been led to question it. If someone has a new database that contains any We're creating a backup, so none of these would be catastrophic, but one may be preferable. |
The above issue (re: constraints) seems to be solved using the contextual version of the tl;dr; I have the database successfully upgrading and downgrading, new and old databases alike. Now I just need to decide whether warnings are needed for downgrading, et cetera, and to finish my tests. |
My first thought is that we should aim to preserve the necessary metadata for a run, which includes - for I think we are justified as we backup, and do not destroy, the original, late-format database. I would see downgrading the database from |
I agree it's likely to be an unusual case, but if I'm supplying code that will perform such a downgrade (and I currently am), then I need to think about such consequences for the user. Will write code to remove the rows, and do so loudly, with specific messages pointing to the backup. |
Ran into a new issue when working on the It seems the modification of constraints requires reflection, which is fine when there is a connection to the database, but requires some extra steps to do offline. The error I got related to this (which is sufficiently arcane) was:
The useful solution I have found is from the docs here (https://hellowac.github.io/alembic_doc/en/batch.html#working-in-offline-mode). meta = MetaData()
some_table = Table(
'some_table', meta,
Column('id', Integer, primary_key=True),
Column('bar', String(50))
)
with op.batch_alter_table("some_table", copy_from=some_table) as batch_op:
batch_op.add_column(Column('foo', Integer))
batch_op.drop_column('bar') This basically requires the table schema to be included in the migration file (or somewhere, anyway). A bit annoying, but easy enough to do, once you figure out you need to do it. And I got a second error when dealing with the first:
The deal with this one seems to be that I hadn't yet created two dummy tables in the migration script—one that contained the columns being added/updated constraints, and one that modeled the base settings—for the different manipulations to be done on respectively. |
Saw the weird error mentioned in #378 (comment) again. This time I seem to have managed to resolve it by swapping the order of some lines, e.g.: batch_op.drop_constraint("base_reqs")
batch_op.add_column(sa.Column("kmersize", sa.Integer, default=None))
batch_op.add_column(sa.Column("minmatch", sa.Float, default=None))
batch_op.create_unique_constraint(
"fastani_reqs",
[
"query_id",
"subject_id",
"program",
"version",
"fragsize",
"maxmatch",
"kmersize",
"minmatch",
],
) instead of: batch_op.add_column(sa.Column("kmersize", sa.Integer, default=None))
batch_op.add_column(sa.Column("minmatch", sa.Float, default=None))
batch_op.drop_constraint("base_reqs")
batch_op.create_unique_constraint(
"fastani_reqs",
[
"query_id",
"subject_id",
"program",
"version",
"fragsize",
"maxmatch",
"kmersize",
"minmatch",
],
) I still do not know why this makes sense. |
Tests are passing locally for me—exclusive of ContextI completely scrapped the tests from before, rewriting them based on the meeting last week. They now work by first creating a dump from an input database, using that dump to initialise a new database, which is then upgraded or downgraded, and then comparing a dump of the up/downgraded database against a dump of a target. This is done with |
When I've had similar CircleCI issues in the past, I've had to pay particular attention to It can be useful for diagnosis to insert a step in the CircleCI config that checks for the presence of the executable you want in the On the whole your new approach to the tests sounds good. |
My less sophisticated approach thus far has been to put extra shell commands in the Makefile action: wget https://www.sqlite.org/2022/sqlite-tools-linux-x86-3380100.zip
unzip sqlite-tools-linux-x86-3380100.zip
ls -l sqlite-tools-linux-x86-3380100/sqlite3
echo $PWD
echo 'export PATH=$PWD/sqlite-tools-linux-x86-3380100/sqlite3:$PATH' >> $BASH_ENV
source $BASH_ENV
echo $PATH
/home/circleci/repo/sqlite-tools-linux-x86-3380100/sqlite3 -help
sqlite3 -help which gives: --2022-03-14 22:15:44-- https://www.sqlite.org/2022/sqlite-tools-linux-x86-3380100.zip
Resolving www.sqlite.org (www.sqlite.org)... 45.33.6.223, 2600:3c00::f03c:91ff:fe96:b959
Connecting to www.sqlite.org (www.sqlite.org)|45.33.6.223|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 2232504 (2.1M) [application/zip]
Saving to: ‘sqlite-tools-linux-x86-3380100.zip’
sqlite-tools-linux- 100%[===================>] 2.13M 9.51MB/s in 0.2s
2022-03-14 22:15:44 (9.51 MB/s) - ‘sqlite-tools-linux-x86-3380100.zip’ saved [2232504/2232504]
Archive: sqlite-tools-linux-x86-3380100.zip
creating: sqlite-tools-linux-x86-3380100/
inflating: sqlite-tools-linux-x86-3380100/sqlite3
inflating: sqlite-tools-linux-x86-3380100/sqlite3_analyzer
inflating: sqlite-tools-linux-x86-3380100/sqldiff
-rwxrwxr-x 1 circleci circleci 1202884 Mar 12 14:05 sqlite-tools-linux-x86-3380100/sqlite3
/home/circleci/repo
/home/circleci/repo/sqlite-tools-linux-x86-3380100/sqlite3:/home/circleci/repo:/home/circleci/repo/blast-2.2.26/bin:/home/circleci/repo:/home/circleci/repo/blast-2.2.26/bin:/home/circleci/.local/bin:/home/circleci/bin:/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
/bin/bash: line 7: /home/circleci/repo/sqlite-tools-linux-x86-3380100/sqlite3: No such file or directory
Exited with code exit status 127
CircleCI received exit code 127 The failure has occurred whenever I try to invoke I don't know if any of this obviates your suggestion with the artefacts, but I'll start looking into that, as I haven't used them before. |
If you're modifying the Changes specific to CircleCI integration should be made in the Examples of how to handle a specific installation for CircleCI can be seen in that - run:
name: install legacy BLAST
command: |
curl -o legacy_blast.tar.gz ftp://ftp.ncbi.nlm.nih.gov/blast/executables/legacy.NOTSUPPORTED/2.2.26/blast-2.2.26-x64-linux.tar.gz
tar -zxf legacy_blast.tar.gz
echo 'export PATH=$PWD/blast-2.2.26/bin:$PATH' >> $BASH_ENV
source $BASH_ENV Note that the In the same section you should be able to This kind of workflow keeps everything involved in debugging/fixing the CircleCI runs neatly in one file, and easily removable when it is no longer required. Since If you need to install the executable, it may be simpler to use the built-in package management, which should place the executable somewhere visible, e.g.: - run:
name: install third-party tools
command: |
sudo apt-get install csh mummer ncbi-blast+ sqlite3 |
I described what I did poorly/inaccurately. I have been editing the |
Have you tried the |
No, I don't tend to think about that because I've never really been able to use it. You might be right about that being the way forward, though. I did look for a Would it be |
Try appending |
Summary:
Use
alembic
or similar to manage database migration.Description:
In discussions with users, modifications to the SQLite database schema have caused issues where updating their v0.3 installation has led to incompatibility with an existing database, and resulting errors.
We could potentially avoid this by carefully versioning the SQLite/SQLAlchemy schema and using
alembic
or similar to aid migration between versions in most cases.If users maintain large analyses in their databases, we should avoid discouraging them from trusting the package by breaking backwards-compatibility.
Having used this before with Flask applications, I'm confident we could usually achieve seamless in-place upgrades of users' existing local databases.
The text was updated successfully, but these errors were encountered: