Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mock database connection #31

Open
gzt5142 opened this issue Feb 8, 2023 · 4 comments
Open

Mock database connection #31

gzt5142 opened this issue Feb 8, 2023 · 4 comments
Assignees

Comments

@gzt5142
Copy link
Owner

gzt5142 commented Feb 8, 2023

Current testing depends too heavily on the environment. Configure a better mock for the db for the unit-tests.

@gzt5142 gzt5142 self-assigned this Feb 8, 2023
@gzt5142
Copy link
Owner Author

gzt5142 commented Feb 8, 2023

I think this will be straightforward... currently, I create a fixture for all tests to use dal as the data access layer:

@pytest.fixture(scope="session")
def dal():
    url = URL.create(
        "postgresql+psycopg",
        username="nldi_schema_owner",
        password="changeMe",
        host="172.18.0.1",
        port="5432",
        database="nldi",
    )
    _dal = DataAccessLayer(url)
    yield _dal
    _dal.disconnect()

I think that I could instead create a sqlalchemy engine and session connected to a CSV or SQLLite database stored in the test folder. It only needs to hold the sources table....

@gzt5142
Copy link
Owner Author

gzt5142 commented Feb 15, 2023

well.... it's almost easy. . .

@pytest.fixture(scope="session")
def dummy_dal():
    class DummyDataAccessLayer(DataAccessLayer):
        def connect(self):
            if self.engine is None:
                self.engine = create_engine(self.uri, echo=False)

    _dal = DummyDataAccessLayer('sqlite:///:memory:')
    _dal.connect()
    _crawlers = pd.read_csv(r"./notebooks/crawler_source.tsv", delimiter="\t").set_index('crawler_source_id')
    _crawlers.to_sql("nldi_data.crawler_source", _dal.engine, if_exists='replace')

    yield _dal
    _dal.disconnect()

That does everything I want, except name the table correctly.

The naming convention within postgresql (the back end for nldi-db) uses 'schema' to separate groups of tables. The schema of nldi_data is where the crawler source table lives.

I cannot seem to get a sqlite database stored in memory to create this structure. sqlite does not have the same semantics as 'schema' in the posgresql sense. The crawler has to specify schema in order to find the crawler source table (and all other nldi tables)... but I can't get that naming hierarchy to work with a dummy sqlite stored in memory.

@gzt5142
Copy link
Owner Author

gzt5142 commented Feb 15, 2023

As temptingly simple as the in-memory sqlite database is... the structure is critical to this app and its testing harness. So will have to get more complicated.

Will shift gears to this: https://github.com/ClearcodeHQ/pytest-postgresql
A stable plugin to pytest for mocking postgresql connections.

@gzt5142
Copy link
Owner Author

gzt5142 commented Apr 20, 2023

The issue of mocking a db connection is still relevant for the ingest workflows....

but for the crawler-source parsing, I have re-factored to use a "repository" pattern where it is easier to fake the backing store for the table of sources. For such a small table that we are only reading, this is a simple (and more powerful) pattern that makes it easier to test.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant