Skip to content

Conversation

shmax
Copy link
Contributor

@shmax shmax commented Aug 20, 2023

Let's fix the tests!

  • added docker-compose.yml file to easily install MySQL, PostgresQL, and Memcache (just run docker-compose up -d)
  • fixed hundreds of little modern PHP issues in the tests
  • got rid of the bizarre SnakeCaseTestCase thing.
  • set up Github Actions CI (not totally working yet)
  • replaced pear\log with modern monolog/monolog
  • fixed lots of little typos in the comments
  • finally threw out the OciAdapter; it never really worked, and as far as I can tell OCI drivers support for PDO have been abandoned.
  • updated to modern version of PHPUnit

@shmax shmax changed the title update phpunit Fix tests Aug 20, 2023
@ipundit
Copy link
Contributor

ipundit commented Aug 21, 2023

I don't use Sqlite myself, but isn't it just one of the optional database adapters? I see three areas that need to be tested:

  1. The ActiveRecord business logic which eventually generates an SQL statement. This doesn't need a database to test as you can just compare the resulting SQL statement (which is a string) to the expected string
  2. The adapters. These are cumbersome to test because you have to install their respective databases. Sqlite is one of those database adapters
  3. Integration testing between 1) and 2)

I run MySQL and so I know that adapter works. If the test suite was working, I'd be willing to run all the tests for MySQL only, whereas another developer who runs another database, say Postgres, would be willing to run the tests for only Postgres.

Detecting which database that is installed can be automated in PHPUnit:
https://docs.phpunit.de/en/10.3/writing-tests-for-phpunit.html#skipping-tests

Why not then skip all the Sqlite tests if you don't have that database installed / can't figure it out, and then write integration tests for the database you have installed? If the code coverage reports that you've covered your database adapter, then you got a working test suite that can be checked in. Some other developers who run the other databases can run code coverage for their setup and verify that their adapter works too, provided the integration tests pass on their setup.

@shmax
Copy link
Contributor Author

shmax commented Aug 21, 2023

Well, if I'm going to be maintaining this project in good faith I figure I may as well sort this stuff out once and for all. There are actually 3 different databases that the adapter tests exercise:

  • MySQL
  • PostgreSQL
  • SQLite

It is indeed super painful to have to set all that stuff up locally just to run the tests, so my plan is to jam them all into a Docker container so it can all be set up automatically in a cross-platform way.

Only problem is that I'm not super comfortable with Docker, so it's very slow going. If you have experience with it and would like to participate I can push what I have, so far.

@ipundit
Copy link
Contributor

ipundit commented Aug 21, 2023

Sorry, I don't have experience with Docker, but even if you were to setup a container, would developers be willing to install such a huge setup? Just the database binaries are huge, and I wouldn't be willing to have that setup on my dev machine just for the occasional bug fix for databases I never use, let alone know or understand. But if I can use the database I already have installed (MySQL), then the cost of contributing to this project is manageable for me because I have a personal interest on having this project keep current for the database I use. I'm sure other busy developers would do a similar cost/benefit analysis for the database they use.

Someone still has to run code coverage tests for all the databases, and if you're committed to that, then all power to you :). Do you need a Docker setup if you're the only one who would practically be willing to do 100% code coverage testing in the near future? A Docker setup is nice, but is it a gating requirement to releasing what you have?

If you push a test suite that supports MySQL, I can take a look at running the tests and contributing code that I've already written to get my projects going, including, support for MySQL stored procedures if you're interested in that.

@shmax
Copy link
Contributor Author

shmax commented Aug 21, 2023

Sorry, I don't have experience with Docker, but even if you were to setup a container, would developers be willing to install such a huge setup? Just the database binaries are huge, and I wouldn't be willing to have that setup on my dev machine just for the occasional bug fix for databases I never use, let alone know or understand.

Well, you would be no worse off than if you decided to install each of the databases individually, minus the overhead of Docker itself if you don't have it installed, but the general idea is to make it all an invisible part of the test CLI. Docker is available by default in github actions, but before I go too far with it I'll also take some time to figure out how the old repo did the tests in the CI.

But if I can use the database I already have installed (MySQL), then the cost of contributing to this project is manageable for me because I have a personal interest on having this project keep current for the database I use. I'm sure other busy developers would do a similar cost/benefit analysis for the database they use.

Should be easy enough to pass an argument to the docker startup command to install only the databases you're interested in, I guess. Will keep it in mind.

Someone still has to run code coverage tests for all the databases, and if you're committed to that, then all power to you :). Do you need a Docker setup if you're the only one who would practically be willing to do 100% code coverage testing in the near future? A Docker setup is nice, but is it a gating requirement to releasing what you have?

Just trying to modernize things so that people that want to go all-in on development don't have to start from 0 with all the different databases, and that includes myself if I ever change machines or come back to this after another long absence.

@shmax
Copy link
Contributor Author

shmax commented Aug 21, 2023

figure out how the old repo did the tests in the CI.

Looks like it all happens in travis.xml. Travis evidently has MySQL and PostGres built-in (not sure about SqlLite), and it creates the databases in the before_script hook

@ipundit
Copy link
Contributor

ipundit commented Aug 21, 2023

From https://glauth.github.io/docs/running-travis-ci-locally.html

Travis CI is now a paid only product, even for open source projects.

A replacement for local testing is to use phpunit 10's test runner wrapper to write setup/tear down code after the tests.
https://docs.phpunit.de/en/10.3/extending-phpunit.html#wrapping-the-test-runner

So before the tests are run, create the databases. After the tests are done, drop them. This would be a platform independent way of replacing travis. If developers have those databases installed on their machines, then those databases are created and those tests are run.

For the developers who don't have all of PHP-ActiveRecord's databases installed on their machines (which I assume would be nearly all of us), then we could use GitHub Actions to do CI for all database adapters. Your docker package of the different databases would live there.
https://docs.github.com/en/actions/using-github-hosted-runners/about-github-hosted-runners

Developers like me who's not willing to host such a huge overhead on their own machine would test against the databases that they have installed. Testing against all the database adapters would happen at each PR on GitHub's servers before being merged in as part of CI. This is a workable model for me.

@shmax
Copy link
Contributor Author

shmax commented Aug 21, 2023

Travis CI is now a paid only product, even for open source projects.

Yep, I don't have any plans to go back to Travis. GIthub Actions is free for OSS.

use phpunit 10's test runner wrapper to write setup/tear down code after the test

Yes, that's already how it works.

Developers like me who's not willing to host such a huge overhead on their own machine would test against the databases that they have installed. Testing against all the database adapters would happen at each PR on GitHub's servers before being merged in as part of CI. This is a workable model for me.

I don't know how big the overhead is, but if possible I'll see if I can get all this to happen in memory and clean up after itself so there's no persistent consumption of resources (I'm working with Alphine Linux, which is tiny, and the docker build and start up processes are nearly instant). And, of course, running the tests locally will be optional, and I can make the docker stuff opt-out if it really bothers you.

But this is all sort of moot until I can figure out how to connect to the databases inside the container, which is where I'm stuck at the moment.

@shmax
Copy link
Contributor Author

shmax commented Aug 27, 2023

@ipundit the tests are green! Well, on local, anyway. I'll try to get the CI working before merging this, but in the meantime I'd welcome any comments you have.

Regarding the dockerfile; I wound up going with a simple docker-compose.yml. In this PR I entirely remove OCI from the equation, because it apparently never worked even back in the heyday of this project, and I was unable to find any modern drivers online. That just leaves images for MySQL, PostGresQL, and Memcache, which total about 1.1Gb in size. If you are worried about the resource consumption, then there's still nothing stopping you from doing it your own way, but if you want to give it a try, just run this in the root:

docker-compose up -d

@shmax shmax marked this pull request as ready for review August 27, 2023 00:35
@shmax shmax merged commit cc36507 into master Aug 30, 2023
@shmax shmax deleted the fix-tests branch August 30, 2023 21:56
@ipundit ipundit mentioned this pull request Aug 31, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants