-
Notifications
You must be signed in to change notification settings - Fork 13
Fix tests #25
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
I don't use Sqlite myself, but isn't it just one of the optional database adapters? I see three areas that need to be tested:
I run MySQL and so I know that adapter works. If the test suite was working, I'd be willing to run all the tests for MySQL only, whereas another developer who runs another database, say Postgres, would be willing to run the tests for only Postgres. Detecting which database that is installed can be automated in PHPUnit: Why not then skip all the Sqlite tests if you don't have that database installed / can't figure it out, and then write integration tests for the database you have installed? If the code coverage reports that you've covered your database adapter, then you got a working test suite that can be checked in. Some other developers who run the other databases can run code coverage for their setup and verify that their adapter works too, provided the integration tests pass on their setup. |
Well, if I'm going to be maintaining this project in good faith I figure I may as well sort this stuff out once and for all. There are actually 3 different databases that the adapter tests exercise:
It is indeed super painful to have to set all that stuff up locally just to run the tests, so my plan is to jam them all into a Docker container so it can all be set up automatically in a cross-platform way. Only problem is that I'm not super comfortable with Docker, so it's very slow going. If you have experience with it and would like to participate I can push what I have, so far. |
Sorry, I don't have experience with Docker, but even if you were to setup a container, would developers be willing to install such a huge setup? Just the database binaries are huge, and I wouldn't be willing to have that setup on my dev machine just for the occasional bug fix for databases I never use, let alone know or understand. But if I can use the database I already have installed (MySQL), then the cost of contributing to this project is manageable for me because I have a personal interest on having this project keep current for the database I use. I'm sure other busy developers would do a similar cost/benefit analysis for the database they use. Someone still has to run code coverage tests for all the databases, and if you're committed to that, then all power to you :). Do you need a Docker setup if you're the only one who would practically be willing to do 100% code coverage testing in the near future? A Docker setup is nice, but is it a gating requirement to releasing what you have? If you push a test suite that supports MySQL, I can take a look at running the tests and contributing code that I've already written to get my projects going, including, support for MySQL stored procedures if you're interested in that. |
Well, you would be no worse off than if you decided to install each of the databases individually, minus the overhead of Docker itself if you don't have it installed, but the general idea is to make it all an invisible part of the test CLI. Docker is available by default in github actions, but before I go too far with it I'll also take some time to figure out how the old repo did the tests in the CI.
Should be easy enough to pass an argument to the docker startup command to install only the databases you're interested in, I guess. Will keep it in mind.
Just trying to modernize things so that people that want to go all-in on development don't have to start from 0 with all the different databases, and that includes myself if I ever change machines or come back to this after another long absence. |
Looks like it all happens in travis.xml. Travis evidently has MySQL and PostGres built-in (not sure about SqlLite), and it creates the databases in the before_script hook |
From https://glauth.github.io/docs/running-travis-ci-locally.html
A replacement for local testing is to use phpunit 10's test runner wrapper to write setup/tear down code after the tests. So before the tests are run, create the databases. After the tests are done, drop them. This would be a platform independent way of replacing travis. If developers have those databases installed on their machines, then those databases are created and those tests are run. For the developers who don't have all of PHP-ActiveRecord's databases installed on their machines (which I assume would be nearly all of us), then we could use GitHub Actions to do CI for all database adapters. Your docker package of the different databases would live there. Developers like me who's not willing to host such a huge overhead on their own machine would test against the databases that they have installed. Testing against all the database adapters would happen at each PR on GitHub's servers before being merged in as part of CI. This is a workable model for me. |
Yep, I don't have any plans to go back to Travis. GIthub Actions is free for OSS.
Yes, that's already how it works.
I don't know how big the overhead is, but if possible I'll see if I can get all this to happen in memory and clean up after itself so there's no persistent consumption of resources (I'm working with Alphine Linux, which is tiny, and the docker build and start up processes are nearly instant). And, of course, running the tests locally will be optional, and I can make the docker stuff opt-out if it really bothers you. But this is all sort of moot until I can figure out how to connect to the databases inside the container, which is where I'm stuck at the moment. |
@ipundit the tests are green! Well, on local, anyway. I'll try to get the CI working before merging this, but in the meantime I'd welcome any comments you have. Regarding the dockerfile; I wound up going with a simple docker-compose.yml. In this PR I entirely remove OCI from the equation, because it apparently never worked even back in the heyday of this project, and I was unable to find any modern drivers online. That just leaves images for MySQL, PostGresQL, and Memcache, which total about 1.1Gb in size. If you are worried about the resource consumption, then there's still nothing stopping you from doing it your own way, but if you want to give it a try, just run this in the root:
|
Let's fix the tests!
docker-compose up -d
)SnakeCaseTestCase
thing.pear\log
with modernmonolog/monolog
OciAdapter
; it never really worked, and as far as I can tell OCI drivers support for PDO have been abandoned.