Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Move to pipenv #411

Merged
merged 6 commits into from
Nov 19, 2018
Merged

Move to pipenv #411

merged 6 commits into from
Nov 19, 2018

Conversation

pixelastic
Copy link
Contributor

@pixelastic pixelastic commented Nov 7, 2018

I replaced the use of requirements.txt with Pipfile from pipenv.

Now that we have pipenv allowing us to isolate a specific python
version and its dependencies, we don't need docker anymore for running
the tests.

I added pytest to the Pipfile and updated the ./docsearch test
command to run ./pytest scraper/src instead of building and running
a docker image.

I also updated the Travis script to install pipenv and then run
pipenv run ./docsearc test instead of using docker once again.

⚠ Note that this PR is still a WIP. I'll need to push stuff a few times to test Travis.

@pixelastic pixelastic changed the title Run tests without docker Move to pipenv Nov 7, 2018
@s-pace
Copy link
Contributor

s-pace commented Nov 12, 2018

We will need to update and sync our content to only promote the use of pipenv

cc feedback from user

Now that we have `pipenv` allowing us to isolate a specific python
version and its dependencies, we don't need docker anymore for running
the tests.

I added `pytest` to the `Pipfile` and updated the `./docsearch test`
command to run `./pytest scraper/src` instead of building and running
a docker image.

I also updated the Travis script to install `pipenv` and then run
`pipenv run ./docsearc test` instead of using docker once again.
@pixelastic
Copy link
Contributor Author

@s-pace All checks are ok, I think this can be merged if it looks good to you. Once merged we'll be able to update the docs on the other repo.

@s-pace
Copy link
Contributor

s-pace commented Nov 19, 2018

@pixelastic Did you check that everything was ready with Travis?

@pixelastic
Copy link
Contributor Author

Yep, all checks on this PR are passing, so it should also pass when we'll merge to master

image

@s-pace
Copy link
Contributor

s-pace commented Nov 19, 2018

Great, thanks. Merging it then

Copy link
Contributor

@s-pace s-pace left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🎉

@s-pace s-pace merged commit 386396f into master Nov 19, 2018
@pixelastic pixelastic deleted the feat/pipenv branch November 20, 2018 13:10
@borekb
Copy link

borekb commented Nov 20, 2018

Hi, is there an updated documentation on how to work with this repo now? I'm not a Python user and running pip install -r requirements.txt as recommended here fails after this PR. Thanks.

@borekb
Copy link

borekb commented Nov 20, 2018

@borekb
Copy link

borekb commented Nov 20, 2018

Also, running ./docsearch docker:build fails with:

Setting up python-wheel (0.24.0-1~ubuntu1.1) ...
Setting up libffi-dev:amd64 (3.1~rc1+r3.0.13-12ubuntu0.2) ...
Processing triggers for libc-bin (2.19-0ubuntu6.14) ...
Removing intermediate container 8f020e30042a
 ---> c2f6af560902
Step 18/20 : COPY requirements.txt /root/
COPY failed: stat /var/lib/docker/tmp/docker-builder178861263/requirements.txt: no such file or directory

@s-pace
Copy link
Contributor

s-pace commented Nov 21, 2018

Hi @borekb,

We will update the whole documentation very soon. We want to unify the way we promote the installation of the scraper. It will mainly be thanks to pipenv that ease this process.

Sorry about the docker issue, we have forgotten to update this part. We are currently updating the Dockerfile.

In the meantime, you can use the following requirements.txt. Just copy and paste this text and place it at the root of the scraper. It should then work as expected:

algoliasearch==1.13.0	
asn1crypto==0.22.0	
attrs==17.2.0	
Automat==0.6.0	
certifi==2017.7.27.1	
cffi==1.11.0	
chardet==3.0.4	
click==6.7	
constantly==15.1.0	
cryptography==2.0.3	
cssselect==1.0.1	
enum34==1.1.6	
future==0.16.0	
hyperlink==17.3.1	
idna==2.6	
incremental==17.5.0	
ipaddress==1.0.18	
lxml==4.0.0	
ndg-httpsclient==0.4.3	
ordereddict==1.1	
parsel==1.2.0	
pyasn1==0.3.5	
pyasn1-modules==0.1.4	
pycparser==2.18	
PyDispatcher==2.0.5	
pyOpenSSL==17.3.0	
pyperclip==1.5.27	
python-dotenv==0.7.1	
queuelib==1.4.2	
ratelimit==1.4.1	
requests==2.18.4	
requests-file==1.4.2	
Scrapy==1.5.0	
selenium==2.53.6	
service-identity==17.0.0	
six==1.11.0	
slacker==0.9.60	
tldextract==2.1.0	
Twisted==17.5.0	
urllib3==1.22	
w3lib==1.18.0	
zope.interface==4.4.2

@borekb
Copy link

borekb commented Nov 21, 2018

Thanks, @s-pace! I worked around it by doing git checkout a60def1 where the file still exists and Docker build works.

By the way, I'm not sure if it's possible from your end but as a user, I would strongly prefer just being able to run something like docker run --rm algolia/docsearch some-command args instead of having to worry about Python on my machine. I have historically had a very bad experience using Python as a runtime on my machine, for example, AWS CLI was historically failing for me because of some versioning issues, same of installing native Node modules, etc; I've just been burnt enough times :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants