Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problems downloading surrogates with fresh install #18

Open
moble opened this issue Jan 19, 2021 · 3 comments
Open

Problems downloading surrogates with fresh install #18

moble opened this issue Jan 19, 2021 · 3 comments

Comments

@moble
Copy link
Member

moble commented Jan 19, 2021

I installed from conda-forge and tried to run a few lines from @vijayvarma392, but got a couple errors when it tried to download the surrogate data. The first error was because the output directory didn't exist. So I suggest following this line

sdir = os.path.abspath(sdir)

with

  if not os.path.isdir(sdir):
    os.makedirs(sdir)

The second error was because, although I have wget installed on my machine, the conda environment the code was running in doesn't. But because wget was called with os.system, when the shell errored out, nothing happened, and gwsurrogates continued as if it had succeeded — which made this bug pretty hard to track down. Python now suggests using something from the subprocess module instead of os.system, so I suggest replacing this line

os.system('wget -q --directory-prefix='+sdir+' '+surr_url)

with

    print(subprocess.check_output('wget -q --directory-prefix='+sdir+' '+surr_url, shell=True, stderr=subprocess.STDOUT))

(along with import subprocess somewhere up above). This should error out if wget fails, and it will show what happened.

@duetosymmetry
Copy link
Member

Maybe we should move away from relying on wget and use the built in urllib.urlretrieve (in py2) or urllib.request.urlretrieve (in py3) instead?

@moble
Copy link
Member Author

moble commented Jan 20, 2021

The docs describe urllib.request.urlretrieve as a legacy interface that may be deprecated in the future; urlopen is presumably preferred. But all of these urllib* functions are pretty finicky — not dealing nicely with encodings, not detecting subtle errors, etc. If you want to stay in python, requests will be more comparable to wget.

@sfield17
Copy link
Contributor

sfield17 commented Apr 6, 2021

Proposed fix:

  • this will be fixed when we address issue 20

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants