Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bazel Python can't do namespaces with google.cloud #65

Closed
fortuna opened this issue Jan 30, 2018 · 10 comments
Closed

Bazel Python can't do namespaces with google.cloud #65

fortuna opened this issue Jan 30, 2018 · 10 comments

Comments

@fortuna
Copy link

fortuna commented Jan 30, 2018

I'm trying to use google-cloud-bigquery, but my code fails to find bigquery in google.cloud.

I am able to import google.cloud, but that loads pypi__google_cloud_core_0_28_0 only, and not the other dependencies that add to google.cloud, despite pypi__google_cloud_bigquery_0_29_0 and all other transitive dependencies being in the runfiles.

Here is a debugging session:

In [22]: import google.cloud

In [23]: from  google.cloud import bigquery
---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
~/firehook/cwm/bazel-bin/jupyter.runfiles/__main__/jupyter.py in <module>()
----> 1 from  google.cloud import bigquery

ImportError: cannot import name 'bigquery'

In [24]: google.__spec__
Out[24]: ModuleSpec(name='google', loader=<_frozen_importlib_external.SourceFileLoader object at 0x7f2db22a6ef0>, origin='/home/fortuna/firehook/cwm/bazel-bin/jupyter.runfiles/pypi__google_cloud_core_0_28_0/google/__init__.py', submodule_search_locations=['/home/fortuna/firehook/cwm/bazel-bin/jupyter.runfiles/pypi__google_cloud_core_0_28_0/google'])

In [25]: google.cloud.__spec__
Out[25]: ModuleSpec(name='google.cloud', loader=<_frozen_importlib_external.SourceFileLoader object at 0x7f2db22a6eb8>, origin='/home/fortuna/firehook/cwm/bazel-bin/jupyter.runfiles/pypi__google_cloud_core_0_28_0/google/cloud/__init__.py', submodule_search_locations=['/home/fortuna/firehook/cwm/bazel-bin/jupyter.runfiles/pypi__google_cloud_core_0_28_0/google/cloud'])

In [26]: ! ls /home/fortuna/firehook/cwm/bazel-bin/jupyter.runfiles
__init__.py		 pypi__futures_3_2_0		       pypi__google_resumable_media_0_3_1  pypi__requests_2_18_4
__main__		 pypi__google_api_core_0_1_4	       pypi__idna_2_6			   pypi__rsa_3_4_2
MANIFEST		 pypi__googleapis_common_protos_1_5_3  pypi__protobuf_3_5_1		   pypi__setuptools_38_4_0
pypi__cachetools_2_0_1	 pypi__google_auth_1_3_0	       pypi__pyasn1_0_4_2		   pypi__six_1_11_0
pypi__certifi_2018_1_18  pypi__google_cloud_bigquery_0_29_0    pypi__pyasn1_modules_0_2_1	   pypi__urllib3_1_22
pypi__chardet_3_0_4	 pypi__google_cloud_core_0_28_0        pypi__pytz_2017_3

My BUILD rule is

py_binary(
    name = "jupyter",
    srcs = ["jupyter.py"],
    data = [
        requirement("google-cloud-bigquery"),
    ],
)

Where jupyter.py starts an IPython REPL.

@fortuna
Copy link
Author

fortuna commented Jan 30, 2018

It may be related to #55

@duggelz
Copy link

duggelz commented Feb 15, 2018

Bug #55 actually diagnoses this better and completely than my previous comment, so I'm removing it.

@joshburkart
Copy link

Getting the same issue! Any thoughts on a workaround?

One idea is to make pip_import generate a single whl_libraries repo rule, instead of one whl_library for each wheel. Then everything should get extracted into the right directory structure. However, this would mean that all PyPI packages need to be downloaded to build a target that might only need a small subset.

To get around this disadvantage, I suppose we could preserve the existing whl_library behavior, but also add this alternate behavior, so that users could e.g. put all their google-cloud-* stuff into one requirements.txt, and everything else in another.

Thoughts?

@evanj
Copy link

evanj commented Mar 8, 2018

There is a lot of discussion happening about a ground-up rewrite of these rules: https://groups.google.com/forum/#!forum/bazel-sig-python

The workaround I'm using at the moment, which I don't necessarily recommend, is I wrote our own set of rules: https://github.com/TriggerMail/rules_pyz

@joshburkart
Copy link

Thanks Evan! Playing around with various options; I appreciate the pointer to your solution, which I will definitely look into...

@jdelfino
Copy link

Just want to note that this issue is blocking me from adopting Bazel for our python codebase.

@tcwalther
Copy link

@brandjon is this somewhere on the roadmap?

@tcwalther
Copy link

Actually, this seems to work now. I just had to declare all the transitive dependencies, so for google-cloud-storage, for example, this resulted in:

py_library(
  name = "google-cloud-storage",
  deps = [
    requirement("cachetools"),
    requirement("certifi"),
    requirement("chardet"),
    requirement("google-api-core"),
    requirement("google-auth"),
    requirement("google-cloud-core"),
    requirement("google-resumable-media"),
    requirement("google-cloud-storage"),
    requirement("googleapis-common-protos"),
    requirement("idna"),
    requirement("protobuf"),
    requirement("pyasn1"),
    requirement("pyasn1-modules"),
    requirement("pytz"),
    requirement("requests"),
    requirement("rsa"),
    requirement("setuptools"),
    requirement("six"),
    requirement("urllib3"),
  ]
)

@jonbuffington
Copy link

jonbuffington commented Dec 13, 2019

@tcwalther The bug still exists.

To recreate, try bazel run where a target depends on a .py file containing from google.cloud import storage. You will see an error message similar to:

    from google.cloud import storage
ImportError: cannot import name 'storage' from 'google.cloud' (/private/var/tmp/_bazel_jcb/6b822ab9db83c06cf3104c3b5c927ad9/execroot/__main__/bazel-out/darwin-fastbuild/bin/py/scapy_crawl_bin.runfiles/py_pypi__google_cloud_core_1_1_0/google/cloud/__init__.py)

It should be importing from the py_pypi__google_cloud_storage.

My guess is that each requirement is sandboxed while the various "google-*" requirements assume the modules will be merged into a common google.cloud namespace.

See #93

@thundergolfer
Copy link
Collaborator

I think we can close this as a duplicate of #93.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests