Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BigQuery table.import() complains about not having a project ID. #1984

Closed
AKPWebDesign opened this issue Feb 10, 2017 · 6 comments
Closed

BigQuery table.import() complains about not having a project ID. #1984

AKPWebDesign opened this issue Feb 10, 2017 · 6 comments
Assignees
Labels
api: bigquery Issues related to the BigQuery API.

Comments

@AKPWebDesign
Copy link
Contributor

Despite having specified the project ID when creating the BigQuery object, I'm getting this error about there not being one specified when trying to use table.import().

Environment details

  • OS: Win 10 Pro
  • Node.js version: v7.5.0
  • npm version: v4.1.2
  • google-cloud-node version: bigquery v0.7.0

Steps to reproduce

  1. require @google-cloud/bigquery
  2. create BigQuery object specifying projectId in options
  3. create dataset object from BigQuery object, then table object from dataset
  4. table.import(jsonFile).then(...);

Error

Sorry, we cannot connect to Google Cloud Services without a project ID. You may specify one with an environment variable named "GCLOUD_PROJECT". See https://googlecloudplatform.github.io/google-cloud-node/#/docs/guides/authentication for a detailed guide on creating an authenticated connection.
    at Object.<anonymous> (%project%\node_modules\@google-cloud\common\src\util.js:55:29)
    at Module._compile (module.js:571:32)
    at Object.Module._extensions..js (module.js:580:10)
    at Module.load (module.js:488:32)
    at tryModuleLoad (module.js:447:12)
    at Function.Module._load (module.js:439:3)
    at Module.require (module.js:498:17)
    at require (internal/module.js:20:19)
    at Object.<anonymous> (%project%\node_modules\@google-cloud\common\src\service-object.js:32:12)
    at Module._compile (module.js:571:32)
    at Object.Module._extensions..js (module.js:580:10)
    at Module.load (module.js:488:32)
    at tryModuleLoad (module.js:447:12)
    at Function.Module._load (module.js:439:3)
    at Module.require (module.js:498:17)
    at require (internal/module.js:20:19)
@stephenplusplus
Copy link
Contributor

Can you show the code that reproduces as well? I just tried this and it went through okay:

var bigquery = require('@google-cloud/bigquery')({
  projectId: 'my-project-id'
})

var dataset = bigquery.dataset('myDataset')
var table = dataset.table('myTable')

table.getMetadata().then(console.log)

(getMetadata and import should both error if there's not a project ID available)

@stephenplusplus stephenplusplus added api: bigquery Issues related to the BigQuery API. auth labels Feb 10, 2017
@AKPWebDesign
Copy link
Contributor Author

Alright, I've been playing around with my code for a bit, and I realize that if I call the code I'm using to send data directly, it seems to work fine, but when calling it the way I do in my application, it doesn't. I'm calling it exactly the same way in both cases, but apparently there's something I'm missing. Either way, it seems like something on my side, since it works in my stripped down tests. Feel free to close this, unless you've got any ideas of things I could be doing that would mess this up.

@stephenplusplus
Copy link
Contributor

Not off hand, but walking step by step through what your code does will likely highlight any oddities. If you're still stumped, share any code to reproduce and I will take a look.

@AKPWebDesign
Copy link
Contributor Author

AKPWebDesign commented Feb 13, 2017

Alright, I've done some investigating, and found that the error is being thrown here: https://github.com/GoogleCloudPlatform/google-cloud-node/blob/master/packages/common/src/util.js#L356-L358

Problem is, it's catching a RangeError ("Maximum call stack size exceeded"), then assuming that there's a missing project ID, leading to an incorrect error message. Additionally, the reason for the stack overflow appears to be that a DestroyableTransform object is getting passed in as authenticatedReqOpts.multipart[1].body, which is causing there to be quite a few levels of recursion once execution gets to replaceProjectIdToken (https://github.com/GoogleCloudPlatform/google-cloud-node/blob/master/packages/common/src/util.js#L512).

I'm not entirely sure what I could be doing differently between my application code and my testing code, so I'll see if I can come up with a way that I can give you the problematic code (without making my employer angry).

@AKPWebDesign
Copy link
Contributor Author

Upon further investigation, the same DestroyableTransform object is being passed in with my test code, making things a bit more confusing for me.

@AKPWebDesign
Copy link
Contributor Author

Alright, so after even more investigation, the DestroyableTransform object includes some other objects in its Domain when being run in my application. Those objects are then being looped through as well, leading to the call stack overflow.

AKPWebDesign added a commit to DataAnalyticsServices/google-cloud-node that referenced this issue Feb 13, 2017
This is to stop the for...in loop from looping over properties that
aren't actually part of the object, which can lead to the call stack
overflowing under certain circumstances.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: bigquery Issues related to the BigQuery API.
Projects
None yet
Development

No branches or pull requests

2 participants