Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not able to load the file data in table #209

Open
rahul-poralwar opened this issue Jun 16, 2020 · 7 comments
Open

Not able to load the file data in table #209

rahul-poralwar opened this issue Jun 16, 2020 · 7 comments

Comments

@rahul-poralwar
Copy link

Hi There,

Getting below error message in errorMessage column of the table LambdaRedshiftBatches table, while trying to load the file into Redshift table
{"message":"This copy request is illegal because it is trying to copy an object to itself without changing the object's metadata, storage class, website redirect location or encryption attributes.","code":"InvalidRequest","region":null,"time":"2020-06-16T14:00:43.027Z","requestId":"4E16FF73E0D32EAF","extendedRequestId":"8aM2KiSb6Zck30cMjXa/xX7yYfaVRQd2Ko6r1ZbTwzZ1GAieEIoeZEhBySK1dGO+OjReQy3G8wI=","statusCode":400,"retryable":false,"retryDelay":47.27240989136392,"level":"error"}

@IanMeyers
Copy link
Contributor

Can you please setup debug logging and then retrieve a failed load log output from Lambda?

@rahul-poralwar
Copy link
Author

Hi There,

The above was resolved by increasing the timeout of the Lambda function to 5 mins. However, I am getting below error in errorMessage column of the table LambdaRedshiftBatches table
{"kaizen-cluster.cihwavvmfjhc.ap-southeast-1.redshift.amazonaws.com:5439/kaizen-db":{"status":-1,"error":{"errno":"ETIMEDOUT","code":"ETIMEDOUT","syscall":"connect","address":"172.31.7.48","port":5439,"level":"error"}}}

Thanks in advance !

@IanMeyers
Copy link
Contributor

IanMeyers commented Jun 18, 2020

This is very likely due to networking. Please review this link in detail for how to configure Lambda→Redshift connections.

@rahul-poralwar
Copy link
Author

Thank you Ian for the update. Somehow it worked, however, I am now getting an error elsewhere. Below are the details for error in LambdaRedshiftBatches table
{
"kaizen-cluster.cihwavvmfjhc.ap-southeast-1.redshift.amazonaws.com:5439/kaizen-db": {
"status": -1,
"error": {
"name": "error",
"length": 154,
"severity": "FATAL",
"code": "3D000",
"file": "/home/ec2-user/padb/src/pg/src/backend/utils/init/postinit.c",
"line": "385",
"routine": "LcInitPostgres",
"level": "error"
}
}
}

Below is the detail of the loadClusters column of the LambdaRedshiftBatchLoadConfig table.

[
{
"M": {
"clusterDB": {
"S": "kaizen-db"
},
"clusterEndpoint": {
"S": "kaizen-cluster.cihwavvmfjhc.ap-southeast-1.redshift.amazonaws.com:5439/kaizen-db"
},
"clusterPort": {
"N": "5439"
},
"connectPassword": {
"S": "XXX"
},
"connectUser": {
"S": "kaizen-user"
},
"targetTable": {
"S": "kaizen_schema.kaizen_csv"
},
"truncateTarget": {
"BOOL": false
},
"useSSL": {
"BOOL": false
}
}
}
]

Am I missing anything here?

@rahul-poralwar
Copy link
Author

Hi Ian,

Just to update you, I tried to set-up the Redshift Loader in different AWS account and I got the same error.

@rahul-poralwar
Copy link
Author

Hi Ian,

There was some configuration issue due to which the file data was not loading into the table, which is sorted now.

Thanks a lot for your timely assistance :)

Now we also want to load the JSON file into the table. I just wanted to check with you, if there is any special caring needs to be taken care if we want to set up this process for JSON files?

@IanMeyers
Copy link
Contributor

Nothing particularly special. You will just need to set load type to JSON and then add any COPYOPTIONS that help you to import your particular structure of JSON. Glad to hear it's working!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants