-
Notifications
You must be signed in to change notification settings - Fork 4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
code-pipeline: Existing S3 buckets are not given the correct permissions #14165
Comments
thanks for opening the issue, but I think you're making a logical mistake here. From your description, it seems like the Buckets to be deployed are in the target env, not in the tooling account. But look how you are importing them: const testAccountBucket = s3.Bucket.fromBucketName(toolingAccountStack, bucketName, bucketName); You are importing it into the The solution you mentioned (importing it into the target env) is the correct one. Yes, it adds additional Stacks, but that's because you need additional resources in the target env to be able to deploy to them from the tooling account. Alternatively, you can look into the CDK Pipelines module. When you bootstrap your target envs using it, it will create Roles that can be used for the deployment (using something like Thanks, |
The problem is we ARE using pipelines cdk and it DOES create roles but they don't work (because the tooling account can't give permissions). If you compare this to the cloud formation deploy action (where admittedly you can supply an account here) this creates the roles in the cross-account-support stack so they work. I understand that this difference comes from the fact that one is "resource" backed and the other is "account" backed but this is rather unintuitive. The documentation on the Action properties isn't actually consistent.
We see that the account documentation clearly states that it will use I did try using the fromBucketAttributes which lets you specify what account the bucket is in but pipelines doesn't pay attention to this. The issue is that the synth works and there a roles and policies that make sense but the pipeline fails saying there are not enough permissions (I did try adding the putObjectAcl permissions it suggested but obviously tooling can't create a policy that gives access to a different account) |
Sorry, I have problems following what you're saying. Let's establish some basic facts:
const testAccountBucket = s3.Bucket.fromBucketName(toolingAccountStack, bucketName, bucketName); , then this will not work, and cannot work. That's because you're telling CDK the target Bucket lives in the tooling account, which is not the case.
const testAccountBucket = s3.Bucket.fromBucketName(targetEnvStack, bucketName, bucketName); , you wrote:
So, does doing |
Yes. 1. doesn't work. 2. does. 1 is frustrating because it appears to work and gives no useful error. But I guess this is a limitation of cdk in general. there is no way for it to know where that bucket lives during the But
now tells cdk that the target bucket lives in the target account. However the result is the same as for 1. My point is that there is NO way to tell the Looking further into the
So why does I have raised #14217 as a draft as this is what I would expect the behaviour to be. |
…resource's account, not its Stack's account In CodePipeline, Actions can have a resource backing it (like an S3 Bucket, or an ECS Service). In the CodePipeline construct however, the account a given action was in was incorrectly determined by checking the Stack of the backing resource, instead of the resource itself. This value can be different for imported resources. Fixes aws#14165
Thanks for the explanation @alastair-watts-avrios! I finally understood what the problem was. I've submitted #14224 fixing this bug. Thanks, |
Thanks for your help. I will close my PR as it was far from production ready anyway. Hopefully your change will be released soon. Glad I was able to clarify the issue. |
…resource's account, not its Stack's account (#14224) In CodePipeline, Actions can have a resource backing it (like an S3 Bucket, or an ECS Service). In the CodePipeline construct however, the account a given action was in was incorrectly determined by checking the Stack of the backing resource, instead of the resource itself. This value can be different for imported resources. Fixes #14165 ---- *By submitting this pull request, I confirm that my contribution is made under the terms of the Apache-2.0 license*
|
…resource's account, not its Stack's account (aws#14224) In CodePipeline, Actions can have a resource backing it (like an S3 Bucket, or an ECS Service). In the CodePipeline construct however, the account a given action was in was incorrectly determined by checking the Stack of the backing resource, instead of the resource itself. This value can be different for imported resources. Fixes aws#14165 ---- *By submitting this pull request, I confirm that my contribution is made under the terms of the Apache-2.0 license*
When an S3 bucket is imported into a cross-account pipeline for use in an
S3DeployAction
it is hard to give it the correct permissions. We use an approach of having a tooling account and three different environment accounts. We have a final stage where we want to deploy some files from a pipeline in the tooling account to an existing S3 bucket in each env.The bucket can actually be referenced using the pipeline as the scope in
Bucket.fromBucketAttributes
orBucket.fromBucketName
and a role and policy is created. However this is created all in the tooling account so doesn't have any effect once the pipeline step is actually triggered. To get around this we can create a new stack in the correct account and use that as the scope. This adds 7 extra cloud formations templates to our build output (since we have multiple pipelines). Currently we have gone with an approach of mimicking some of the code fromgetOtherStackIfActionIsCrossAccount
to get the cross account support stack. Would be useful if this could somehow be spotted and the role created in the cross account stack when needed.Reproduction Steps
What did you expect to happen?
A role to be created in the cross account stack since the bucket exists in the test account. Or a buildtime error.
What actually happened?
Failed to run the step and hard to debug since all the permissions seem to be present.
Environment
Other
I can see two ways this may be solvable:
getOtherStackIfActionIsCrossAccount
to use the second half of the method when the resource is an existing s3.Happy to contribute but am unsure what solution(s) fit best. Am aware this is an edge case with multiple work arounds.
This is 🐛 Bug Report
The text was updated successfully, but these errors were encountered: