-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Get s3 info cf template #3050
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Get s3 info cf template #3050
Changes from all commits
1263735
ba47369
d77f7c4
d506648
698de67
3f6f070
af2f929
481ef8a
1d70155
e3f7872
99f7db4
e7304ec
c2e43db
dd35d53
f43e763
5779cd3
bc9db14
401a950
0a38340
8977c6a
99c80e1
aaa1b05
e2e85a9
c98e6ee
17a427a
e577d7f
70eac98
58ead71
45ee66f
d151b01
6f54240
30e3c02
b8a2591
7292353
6ff2e22
6c9a060
49e0967
4450886
5098e3c
29add30
308f859
4275e34
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -35,7 +35,7 @@ | |
| ResourceZip, | ||
| ) | ||
| from samcli.lib.package.s3_uploader import S3Uploader | ||
| from samcli.lib.package.uploaders import Uploaders | ||
| from samcli.lib.package.uploaders import Uploaders, Destination | ||
| from samcli.lib.package.utils import ( | ||
| is_local_folder, | ||
| make_abs_path, | ||
|
|
@@ -47,7 +47,6 @@ | |
| from samcli.lib.utils.packagetype import ZIP | ||
| from samcli.yamlhelper import yaml_parse, yaml_dump | ||
|
|
||
|
|
||
| # NOTE: sriram-mv, A cyclic dependency on `Template` needs to be broken. | ||
|
|
||
|
|
||
|
|
@@ -265,3 +264,48 @@ def delete(self): | |
| # Delete code resources | ||
| exporter = exporter_class(self.uploaders, None) | ||
| exporter.delete(resource_id, resource_dict) | ||
|
|
||
| def get_s3_info(self): | ||
| """ | ||
| Iterates the template_dict resources with S3 EXPORT_DESTINATION to get the | ||
| s3_bucket and s3_prefix information for the purpose of deletion. | ||
| Method finds the first resource with s3 information, extracts the information | ||
| and then terminates. It is safe to assume that all the packaged files using the | ||
| commands package and deploy are in the same s3 bucket with the same s3 prefix. | ||
| """ | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think we should also mention it is safe to assume all packaged files are in the same S3 bucket and with the same prefix, because this is how
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. this condition only applies if the stack is created by sam, but if not so this condition is not correct, or if the stack template was originally contains a lambda function that its code was already in S3 bucket. My understanding is we need this information to be able to delete the template file, so if for any reason, the S3 bucket information we got it from the template is not the correct one, what will be SAM behaviour?
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. That is a good point, this logic would only work for stacks which are created by sam. In the case we get bucket information which is not correct, SAM will not be able to tell that it is wrong and the template file wont be deleted. |
||
| result = {"s3_bucket": None, "s3_prefix": None} | ||
| if "Resources" not in self.template_dict: | ||
| return result | ||
|
|
||
| self._apply_global_values() | ||
|
|
||
| for _, resource in self.template_dict["Resources"].items(): | ||
|
|
||
| resource_type = resource.get("Type", None) | ||
| resource_dict = resource.get("Properties", {}) | ||
|
|
||
| for exporter_class in self.resources_to_export: | ||
| # Skip the resources which don't give s3 information | ||
| if exporter_class.EXPORT_DESTINATION != Destination.S3: | ||
| continue | ||
| if exporter_class.RESOURCE_TYPE != resource_type: | ||
| continue | ||
| if resource_dict.get("PackageType", ZIP) != exporter_class.ARTIFACT_TYPE: | ||
| continue | ||
|
|
||
| exporter = exporter_class(self.uploaders, None) | ||
| s3_info = exporter.get_property_value(resource_dict) | ||
|
|
||
| result["s3_bucket"] = s3_info["Bucket"] | ||
| s3_key = s3_info["Key"] | ||
|
|
||
| # Extract the prefix from the key | ||
| if s3_key: | ||
| key_split = s3_key.rsplit("/", 1) | ||
| if len(key_split) > 1: | ||
| result["s3_prefix"] = key_split[0] | ||
| break | ||
| if result["s3_bucket"]: | ||
| break | ||
|
|
||
| return result | ||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do we really get the S3 info from CF, or from the template itself?
S3 info can only be given through the toml config file, right ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes correct we get the s3 info from the cloudformation stack template and not cloudformation.