-
Notifications
You must be signed in to change notification settings - Fork 5.8k
Python: Add S3 Batch scenario with CloudFormation integration #7531
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like a good addition, I added some clarification, references to applicable standards, and suggestions.
import boto3 | ||
from botocore.exceptions import ClientError, WaiterError | ||
|
||
class CloudFormationHelper: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These classes would be more consistent with the other examples if they were separate files. See the coding standards for more information.
""" | ||
try: | ||
# Define the CloudFormation template | ||
template = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could this stack be defined in a separate file? See some of our other examples, such as directory buckets, for a sample of what this might look like.
ClientError: If bucket creation fails | ||
""" | ||
try: | ||
if self.region_name != 'us-east-1': |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You might include some comments as to why the region specification changes the parameters.
# Create a CloudFormation client for the specified region | ||
self.cfn_client = boto3.client('cloudformation', region_name=region_name) | ||
|
||
def deploy_cloudformation_stack(self, stack_name): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can include type hints in the function declarations. This is a newer change, but it's part of our coding standards also.
DASHES = "-" * 80 | ||
STACK_NAME = "MyS3Stack" | ||
|
||
def __init__(self, region_name='us-west-2'): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If there's not a reason, we don't usually hard-cod regions.
None | ||
""" | ||
while True: | ||
user_input = input("\nEnter 'c' followed by <ENTER> to continue: ") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The "enter c" is a Java thing, and isn't needed by python. See our other examples for use of demo_tools, which has functionality for handling user input.
|
||
print("\n1. Creating S3 Batch Job...") | ||
job_id = scenario.create_s3_batch_job( | ||
account_id, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should the job be created in a suspended state so the rest of the scenario can run? When I run it, it goes so quickly to completed that the other parts of the code are unable to run.
|
||
|
||
class TestCloudFormationHelper: | ||
"""Test cases for CloudFormationHelper class.""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Have a look at some of the existing scenarios tests, like ECR, to see how we use stubbers for our tests. Unit tests are used for logic testing, otherwise we use a stubber (with option to use a real client) to run the scenario with mocked user input.
def create_s3_batch_job(self, account_id, role_arn, manifest_location, | ||
report_bucket_name): | ||
""" | ||
Create an S3 batch operation job. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can tag these discrete actions for inclusion in the actions part of the code library. See the Java example for the snippet blocks.
@@ -132,5 +132,24 @@ s3-control_Basics: | |||
- description: An action class that wraps operations. | |||
snippet_tags: | |||
- s3control.java2.job.actions.main | |||
Python: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This scenario includes some additional actions such as tagging, listing, and updating the priority - if you want to add it here, it should include that functionality as well. I believe the Java example has those functions, and you can see the full specification here. You'll also want to add snippet tags around the individual action code so that it can be shown in the actions blocks above.
This PR adds code example that demonstrates how to use the AWS SDK for Python (boto3) to work with S3 Batch Operations. The scenario covers various operations such as creating an AWS Batch compute environment, creating a job queue, creating a job definition, and submitting a job.
The example includes:
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.