Step 1: Signing in to your AWS Account
Begin by opening your web browser and navigating to
https://aws.amazon.com/
.Sign in to your AWS account by entering your email address and password.
Once signed in successfully, you will be directed to the AWS Management Console dashboard.
Choose the appropriate region for your project from the top-right corner of the console.
You are now able to access and oversee your AWS resources, including EC2 instances, S3 buckets, and databases, via the console.
Don't forget to log out of your AWS account when you have completed your tasks.
Step 2: Setting Up an IAM Role
Navigate to the IAM (Identity and Access Management) service:
Click on "Services" located at the top of the page.
Enter "IAM" in the search bar and choose "IAM" from the search results.
Create a new IAM role by following these steps:
Go to the IAM dashboard and select "Roles" from the menu on the left.
Click on the "Create role" button.
Select the "Lambda" service as the trusted entity type that will utilize this role.
Proceed by clicking on the "Next: Permissions" button.
Add permissions to the role:
Search for "AWSLambdaBasicExecutionRole" and select it. This gives your Lambda function basic permissions to write logs.
Search for "AmazonS3ReadOnlyAccess" and select it. This gives your Lambda function read-only access to Amazon S3.
Click on the "Next: Tags" button (you can skip adding tags).
Click on the "Next: Review" button.
Name and create the role:
Give your role a name (e.g., LambdaS3Role) and optionally add a description.
Click on the "Create role" button.
Your IAM role is all set up and ready to rock! It's primed for your Lambda function to dive into Amazon S3 with read-only access. Let's go!
Step 3: Creating a Lambda Function
Navigate to the Lambda service in the AWS Management Console.
Click on the "Create function" button.
Choose the Python runtime for your function and select an existing IAM role or create a new one.
Click on the "Create function" button to initiate the creation of your Lambda function.
In the function code section, replace the default code with the provided Python code. Copy the provided code and paste it into the editor, replacing the existing code.
Save your function by clicking on the "Save" button located at the top right corner of the page.
sample code -
import json
def lambda_handler(event, context):
#Log the event information
print("S3 bucket creation event:", json.dumps(event, indent=2))
#Add your custom logic here
return {
'statusCode': 200,
'body': json.dumps('Lambda function executed successfully!')
}
Step 4: Adding Event Notification to S3 Bucket
Navigate to the S3 Management Console.
-
Select the S3 bucket you wish to set up (if it's not already created, you can create one).
Click on the bucket name to open its properties.
Scroll down to the "Properties" tab and find "Event notifications." Click on "Create event notification."
Configure the event notification:
For the "Event name," pick a clear and descriptive name.
Under "Events," opt for "All object create events."
In the "Destination" section, select "Lambda function" and pick the correct function from the dropdown menu.
Review your configuration to ensure it's correct.
Click "Save changes" to add the event notification to the S3 bucket.
Now we have successfully created the event notification! 🚀
Step 5: Testing the Configuration
Upload a File to the S3 Bucket:
Navigate to the S3 Console.
Select the bucket you previously created.
Click on the "Upload" button and choose a file from your computer to upload it to the bucket.
Check Lambda Logs:
Navigate to the Lambda Console.
Select the Lambda function you created.
Click on the "Monitoring" tab.
Review the CloudWatch Logs for any logs associated with the execution of your Lambda function.
Reviewing Lambda Output Log Events:
- Check the CloudWatch Logs for your Lambda function to identify any error messages or logs indicating the successful execution of your function.
In summary, automating AWS Lambda with S3 bucket creation triggers can really streamline your file processing workflows. When you set up triggers, any files uploaded to S3 can automatically kick off Lambda functions, making the whole processing seamless and super efficient. This method boosts automation, cuts down on manual work, and amps up the efficiency of your entire workflow.
Let me know if you have any other queries!