AWS S3 Tutorial for Beginners
In this post, we are going to learn about AWS S3. It is the most basic service to deploy a website. AWS S3 is an object storage service from Amazon.
There is a concept of Bucket. They are containers where objects are stored and then uploaded to Amazon S3. There will never be an object without a Bucket.
To start with AWS, we need to login to the console.
If you don’t have an AWS account, then create one by giving an email address. Also note that it is mandatory to give your debit card or credit card for registering. After registration is done, we can opt for the free plan.
You will get the below screen, once registration is complete. Here, simply click on Go to the AWS Management Console button.
For first time user the console will look like below.
We have clicked on S3 in the above screenshot and it will open a new screen, where click on the Create bucket button.
Now, in the next screen put the Bucket name and choose a region near to your location.
Scroll a bit down and keep all the default option. After that click on the Create bucket button.
Now, the bucket will be created and shown in the console.
On clicking on the bucket link, a new screen will open. Here, click on the Upload button to upload files or folder.
Now, in the next screen we will click on the Add files button to add a file.
We have added a simple image file from our local computer and after that clicked on the Upload button.
Once the upload is completed, it will show Succeeded status. Click on the Close button on the upper right corner.
Back to the bucket, we can see the basic properties in the Properties tab now.
Amazon S3 charges different for different types of Buckets. By default it will put everything in the costlier S3 Standard bucket. All the buckets are shown in below image.
We can move our data to different buckets from top to bottom, which charges the least by creating lifecycle rules.
So, click on the Management tab and then the Create lifecycle rule button.
Now, we are giving the rule a name and limiting it’s scope to filter. In the Prefix, we need add the thing in which this rule will be applicable. In our case it is logs folder.
After scrolling down a bit, we need to check the checkbox to move objects between storage classes. Also we need to add the different storage classes our object will be moved after certain days.
Also, need to acknowledge a checkbox for Transitioning small objects.
In the previous screen, if we click on Amazon S3 pricing, we can see the pricing of each storage class.
Back in the previous screen, we have scrolled down and clicked on Create rule button.
We got an error for Glacier Deep Archive, as we need to increase the days. We have increased the days to 180 and again clicked on Create rule button.
This time our rule got created and we see the same as enabled.
Now, we will create the logs folder, which will go through different bucket classes. So, again back in Buckets click on Create folder button.
Now, give the folder name as logs and click on the Create folder button.
Now, the logs folder will be created in our bucket.
Now, click on the logs folder and upload any file. Here, we have uploaded a sample Apache log file. Now, this file will be moved to different class after the mentioned number of days.
Now, we will look into Bucket Policy. It allows on deny access to a bucket to specific users or services. The IAM policy in S3 allows this access. Here, we have to upload a JSON object.
For this first go to the bucket and click on Permissions tab. After that click on the Edit button in the Bucket policy.
Now, in the next screen click on the Policy generator to generate the JSON. But also copy the Bucket ARN, which we will need in the next screen.
In the next screen click on the the Select Type of Policy and make it S3 Bucket Policy. The Effect will be Allow and Principal will be *.
In the Actions click on the checkbox for All Actions. And in the ARN, give the ARN copied earlier. After that click on the Add Statement button.
Now, click on the Generate Policy button and it will open a pop-up with the policy JSON. Copy the content for the same.
Now, we will copy the policy in the Edit bucket policy screen. Also, notice that we added a /* to choose all objects.
Now, scroll a bit down and click on Save changes button. But we are getting a big error as we made everything public in this bucket.
To solve the issue, we need to go back to the Permissions tab and click the Edit button in Block public access section.
Now, uncheck the checkbox for Block all public access and click on the Save changes button.
Since, it’s for all public access to this bucket it will ask us to confirm.
Back in the Edit bucket policy screen, we have clicked on the Save Changes again and it got saved. But we got a red Publicly accessible tag in our bucket.
Now, we will learn to add Versioning in our bucket. For this in the Properties of the Bucket, click on Edit in the Bucket Versioning section.
In the next screen click on Enable and then Save changes.
Now, we will upload a simple HTML project in our bucket.
Once the three files are uploaded, click on the index.html file.
In the next screen we will see an Object URL. Copy it.
Now, opening the url will open our simple project, which is working fine.
Now, back in S3 click on the Version tab and we will see the versioning been done.
To see the versioning in action, we have updated our small project and uploaded the files again.
Now, if we go to Versions tab, we will see the new version.
And revisiting the public url will show our updated site.
This completes our AWS S3 tutorial, where we also learned to host a static site.