How to upload files in Amazon S3 Bucket

What is AWS S3

Amazon S3 (Simple Storage Service) is a cloud-based object storage service provided by Amazon Web Services (AWS) which is scalable, high-speed, and easy to use.. It allows you to store and retrieve data of any size from anywhere on the web. S3 is commonly used for data backup, hosting static websites, storing media files, application data, and more.

Now most software industries (small or large) prefer AWS S3 to store and protect their massive of data for range of use cases such as websites, mobile applications, archive, enterprise applications, data lakes, backup and restore, IoT devices, and big data analytics.

 

Key Features of Amazon S3:

  • Scalability: S3 allows to store an unlimited amount of data and easily scale your storage needs up or down.
  • Durability and Availability: S3 is designed for 99.999999999% (11 nines) durability and provides high availability for stored objects across multiple locations and Availability Zones.
  • Security: It offers robust security features, including encryption, access control, and identity management to control access to your data.
  • Flexible Storage Classes: S3 provides various storage classes optimized for different use cases, including Standard, Infrequent Access (IA), Intelligent-Tiering, Glacier for archiving, and more.
  • Versioning and Lifecycle Policies: You can enable versioning to maintain multiple versions of an object and define lifecycle policies to automate data transitions to different storage classes or delete expired data.
  • Integration and Compatibility: S3 integrates seamlessly with other AWS services and supports a wide range of applications, frameworks, and SDKs across different platforms.
  • RESTful API: Access to S3 is provided through a simple, RESTful API, enabling easy integration and interaction with the service.

 

Use Cases of Amazon S3:

  • Backup and Recovery: Storing backups of critical data and enabling disaster recovery solutions.
  • Static Website Hosting: Hosting static websites by directly serving HTML, CSS, JS, and media files.
  • Data Archiving: Storing rarely accessed data for long-term retention using lower-cost storage classes like Glacier.
  • Content Distribution: Serving media files, large downloads, and streaming content through Amazon CloudFront CDN (Content Delivery Network) that integrates with S3.

Amazon S3 is a highly reliable, secure, and versatile storage service suitable for a wide range of applications, from small-scale projects to enterprise-level solutions.

 

To store data, we need to work with resource which is known as bucket and object. A bucket is the container of objects where object is the file and any metadata

For storing an object in AWS S3 we have to create first bucket, then we can upload object into that bucket. We can open, download and copy that object form when needed. Even we can clean  up resource or bucket or objects when no longer needed.

AWS S3 allow to upload, store and download any types of files or data up to 5 TB in size.

 

Uploading Objects in S3

Any files store in S3 as an object. Before we upload any object in S3 we need write permission for bucket. You can upload any type of file to an S3 bucket - images, backups, data, movies, etc. The maximum file size that can be uploaded using the Amazon S3 console is 160 GB. To upload a file larger than 160 GB, use the AWS Command Line Interface (AWS CLI), the AWS SDKs, or the Amazon S3 REST API.

 

If you upload an object with a key that already exists in a group that supports versioning, Amazon S3 creates another version of the object instead of overwriting the existing object

  • Upload an object in a single operation using the AWS SDKs, REST API, or AWS CLI - You can load a single object up to 5GB in size with a single PUT operation.
  • Upload a single item using the Amazon S3 console - The Amazon S3 console allows you to upload a single item up to 160 GB in size.
  • Upload an object in chunks using the AWS SDKs, REST API, or AWS CLI - The chunk upload API feature allows you to upload a single large object up to 5 TB.

Upon uploading an object, it undergoes automatic encryption using server-side encryption with Amazon S3 managed keys (SSE-S3) by default. Subsequently, during download, the object is decrypted.

 

Upload Using the AWS SDKs

This example guides you through using classes from the AWS SDK for PHP to upload an object of up to 5 GB in size. The following PHP example creates an object in a specified bucket by uploading data using the putObject() method

 

 For larger files, you must use the multipart upload API operation. For more information, see Uploading and copying objects using multipart upload.

The multipart upload API feature is designed to improve the upload experience for larger items. You can upload the item in parts. These object parts can be uploaded independently, in any order, and in parallel. You can use multipart shipping for items between 5 MB and 5 TB in size.

Multipart upload breaks the file into smaller parts and uploads them in parallel, improving efficiency, resiliency, and the ability to upload large files.

Here's a general guide to uploading a single large object using multipart upload in AWS S3 with the AWS SDK for PHP:

Downloading objects from S3

Downloading an object (file) from an Amazon S3 bucket using the AWS SDK for PHP is relatively straightforward. Here's an example of how you can download an object from an S3 bucket:

Deleting objects from S3

The following examples show how you can remove an object from a group using the AWS SDKs. 

 

How to use AWS S3 in any Laravel Application

If you haven't installed the Amazon s3 package, you need to install the s3 component package with the following command.

Configuration of AWS S3 Credentials

Now you need to add aws credentials to your .env file like this:
 

The next step is to add two new routes to the web.php file. One way to create the form and another for the post method. Let's just create both routes listed below:


 

In this step, we have to create a new ImageUploadController and here we have to write two methods imageUpload() and imageUploadPost(). So one method handles receiving the other method for post. So enter and add the code.

 

Downloading an Object from S3 in Laravel:

Deleting an Object from S3 in Laravel:

 

 

Displaying an Image from S3 in Laravel View:

 

N.B. : how the folders are created in S3 for uploading image like image/users/gallery

In Amazon S3, you don't create actual folders; rather, the folder structure is simulated using the object key (file path). You can use a delimiter (like /) in the object key to organize and mimic a folder structure.

For example, if you want to organize images for users in a gallery, you can structure the object keys as follows:

  • To upload an image for a user with ID 123 to their gallery: Object Key: image/users/123/gallery/image1.jpg

Here, image/users/123/gallery/ appears like a folder structure, but in reality, it's part of the object's key.

When using the AWS SDK or any S3 API to upload an image, you'll specify the entire object key (image/users/123/gallery/image1.jpg). If the "folders" don't exist yet, S3 will create them implicitly as part of the object key path when you upload the image.

To illustrate using the AWS SDK for PHP:

In this example, when putObject is called, if the folders image/users/123/gallery/ don't exist, S3 will create them as part of the object key path during the upload of image1.jpg.

This logical structuring makes it convenient to organize and manage objects within S3 without explicit folder creation.


Conclusion: Amazon S3, a cloud-based storage service by AWS, excels in scalability, durability, and security. It accommodates diverse storage needs, from backups to static website hosting, with seamless integration into AWS services. Offering a RESTful API and supporting various storage classes, S3 ensures flexibility and efficiency. The blog covers key features, use cases, and the process of uploading, downloading, and deleting objects in S3. It highlights the multipart upload API for large files and provides a practical guide for implementing AWS S3 in Laravel applications. S3's versatility and reliability make it an ideal choice for data management, catering to both small-scale and enterprise-level solutions.

Comments

We Serve clients globally in diverse industries

Stay Upto Date With Our Newsletter.