Control and management of data can be a strenuous task. These AWS S3 commands will help you quickly and efficiently manage your AWS S3 buckets and Data.

AWS S3 is the object storage service provided by AWS. It is the most widely used storage service from AWS that can virtually hold an infinite amount of data. It is highly available, durable, and easy to integrate with several other AWS Services.  

AWS S3 can be used by people with any requirements like mobile/web application storage, big data storage, machine learning data storage, hosting static websites, and many more.

If you have been using S3 in your project, you would know that given the vast amount of storage capacity, the management of 100s of buckets and terabytes of data in these buckets can be a demanding job. We have a list of AWS S3 commands with examples that you can use to manage your AWS S3 buckets and data efficiently.

AWS CLI Setup

After you have successfully downloaded and installed the AWS CLI, you need to configure AWS Credentials to be able to access your AWS Account and services. Let us quickly run through how you can configure AWS CLI.

The first step is to create a user with programmatic access to AWS Account. Remember to check this box when you create a user for AWS CLI.

Give the permissions and create a user. At the final screen after you successfully created this user copy the Access key ID and Secret access key for this user. We will use these credentials to log in via the AWS CLI.

Now go to the terminal of your choice and run the following command.

aws configure 

Enter the Access key ID and Secret access key when prompted. Select any AWS region of your choice and the command output format. I personally prefer using the JSON format. This is no big deal you can always change these values later.

You can now run any AWS CLI Command in the console. Let us now go through the AWS S3 Commands.

cp

The cp command simply copies the data to and from S3 buckets. It can be used to copy files from local to S3, from S3 to local, and between two S3 buckets. There are a lot of other parameters that you can supply with the commands.

For example, -dryrun parameter to test the command, –storage-class parameter to specify the storage class of your data in S3, other parameters to set encryption, and much more. The cp command gives you complete control over how you configure your data security in S3.

Usage

aws s3 cp <SOURCE> <DESTINATION> [--options]

Examples

Copy data from local to S3

aws s3 cp file_name.txt s3://bucket_name/file_name_2.txt

Copy data from S3 to local

aws s3 cp s3://bucket_name/file_name_2.txt file_name.txt

Copy data between S3 buckets

aws s3 cp s3://bucket_name/file_name.txt s3://bucket_name_2/file_name_2.txt

Copy data from local to S3 – IA

aws s3 cp file_name.txt s3://bucket_name/file_name_2.txt --storage-class STANDARD_IA 

Copy all the data from a local folder to S3

aws s3 cp ./local_folder s3://bucket_name --recursive

ls

The ls command is used to list the buckets or the contents of the buckets. So, if you simply want to view information about your buckets or the data in these buckets you can use the ls command.

Usage:

aws s3 ls NONE or <BUCKET_NAME> [--options]

Examples

List all buckets in the account

aws s3 ls

Output:
2022-02-02 18:20:14 BUCKET_NAME_1
2022-03-20 13:12:43 BUCKET_NAME_2
2022-03-29 10:52:33 BUCKET_NAME_3

This command list all the buckets in your account with the bucket creation date.

List all top-level objects in a bucket

aws s3 ls BUCKET_NAME_1 or s3://BUCKET_NAME_1 

Output:
                           PRE samplePrefix/
2021-12-09 12:23:20       8754 file_1.png
2021-12-09 12:23:21       1290 file_2.json
2021-12-09 12:23:21       3088 file_3.html

This command list all the top-level objects in an S3 bucket. Note here that the objects with the prefix samplePrefix/ are not shown here only the top-level objects.

List all the objects in a bucket

aws s3 ls BUCKET_NAME_1 or s3://BUCKET_NAME_1 --recursive

Output:
2021-12-09 12:23:20       8754 file_1.png
2021-12-09 12:23:21       1290 file_2.json
2021-12-09 12:23:21       3088 file_3.html
2021-12-09 12:23:20      16328 samplePrefix/file_1.txt
2021-12-09 12:23:20      29325 samplePrefix/sampleSubPrefix/file_1.css

This command list all the objects in an S3 bucket. Note here that the objects with the prefix samplePrefix/ and all the sub prefixes are also displayed.

mb

The mb command is simply used to create new S3 buckets. This is a fairly simple command but to create new buckets, the name of the new bucket should be unique across all the S3 buckets.

Usage

aws s3 mb <BUCKET_NAME>

Example

Create a new bucket in a specific region

aws s3 mb myUniqueBucketName --region eu-west-1

mv

The mv command simply moves the data to and from S3 buckets. Just like the cp command, mv command is used to move data from local to S3, S3 to local, or between two S3 buckets.

The only difference between the mv and the cp command is that when using the mv command the file is deleted from the source. AWS moves this file to the destination. There are a lot of options that you can specify with the command.

Usage

aws s3 mv <SOURCE> <DESTINATION> [--options]

Examples

Move data from local to S3

aws s3 mv file_name.txt s3://bucket_name/file_name_2.txt

Move data from S3 to local

aws s3 mv s3://bucket_name/file_name_2.txt file_name.txt

Move data between S3 buckets

aws s3 mv s3://bucket_name/file_name.txt s3://bucket_name_2/file_name_2.txt

Move data from local to S3 – IA

aws s3 mv file_name.txt s3://bucket_name/file_name_2.txt --storage-class STANDARD_IA 

Move all the data from a prefix in S3 to a local folder.

aws s3 mv s3://bucket_name/somePrefix ./localFolder --recursive

presign

The presign command generates a pre-signed URL for a key in the S3 bucket. You can use this command to generate URLs that can be used by others to access a file in the specified S3 bucket key.

Usage

aws s3 presign <OBJECT_KEY> –expires-in <TIME_IN_SECONDS>

Example

Generate a pre-signed URL that is valid for 1 hour for an object in the bucket.

aws s3 presign s3://bucket_name/samplePrefix/file_name.png --expires-in 3600

Output:
https://s3.ap-south-1.amazonaws.com/bucket_name/samplePrefix/file_name.png?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIA4MCZT73PAX7ZMVFW%2F20220314%2Fap-south-1%2Fs3%2Faws4_request&X-Amz-Date=20220314T054113Z&X-Amz-Expires=3600&X-Amz-SignedHeaders=host&X-Amz-Signature=f14608bbf3e1f9f8d215eb5b439b87e167b1055bcd7a45c13a33debd3db1be96

rb

The rb command is simply used to delete S3 buckets.

Usage

aws rb <BUCKET_NAME>

Example

Delete an S3 bucket.

aws s3 mb myBucketName
# This command fails if there is any data in this bucket.

Delete an S3 bucket along with the data in the S3 bucket.

aws s3 mb myBucketName --force

rm

The rm command is simply used to delete the objects in S3 buckets.

Usage

aws s3 rm <S3Uri_To_The_File>

Examples

Delete one file from the S3 bucket.

aws s3 rm s3://bucket_name/sample_prefix/file_name_2.txt

Delete all files with a specific prefix in an S3 bucket.

aws s3 rm s3://bucket_name/sample_prefix --recursive

Delete all files in an S3 bucket.

aws s3 rm s3://bucket_name --recursive

sync

The sync command copies and updates files from the source to the destination just like the cp command. It is important that we understand the difference between the cp and the sync command. When you use cp it copies data from source to destination even if the data already exists in the destination.

It will also not delete files from the destination if they are deleted from the source. However, sync looks at the destination before copying your data and only copies the new and updated files. The sync command is similar to committing and pushing changes to a remote branch in git. The sync command offers a lot of options to customize the command.

Usage

aws s3 sync <SOURCE> <DESTINATION> [--options]

Examples

Sync local folder to S3

aws s3 sync ./local_folder s3://bucket_name

Sync S3 data to a local folder

aws s3 sync s3://bucket_name ./local_folder

Sync data between two S3 buckets

aws s3 sync s3://bucket_name s3://bucket_name_2

Move data between two S3 buckets excluding all .txt files

aws s3 sync s3://bucket_name s3://bucket_name_2 --exclude "*.txt

website

You can use S3 buckets to host static websites. The website command is used to configure the S3 static website hosting for your bucket.

You specify the index and the error files and the S3 gives you a URL where you can view the file.

Usage

aws s3 website <S3_URI> [--options]

Example:

Configure static hosting for an S3 bucket and specify the index and error files

aws s3 website s3://bucket_name --index-document index.html --error-document error.html

Conclusion

I hope the above gives you an idea about some of the frequently used AWS S3 commands to manage buckets. If you are interested in learning more, you may check out AWS certification details.