You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Now that your bucket is configured, let's upload some files. First, create a sample text file:
80
-
81
-
```
82
-
echo "This is a sample file for the S3 tutorial." > sample-file.txt
55
+
echo "Hello, Amazon S3! This is a sample file for the getting started tutorial." > sample.txt
83
56
```
84
57
85
58
Upload this file to your bucket:
86
59
87
60
```
88
61
aws s3api put-object \
89
62
--bucket "$BUCKET_NAME" \
90
-
--key "sample-file.txt" \
91
-
--body "sample-file.txt"
63
+
--key "sample.txt" \
64
+
--body "sample.txt"
92
65
```
93
66
94
-
The response includes an ETag (entity tag) that uniquely identifies the content of the object, and since we enabled encryption, it also shows the encryption method:
67
+
The response includes an ETag (entity tag) that uniquely identifies the content of the object:
95
68
96
69
```
97
70
{
98
-
"ETag": "\"4f4cf806569737e1f3ea064a1d4813db\"",
99
-
"ServerSideEncryption": "AES256",
100
-
"VersionId": "9RCg6lFF_CmB.r_YlMS8sdPBiv878gQI"
71
+
"ETag": "\"abcd1234abcd1234abcd1234abcd1234\""
101
72
}
102
73
```
103
74
104
-
You can also upload files with additional metadata. Let's create another file and add some metadata to it:
105
-
106
-
```
107
-
echo "This is a document with metadata." > sample-document.txt
Notice that we used `documents/` in the key name. This creates a logical folder structure in your bucket, even though S3 is actually a flat object store.
118
-
119
75
## Download and verify objects
120
76
121
77
To download an object from your bucket to your local machine:
122
78
123
79
```
124
80
aws s3api get-object \
125
81
--bucket "$BUCKET_NAME" \
126
-
--key "sample-file.txt" \
127
-
"downloaded-sample-file.txt"
82
+
--key "sample.txt" \
83
+
"downloaded-sample.txt"
128
84
```
129
85
130
-
The command downloads the object and saves it as `downloaded-sample-file.txt` in your current directory. The output provides metadata about the object:
86
+
The command downloads the object and saves it as `downloaded-sample.txt` in your current directory. The output provides metadata about the object:
131
87
132
88
```
133
89
{
134
90
"AcceptRanges": "bytes",
135
-
"LastModified": "Thu, 22 May 2025 20:39:53 GMT",
136
-
"ContentLength": 43,
137
-
"ETag": "\"4f4cf806569737e1f3ea064a1d4813db\"",
138
-
"VersionId": "9RCg6lFF_CmB.r_YlMS8sdPBiv878gQI",
91
+
"LastModified": "2026-01-13T20:39:53+00:00",
92
+
"ContentLength": 75,
93
+
"ETag": "\"abcd1234abcd1234abcd1234abcd1234\"",
139
94
"ContentType": "binary/octet-stream",
140
-
"ServerSideEncryption": "AES256",
141
95
"Metadata": {}
142
96
}
143
97
```
144
98
145
-
If you just want to check if an object exists or view its metadata without downloading it:
99
+
## Copy an object to a folder prefix
100
+
101
+
Although S3 is a flat object store, you can simulate folders by using key name prefixes. Let's copy the sample file into a `backup/` prefix:
146
102
147
103
```
148
-
aws s3api head-object \
104
+
aws s3api copy-object \
149
105
--bucket "$BUCKET_NAME" \
150
-
--key "sample-file.txt"
106
+
--copy-source "$BUCKET_NAME/sample.txt" \
107
+
--key "backup/sample.txt"
151
108
```
152
109
153
-
This returns the same metadata information without transferring the actual object content.
110
+
The response includes information about the copy operation:
154
111
155
-
## Organize objects with folders
112
+
```
113
+
{
114
+
"CopyObjectResult": {
115
+
"ETag": "\"abcd1234abcd1234abcd1234abcd1234\"",
116
+
"LastModified": "2026-01-13T20:39:59+00:00"
117
+
}
118
+
}
119
+
```
156
120
157
-
Although S3 is a flat object store, you can simulate folders by using key name prefixes. Let's create a folder structure and copy an existing object into it.
121
+
## Enable versioning
158
122
159
-
First, create a folder by uploading an empty object with a trailing slash:
123
+
Versioning helps protect against accidental deletion by keeping multiple variants of an object in the same bucket.
160
124
161
125
```
162
-
touch empty-file.tmp
163
-
aws s3api put-object \
126
+
aws s3api put-bucket-versioning \
164
127
--bucket "$BUCKET_NAME" \
165
-
--key "favorite-files/" \
166
-
--body empty-file.tmp
128
+
--versioning-configuration Status=Enabled
167
129
```
168
130
169
-
Now, copy the sample file into this folder:
131
+
With versioning enabled, uploading a file with the same key creates a new version instead of overwriting the original. Let's upload a second version of the sample file:
170
132
171
133
```
172
-
aws s3api copy-object \
134
+
echo "Hello, Amazon S3! This is version 2 of the sample file." > sample.txt
135
+
136
+
aws s3api put-object \
173
137
--bucket "$BUCKET_NAME" \
174
-
--copy-source "$BUCKET_NAME/sample-file.txt" \
175
-
--key "favorite-files/sample-file.txt"
138
+
--key "sample.txt" \
139
+
--body "sample.txt"
176
140
```
177
141
178
-
The response includes information about the copy operation:
Let's list all objects in the bucket to see our folder structure:
151
+
## Configure default encryption
152
+
153
+
Default encryption ensures that all objects stored in the bucket are encrypted at rest using server-side encryption with Amazon S3 managed keys (SSE-S3):
193
154
194
155
```
195
-
aws s3api list-objects-v2 \
156
+
aws s3api put-bucket-encryption \
196
157
--bucket "$BUCKET_NAME" \
197
-
--query 'Contents[].Key' \
198
-
--output table
158
+
--server-side-encryption-configuration '{
159
+
"Rules": [
160
+
{
161
+
"ApplyServerSideEncryptionByDefault": {
162
+
"SSEAlgorithm": "AES256"
163
+
},
164
+
"BucketKeyEnabled": true
165
+
}
166
+
]
167
+
}'
199
168
```
200
169
201
-
The output shows all objects, including our folder structure:
170
+
## Block public access
171
+
172
+
Blocking public access is a security best practice that prevents objects in your bucket from being made public:
202
173
203
174
```
204
-
------------------------------------
205
-
| ListObjectsV2 |
206
-
+----------------------------------+
207
-
| documents/sample-document.txt|
208
-
| favorite-files/ |
209
-
| favorite-files/sample-file.txt |
210
-
| sample-file.txt |
211
-
+----------------------------------+
175
+
aws s3api put-public-access-block \
176
+
--bucket "$BUCKET_NAME" \
177
+
--public-access-block-configuration '{
178
+
"BlockPublicAcls": true,
179
+
"IgnorePublicAcls": true,
180
+
"BlockPublicPolicy": true,
181
+
"RestrictPublicBuckets": true
182
+
}'
212
183
```
213
184
214
-
You can also list objects within a specific folder:
185
+
## Add tags to your bucket
186
+
187
+
Tags help you categorize your AWS resources for cost allocation, access control, and organization:
215
188
216
189
```
217
-
aws s3api list-objects-v2 \
190
+
aws s3api put-bucket-tagging \
218
191
--bucket "$BUCKET_NAME" \
219
-
--prefix "favorite-files/" \
220
-
--query 'Contents[].Key' \
221
-
--output table
192
+
--tagging '{
193
+
"TagSet": [
194
+
{
195
+
"Key": "Environment",
196
+
"Value": "Tutorial"
197
+
},
198
+
{
199
+
"Key": "Project",
200
+
"Value": "S3-GettingStarted"
201
+
}
202
+
]
203
+
}'
222
204
```
223
205
224
-
This shows only the objects within the "favorite-files" folder:
206
+
Verify the tags were applied:
225
207
226
208
```
227
-
------------------------------------
228
-
| ListObjectsV2 |
229
-
+----------------------------------+
230
-
| favorite-files/ |
231
-
| favorite-files/sample-file.txt |
232
-
+----------------------------------+
209
+
aws s3api get-bucket-tagging \
210
+
--bucket "$BUCKET_NAME"
233
211
```
234
212
235
-
## Add tags to your bucket
213
+
```
214
+
{
215
+
"TagSet": [
216
+
{
217
+
"Key": "Environment",
218
+
"Value": "Tutorial"
219
+
},
220
+
{
221
+
"Key": "Project",
222
+
"Value": "S3-GettingStarted"
223
+
}
224
+
]
225
+
}
226
+
```
236
227
237
-
Tags help you categorize your AWS resources for cost allocation, access control, and organization:
228
+
## List objects and versions
229
+
230
+
List all objects in the bucket to see your folder structure:
Since versioning is enabled, you can also list all versions of objects in the bucket. This shows both the current and previous versions of `sample.txt`:
238
+
239
+
```
240
+
aws s3api list-object-versions \
241
+
--bucket "$BUCKET_NAME"
243
242
```
244
243
245
244
## Clean up resources
@@ -250,20 +249,30 @@ For buckets with versioning enabled, you need to delete all object versions befo
0 commit comments