Check If a Key Exists in a Bucket in S3 Using Boto3

check if a key exists in a bucket in s3 using boto3

Boto 2's boto.s3.key.Key object used to have an exists method that checked if the key existed on S3 by doing a HEAD request and looking at the the result, but it seems that that no longer exists. You have to do it yourself:

import boto3
import botocore

s3 = boto3.resource('s3')

try:
s3.Object('my-bucket', 'dootdoot.jpg').load()
except botocore.exceptions.ClientError as e:
if e.response['Error']['Code'] == "404":
# The object does not exist.
...
else:
# Something else has gone wrong.
raise
else:
# The object does exist.
...

load() does a HEAD request for a single key, which is fast, even if the object in question is large or you have many objects in your bucket.

Of course, you might be checking if the object exists because you're planning on using it. If that is the case, you can just forget about the load() and do a get() or download_file() directly, then handle the error case there.

how to check if a particular directory exists in S3 bucket using python and boto3

Please try this code as following

Get subdirectory info folder¶
folders = bucket.list("","/")
for folder in folders:
print (folder.name)

PS reference URL(How to use python script to copy files from one bucket to another bucket at the Amazon S3 with boto)

How to check if an object exists or filename matches in s3 bucket?

If you know the exact key

response = client.head_object(
Bucket='examplebucket',
Key='HappyFace.jpg',
)

If no such object matches the key, S3.Client.exceptions.NoSuchKey will be thrown. The object data is not transfered to your local, if that's your concern

How can I get the key of a downloaded S3 object using boto3 in Python?

The response of get_object(...) does not return the key ("filename") in the response object.

It returns the below properties, none of which is the key.

Unfortunately, you'll have to pass the key/filename that you used to get the object in the first place, to any other function which needs it.

{
'Body': StreamingBody(),
'DeleteMarker': True|False,
'AcceptRanges': 'string',
'Expiration': 'string',
'Restore': 'string',
'LastModified': datetime(2015, 1, 1),
'ContentLength': 123,
'ETag': 'string',
'MissingMeta': 123,
'VersionId': 'string',
'CacheControl': 'string',
'ContentDisposition': 'string',
'ContentEncoding': 'string',
'ContentLanguage': 'string',
'ContentRange': 'string',
'ContentType': 'string',
'Expires': datetime(2015, 1, 1),
'WebsiteRedirectLocation': 'string',
'ServerSideEncryption': 'AES256'|'aws:kms',
'Metadata': {
'string': 'string'
},
'SSECustomerAlgorithm': 'string',
'SSECustomerKeyMD5': 'string',
'SSEKMSKeyId': 'string',
'BucketKeyEnabled': True|False,
'StorageClass': 'STANDARD'|'REDUCED_REDUNDANCY'|'STANDARD_IA'|'ONEZONE_IA'|'INTELLIGENT_TIERING'|'GLACIER'|'DEEP_ARCHIVE'|'OUTPOSTS',
'RequestCharged': 'requester',
'ReplicationStatus': 'COMPLETE'|'PENDING'|'FAILED'|'REPLICA',
'PartsCount': 123,
'TagCount': 123,
'ObjectLockMode': 'GOVERNANCE'|'COMPLIANCE',
'ObjectLockRetainUntilDate': datetime(2015, 1, 1),
'ObjectLockLegalHoldStatus': 'ON'|'OFF'
}

AWS S3 check if file exists based on a conditional path

I ended up using this which gave a little cleaner code

import boto3
s3client = boto3.client('s3')

def all_file_exist(bucket, prefix, fileN):
fileFound = False
fileConditionFound = False
theObjs = s3client.list_objects_v2(Bucket=bucket, Prefix=prefix)
for object in theObjs['Contents']:
if object['Key'].endswith(fileN+'_condition.jpg') :
fileConditionFound = True
if object['Key'].endswith(fileN+".jpg") :
fileFound = True
if (fileFound and fileConditionFound) :
return True
return False

all_file_exist("bucket","folder1", "test")



Related Topics



Leave a reply



Submit