Head in the Clouds: Amazon Web Services

The Amazon Web Services Edition

Christopher Maddalena
Posts By SpecterOps Team Members

--

Introduction

This article is a companion piece for this primer:

IP Address Ranges

Amazon provides the simplest option for fetching their IP addresses by offering a webpage with easily digested JSON:

https://ip-ranges.amazonaws.com/ip-ranges.json

That is all there is to it. The JSON makes it simple to do something like create a script that makes a single web request and parses the JSON.

Making Use of the IP Addresses

An up-to-date list of these IP addresses is useful for identifying assets hosted in an AWS environment. Any domain or subdomain that points back to an IP address in the list will lead to an S3 bucket or EC2 host.

Maintaining an Updated Master List

The collection of these IP addresses has been automated in the following script:

The script fetches the latests IP address ranges used by each provider and then outputs one list in a CloudIPs.txt file. Each range is on a new line following a header naming the service, e.g. “# Amazon Web Services IPs.”

Storage: AWS Secure Simple Storage Buckets

The AWS Secure Simple Storage (S3) bucket is the quintessential bucket. They use these address schemes for web requests:

https://s3-us-west-1.amazonaws.com/cmaddy/

https://cmaddy.s3.amazonaws.com/

Both of the above options work and point to the same “cmaddy” bucket. This makes searching for AWS buckets a relatively simple affair. A basic approach is brute force, attempting to request a bucket and checking the XML returned by AWS.

The ip-ranges.json file, discussed above, is actually hosted on a bucket. Visiting https://ip-ranges.amazonaws.com/ will return the XML “AccessDenied” response. However, that is not the bucket’s name. This subdomain points to an Amazon CloudFront instance that is in front of the bucket.

Bucket Names

AWS allows users to name their buckets using any DNS-compliant name. That includes names like “cmaddy.test” and “cmaddy-test.”

The DNS-compliant names are important to remember when dealing with a bucket being used for web hosting. It is possible to host pieces of websites (e.g. images or scripts) or an entire website in an S3 bucket. If a domain or subdomain resolves to an IP in the AWS range which in turn resolves to a hostname like s3-website-us-west-2.amazonaws.com, then the files are hosted in a bucket.

A bucket owner can setup a CNAME record to alias the bucket name to match their website. Amazon requires the bucket name to match the alias, so if the alias is for images.whatever.com, that would require the bucket name to be images.whatever.com and bucket web address images.whatever.com.s3.amazonaws.com. The CNAME can point directly to that bucket address or to s3.amazonaws.com. Amazon warns pointing a CNAME to s3.amazonaws.com will cause additional redirects.

Further details on S3 bucket web hosting can be found here in the S3 documentation:

Virtual Servers: Elastic Cloud Computing

AWS’ virtual machine offering is called Elastic Cloud Computing, or EC2. Amazon’s documentation for the EC2 metadata service can be found here:

The metadata service is covered in the main primer article.

Authentication: Identity and Access Management

AWS’s user account control is called Identity and Access Management, or IAM. A root account can create new users and grant them programmatic access and/or web console access. If programmatic access is selected, IAM access keys are generated for the user and these can be used with scripts and command line tools, e.g. awscli.

If web console access is selected, a password is set for the user and the user can use a username and password to log into the AWS management console in a web browser.

Programmatic access is arguably the most interesting of the two for the purposes of this article. These keys may get accidentally committed to git repositories, can be found in S3 buckets, or pulled from EC2 hosts via the metadata service or files. If stolen, anyone can use the keys to access the account in the context of the key owner.

A Note on Privileges

Even if the keys have very few privileges on the account, they can be used to enumerate a lot of information. For example, while the keys may not allow access to S3 buckets, AWS does not make it possible to restrict who can list buckets. In other words, even low privileged keys could be stolen and then used to pull a list of all S3 buckets and other AWS resources.

Credential Files

When the awscli command line tool is setup, the aws configure command must be run and the user needs to provide access keys. This creates a “credentials” file to store the keys. This file stores the keys for the default awcli profile and any additional profiles configured by the user. These keys are stored in plaintext and can be easily copied and reused, so this file should be considered sensitive. The file is stored in the user’s home directory:

Linux and macOS:~/.aws/credentials

Windows: C:\Users\USERNAME\.aws\credentials

Stealing Credentials

If access keys are found, the simplest way to make use of them is via awscli profiles. A new profile can be setup using this command:

aws configure --profile PROFILE_NAME

The command will prompt for the Access Key ID and AWS Secret Access Key strings. The profile can be tested by running any awscli command with --profile PROFILE_NAME added to the command.

In some cases, a session token may also be needed. If the token is available, it can be easily added to the profile by editing the ~/.aws/credential file and adding an entry for a line for aws_session_token.

A sample profile:

[SomeStolenCreds]
aws_access_key_id = ACCESS_KEY_STRING
aws_secret_access_key = SECRET_KEY_STRING
aws_session_token = SESSION_TOKEN_THAT_LOOKS_LIKE_BASE64

Automating Credential Theft

As a proof of concept, this tool was created to search for and collect credential files associated with AWS, Compute, and Azure:

SharpCloud is a simple, basic C# console application that checks for the credential and config files associated with each cloud provider. If found, the contents of each found file is dumped for collection and potential reuse.

Command Line: AWSCLI

AWS uses Python for command line access. The official awscli toolkit can be installed using Python and pip and Python libraries, boto and boto3, are available for scripting.

The awscli tool can be installed on any operating system using Python using pip (pip install awscli). On macOS, brew install awscli can be used as an alternative.

Amazon has a command line reference available here:

Command Structure

The command structure for awscli is:

aws <command> <subcommand> [options and parameters]

Useful Commands by Service

View the gist on GitHub for easy bookmarking and copy/pasting

Useful Documentation

A review of the high-level S3 commands are provided here:

--

--