Using the AWS Cost and Usage Report in PowerBI to Analyze Costs: Part 1

Navigating the AWS Cost and Usage Report can be challenging or costly if using the native AWS services. This is where the free PowerBI desktop comes into place, allowing you to quickly and easily create visuals and analyze your AWS costs. In this section, I’ll guide you on how to set up the AWS Cost & Usage Report and use it in PowerBI


Step 1: Setup Cost and Usage Report in the AWS Billing Portal

Log into the AWS Portal, Navigate to Billing through the search bar, and select “Cost and Usage Report” on the sidebar. If you’re already logged in, you can go to

Now select the blue button that says “create report”

Fill in the report name and check the “include Resource IDs.” This will increase the report’s size but is mandatory if you’re analyzing resource-specific costs. Remember to click next!

On this page, you’ll be required to fill in all the delivery options. Let’s first configure the S3 bucket, click configure. You’ll be given two options, to either select an existing bucket or create a bucket. For this tutorial, we will create a new bucket. Fill in your bucket name, select a region, then select next. You’ll be asked to verify the policy, and this will give Cost & Usage Report the ability to save the report in the s3 bucket. Check I have confirmed that the policy is correct and click save. Remember to disable public access to the S3 bucket that was created.

Fill in the report path, and this can be anything as long as it doesn’t exist. In this tutorial, we will select hourly for the time granularity but remember this will increase the cost and the size of the dataset. I highly recommend using hourly. Compression type is it beneficial to leave this at GZIP as Parquet is not supported in PowerBI as of writing this post. Click next.

Please review the settings, verify they’re correct, and click Review and Complete. Now we have set up the Cost and Usage Report. We will come back in 24 hours to give AWS enough time to generate them.

Step 2: Downloading the report

Log back into the AWS console, go to S3, and select your S3 bucket. Remember the report path prefix you created in the previous step? Click on that folder, select your report name, and you’ll be greeted with multiple folders with a day range. Click the current month one.

You’ll be shown a folder with a bunch of folders with UUID’s and a JSON object named “REPORTNAME-Manifest.json”. The JSON object will tell you which folder is the newest one. We will enter the last folder and download the .csv.gz file in there. There will also be a JSON object that describes each header name.

Step 3: Loading the Cost and Usage Report into PowerBI

You have two options with this report, you can extract the CSV out of there and load it through the build-in CSV importer, or we can use a blank query that Decompresses the GZIP directly in PowerBI. This will be very useful for the next tutorial when we use Data Factory to automatically land the report into Azure Data Lake Storage and use the built-in connector in PowerBI.

The following PowerQuery will allow you to read the GZip file directly and import it without extracting the file.

    Source = Binary.Decompress(File.Contents("GZFilePath"), Compression.GZip),
    #"Imported CSV" = Csv.Document(Source,[Delimiter=",", Columns=228, Encoding=1252, QuoteStyle=QuoteStyle.None]),
    #"Promoted Headers" = Table.PromoteHeaders(#"Imported CSV", [PromoteAllScalars=true])
    #"Promoted Headers"

After you’re done with the query, select all, transform, and pick detect DataType. This will convert all the columns from Text to Date, Integers, Decimals, etc.

There will be over 100 columns of data here, including any defined cost billing tags. So remember to ensure all the collect allocation tags you want are activate under the billing console.

After you get familiar with the 100 columns of data, you’re ready to click Close & Apply. Depending on the report’s size, it’ll be swift to import, and you’re ready to start creating visuals.

Step 4: Creating Basic Visuals in PowerBi

So now that we have all the data imported, let’s get started with building some visuals. Since there are many columns, I would highly recommend using the built-in search field input to find the column you’re looking for quickly.

For this example, we will compare the daily cost based on the resource tag environment and use the Stacked Bar Chart under visualization. Then we will find “lineItem/UsageStartDate” from the fields and drag it onto the Axis under the Visualizations. Now we have the date on the X-Axis, we will need the costs to show on the y-axis. There are quite a few options such as “lineItem/UnblendedCost”, “lineItem/BlendedCost”, etc. These options will give you different numbers depending on your organization, what you have in your environment, reservations, etc.

For what we are trying to accomplish, we will select the unblended costs. Next, we will need to search for tags in the field, drag and drop the appropriate tag to the legend. Now we can clearly visualize how much each environment is costing us per day.

If you like this content, please share this post and let me know in the comment section if you want more!

1 thought on “Using the AWS Cost and Usage Report in PowerBI to Analyze Costs: Part 1

  1. Pingback: The benefits of a tagging strategy for private & public cloud - Arber's Blog

Comments are closed.