How to perform Multi-Cloud tasks in AWS Lambda with Hashicorp Vault

Written by Carl Li

Introduction

Imagine you are given a task to develop a script to perform queries to the 3 major Cloud Providers (AWS, Azure & GCP). The first thing that comes to one’s mind is to reach out to the Administrator of each Cloud Platform Team and ask for respective sets of credentials so you can authenticate and make the required API calls to each Cloud Platform. You decided to use AWS Lambda Function (next to no infrastructure, cost-effective and easy to implement), the next step is to store those credentials safely and preferably retrieve them at runtime so they are secured and for AWS Lambda, the default-go-to secret store will be AWS Secrets Manager. But then is it the best solution??

You go search the internet and most of the results conclude to something like below:
Storing static cloud credentials in AWS Secrets Manager poses several risks,

  • Not rotated regularly

  • Mismanagement or lack of stringent access controls

  • Increased attack surface for attackers to target

For the above use case, the AWS API calls can be mitigated with the use of Lambda execution role which grants only the Lambda function to access AWS services and resources. But how about Azure and GCP credentials? Is there a way to securely retrieve sets of credentials, preferably dynamically and short-lived, without storing them in AWS Secret Manager?

In this post, we’ll look into utilising Hashicorp Vault, an identity-based secrets and encryption management system, get a Lambda function to retrieve the 3 public cloud platform credentials dynamically and query the spending of each Cloud Platform for the last calendar month.

HashiCorp Vault offers several key advantages, making it an essential tool for secure secret management. It provides centralised control over sensitive data like API keys, passwords, and certificates, reducing the risk of leaks and unauthorised access. Vault dynamically generates secrets on demand, ensuring that credentials are short-lived and automatically revoked when no longer needed, improving security and minimising exposure. Its fine-grained access control allows organisations to define precise permissions for who can access specific secrets. Vault's ability to encrypt data both at rest and in transit adds an additional layer of protection, while its audit logging feature provides a detailed trail of access and activity, aiding in compliance and security monitoring.

What does the architecture look like?

The development/proof of concept architecture comprises:

  • Hashicorp Vault Instance in AWS public VPC, this helps local development on developers' PCs.

  • AWS Lambda functions with an execution role that can be authenticated by Vault, in this example the role used is arn:aws:iam::054xxxx399:role/VaultLambdaRole (which was used in the Vault Configurations section below)

The recommended production architecture should comprise the following:

  • Hashicorp Vault Cluster in AWS private subnet, this secures access within your organisation

  • AWS Lambda functions run in the same VPC as Vault Cluster in a private subnet with egress internet access to communicate to GCP and Azure endpoints.

Below is an example architecture of a production Hashicorp Vault cluster

Vault Configurations:

To get started with creating a development Hashicorp Vault Cluster, the recommendation is to start with utilising a new HashiCorp Cloud Platform (HCP) account which gives you a $50 credit for new sign up, the Create a Vault Cluster on HCP guide should get a Small Development Vault Cluster up and running quickly.

Some essential configurations are required on Hashicorp Vault as outlined below, namely setting up the AWS authentication method in HashiCorp Vault, and then configuring the Secret Engines for AWS, Google Cloud Platform (GCP), and Microsoft Azure respectively. This will enable authenticated AWS role(s) to generate dynamic cloud credentials across these major providers.

AWS Auth Method:

The first step is to configure AWS auth method on Vault so the Lambda function can authenticate to Vault with its execution role. The AWS auth method in HashiCorp Vault is a mechanism that allows Vault to authenticate AWS identities, in this case the Lambda execution role, and grant them access to secrets. When an AWS entity attempts to authenticate, Vault verifies its identity by checking the associated IAM role and principal id. Once authenticated, the entity is assigned specific Vault policies that determine what secrets it can access.

➜ vault read auth/aws/role/VaultLambdaRole
Key                               Value
---                               -----
allow_instance_migration          false
auth_type                         iam
bound_ec2_instance_id             <nil>
bound_iam_principal_arn           [arn:aws:iam::054xxxx399:role/VaultLambdaRole]
bound_iam_principal_id            [AROAQZOVxxxxxxxT5DTLO]
disallow_reauthentication         false
policies                          [multi-cloud]
resolve_aws_unique_ids            true
role_id                           d4f73c17-xxxx-d60e-c93c-3e7f476b9a9c
token_explicit_max_ttl            0s
token_max_ttl                     0s
token_no_default_policy           false
token_num_uses                    0
token_period                      0s
token_policies                    [multi-cloud]
token_ttl                         0s
token_type                        default

You will then need to enable and configure respective secret engines for each cloud provider.  Secret engines are enabled at a path basis in Vault to dynamically generate secrets for each cloud provider.  When a request comes into Vault, the router automatically routes anything with the route prefix to the secrets engine. It is recommended to have a naming convention that separates and identifies AWS accounts, GCP projects, Azure subscriptions.

In the example below, we enabled AWS, Azure and GCP secret engines for sandpit1 accounts for each cloud provider.

➜ vault secrets enable -path=aws/sandpit1 aws
Success! Enabled the aws secrets engine at: aws/sandpit1/

➜ vault secrets enable -path=gcp/sandpit1 gcp
Success! Enabled the gcp secrets engine at: gcp/sandpit1/

➜ vault secrets enable -path=azure/sandpit1 azure
Success! Enabled the azure secrets engine at: azure/sandpit1/

AWS Secret Engine:

➜ vault read aws/sandpit1/creds/lambda_role
Key                Value
---                -----
lease_id           aws/sandpit1/creds/lambda_role/RXRMfFEciWbOrDRZAPXgNlyR
lease_duration     10m
lease_renewable    false
access_key         ASIAQXXXXXZH3IOBQHF6
secret_key         QD2GKqPhuvXXXxxxiKSHsvcwygtWTfsT5igJoWz6
session_token      FwoGZXIvYXdzEIP//////////wEaDDowkVA7fEUe4EJYFyKBAfrZt2f+grJfS60ZSfPZfx2OoL9GmrAVDLLWQW4H3hCInG++gIwkP7fnjmXJ+iiC2oHIc52rw3Dgh4O1Ii6fiUZ3s195v5VkUt5Xvvz/...
ttl                59m59s

Azure Secret Engine:

➜ vault read azure/sandpit1/creds/lambda_role
Key                Value
---                -----
lease_id           azure/sandpit1/creds/lambda_role/3hcsH4bemnGjrm4BEjVSF2e0
lease_duration     10m
lease_renewable    true
client_id          9576b0f2-19eb-XXXX-b136-a62aad9d6946
client_secret      Qkr8Q~O03cF5lh49XXXXd1sVvpvLc_2yWwDfbuP

GCP Secret Engine:

vault read gcp/sandpit1/roleset/lambda_role/key
Key                 Value
---                 -----
lease_id            gcp/sandpit1/roleset/lambda_role/key/afErxz8S67H3MfAAYVoqvJUH
lease_duration      5m
lease_renewable     true
key_algorithm       KEY_ALG_RSA_2048
key_type            TYPE_GOOGLE_CREDENTIALS_FILE
private_key_data    ewogICJ0eXBlIjogInNlcnZpY2VfYWNjb3VudCIsCiAgInByb2plY3RfaWQiOiAid2VpZ2h0eS13b3Jrcy00MzAwMjItdTgiLAogICJwcml2YXRlX2tleV9pZCI6ICI1NzY0ZmMzZjViMDM2ZGFmNDc4MjkyODM2YzkxOTQwZmJlYmNjYjFlIiwKICAicHJpdmF0ZV9rZXkiOiAiLS0tLS1CRUdJTiBQUklWQVRFIEtFWS0tLS0tXG5NSUlFdmdJQkFEQU5CZ2txaGtpRzl3MEJBUUVGQUFTQ0JLZ3dnZ1NrQWdFQUFvSUJBUUNYSlc2WVZXamhTNG4vXG4wMWtCOFVad...

How Lambda Functions authenticate to Vault and do multi-cloud tasks

There are a wide range of programming libraries that can be used to consume Vault API more conveniently. Golang libraries are officially maintained while others are provided by the community.

In this example, I have written an AWS Lambda function in Python that authenticates to Vault, gathers each public cloud provider credential, and then queries last month’s spending for each AWS account, GCP project, and Azure subscription. And send an email out via SNS with the queried spendings.

The example code is available at https://github.com/carl-sl-li/lambda-no-secret. Please note the code is intended for demonstration only and is not fit for production usage.

Variables to pass to Lambda function:

  • SNS_ARN: use to send an email summary of last month's spending

  • REGION: AWS region the function execute in (e.g. ap-southeast-2)

  • VAULTAWSPATHS: vault path for AWS credentials (e.g. aws/sandpit1/roles/lambda_role)

  • VAULTGCPPATHS: vault path for GCP credentials (e.g. gcp/sandpit1/roleset/lambda_role)

  • GCP_BILL_TABLE: GCP BigQuery billing data table ID 

    (e.g. <DATASET_ID>.gcp_billing_export_v1_<BILLING_ACCOUNT_ID>

  • VAULTAZUREPATHS: vault path for Azure credentials (e.g. azure/sandpit1/roles/lambda_role)

  • VAULTURL: endpoint URL of vault (e.g. http://vault.example.com.au:8200)

Solution breakdown

  1. Authenticate to Vault using HashiCorp Vault API client for Python 3.x

AWS_ACCESS_KEY, AWS_ACCESS_KEY_ID, AWS_SESSION_TOKEN are access keys obtained from the function's execution role, in this case the VaultLambdaRole role we configured in the previous section.

# Define Vault python client
vault_client = hvac.Client(url=os.environ['VAULTURL'])

# Authenticate to Hashicorp Vault
def auth_to_vault():
    vault_client.auth.aws.iam_login(os.environ['AWS_ACCESS_KEY_ID'], \ os.environ['AWS_SECRET_ACCESS_KEY'], os.environ['AWS_SESSION_TOKEN'])

Once authenticated, a vault token is generated for the Lambda to read credentials for the below steps

  1. Generate AWS credentials and query last month AWS account spending

# AWS Functions
def get_aws_creds(path:str):
    response = vault_client.secrets.aws.generate_credentials(
        name=path.split('/')[-1],
        mount_point=path.rsplit('/', 2)[0],
        ttl=900
    )
    data = response['data']
    return(data)

def aws_last_mth_bill(start:str, end:str, vaultcreds:str):
    client = boto3.client(
        'ce',
        aws_access_key_id=vaultcreds['access_key'],
        aws_secret_access_key=vaultcreds['secret_key'],
        aws_session_token=vaultcreds['session_token']
    )
    expression = { "Dimensions": { "Key": "RECORD_TYPE", "Values": [ "Usage" ] } }
    try:
        response = client.get_cost_and_usage(
            TimePeriod={
                'Start': start,
                'End': end
            },
            Metrics=[
                'BlendedCost',
            ],
            Granularity='MONTHLY',
            Filter=expression
        )
        amount=response['ResultsByTime'][0]['Total']['BlendedCost']['Amount']
        return(round(Decimal(amount), 2))
    except Exception as e:
        print(e)
        raise e
  1. Generate GCP credentials and query last month GCP project spending

# GCP Functions
def get_gcp_creds(path:str):
    response = vault_client.secrets.gcp.generate_service_account_key(
        roleset=path.split('/')[-1],
        mount_point=path.rsplit('/', 2)[0]
    )
    data = response['data']['private_key_data']
    credfile = json.loads(base64.b64decode(data))
    return(credfile)

def gcp_last_mth_bill(start:str, end:str, vaultcreds:str):
    # Set up credentials and build the service
    credentials = service_account.Credentials.from_service_account_info(
        vaultcreds,
        scopes=['https://www.googleapis.com/auth/cloud-platform']
    )

    # Get Project ID
    cloudresourcemanager = build('cloudresourcemanager', 'v1', credentials=credentials)
    project_data = cloudresourcemanager.projects().list().execute()
    project_id = project_data['projects'][0]['projectId']

    gcp_table = os.environ['GCP_BILL_TABLE']

    # Initialize the BigQuery client with the credentials
    client = bigquery.Client(credentials=credentials, project=project_id)

    # Construct the query
    query = f"""
    SELECT
        SUM(cost) AS total_cost,
    FROM
        `{gcp_table}`
    WHERE
    usage_start_time >= TIMESTAMP('{start}')
    AND usage_start_time < TIMESTAMP('{end}')
    """

    # Execute the query
    query_job = client.query(query)
    results = query_job.result()

    # Process and display the results
    for row in results:
        amount = row.total_cost
    return(round(Decimal(amount), 2))
  1. Read Azure Subscription ID, Generate Azure credentials and query last month Azure subscription spending

# Azure Functions
def read_azure_config(path):
    response = vault_client.secrets.azure.read_config(
        mount_point=path.rsplit('/', 2)[0]
    )
    data = response
    return(data)

def get_azure_creds(path):
    response = vault_client.secrets.azure.generate_credentials(
        name=path.split('/')[-1],
        mount_point=path.rsplit('/', 2)[0]
    )
    data = response
    return(data)

def azure_last_mth_bill(start:datetime, end:datetime, vaultcreds, config):

    subscription_id = config['subscription_id']
    # Authenticate using ClientSecretCredential
    credential = ClientSecretCredential(
        client_id=vaultcreds['client_id'],
        client_secret=vaultcreds['client_secret'],
        tenant_id=config['tenant_id']
    )
    
    # Create a CostManagementClient
    client = CostManagementClient(credential)  

    # Create the query
    query = {
        "type": "Usage",
        "timeframe": "Custom",
        "timePeriod": {
            "from": start.strftime('%Y-%m-%dT%H:%M:%SZ'),
            "to": end.strftime('%Y-%m-%dT%H:%M:%SZ')
        },
        "dataset": {
            "granularity": "None",
            "aggregation": {
                "totalCost": {
                    "name": "Cost",
                    "function": "Sum"
                }
            }
        }
    }

    # Execute the query
    result = client.query.usage(
        scope=f"/subscriptions/{subscription_id}",
        parameters=query
    )

    # Print the total cost
    if result and result.rows:
        total_cost = result.rows[0][0]
        return(round(Decimal(total_cost), 2))
    else:
        return(round(Decimal(0), 2))
  1. A function to send SNS

def send_sns(message, subject, topic, vaultcreds):
    client = boto3.client(
        "sns",
        aws_access_key_id=vaultcreds['access_key'],
        aws_secret_access_key=vaultcreds['secret_key'],
        aws_session_token=vaultcreds['session_token']        
    )
    client.publish(TopicArn=topic, Message=message, Subject=subject)
  1. Putting it all together

end_time = datetime.now(timezone.utc).replace(day=1)
start_time = (end_time - timedelta(days=1)).replace(day=1)
startdate = start_time.strftime('%Y-%m')+'-01'
enddate = end_time.strftime('%Y-%m')+'-01'

def lambda_handler(event, context):
    print("Received event: " + json.dumps(event, indent=2))
    # Athenticate to Vault
    auth_to_vault()
    # Process AWS Bill
    aws_creds=get_aws_creds(os.environ['VAULTAWSPATHS'])
    aws_bill=aws_last_mth_bill(startdate, enddate, aws_creds)
    # Process GCP Bill
    gcp_creds=get_gcp_creds(os.environ['VAULTGCPPATHS'])
    gcp_bill=gcp_last_mth_bill(startdate, enddate, gcp_creds)
    # Process Azure Bill
    azure_config = read_azure_config(os.environ['VAULTAZUREPATHS'])
    azure_cred = get_azure_creds(os.environ['VAULTAZUREPATHS'])
    azure_bill = azure_last_mth_bill(start_time, end_time, azure_cred, azure_config)

    # Prepare and send SNS subject and message
    subject = 'Last Month Cloud Bills'
    message= (
        f"AWS Bill for last month is ${aws_bill}\n"
        + f"GCP Bill for last month is: ${gcp_bill}\n"
        + f"Azure Bill for last month is ${azure_bill}\n"
    )
    send_sns(message, subject, os.environ["SNS_ARN"], aws_creds)

Running the Lambda function will give you an email like below with last month's bill spending for each Cloud Provider

Security and Operational Considerations for Hashicorp Vault

Hashicorp Vault is more well-known for its feature to dynamically generate database credentials, and this blog covers its less-utilised cloud secret engines. One common theme about Vault dynamic secret engines (whether it is for database or cloud) is privileged access keys and credentials are required for their root configuration in order to generate dynamic credentials (essentially creating short-lived secrets on demand). With this requirement in mind, do ensure this is communicated clearly with your organisation's cyber and security team. Once these root credentials/keys are configured into Vault, they should be immediately rotated to avoid potential breaches. The same applies to Vault’s root token which should be securely stored or best case be revoked after initial base configuration. When deploying HashiCorp Vault in an organisation, it’s also crucial to establish clear policies and communication channels between teams to avoid misconfigurations and unauthorised access. Collaboration between development, operations, and security teams is essential to ensure that Vault is properly configured and maintained. Additionally, audit logging should be enabled to monitor access and changes to the Vault configuration.

Hashicorp Vault in a production environment should be treated as crucial business infrastructure so regular reviews, failover tests and updates of the Vault setup are necessary to fulfil cyber/security requirements and business contingency plans.

Closing

In conclusion, using HashiCorp Vault to dynamically retrieve credentials for AWS, Azure, and GCP enhances the security of your cloud operations, while also simplifying the management of multi-cloud environments, and provides a better developer experience. By avoiding the storage of static credentials and leveraging Vault's capabilities for secure access and secrets management, you can reduce the risks associated with hardcoded or improperly managed secrets. As you implement this solution, remember to enforce strict access controls, regularly audit your Vault configuration, and promote collaboration between your security, development, and operations teams. This approach ensures that your cloud infrastructure remains both secure and efficient, allowing your organisation to confidently scale its operations across multiple cloud platforms.

If you’re looking for any help on multi-cloud workloads or conduct a Hashicorp Vault assessment/feasibility program, reach out to us! We’ve helped multiple customers adopt multi-cloud strategy and strengthened security posture with great success. We are looking forward to helping you to set up a scalable and cost-efficient cloud environment with our expertise.

06/24/2025