Stackdriver Logging makes it easy to export Admin Activity logs to BigQuery, Cloud Storage, or Cloud Pub/Sub.
Cloud Pub/Sub is typically used to export logs as messages to an external system such as Splunk. Configuring this can be done using the GCP Console.
The data flow of that architecture looks like the following:
Cloud IAM changes (such as adding a User Account or Service Account)
|
v
Stackdriver Logging
|
v
Stackdriver Logging Export (with an Advanced Filter)
|
v
Cloud Pub/Sub Topic 1 (Name: iam-logs-sink)
|
v
Cloud Pub/Sub Topic 1 Subscription (Delivery Type: Pull)
^
|
An external system (such as Splunk) pulls from
Cloud Pub/Sub Topic 1 Subscription using a
Service Account to authenticate
A single message contains a single log and each log contains a lot of information. If you want to further transform each log to only export the information you need, a Cloud Function can be written to do that transformation, and then it can push that transformed log to a second Cloud Pub/Sub Topic.
The data flow of this more advanced architecture - which you will setup in this post - looks like the following:
Cloud IAM changes (such as adding a User Account or Service Account)
|
v
Stackdriver Logging
|
v
Stackdriver Logging Export (with an Advanced Filter)
|
v
Cloud Pub/Sub Topic 1 (Name: iam-logs-sink)
|
v
Cloud Function (Uses Python 3.7 to do the transformations)
|
v
Cloud Pub/Sub Topic 2 (Name: from-gcf-iam-log-transform)
|
v
Cloud Pub/Sub Topic 2 Subscription (Delivery Type: Pull)
^
|
An external system (such as Splunk) pulls from
Cloud Pub/Sub Topic 2 Subscription using a
Service Account to authenticate
Prerequisites
This post assumes the following:
- You are familiar with Google Cloud Platform (GCP)
- You have an existing GCP Project to create GCP resources in or you will be creating a new GCP Project to create GCP resources in
Create a GCP Project
If you have not yet created a GCP Project, follow these steps:
- Open a web browser, and create or log in to a Google Account
- Navigate to the GCP Console
- If this is your first GCP Project, you will be prompted to create a GCP Project. Each Google Account gets $300 in credit to use within 12 months towards GCP. You are required to enter a credit card to create a GCP Project, but it will not be charged until the $300 credit is consumed or 12 months expire, whatever comes first.
Enable GCP Services
If you just created a new GCP Project, you will need to enable the APIs for the GCP services you will be using in this post.
In the GCP Console, go to Cloud Functions and enable the Cloud Functions API.
Use Cloud Shell
The entirety of this post will be done using Cloud Shell: GCP’s built-in terminal environment.
Cloud Shell allows you to operate GCP using command-line tools from any computer with a modern web browser. Every Google account gets their own Cloud Shell instance which is a simply an f1-micro Google Compute Engine (GCE) virtual machine with 5 GB of Persistent Disk attached to your home directory. Cloud Shell instances are terminated after 20 minutes of inactivity, but anything you store in your home directory will persistent across Cloud Shell instances.
Open Cloud Shell by going to the GCP Console and clicking the terminal prompt icon (which looks like this: [>_]
) in the upper-right part of the web browser window.
Create Environment Variables
Start by creating some environment variables:
PROJECT_ID="$(gcloud config list project --format 'value(core.project)')"
FIRST_TOPIC_NAME="iam-logs-sink"
SECOND_TOPIC_NAME="from-gcf-iam-log-transform"
SECOND_SUBSCRIPTION_NAME="external-system-pull"
Create Cloud Pub/Sub Topics and Subscriptions
Create the first Cloud Pub/Sub Topic. This is the Topic the Stackdriver Logging Export will stream logs to:
gcloud pubsub topics create $FIRST_TOPIC_NAME
You do not need to create a Cloud Pub/Sub Subscription attached to the first Cloud Pub/Sub Topic because Cloud Functions will do that for you.
Create the second Cloud Pub/Sub Topic. This is the Topic the Cloud Function will publish a transformed log entry to:
gcloud pubsub topics create $SECOND_TOPIC_NAME
Create the Cloud Pub/Sub Subscription that will be attached to the second Cloud Pub/Sub Topic:
gcloud pubsub subscriptions create $SECOND_SUBSCRIPTION_NAME --topic $SECOND_TOPIC_NAME
Create the Stackdriver Logging Export
Create the Stackdriver Logging Export with an Advanced Filter that only shows IAM logs. Read more about this here:
gcloud logging sinks create iam-logs-sink \
pubsub.googleapis.com/projects/$PROJECT_ID/topics/iam-logs-sink \
--log-filter='resource.type="project" logName="projects/'"$PROJECT_ID"'/logs/cloudaudit.googleapis.com%2Factivity" protoPayload.methodName="SetIamPolicy"'
Once the Stackdriver Logging Export is created, it will create a Service Account in a GCP Project managed by Google. Grant that Service Account permission to push messages to the first Cloud Pub/Sub Topic:
WRITER_IDENTITY="$(gcloud logging sinks describe iam-logs-sink --format 'value(writerIdentity)')"
gcloud beta pubsub topics add-iam-policy-binding $FIRST_TOPIC_NAME \
--member=$WRITER_IDENTITY \
--role='roles/pubsub.publisher'
Deploy the Cloud Function
Finally, it is time to deploy the Cloud Function.
Create directory gcf-process-iam-log and change into it:
mkdir gcf-process-iam-log
cd gcf-process-iam-log
Create file requirements.txt with the following contents:
google-cloud-pubsub
Create file main.py with the following Python script:
import base64
import json
import os
import sys
from google.cloud import pubsub
def processIAMLog(event, context):
PROJECT_ID=os.environ.get('PROJECT_ID', False)
TOPIC_NAME=os.environ.get('TOPIC_NAME', False)
pubsub_message = base64.b64decode(event['data']).decode('utf-8')
data=json.loads(pubsub_message)
authenticationInfo=data["protoPayload"]["authenticationInfo"]
policyDelta=data["protoPayload"]["serviceData"]["policyDelta"]
dataCombined = dict(list(authenticationInfo.items()) + list(policyDelta.items()))
dataFinal = json.dumps(dataCombined).encode()
publisher = pubsub.PublisherClient()
topic_name = 'projects/{project_id}/topics/{topic_name}'.format(
project_id=PROJECT_ID,
topic_name=TOPIC_NAME,
)
publisher.publish(topic_name, dataFinal)
Deploy the Cloud Function:
gcloud beta functions deploy iam-log-transform \
--region=us-central1 \
--memory=128MB \
--runtime=python37 \
--trigger-topic=$FIRST_TOPIC_NAME \
--source=gcf-process-iam-log \
--entry-point=processIAMLog \
--set-env-vars PROJECT_ID=$PROJECT_ID,TOPIC_NAME=$SECOND_TOPIC_NAME
The Cloud Function uses the default App Engine Service Account which allows it to push messages to the second Cloud Pub/Sub Topic without further configuration. Read more about this here.
Test
Once the Cloud Function is deployed, navigate to Cloud IAM in the GCP Console, and perform an action that will generate an IAM log such as adding a new User Account with the Viewer role or adding a Service Account with the Viewer role.
It might take a few seconds, but verify the data flow is working properly with the following command:
gcloud pubsub subscriptions pull projects/$PROJECT_ID/subscriptions/$SECOND_SUBSCRIPTION_NAME
The second Cloud Pub/Sub Subscription is the last component in the data flow. If the command returns a single row with a DATA column that contains the JSON output of the transformed log, then everything has been setup properly.
Pull From an External System
Configuring whatever external system you are using is out of scope for this post, but, at a minimum, it will need to use a Service Account private key to authenticate against the Cloud Pub/Sub API so it can pull messages from the second Cloud Pub/Sub Subscription.
It is very important to configure the Service Account in a way that allows it to only pull messages from the second Cloud Pub/Sub Subscription and not be capable of doing anything else in GCP.
Start by creating an environment variable:
SUBSCRIPTION_SERVICE_ACCOUNT_NAME="external-system-pull"
Create the Service Account:
gcloud iam service-accounts create $SUBSCRIPTION_SERVICE_ACCOUNT_NAME
Grant that Service Account permission to pull to messages from the second Cloud Pub/Sub Subscription:
gcloud beta pubsub subscriptions add-iam-policy-binding $SECOND_SUBSCRIPTION_NAME \
--member=serviceAccount:$SUBSCRIPTION_SERVICE_ACCOUNT_NAME@$PROJECT_ID.iam.gserviceaccount.com \
--role='roles/pubsub.subscriber'
To obtain the Service Account’s private key, navigate to the Service accounts section of the GCP Console, find the row for your Service Account, click the vertical three dot menu in the Actions column, and click Create Key. Create a JSON key type.