Google Cloud Platform (GCP) plugin

Visit our website to see the data that you can access if you use this plugin to add the data source to SquaredUp:

Google Cloud Platform

Monitor your GCP environment, including GKE, Hosts and more.

How to add a Google Cloud Platform data source

Prerequisites

In the GCP console select your project and check that the following are enabled:

To view cost information in SquaredUp using the Cost data stream check the following:

GCP Service Account Configuration

  1. Create a new Service Account, or edit an existing account.

    See GCP - Creating and managing service accounts

  2. Ensure the account has the role Viewer by adding the role Basic > Viewer.

    To view cost information, the account also needs to have BigQuery Data Viewer role.

    See GCP - Grant a single role

  3. Create a new key for the Service Account using the key type JSON. Download the JSON file as you will need to copy information from this JSON file when adding the data source next.

    See GCP - Creating service account keys

    Make sure to store the key file securely, because it can be used to authenticate as your service account.

Add a Google Cloud Platform (GCP) data source to SquaredUp

  1. To add a data source click on the + next to Data Sources on the left-hand menu in SquaredUp. Search for the data source and click on it to open the Configure data source page.

  2. Open the JSON file that you downloaded when creating the key.

  3. Copy and paste the clientEmailfrom the JSON file into the data source form.

  4. Copy and paste the private_key from the JSON file into the data source form (everything between the quotes).

  5. Copy and paste the project_ids from the JSON file into the data source form.

  6. Optionally, to use the Cost data stream, copy and paste in the billingProjectId, billingDataSetName , and billingTableName.

  7. Optionally, select whether you would like to restrict access to this data source instance. By default, restricted access is set to off.

  8. Click Test and add to validate the data source configuration.

    You can also add a data source from Settings > Data Sources > Add data source, but sample dashboards are not added when using this method.

Using the Google Cloud Platform (GCP) data streams

Data streams installed with the data source, see Tips for using the GCP Cost data stream

Write a custom data stream (advanced use) see Writing a custom data stream (advanced users)

Google Cloud Platform - BigQuery

The GCP data source allows you to set up configurable data streams for any kind of query for Google BigQuery.

  1. In the tile editor, filter by the GCP data source, select BigQuery from the data stream list and then click Next.

  2. Query:

    Enter a BigQuery query.

  3. Project ID:

    Specify the project that you would like to run the BigQuery against.

  4. Click Save.

Google Cloud Platform - MQL Query

The GCP data source allows you to set up configurable data streams for any kind of query for Monitoring Query Language (MQL). Use the + MQL Query data stream to enter your own MQL queries into SquaredUp. You are able to query any object from your Google Cloud, even if they are not indexed in the SquaredUp graph. See Google Cloud: Using Monitoring Query Language for more information about MQL.

  1. In the tile editor, filter by the GCP data source, select MQL Query from the data stream list and then click Next.

    You can either select the scoped MQL Query data stream or the global MQL Query data stream.

  2. If you selected the scoped MQL Query data stream, select the objects you want to use and then click Next.

    You do not need to do this if you have selected the global MQL Query data stream.

  3. Project:

    Select the project that you would like to run the MQL against from the dropdown.

    You will not need to select a project if you have already selected one in the scope.

  4. MQL Query:

    Enter a MQL query.

    Mustache parameters are only supported if you have scoped to a GCP object and selected the + MQL Query. Mustache parameters are not supported if you have scoped to a project or the data source instance itself.

    See MQL query examples: below for more information about queries.

    Supports mustache parameters supplied in an array

    A mustache parameter is a dynamic value, the actual value will be inserted to replace the field in curly braces. For example, {{timeframe.start}} will insert the start time based on the timeframe configured within the tile, or {{name}} will insert the name of the object(s) in scope.

    This data stream supplies scoped objects in an array for mustache parameters. When there are multiple objects in scope this data source will send the query once with all the objects in an array.

    When the scoped objects are supplied in an array the normal mustache syntax, for example {{name}}, must be contained between {{#.}} and {{/.}} (the full-stop indicates that the whole object should be used, in this case the array of objects in scope).

    For example, a query where clause might look like:

    | where ComputerName in (  {{#.}}    '{{name}}',    {{/.}} '' )
    • The {{#.}} and {{/.}} indicate that what is contained within is expanded for each element in the array of objects.

    • You can use properties of objects and write them in between curly braces e.g {{name}} to use them as mustache parameters. For example, if objects of type "host" have a property called name, you can use {{name}}. This will resolve {{name}} to the value of the name property of the different "host" objects used in the scope.

    • '{{name}}', means that the name property is expanded inside single-quotes with a trailing comma.

    • The trailing single quotes '' are necessary to stop the query being rejected because a trailing comma is disallowed.

    • Whenever you use mustache parameters, you need to use a scope of objects that contain the property you're referencing.

  5. Automatically apply dashboard timeframe:

    By default, Automatically apply dashboard timeframe will be selected.

    The dashboard timeframe is the current timeframe setting for a dashboard. Users can change the dashboard timeframe to see data for a different time span, for example, instead of showing data from "the last 12 hours" it can be changed to show data from "the last 7 days".

    Tiles can be configured to:

    • Use dashboard timeframe (default). For these tiles the data shown will change when the user changes the dashboard timeframe.

    • Use a fixed timeframe from the options available. These tiles show a clock icon and hovering shows the fixed timeframe configured. The data will not change when the dashboard timeframe is changed.

    Tip: Indicate with the name of a tile if the tile's timeframe can be changed. For example, naming a tile "Performance during the last week" tells users that this tile always shows data for the last week. Naming a tile just "Performance" indicates to users that changing the dashboard timeframe will change the data.

    If you do not wish to use a dashboard timeframe, you can set the timeframe in the MQL query by adding within to the query. If you deselect Automatically apply dashboard timeframe and do not specify a timeframe in the MQL query, the query will fail. To add a timeframe to a query, simply type | within (timeframe) at the end of your query. For example, to add a timeframe of 24 hours you enter: | within (24h)

  6. Click Save.

MQL query examples:

MQL query not using dashboard timeframe - CPU usage time in the last 20 minutes in the selected project:

fetch gce_instance | metric 'compute.googleapis.com/instance/cpu/usage_time' | within (20m)

MQL query not using dashboard timeframe - authentication events count in the last 24 hours:

fetch iam_service_account
| metric 'iam.googleapis.com/service_account/authn_events_count'
| align rate(1m)
| every 1m | within (24h)

MQL query using dashboard timeframe - uploaded bytes in BigQuery data set:

fetch bigquery_dataset
| metric 'bigquery.googleapis.com/storage/uploaded_bytes'
| align rate(1m)
| every 1m

MQL query scoped to Hosts and using mustache syntax - CPU usage time using dashboard timeframe:

fetch gce_instance
            | metric 'compute.googleapis.com/instance/cpu/usage_time'
            | filter (metadata.system_labels.name == '{{#.}}{{name}}{{/.}}')
            | align rate(1m)
            | every 1m 

Monitor Metric

  1. In the tile editor, filter by the GCP data source, select Monitor Metric from the data stream list and then click Next.

  2. Select the objects that you want to use and then click Next.

    If you are selecting multiple objects, we suggest you only scope to objects of the same type. For example, the data will probably not be displayed in a helpful way if you were to select both Host objects and SQL objects in the same scope.

  3. Metric Name:

    Select the desired metric from the dropdown. The dropdown options will automatically update depending on the object that you have scoped to. For example, scope to a Host and select CPU usage from the Metric Name dropdown.

  4. Click Save.

Workflow

Just like other services, Workflow is also supported by the GCP data source. You can monitor Workflow executions, execution times, and finished execution counts.

To set up a data stream for Workflow:

  1. In the tile editor, filter by the GCP data source or select Workflow from the scope, then select your desired metrics (Workflow executions, Execution times, Finished execution count) from the list of data streams, and then click Next.

  2. Select the Workflow you wish to scope to, then click Next.

  3. Optionally, select a Timeframe or data shaping.

  4. Click Save.

Build

The GCP data source also supports monitoring Builds from the Build Trigger service.

To set up a data stream for Build:

  1. In the tile editor, filter by the GCP data source or select Build Trigger from the scope, then select Builds from the list of data streams, and then click Next.

  2. Select the Build Trigger you wish to scope to, then click Next.

  3. Optionally, select a Timeframe or data shaping.

  4. Click Save.

PubSub

For PubSub service, you can monitor Publish requests, Publish message size, Oldest unacked message age, and Unacked messages.

To set up a data stream for PubSub:

  1. In the tile editor, filter by the GCP data source or select Topic or Subscription from the scope (depending on what you want to monitor), then select your desired metrics from the list of data streams, and then click Next.

  2. Select the Topic or Subscription you wish to scope to, then click Next.

  3. Optionally, select a Timeframe or data shaping.

  4. Click Save.

Firestore

The Firestore Database service is now supported by the GCP data source. You can set up configurable data streams for Structured query and Document.

To set up a data stream for Firestore:

  1. In the tile editor, filter by the GCP data source or select Database from the scope, then select Structured query or Document from the list of data streams, and then click Next.

  2. Select the Database you wish to scope to, then click Next.

  3. Optionally, select a Timeframe or data shaping.

  4. Click Save.

 

Writing a custom data stream (advanced users)

Add one or more custom data streams in SquaredUp.

  1. Go to Settings > Advanced > Data Streams.

  2. Click Add new Data Stream.

  3. Enter a display name for your Data Stream.

    Note: The display name is the name that you use to identify your Data Stream in SquaredUp. It has no technical impact and doesn't need to be referenced in the Data Stream's code.

  4. Choose the Data Source this Data Stream is for.

    After you've chosen the data source a new field Entry Point appears.

  5. Entry point and code:

    To find out which entry point to select and get code examples for the Code field, see the help below.

  6. Click Save to save your Data Stream.

Create generic scoped custom data stream for BigQuery

Which entry point do I have to select from the dropdown?

BigQuery (Scoped)

 

Code example:

{
  "name": "exampleScopedBigQuery",
  "dataSourceConfig": {
    "query": "SELECT project.name, service.description FROM `<project_id>.<data_set_name>.<table_name>` WHERE project.id in ('{{sourceId}}') GROUP BY 1,2 ORDER BY 1,2;"
  },
  "rowPath": [],
  "matches": {
    "sourceType.0": {
      "type": "oneOf",
      "values": [
        "GCP Project"
      ]
    }
  },
      "metadata": [
        { "name": "name", "displayName": "Project","shape": "string", "role": "label" },
        { "name": "description", "displayName": "Service", "shape": "string" },
      ]
}

Tips for using the GCP Cost data stream

The Cost data stream is part of the GCP data source and uses the GCP BigQuery API. The GCP data source needs some additional configuration to enable the Cost data stream see How to add a Google Cloud Platform data source

  • The GCP Cost data stream shows daily data.

  • By default the Cost data stream splits costs by service.

  • We recommend scopes used with the GCP Cost data stream contain only one type of object, for example, only accounts, projects or hosts. Mixed scopes may include duplicated data in the results, because a query is sent for each type of object in the scope, and the results are combined. For example, resulting data would show the cost of a host, as well as the cost of a project, even if the host is part of that project. It is useful to create a scope for your account(s), a scope of project(s) and a scope for hosts.

Use Cases for the GCP Cost data stream


Have more questions or facing an issue?