JavaScript Mutators & The Programmable Observability Pipeline

JS Mutators

JavaScript mutators shine among the improvements in Sensu Go 6.5 – they are both more effective and more efficient at transforming Sensu event data than pipe mutators. This post explores the advantages of JavaScript mutators and includes an example, but first, a brief review.

In the Sensu observability pipeline, checks generate events, which Sensu then filters, transforms, and processes. A mutator is a component that transforms the event data. For example, a mutator can transform a Sensu event into a different JSON structure that you can send to a data platform’s API via a Sensu handler.

Traditionally, all Sensu mutators have been pipe mutators. Pipe mutators include an executable command, usually provided by a Sensu plugin, that will be executed on a Sensu backend. When the backend processes an event, the pipe mutator runs as an external process, with the event data being passed via STDIN. The pipe mutator then transforms the event and returns the output to the backend via STDOUT.

Sensu’s JavaScript mutators are an efficient, flexible alternative to pipe mutators that remove barriers as you scale up your observability efforts.

Performance and flexibility as you scale

Pipe mutators fork a process on each invocation to run their commands, which makes them expensive in terms of system resources. JavaScript mutators, on the other hand, allow you to write transformation functions as an ECMAScript 5 expression, instead of an external command. These expressions run within the Sensu backend’s process and are evaluated by the embedded Otto JavaScript VM as JavaScript programs. As a result, JavaScript mutators offer substantial benefits in performance and scaling: they require fewer system resources and provide greater throughput, especially useful in high-speed, scaled-up environments.

The processing bottleneck inherent in pipe mutators means they are poorly suited for metrics processing pipelines. Collecting and processing metrics often requires the mutator to be invoked for each event. The overhead of launching that many processes can be very slow in larger systems with many nodes, each producing hundreds of different metrics every second. JavaScript mutators eliminate this processing overhead by relying on the Sensu backend’s embedded JavaScript runtime. This makes them an especially good choice for metrics pipelines.

JavaScript mutators also streamline mutator authoring and maintenance. Pipe mutators require separate executables (usually plugins), which must be maintained separately from the rest of your Sensu resource configurations. A JavaScript mutator only requires a valid JavaScript expression, which can be included directly in the mutator’s YAML definition. If you’re using Sensu’s monitoring as code approach to manage your observability configuration, JavaScript mutators give you one less resource to worry about because there’s no need to write or maintain the external plugins that traditionally would provide the command executables for pipe mutators.

JavaScript mutators in action

Now that we’ve covered the benefits of JavaScript mutators, let’s take a look at how you might use them in your own Sensu environment.

You can use any valid ECMAScript 5 expression in your JavaScript mutators to directly mutate the events that are passed to the mutator.

NOTE: Although JavaScript mutators do not require a return value, if you do return a value with a JavaScript mutator, it must be a string. Non-string values will result in a Sensu backend error!

In this example, we’ll use a JavaScript mutator to add a new attribute to an event based on the event.check.status value.

To demonstrate how the mutator works, you’ll also need a handler, a pipeline that includes your mutator and handler, and a check to produce the events. We’ll walk through the entire workflow below.

Create the JavaScript mutator

First, configure the JavaScript mutator. The behavior we want is for the system to add an attribute to all events, with the attribute value based on the event status. To do that, we’ll create a mutator that adds a datastore attribute to the event, which is derived from the event.check.status value:

  • If the event status is OK (0), add the datastore attribute with the DB_archive value.
  • If the event status is non-OK [warning (1), critical (2), or other status] add the datastore attribute with the DB_exceptions value.

This JavaScript mutator definition implements the behavior we want:

---
type: Mutator
api_version: core/v2
metadata:
  name: assign_datastore
spec:
  eval: >-
    data = JSON.parse(JSON.stringify(event));
    if (data.check.status == 0) {
      data['datastore'] = 'DB_archive';
    } else {
      data['datastore'] = 'DB_exceptions';
    };
    return JSON.stringify(data)
  type: javascript

Create a handler

Now you’ll need a handler to process mutated events. In a real-life setting, the handler might look at the new datastore attribute to select the appropriate datastore folder for long-term event storage. For this example however, all we want to do is have a look at the processed events to understand how the mutator transformed their data, so we’ll create a debug handler that just prints events to a JSON file.

Add the following handler definition:

---

type: Handler
api_version: core/v2
metadata:
  name: debug
spec:
  type: pipe
  command: cat > /var/log/sensu/debug-event.json
  timeout: 2

Configure a pipeline

Configure a pipeline that includes your new assign_datastore JavaScript mutator and the debug handler in a single workflow. Add this pipeline definition:

---

type: Pipeline
api_version: core/v2
metadata:
  name: jsmutator_test
spec:
  workflows:
  - name: datastores
    mutator:
      name: assign_datastore
      type: Mutator
      api_version: core/v2
    handler:
      name: debug
      type: Handler
      api_version: core/v2

Reference a pipeline in a check

To start sending events to your new pipeline, you’ll need a check to supply the events.

If you already have a check you want to use, update your existing check definition with the following stanza, which will add the jsmutator_test pipeline reference:

pipelines:
  - api_version: core/v2
    name: jsmutator_test
    type: Pipeline

If you do not have a check, follow these steps to add one:

  1. Add the Sensu CPU usage check dynamic runtime asset so it is available for your check to use:
sensuctl asset add sensu/check-cpu-usage:0.2.2 -r check-cpu-usage
  1. Add the system subscription to an entity:
sensuctl entity update <entity_name>
  • For Entity Class: press enter.
  • For Subscriptions: type system and press enter.
  1. Create a check definition that uses the http-checks asset and references your pipeline:
type: CheckConfig
api_version: core/v2
metadata:
  name: check_cpu
spec:
  command: check-cpu-usage -w 1 -c 2
  interval: 15
  pipelines:
  - api_version: core/v2
    name: jsmutator_test
    type: Pipeline
  publish: true
  round_robin: false
  runtime_assets:
  - check-cpu-usage
  subscriptions:
  - system

NOTE: This check is based on the check_cpu check in Monitor server resources with checks.

Explore the results

As soon as your check starts sending events, you can print the contents of the file your debug handler creates, to explore the output of your mutator. To do that, run the following command:

cat /var/log/sensu/debug-event.json

If the most recent check execution produced an event with an OK status, the event data should include datastore: DB_archive. If the most recent event had a warning or critical status, the event data should include datastore: DB_exceptions instead.

Next steps

JavaScript mutators accept any valid ECMAScript 5 expression, so they offer a vast range of options for transforming event data. For example, you can remove unneeded event data or add a new event attribute that combines the values of several existing attributes into a single string. Read the mutators reference for JavaScript mutator examples that remove, add, and combine event attributes.

In regular usage, you probably won’t send all of your events to a debug file. Instead, you could send your mutated events to a service like Sumo Logic or InfluxDB for storage, analysis, and visualization. Read Sensu Plus to transmit your Sensu observability data to Sumo Logic or Populate metrics in InfluxDB with handlers to send data to InfluxDB.

Make sure to join our Discourse community, where you can share with and learn from other Sensu users and get updates on the latest Sensu product releases.