Overview

Writing a new connector is meant to be easy. You may write your connector in any language which has a fluvio client library. You should simply follow a few conventions and we will gladly add and maintain a connector in our official catalog.

To do so you’ll need to do following steps:

  1. Create a new directory in the repository.
  2. Add common CLI arguments to your connector
  3. Adding metadata to the connector
  4. Integration tests
  5. Custom build steps
 

1. Adding a new connector to the repository

A new connector should work as a stand alone program in development but when merged into the fluvio-connectors repo, this will be built into a docker image. It is recommended to write your connector in Rust and add a new package in the workspace in the rust-connectors/sources or rust-connectors/sinks directories.

 

2. Add commonly used arguments

With our rust connectors, we have common options that we use across the connectors.

The base requirement is the need for a --fluvio-topic commandline argument.

We encourgage use of rust_log (or similar) and use of smartmodules

 

Using Smartmodules

One of the cool features of Fluvio is that you can apply a smartmodule to a stream before sending to the fluvio cluster. To take advantage of these you should have commandline arguments of:

  • --map
  • --filter
  • --arraymap

It’s recommended to take advantage of the common utilities in the fluvio-connectors-common crate.

The fluvio-connectors-common crate also has a produce function that looks for a smartmodule and creates a fluvio producer that applies the smartmodules.

 

3. Connector Metadata

A connector should have a metadata command which prints a json schema of the commandline arguments. This is the command we use to build our connectors library and validate arguments passed to a connector.

This metadata subcommand should print to stdout something of the following:

$ test-connector metadata | jq
{
    "name": "test-connector",
    "direction": "Source",
    "schema": {
        "$schema": "http://json-schema.org/draft-07/schema#",
        "title": "TestConnectorOpts",
        "type": "object",
        "properties": {
            "count": {
                "description": "The maximum number of records to send.",
                "type": [
                    "integer",
                "null"
                ],
                "format": "int64"
            },
        }
    },
    "version": "0.1.0",
    "description": "This is a description of our new connector"
}

With our connectors written in rust, we have a fluvio-connectors-common which has commonly used options.

The fields in a metadata json object should all be generated from various attributes in the project.

In the case of our mqtt rust connector we do the following:

let schema = schema_for!(MqttOpts);
let mqtt_schema = MySchema {
    name: env!("CARGO_PKG_NAME"),
    version: env!("CARGO_PKG_VERSION"),
    description: env!("CARGO_PKG_DESCRIPTION"),
    direction: "source"
    schema,
};
println!("{}", serde_json::to_string(&mqtt_schema).unwrap());

Our CI will take this metadata command, test that it fits the schema and when merged, will generate a catalog of the connectors.

If you’d like to write to do something other than a metadata subcommand in the executable having a metadata make rule in the connector directory is also fine.

 

4. Integration tests

A given connector must have a Makefile and at least have a test rule in it. How integration tests are done, is up to the author of the connector however we have used bats in our http connector.

Our continuous integration will run make -C ./path-to-your-connector/Makefile test on each pull request.

 

5. Custom build steps

Should your connector require special build steps such as depending on a C static library, we’d ask you to have a make build make rule which handles these parts.