Eventador Elements™


Eventador Elements, a library of curated and fully managed plug-ins, provide the connectors, sources, sinks, and other add-ons that unlock the potential of your streaming data infrastructure.

Elements erases the burden of integrating with various internal and external data sources by allowing customers to easily deploy fully managed discrete plug-ins that make streaming infrastructures a true hub for democratizing data across the enterprise. Elements provides unprecedented simplicity with the ability to eliminate any worry about the underlying infrastructure, configuration or management—it is all handled and managed by Eventador in a true cloud-native fashion.

Elements are deployed completely transparent to the end-user. Some elements are stand-alone containers, while others are clusters of containers depending on the type of element. In some cases elements are KafkaConnect applications. Elements are deployment specific and reside in the same VPC as the deployment itself.

If you don’t already have an enterprise account follow these steps to get setup:

  • Fully Managed Apache Flink and Elements are only available in the enterprise environment. Contact sales for a trial account and/or demo. They will give you a dedicated URL, control plane, and environment for your use.
  • Login to your specific URL given to you by sales/support to create Flink deployments.


Adding elements to your deployment

Elements are deployment-specific. To add an element:

  • From the deployments list - Click the add element button to add an element to your deployment.
  • Select an element from the list, click add.
  • The element will be listed under the deployment with the it’s name and version.
  • Each element may have a set of unique configuration options.

Schema Manager configuration and usage

The Eventador Elements Schema Manager is a user interface for management of Avro schema’s in Kafka and is currently using the Confluent Schema Registry 5.0.1 as the backend. It allows for simple creation, editing, versioning, and validation of Avro schema’s with a simple interface. The Eventador Console displays the schema registry endpoint for usage in applications for schema validation.

Create a schema

  • From the deployments list - Click on the element named Schema Manager.
  • To create a new schema, click the new editor button, a new basic schema is show in the editor window and can be altered as desired:
  "type": "record",
  "name": "mynewschema",
  "namespace": "com.eventador.users",
  "doc": "some documentation about this schema",
  "fields": [
      "name": "firstName",
      "type": "string"
      "name": "lastName",
      "type": "string"
      "name": "birthDate",
      "type": "long"
  • Alter the schema to match your desired schema. Ensure that type is record: "type": "record"
  • Give the schema a logical name in the Schema Editor: box and select Save Version.
  • A new schema with version 1 will be created in the schema’s list.
  • You may now reference this schema in your code.

Schema Registry connection string

When Schema Manager is deployed, a URI is created for the Schema Registry backend. This URI is used in your code to reference a schema. To find the URI:

  • From the deployments list - Click on the element named Schema Manager.
  • Select the Connections tab, the URI for Schema Registry will be listed in the pane as Plain Text Endpoint.
  • Use this URI in your code to reference a schema. For more information on using Schema Registry see the docs here.

Validating a schema using Schema Manager

To validate a schema:

  • From the deployments list - Click on the element named Schema Manager.
  • Click on a schema in the schema’s list
  • Select the Validate button. Check the status message if the schema is valid.

Editing a schema and incrementing the version

To edit a schema:

  • From the deployments list - Click on the element named Schema Manager.
  • Click on a schema in the schema’s list
  • Edit the schema as needed then select the Validate button
  • Select the Save Version button. The version select box will increment to the next higher version.


Kafka Connect: Debezium configuration and usage

Debezium is a distributed platform that captures changes in databases (CDC) and publishes them to Kafka using the Kafka Connect interface. Debezium sends data changes to Kafka, thus making them available in a wide variety of use cases. Debezium is integrated into Eventador as an Eventador Element to make proisioning a CDC source a one click endeavor. More specifics on how Debezium works are located in the Debezium documentation.

  • From the deployments list - Click on the element name to configure it.
  • Under the Active Connectors tab, click Add Connector.

Available connectors are:

io.debezium.connector.mongodb.MongoDbConnector                 0.9.2.Final
io.debezium.connector.mysql.MySqlConnector                     0.9.2.Final
io.debezium.connector.oracle.OracleConnector                   0.9.2.Final
io.debezium.connector.postgresql.PostgresConnector             0.9.2.Final
  • Specify a KafkaConnect JSON configuration in the window.
  • Some handy examples are shown in the Examples tab.

For example for the MongoDB connector - io.debezium.connector.mongodb.MongoDbConnector:

  "name": "my-mongodb-connector",
  "config": {
    "connector.class": "io.debezium.connector.mongodb.MongoDbConnector",
    "mongodb.hosts": "mongodb_testing/my.mongo.host:27017",
    "mongodb.name": "mymongodb",
    "collection.whitelist": "mymongodb.mycollection"

Or the MySQL connector - io.debezium.connector.mysql.MySqlConnector:

  "name": "my-mysql-connector",
  "config": {
    "connector.class": "io.debezium.connector.mysql.MySqlConnector",
    "database.hostname": "my.mysql.host.com",
    "database.port": "3306",
    "database.user": "dbz",
    "database.password": "dbz",
    "database.server.id": "1",
    "database.server.name": "my_mysql_server",
    "database.whitelist": "my_mysql_db",
    "database.history.kafka.bootstrap.servers": "your.kafka.broker:9092",
    "database.history.kafka.topic": "dbz.mysql.history",
    "include.schema.changes": "true"
  • Click on Create Connector.
  • The console application will display basic information about the status, and allow you to stop, restart, and delete the Element.



Presto is a distributed SQL query engine for big data. The Eventador Element for Presto has a couple Eventador-only Kafka enhancements to make it well suited for writing SQL against Kafka topics including a simple web based query interface, automatic topic discovery, and an updated driver connection facility that allows SASL auth.

Using Eventador Console to query via Kafka connector.

  • From the deployments list - Click on the element named PrestoDB
  • In the SQL Terminal tab you can issue SQL against your Kafka cluster. It has auto-discovered your topics, and can be queried just like tables. There is no configuration files to edit.
  • A full syntax guide is available in the Presto/Kafka documentation.

Note: for color coded JSON use \g to terminate a SQL statement vs ;


Getting the REST endpoint for Presto

Presto has a REST endpoint for using the Presto client or even home-grown tooling. To get the endpoint for your Presto cluster:

  • From the deployments list - Click on the element named PrestoDB
  • Select the Connecting tab
  • The REST endpoint is listed in the PrestoDB Endpoint box.

Other Presto Connectors

Eventador.io supports all the Presto connectors, so you can join and query other datasources alongside Apache Kafka. Configuring VPC peering, security groups, and presto for other connectors is done via Support