Skip to content

Latest commit

 

History

History

examples

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 

Examples

You will find some examples that help show how to use the connectors in various ways:

  • ping-pong: This is a very simple example to showcase the interaction between Zeebe and Kafka using Kafka Connect and the Zeebe source and sink connectors
  • microservices-orchestration: This example showcases how Zeebe could orchestrate a payment microservice from within an order fulfillment microservice when Kafka is used as transport.

Setup

All examples will require you to build the project and run the Kafka Connect via docker. While docker is not the only way to run the examples, it provides the quickest get started experience and thus is the only option described here. Refer to Kafka Connect Installation for more options on running Kafka Connect.

You will leverage Camunda Platform 8 - SaaS to use a managed Zeebe instance You can use Confluent Cloud to use a managed Kafka installation (recommended) - or you start Kafka via Docker as described below.

You need the following tools on your system:

  1. docker-compose to run Kafka
  2. Java and maven to build the connector
  3. Camunda Modeler to visually inspect the process models

Create Zeebe Cluster on Camunda Platform SaaS

  • Login to https://camunda.io/
  • Create a new Zeebe cluster
  • When the new cluster appears in the console, create a new set of client credentials to be used in the connector properties.

Create Kafka Cluster on Confluent Cloud

  • Login to https://login.confluent.io/login
  • Create a new Kafka cluster
  • When the new cluster appears in the console, create a new set of client credentials
  • Enter these credentials in docker/docker-compose-confluent-cloud.yml, three times that look like:
      CONNECT_SASL_JAAS_CONFIG: "org.apache.kafka.common.security.plain.PlainLoginModule required \
      username=\"7XLMBPRBYSG4PTMS\" password=\"hmS0wgnCc2gzQcPVOtcH78kIhJjbR4qzuilzLbmBgQeDJwR/YURUeZl3ocgZrgLS\";"

Build the connector

To build the connector, simply run the following from the root project directory:

mvn clean install -DskipTests

The resulting artifact is an uber JAR, e.g. target/kafka-connect-zeebe-*-uber.jar, where the asterisk is replaced by the current project version. For example, for version 1.0.0-SNAPSHOT, then the artifact is located at: target/kafka-connect-zeebe-1.0.0-SNAPSHOT-uber.jar.

Copy this JAR to docker/connectors/, e.g. docker/connectors/kafka-connect-zeebe-1.0.0-SNAPSHOT-uber.jar.

Start Kafka Connect via Docker Compose

cd docker
docker-compose -f docker-compose-confluent-cloud.yml up

Alternative to Confluent Cloud: Start Kafka via Docker Compose

You need at least 6,5 GB of RAM dedicated to Docker, otherwise Kafka might not come up. If you experience problems try to increase memory first, as Docker has relatively little memory in default.

If you don't use Confluent Cloud you can start Kafka Connect alongside a whole Kafka cluster locally by

cd docker
docker-compose -f docker-compose-local-kafka.yml up

This will start:

Of course you can customize the Docker Compose file to your needs. This Docker Compose file is also just based on the examples provided by Confluent.

Running without Docker

Of course, you can also run without Docker. For development purposes or just to try it out, you can simply grab the uber JAR after the Maven build and place it in your Kafka Connect plugin path.