Kafka Integration

Alooma can read and replicate all of the events in your Kafka cluster topics in near real time. This allows you to, for example, merge arbitrary data from your Kafka topics with client usage data in your data destination.

Connecting to Kafka

  1. How do you want to connect your Kafka cluster to Alooma? If it's via an SSH server, check out how to connect via SSH. Otherwise, you'll need to whitelist access to Alooma's IP addresses.





  2. From the Plumbing page, click Add new input and select the Kafka option.

  3. Give your input a name and then supply the following information:

    • Hostname or IP address of one of the servers in your Kafka cluster. If there are multiple servers in the cluster, provide the hostname or IP of any of them, and we'll discover the rest.

    • Port of your Kafka server (default is 9092)

  4. If your Kafka server is behind an SSH server you can connect to Kafka via SSH.

  5. Indicate whether your connection should be authenticated with SSL and provide the content of the following files:

    • CRT file: The SSL certificate the server is using for authentication.

    • Client Key file: The SSL key you want to use to authenticate the client.

    • Client CRT file: The SSL certificate you want to use to authenticate the client.

  6. A space- or comma-separated list of the topics you'd like to replicate.

Alooma will create an event type for each of your Kafka topics.

Keep the mapping mode to the default of OneClick if you'd like Alooma to automatically map all Kafka topics exactly to your data destination. Otherwise, they'll have to be mapped manually from the Mapper screen.

That's it! You're done integrating Kafka and Alooma.

Search results

    No results found