Is Apache Kafka becoming a de facto standard for data streaming?

byEli Oxman
Updated May 24, 2018

Apache Kafka is definitely a wonderful tool for serving as the basis for a stream processing pipeline.

It is battle tested and used by many large companies (including LinkedIn, Twitter, Uber, Netflix).

Additionally, the maintainers of Apache Kafka, its community and Confluent.io (a company that is currently the major driving force behind the development of Kafka) have definitely identified it as a central piece stream processing, and they are constantly working on introducing features which are making it easier to build data streaming pipelines around Kafka (including adding features like Transactions, Stream Processing).

I think the reason that Kafka became so popular is that it is simple to use, on one hand, but is very good at what it does on the other hand. It also scales very well to support the ever growing load you can expect going into your data streams.

Kafka is a central piece of our architecture here at Alooma, and we are using it to process billions of events per day for our customers.

We’ve even created a nice way to visualise your data in Kafka: Alooma Live - Kafka Real-Time Visualization.

Like what you read? Share on

Published at Quora. See Original Question here

Apache Kafka

Further reading

What is Oracle Data Integrator?
Alooma Team • Updated Apr 3, 2018
Take control of your data for free!
Sign up and get $500 worth of free credits to try Alooma.
Get started