What is Data Streaming?

by Garrett Alley  
4 min read  • 31 Oct 2018

Data streaming defined

Visualize a river. Where does the river begin? Where does the river end? Intrinsic to our understanding of a river is the idea of flow. The river has no beginning and no end. Streaming data is ideally suited to data that has no discrete beginning or end. For example, data from a traffic light is continuous and has no "start" or “finish.” Data streaming is the process of sending data records continuously rather than in batches. Generally, data streaming is useful for the types of data sources that send data in small sizes (often in kilobytes) in a continuous flow as the data is generated. This may include a wide variety of data sources such as telemetry from connected devices, log files generated by customers using your web applications, e-commerce transactions, or information from social networks or geospatial services.

Traditionally, data is moved in batches. Batch processing often processes large volumes of data at the same time, with long periods of latency. For example, the process is run every 24 hours. While this can be an efficient way to handle large volumes of data, it doesn’t work with data that is meant to be streamed because that data can be stale by the time it is processed.

Data streaming is optimal for time series and detecting patterns over time. For example, tracking the length of a web session. Most IoT data is well-suited to data streaming. Things like traffic sensors, health sensors, transaction logs, and activity logs are all good candidates for data streaming.

This streamed data is often used for real-time aggregation and correlation, filtering, or sampling. Data streaming allows you to analyze data in real time and gives you insights into a wide range of activities, such as metering, server activity, geolocation of devices, or website clicks.

Consider the following scenarios:

  • A financial institution tracks market changes and adjusts settings to customer portfolios based on configured constraints (such as selling when a certain stock value is reached).
  • A power grid monitors throughput and generates alerts when certain thresholds are reached.
  • A news source streams clickstream records from its various platforms and enriches the data with demographic information so that it can serve articles that are relevant to the audience demographic.
  • An e-commerce site streams clickstream records to find anomalous behavior in the data stream and generates a security alert if the clickstream shows abnormal behavior.

Data streaming challenges

Data streaming is a powerful tool, but there are a few challenges that are common when working with streaming data sources. The following list shows a few of the things to plan for when data streaming:

  • Plan for scalability
  • Plan for data durability
  • Incorporate fault tolerance in both the storage and processing layers

Data streaming tools

With the growth of streaming data, comes a number of solutions geared for working with it. The following list shows a few popular tools for working with streaming data:

  • Amazon Kinesis Firehose. Amazon Kinesis is a managed, scalable, cloud-based service which allows real-time processing of large data streams.
  • Apache Kafka. Apache Kafka is a distributed publish-subscribe messaging system which integrates applications and data streams.
  • Apache Flink. Apache Flink is a streaming data flow engine which provides facilities for distributed computation over data streams.
  • Apache Storm. Apache Storm is a distributed real-time computation system. Storm is used for distributed machine learning, real-time analytics, and numerous other cases, especially with high data velocity.

The Alooma difference

Streaming data is a powerful source of information, but what if it’s not your only source of data?

How do you integrate data from streaming sources with your existing structured and unstructured data? To get the most out of your data, regardless of its source, you need to be able to put the information together so that you can analyze it. Alooma is designed to handle data from streaming sources as well as structured and unstructured data from other sources, perform transformations on the data, and load it to your data warehouse or data store so you can analyze it in real time. Just as many tributary streams merge together to create a powerful river, Alooma can help you pull together all of your tributary data streams to create one powerful source of information.

Integration. Alooma can integrate with hundreds of sources and SDKs, allowing you to capture and export your streaming data.

Scalable. Alooma can scale to meet your company’s changing needs. Maybe you are tracking a few devices today, but how many will you track tomorrow? Alooma is ready to scale quickly and painlessly.

Data integrity.With Alooma, you no longer have to worry about schema inconsistencies, type mismatches, or issues with formatting.

Secure. Alooma is 100% SOC 2 Type II, GDPR, HIPAA, and ISO27001 compliant. Alooma encrypts your data, both in motion and when it’s at rest.

Are you ready to see how Alooma can help you manage your streaming data? Contact us today!

Like what you read? Share on

Get your data flowing

Contact us to start using Alooma for free

Get Started

This might interest you as well

Schedule a free demo!

We'll show you how Alooma can integrate all of your data sources in minutes.