• Zev Isert

Streaming data use cases

Updated: Mar 3

What is streaming data?


“IDC, a market-research firm, predicts that the “digital universe” (the data created and copied every year) will reach 180 zettabytes (180 followed by 21 zeros) in 2025. Pumping it all through a broadband internet connection would take over 450m years.” (the Economist, 2017) Much of this data creation will be in streaming data – continuous data streams analyzed in real time, as opposed to batch data which is collected over a time period and then processed in a singular period.


Streaming data is becoming much more ubiquitous in the world of information science because of its place in the “Internet of Things” (IoT). Each day more devices are being connected to the internet and collecting data to be analyzed, usually in real time. Think smart homes, industrial sensors, fitness tracking devices, sensors in stores to track customer activity, webpage tracking and more.


Use Cases


Real time data analytics are ideal for accurate data-driven decision making in your business, as you can make decisions based on the most recent information. The use cases below exemplify how streaming data can be utilized for business outcomes.


IT Monitoring


If a company has a large IT department, it could be beneficial for them to use streaming data to monitor all devices in real time. Computers, sensors, phones and other company devices can send dynamic data to be analyzed, flagging problems such as device failure, viruses and hacking as they occur.


Social Media Marketing


Companies may use streaming data when monitoring social media sentiment about their brand. Social media sites can be scraped for posts and comments mentioning the brand or a specific marketing campaign, and analyzed in real time as posts and interactions occur in order to understand how the brand and its marketing efforts resonate with customers.


Fraud Detection


Businesses can utilize streaming data for real time fraud detection. For example, a bank may monitor all of its customers’ credit card transactions in real time, and utilize algorithms on the dynamic data stream to detect fraud as it occurs, and notify their customers as soon as possible.


While streaming data can be extremely useful, as demonstrated by the use cases above, it may be a struggle to integrate into your business. Below we describe the 3 most relevant challenges associated with streaming data.


Challenges


How can your organization combat the challenges associated with real time data analytics and utilize streaming data in order to generate better, more profitable outcomes? Below are some of streaming data’s major difficulties, and solutions.


1. Continuous generation from different sources and in different formats


Problem: Streaming data may come from a variety of different sources, for example log data, social media likes, banking transactions and more. Unfortunately, this data will also most likely be in differing formats (CSV, XML, etc.). The massive amount of data resulting from continuous generation, combined with different sources and formats can make streaming data very difficult to combine and analyze quickly.


Solution: Homogenizing and integrating data is best by creating a data filter including data mapping – the process of matching fields from multiple datasets into a schema, or centralized database. Mapping your streaming data will help ensure consistency, accuracy and integrity.


2. Lack of developer expertise


Problem: The development of streaming data in the data science world has been extremely fast paced. Many data scientists and IT personnel are unable to keep up with the level of development, and do not have the resources to create data pipelines for streaming data.


Solution: Instead of having your organization’s IT personnel spend time learning to build a custom data pipeline for your company’s streaming data, you can hire a third party to assist you in building the pipeline, or use a pipeline building software.


3. Large processing power


Problem: Due to the volume of data entailed within a dynamic data stream, a large amount of processing power is needed. More processing power is also required because streaming data is inherently more difficult to handle than batch data. It is difficult to detect anomalies and clean the data as with streaming data only one piece of data is being processed at a time and it will be difficult to ascertain patterns.


Solution: Apache has a large number of open source tools dedicated to processing streaming data, and making sure you’re not falling behind. These tools include Apache Kafka, Apache NiFi and Apache Storm.