The increasing role of data makes retail companies switching to a data-driven approach more and more common.
So, what exactly does streaming data in IoT mean? Data streaming refers to the continuous transmission and processing of raw data records. Tools like Apache Kafka allow users to build data streams that can be used to process data in real-time for further analytical and commercial utilization in data lakes, as well as through building integrations.
This approach to data processing is a shift from traditional batch processing, which is now becoming a slightly obsolete method in this highly competitive market where the streaming experience can be enhanced on a large scale.
IoT data - volume processing challenge
Real-time insights derived from IoT devices and connected sensors can help revolutionize operations and establish a competitive advantage. All due to data stream processing systems.
A single IoT device provides small units of generated data. However, bear in mind that there are dozens of sensors, even for a small shop, and each of them generates a continuous flow of real-time data (one or even more logs per second).
Assuming 50 sensors are installed, that gives 50 times 60 (values per second) times 60 (minutes) times 12 (hours). That equals 2,160,000 values for one store PER DAY.
Say, for instance, your company owns 10, 100, or 1000+ stores and you want to keep track of past logs to tune up your stock management – that sounds overwhelming, doesn’t it? Well, when such a huge quantity of data is being generated, cloud data processing is the solution for most companies, along with IoT analytics.
Retailers are trying to establish a direct connection between themselves and their customers by using IoT data streaming. Optimizing the customer experience at every stage of the customer journey culminates in acting on IoT data to ultimately serve customers better and develop a lasting relationship with them.
Some of the cases already being used in the retail industry are highlighted below.
Cloud-based data streaming pipelines
With sensor data as a source, data engineering teams can set up robust, scalable, cloud computing-based pipelines responsible for streaming data. These pipelines immediately send measured values to a centralized data repository (i.e. a part of a company’s data lake) based on cloud object storage solutions. These storage solutions are offered by the digital giants:
- Amazon S3
- Google Cloud’s object storage
- Microsoft’s Azure Blob Storage
Once delivered to a centralized repository, such as the mentioned data lake, information can be used as a reference for automation models and analytics. The algorithms can deal with a vast amount of data using, for example, the Apache Spark processing framework. Cloud service providers offer some ready-to-use tools like AWS Glue to build Spark transformations.
If your BI team or analysts want to use convenient SQL data manipulation, building data marts or even some custom solutions, the data can be loaded to a data warehouse like Snowflake, Amazon Redshift or Google’s BigQuery. If you require immediate analysis of data streams, Apache Kafka jumps in and brings a convenient feature of ksqlDB to build streaming data applications with SQL statements.
Apache Kafka enables transforming, filtering, aggregating, and joining data sets to derive new collections or materialized views that are incrementally updated in real-time as new event data streams arrive.
Real-time stream data processing in retail inventory management
The fast-paced retail sector is one where there is an abundance of instantaneous new data. Implementing real-time analytics and processing technologies could be incredibly beneficial and can guide retail businesses towards making decisions based on valuable insights to amplify profits.
One of the simplest examples of streaming data architecture is as a retail inventory management system. This kind of solution monitors the number of products currently available in every store and warehouse. With such a data processing system, retailers get a real-time, global insight into a cross-company out-of-stock situation.
It sounds simple but it’s crucial for the supply chain automation model in your whole company to have reliable real-time and historical data analytics. Streaming data architecture lowers required workforce, simplifies processes, and provides monitoring that can manage inventory at a global scale (for all offline and online channels).
Fixed thresholds that trigger automatic stock orders may sound reasonable for an optimization starter, but there is no limitation on going deeper with that. Data science capabilities can enrich algorithms with a key feature — future demand prediction based on predictive analytics that better detect upcoming stock shortages and also optimize storage and logistics costs.
Having effective stock management in place is essential.
Taking a real-life example, imagine a scenario in which a client wants to buy a product that should be available in-store, but the shelf is empty. Such incidents create a bad customer experience. This customer might then decide to go to a different supplier to buy the missing product or even leave the shop without making any purchase.
However, there are solutions that prevent a situation when there is a disconnection between products on stock shelves and the inventory.
IoT for smart shelf management
Data streams can be generated by different sources like physical IoT devices, such as sensors that transfer monitored data in real-time like information about motion, temperature, etc., for further processing.
One of the top examples for retail IoT devices are smart shelf management systems. They continuously send information about the amounts of every product available in a store using weight sensors. Using a dedicated application, the shop assistant is immediately notified when the number of given products on shelves falls below a specified level. There is also room for another streaming data architecture like a web or mobile application, which can provide real-time analytics for inventory monitoring with single item accuracy.
Inventory management and smart shelves are not the only examples of data leveraging in retail. Collecting historical sensor data brings input for further analytics and forecasting. These can provide data about which shelves have the best conversion rate, which products sell well, and which ones customers omit for some reason.
Data streaming in video analytics
Most stores and warehouses use CCTV cameras to monitor shop traffic. However, CCTV cameras can be installed to easily manage queues and predict in-store wait time as well.
As explored in one of our previous articles on video analytics, surveillance technology can improve a future sales strategy to be aligned with customer behavior. Stream processing of large video data volumes using computer software provides intelligence and insight on products, displays, customer patterns, and other metrics to establish optimal product placement and an efficient floor plan.
Let us assume that a retailer has various data warehouses for the streaming data from IoT sensors that monitor their inventory. The first thing they want to do is to ensure that some physical quantities are set at optimum levels.
Shoppers will not buy food that isn’t fresh. Customers want to save money, but they also prefer to have a choice. Being assured that they are buying food that is fresh and of value is paramount. That's why Kroger, a grocery retailer, benefits from IoT solutions on a daily basis by preventing temperature spikes and keeping food from spoiling.
In Kroger’s case, the installation of temperature sensors in fridges and freezers keeps their products fresh by constantly monitoring their condition and streaming data about current temperature levels. Managers and facility engineers are immediately alerted when appliances don’t adhere to set conditions. Therefore, inventory losses are reduced and the seller is assured they offer fresh and safe food, providing greater customer satisfaction.
Clickstream for online business
The current standard business model comprises both physical and online stores. In order to track online customer fingerprints, it is smart to install software that provides continuously generated clickstream data. It’s the roadmap of a website visitor’s online activity, relating to what websites a user has visited, pages viewed on that particular site, time spent on each page, and where this user clicked next.
This new kind of data streaming pipeline can feed retail data analytics systems. Data streams deliver information about viewed products, device location, time spent in a virtual store, and any 404 errors that may have occurred. Such data streaming systems can be incorporated as part of the marketing strategy as well for a better user experience.
The potential of streaming data
Each of the streaming data applications above is a component of modern retail data systems, supporting your business with increased store and stock management efficiency, as well as cost optimization and shrink reduction.
Data streams can achieve numerous possibilities using IoT and data engineering power. Cloud providers bring ready-to-use services like reliable data warehouses that provide immediate value to your business, are scalable, and consequently simplify the IT infrastructure management.
More posts by this author