Welcome!

@CloudExpo Authors: Dana Gardner, Pat Romanski, Liz McMillan, Yeshim Deniz, Elizabeth White

Related Topics: @CloudExpo, Linux Containers, @DXWorldExpo, @ThingsExpo, @DevOpsSummit

@CloudExpo: Article

In-Stream Processing | @CloudExpo @robinAKAroblimo #BigData #AI #BI #DX

We are still getting sales reports and other information we need to run our business long after the fact

Most of us have moved our web and e-commerce operations to the cloud, but we are still getting sales reports and other information we need to run our business long after the fact. We sell a hamburger on Tuesday, you might say, but don't know if we made money selling it until Friday. That's because we still rely on Batch processing, where we generate orders, reports, and other management-useful pieces of data when it's most convenient for the IT department to process them, rather than in real time. That was fine when horse-drawn wagons made our deliveries, but it is far too slow for today's world, where stock prices and other bits of information circle the world (literally) at the speed of light. It's time to move to In-Stream Processing. You can't - and shouldn't - keep putting it off.

[Figure 1, courtesy of the Grid Dynamics Blog]

This diagram may look complicated at first, but if you trace the lines it will soon become clear. The only thing that might throw some managers for a loop is the "Data Science" box at the bottom. This term may seem intimidating, but in real life it's just a method of deciding what data is most important to extract from the data stream as it flows by, and how it should best be displayed to the business people who are its end users. Some say, "Data science is just a sexed-up term for statistics." Perhaps. But title aside, displaying the dynamic information needed to run a business - and only that information - in real-time is what In-Stream Processing is all about.

Data Nuggets in the Stream
A retailer may do 100,000 POS credit card transactions per day. That's nice to know, and it's nice to have a dashboard-type display that shows them in real-time, and shows upward and downward trends minute by minute so that cashiers can be assigned with maximum efficiency. In-Stream Processing can do this, and with the right output devices, can even trigger a storewide PA announcement that automatically says, "All cashiers to the front, please." Opening more registers as soon as checkout volume starts to trend upwards makes customers happy. Even the announcement makes them happy, because it shows that management cares about them.

And buried in those 100,000 credit card transactions, there is one that is an attempt to use a stolen card. Obviously, that transaction will be declined. But then what? In the era of batch processing, an automated alert might get sent to management or loss control hours or days after the miscreant was gone. Perhaps the cashier called security, perhaps he didn't. After all, he had other customers in line, and when he told the person with the hot credit card that it was declined, the criminal probably bolted, and the cashier didn't give chase because it was probably against store policy - and he had other things to worry about, anyway.

Now let's fast-forward to tomorrow, when we have In-Stream Processing up and running. The second - literally the second - the hot credit card transaction takes place, both the store's loss control people and management at HQ learn about it. Store security can head for the appropriate POS before the bogus customer is informed that "his" credit card is no good. Indeed, notifying the cashier, and therefore the bogus customer, that the card has been declined can be automatically delayed until store security personnel acknowledge receipt of the alert, which gives them a far greater chance to detain the criminal than they had in the bad old days before they had In-Stream Processing as an anti-theft weapon.

Predicting production failures with In-Stream Processing
Now, let's turn our sights from the POS system to the conveyors in our packing and shipping facility. Whoops! It seems the drive motor for belt number five is suddenly running at 40 degrees C instead of its typical 30C - 32C. If we don't learn about this for a day or two, chances are we'll have burnt-out motor to replace, and we've had a packing line down for a number of hours.

With In-Stream Processing, we can see temperature sensor output changes instantly, and even have alerts set to go off if they move more than X degrees outside their normal operating range. We can do the same thing with strain gauges and many other measurement devices. We can even check the number of boxes getting packed on each line, with alerts set for changes beyond our expectations so that a manager can check that line to make sure everything is okay both with the workers and with their equipment.

The sooner a potential problem is detected, the better our chance of solving it before it becomes a major, costly problem. This applies to almost every aspect of our business, including a sudden spate of customer calls about problems with a particular product.

Keeping irritated customers from becoming angry customers
An exciting recent development in call center management, which couldn't have happened before big data and the cloud made immense data processing power and data storage available at low (and ever-dropping) prices is Emotion Analysis. It's no great trick for a human to tell whether a caller is happy, inquisitive, upset or downright angry. But this is a recent trick for computers, and is just now starting to become a practical business application.

The idea is that as soon as the system detects unhappiness beyond a preset level in a customer's voice, the call is automatically diverted to a supervisor or 2nd-level support person. In theory, a first-tier support person should be able to detect that unhappiness and call for help, but as you know from your own experience calling businesses for help, people in the first support tier may not be capable of recognizing unhappiness even if - it often seems - you bluntly say, "I'm unhappy. Please get your supervisor."

Think what a pleasant surprise it would be if a fresh voice came on the line and said, "This is Ron, in support management. It sounds like you're upset. I don't blame you. Let's see what we can do to solve your problem!" We're not using a chatbot in this scenario - yet. It may not be long until we have good enough AI (or at least pseudo-AI), and good enough voice recognition to replace human customer service workers, but right now nothing beats a knowledgeable employee with the authority to actually make things right for a customer who has gotten a defective product or poor service of some sort.

But In-Stream Processing, running Emotion Analysis, can certainly help us hook our unhappy customer up with the person who can make her problems go away - and do it right now, not a week from Sunday.

New uses for In-Stream Processing are cropping up all the time
Indeed, Emotion Analysis and dozens of other applications that are new or are still being developed all depend on In-Stream Analysis, because they all rely on real-time or near-real-time processing, not processing that happens someday. Program trading is a great example. If you're doing high-frequency stock buying and selling, making hundreds or thousands of trades per minute (or even per second in some cases), you must be able to process data and make decisions - or have a program that makes decisions - fast enough that the length of your connection to the stock exchange can make a noticeable difference in your profits -- which is pretty darn fast.

-------------------------

With a little thought, you can almost certainly think of at least a few ways In-Stream Processing can benefit your business. If not, throw the idea out to your fellow executives. The chances are 100%, more or less, that within a week they'll think of at least a few ways In-Stream Processing can increase your profits.

More Stories By Robin Miller

Robin “Roblimo” Miller is a long-time IT journalist known for his work on Slashdot, Linux.com, and other sites covering software so new that its edges haven’t even started to bleed. Nowadays, he writes for FOSSForce.com and works as an editorial consultant and blog editor for Grid Dynamics, “the engineering IT services company known for transformative, mission-critical cloud solutions for retail, finance and technology sectors.”

CloudEXPO Stories
The use of containers by developers -- and now increasingly IT operators -- has grown from infatuation to deep and abiding love. But as with any long-term affair, the honeymoon soon leads to needing to live well together ... and maybe even getting some relationship help along the way. And so it goes with container orchestration and automation solutions, which are rapidly emerging as the means to maintain the bliss between rapid container adoption and broad container use among multiple cloud hosts. This BriefingsDirect cloud services maturity discussion focuses on new ways to gain container orchestration, to better use serverless computing models, and employ inclusive management to keep the container love alive.
Artificial intelligence, machine learning, neural networks. We're in the midst of a wave of excitement around AI such as hasn't been seen for a few decades. But those previous periods of inflated expectations led to troughs of disappointment. This time is (mostly) different. Applications of AI such as predictive analytics are already decreasing costs and improving reliability of industrial machinery. Pattern recognition can equal or exceed the ability of human experts in some domains. It's developing into an increasingly commercially important technology area. (Although it's also easy to look at wins in specific domains and generalize to an overly-optimistic view of AI writ large.) In this session, Red Hat Technology Evangelist for Emerging Technology Gordon Haff will examine the AI landscape and identify those domains and approaches that have seen genuine advance and why. He'll also ...
Is advanced scheduling in Kubernetes achievable?Yes, however, how do you properly accommodate every real-life scenario that a Kubernetes user might encounter? How do you leverage advanced scheduling techniques to shape and describe each scenario in easy-to-use rules and configurations? In his session at @DevOpsSummit at 21st Cloud Expo, Oleg Chunikhin, CTO at Kublr, answered these questions and demonstrated techniques for implementing advanced scheduling. For example, using spot instances and cost-effective resources on AWS, coupled with the ability to deliver a minimum set of functionalities that cover the majority of needs – without configuration complexity.
The term "digital transformation" (DX) is being used by everyone for just about any company initiative that involves technology, the web, ecommerce, software, or even customer experience. While the term has certainly turned into a buzzword with a lot of hype, the transition to a more connected, digital world is real and comes with real challenges. In his opening keynote, Four Essentials To Become DX Hero Status Now, Jonathan Hoppe, Co-Founder and CTO of Total Uptime Technologies, shared that beyond the hype, digital transformation initiatives are infusing IT budgets with critical investment for technology. This is shifting the IT organization from a cost center/center of efficiency to one that is strategic for revenue growth. CIOs are working with the new reality of cloud, mobile-first, and digital initiatives across all areas of their businesses. What's more, top IT talent wants to w...
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like "How is my application doing" but no idea how to get a proper answer.