Analytics has taken world by storm & It it the powerhouse for all the digital transformation happening in every industry.
In the recent Cricket world cup, every team had its own team of Data Analysts. They used various technologies like Cloud Platform and visualizations to predict scores, player performance, player profiles and more. Around 40 years’ worth of Cricket World Cup data is being mined to produce insights that enhances the viewer's experience.
- What would be the most profitable food served at the concession stand?
- What would be the best prices to sell game day tickets?
- Determine which player on the team is the most productive?
- Which players in the draft will become all-stars, and which ones will be considered role players?
- Understand the fans behavior at the stadium via their app and push relevant information accordingly.
Now is the time to move beyond just collecting, storing & managing the data to take rapid actions on the continuous streaming data – Real-Time!!
- Digital makes customer self-service easy.
- Digitally engaged customers trust their utilities.
- Customer care, provided through digital technology, offers utilities both cost-to-serve efficiencies and improved customer intimacy.
- Digital technology brings the capability to provide more accurate billing and payment processing, as well as faster response times for changing addresses and bills, removing and adding services, and many other functions
- Using Mobile as a primary customer engagement channel for tips and alerts
- Predictive maintenance with outage maps and real time alerts to service engineer helps reduce the downtime and costs
- Smart meters allows utilities organizations to inform their customers about the energy consumption, tailor products and services to their customers while achieving significant operational efficiencies at the same time
- Minimize maintenance costs - Don’t waste money through over-cautious time bound maintenance. Only repair equipment when repairs are actually needed.
- Reduce unplanned downtime - Implement predictive maintenance to predict future equipment malfunctioning and failures and minimize the risk for unplanned disasters putting your business at risk.
- Root cause analysis - Find causes for equipment malfunctions and work with suppliers to switch-off reasons for high failure rates. Increase return on your assets.
- Efficient labor planning — no time wasted replacing/fixing equipment that doesn’t need it
- Avoid warranty cost for failure recovery – thousands of recalls in case of automakers while production loss in assembly line
TrainItalia has invested 50M euros in Internet of Things project which expects to cut maintenance costs by up to 130M euros to increase train availability and customer satisfaction.
- Identity: name, location, gender, age and other demographic data
- Relationships: their influence, connections, associations with others
- Current activity: orders, complaints, deliveries, returns
- History: contacts, campaigns, processes, cases across all lines of business and channels
- Value: which products or services they are associated with, including history
- Flags: prompts to give context, e.g. churn propensity, up-sell options, fraud risk, mood of last interactions, complaint record, frequency of contact
- Actions: expected, likely or essential steps based on who they are and the fact they are calling now
- All customer touch point data in a single repository for fast queries
- Next best actions or recommendations for customers
- All key metrics in a single location for business users to know and advise customers
- Intuitive and customizable dashboards for quick insights
- Real time hyper personalized customer interaction
- Enhanced customer loyalty
Customer 360º helps achieve Single View of Customer across Channels – online, stores, marketplaces, Devices – wearables, mobile, tablets, laptops & Interactions – purchase, posts, likes, feedback, service.
- Increased need & desire among businesses to gain greater value from their data
- Over 80% of data/information that businesses generate and collect is unstructured or semi-structured data that need special treatment
- Typically requires mix of skills - mathematics, statistics, computer science, machine learning and most importantly business knowledge
- They need to employ the R or Python programming language to clean and remove irrelevant data
- Create algorithms to solve the business problems
- Finally effectively communicate the findings to management
Any company, in any industry, that crunches large volumes of numbers, possesses lots of operational and customer data, or can benefit from social media streams, credit data, consumer research or third-party data sets can benefit from having a data scientist or a data science team.
- Kirk D Borne of BoozAllen
- D J Patil Chief Data Scientist at White House
- Gregory Piatetsky of kdnuggets
- Vincent Granville of Analyticsbridge
- Jonathan Goldman of LinkedIn
- Ronald Van Loon
One of the main attractions of automated analytics appears to be the perception that it represents an automated process that is able to learn automatically from data without the need to do any programming of rules. Furthermore, it is perceived that the IOT will allow organisations to apply analytics to data being generated by any physical asset or business process and thereafter being able to use automated analytics to monitor asset performance, detect anomalies and generate problem resolution / trouble-shooting advice; all without any programming of rules!
In reality, automated analytics is a powerful technology for turning data into actionable insight / knowledge and thereby represents a key enabling technology for automation in Industrial IOT. However, automated analytics alone cannot deliver complete solutions for the following reasons:
i- In order for analytics to learn effectively it needs data that spans the spectrum of normal, sub normal and anomalous asset/process behaviour. Such data can become available relatively quickly in a scenario where there are tens or hundreds of thousands of similar assets (central heating boilers, mobile phones etc.). However, this is not the case for more complex equipment / plants / processes where the volume of available faults or anomalous behaviour data is simply not large enough to facilitate effective analytics learning/modelling. As a result any generated automated analytics will be very restricted in its scope and will generate a large number of anomalies representing operating conditions that do not exist in the data.
ii- By focussing on data analytics alone we are ignoring the most important asset of any organisation; namely the expertise of its people in how to operate plants / processes. This expertise covers condition / risk assessment, planning, configuration, diagnostics, trouble-shooting and other skills that can involve decision making tasks. Automating ‘Decision making’ and applying it to streaming real-time IOT data offers huge business benefits and is very complementary to automated analytics in that it addresses the very areas in point 1 above where data coverage is incomplete, but human expertise exists.
Capturing expertise into an automated decision making system does require the programming of rules and decisions but that need not be a lengthy or cumbersome in a modern rules/decision automation technology such as Xpertrule. Decision making tasks can be represented in a graphical way that a subject matter expert can easily author and maintain without the involvement of a programmer. This can be done using graphical and easy to edit decision flows, decision trees, decision tables and rules. From my experience in using this approach, a substantial decision making task of tens of decision trees can be captured and deployed within a few weeks.
Given the complementary nature of automated analytics and automated decisions, I would recommend the use of symbolic learning data analytics techniques. Symbolic analytics generate rules/tree structures from data which are interpretable and understandable to the domain experts. Whilst rules/tree analytics models are marginally less accurate than deep learning or other ‘blackbox models’, the transparency of symbolic data models offer a number of advantages:
i- The analytics models can be validated by the domain experts
ii- The domain experts can add additional decision knowledge to the analytics models
iii- The transparency of the data models gives the experts insights into the root causes of problems and highlights opportunities for performance improvement.
Combining automated knowledge from data analytics with automated decisions from domain experts can deliver a paradigm shift in the way organisations use IOT to manage their assets / processes. It allows organisations to deploy their best practice expertise 24/7 real time throughout the organisation and rapidly turn newly acquired data into new and improved knowledge.
Below are example decision and analytics knowledge from an industrial IOT solution that we developed for a major manufacturer of powder processing mills. The solution monitors the performance of the mills to diagnose problems and to detect anomalous behaviour:
The Fault diagnosis tree below is part of the knowledge captured from the subject matter experts within the company
The tree below is generated by automated data analytics and relates the output particle size to other process parameters and environmental variables. The tree is one of many analytics models used to monitor anomalous behaviour of the process.
The above example demonstrates both the complementary nature of rules and analytics automation and the interpretability of symbolic analytics. In my next posting I will cover the subject of the rapid capture of decision making expertise using decision structuring and the induction of decision trees from decision examples provided by subject matter experts.
Original article is published at Forbes: link
Have heard about the magic pill? Not sure how it works, but it helps you lose 20 pounds in a week while consuming the same calories as before. And you’ve probably also heard about the scary side effects of that pill. The need for magic pills is appearing in the IoT market as well. Thanks to the explosion of sensors to measure everything imaginable within the Internet of Things, enterprises are confronted with a never-ending buffet of tempting data.
Typically data has been consumed like food: first it is grown, harvested, and prepared. Then this enjoyable meal is ingested into a data warehouse and digested through analytics. Finally we extract the nutritional value and put it to work to improve some part of our operations. Enterprises have evolved to consume data from CRM, ERP, and even the Web that is high in signal nutrition in this genteel, managed manner from which they can project trends or derive useful BI.
The IoT and its superabundance of sensors completely changes that paradigm and we need to give serious consideration to our data dietary habits if we want to succeed in this new data food chain. Rather than being served nicely prepared data meals, sensor data is the equivalent of opening your mouth in front of some kind of cartoon food fire hose. Data comes in real-time, completely raw, and in such sustained volume that all you can do is keep stuffing it down.
And, as you would expect, your digestion will be compromised. You won’t benefit from that overload of raw IoT data. In fact, we’ll need to change our internal plumbing, our data pipelines, to get the full nutritional benefit of IoT sensor data.
That will require work, but if you can process the data and extract the value, that’s where the real power comes in. In fact, you can attain something like superpowers. You can have the eyesight of eagles (self-driving cars), the sonar wave perception of dolphins (for detecting objects in the water), and the night vision of owls (for surveillance cameras).If we can digest all this sensor data and use it in creative ways, the potential is enormous. But how can we adapt to handle this sort of data? Doing so demands a new infrastructure with massive storage, real-time ingestion, and multi-genre analytics.
If we can digest all this sensor data and use it in creative ways, the potential is enormous. But how can we adapt to handle this sort of data? Doing so demands a new infrastructure with massive storage, real-time ingestion, and multi-genre analytics.
Massive storage. More than five years ago, Stephen Brobst predicted that the volume of sensor data would soon crush the amount of unstructured data generated by social media(remember when that seemed like a lot?). Sensor data demands extreme scalability.
Real-time ingestion. The infrastructure needs to be able to ingest raw data and determine moment by moment where to land it. Some data demands immediate reaction and should move into memory. Other data is needed in the data warehouse for operational reporting and analytics. Still other data will add benefit as part of a greater aggregation using Hadoop. Instant decisions will help parse where cloud resources are appropriate versus other assets.
Multi-genre analytics. When you have data that you’ve never seen before, you need to transform data and apply different types of algorithms. Some may require advanced analytics and some may just require a standard deviation. Multi-genre analytics allows you to apply multiple analytics models in various forms so that you can quickly discern the value of the data.
The self-driving car is a helpful metaphor. I’ve heard estimates that each vehicle has 60,000 sensors generating terabytes of data per hour. Consider the variety of that data. Data for obstacle detection requires millisecond response and must be recognized as such if it is to be useful. A sensor on the battery to predict replacement requires aggregation to predict a trend over time and does not require real-time responsiveness. Nevertheless both types of data are being created constantly and must be directed appropriately based on the use case.
How does this work at scale? Consider video games. Real-time data is critical to everything from in game advertising, which depends on near instant delivery of the right ad at a contextually appropriate moment, to recommendations and game features that are critical to the user experience and which are highly specific to moments within the game. At the same time, analyzing patterns at scale is critical to understanding and controlling churn and appeal. This is a lot of data to parse on the fly in order to operate effectively.
From a data perspective, we’re going to need a new digestive system if we are to make the most of the data coming in from the IoT. We’ll need vision and creativity as well. It’s an exciting time to be in analytics.