Subscribe to our Newsletter | To Post On IoT Central, Click here


industrial iot (12)

IIoT protocols for the beginners

We all know HTTP (hypertext transfer protocol). These are the first 4 alphabets which see on any URL of a website you open in your browser. In simple terms, it is a list of rules that define do’s and don’ts of communication between web browser and web server. It is like you (web browser) going to ATM (webserver) to get some cash (request). Here the HTTP will describe the complete procedure – enter pin, amount, etc. You get your cash (result) once you follow the mentioned steps. Quite simple.

The World Wide Web (WWW) works on HTTP as it is the only protocol used there for the data transfer. However, this is not the case in the Industrial (I) IoT world. Here we have a bunch of protocols to choose depending on the type of application or so-called “use case”. The most common among them are MQTT, CoAP and of course HTTP. Before we discuss them, let us first have a look at certain networking terminologies and definitions.

Source: Pixabay

Transport layer protocols (TCP, UDP)

Transport layer protocol, as the name implies, is responsible for transportation of message or information from one computer to another. The transport of the information can be done in two ways:

  1. Connectionless protocol (UDP): This kind of protocol is preferred in cases where speed and efficiency are more important than the reliability. In this case the data is sent without establishing or waiting for a connection. This means that a bit or segment of data can get lost during transportation. A typical example of such protocol is live video streaming where sometimes bad connection results in the fragmented video. For example, imagine yourself bringing a bunch of letters to the postbox and dropping them inside. You are just dropping the letters inside the box without knowing whether they will be delivered to their recipients. This is the case with connectionless protocols. On the other hand, bringing all these letters to the post office and ordering a return receipt for them, thus ensuring their delivery, can be compared to a connection-oriented protocol.
  1. Connection-oriented protocol (TCP): Here the protocol ensures the receipt of a message at the other end without any data loss on the way, thus ensuring a reliable transport. The connection-oriented protocol needs extra overhead (discussed later) as compared to the connectionless protocol. Just like, it takes extra resources (time, money) to order a registered letter with return receipt.

Packet and Packet size 

packet contains data (payload) along with information (header) like source, destination, size etc. Just like a DHL packet that contains stuff to be shipped along with information like address, weight, dimension etc. packet size in networking, is the amount of data (in bytes) carried over the transport layer protocols.

Overhead

It is the extra information (in bytes) or features associated with the packet which ensures the reliable delivery of the data. In other terms, it is that bubble wrap foil around your shipment that is not necessarily needed but provides an extra layer of safety and reliability for a safe shipment of your parcel.

The amount of overhead associated with the packet depends on the type of transport protocol used. UDP in comparison to TCP has smaller overhead.

Bandwidth

Bandwidth is the rate (bits/MB/GB per seconds) at which the data transfer takes place. The larger the bandwidth, the more data can be transferred at a given time.

So that was a crash course on networking. Now let us try to understand the mentioned IIoT protocols using these terminologies.

Message Queue Telemetry Transport or simply MQTT is a lightweight messaging protocol for industrial and mobile applications. It is best suited for application where network bandwidth and power usage are limited, for example, small sensor, remote location applications, machine to machine communication. MQTT communicates with a server over TCP and unlike HTTP works on publish subscriber model (see figure below).

 

Fig. Example of a publish subscriber model used in MQTT

In order to understand the concept behind the MQTT, one should try to understand the underlying architecture “The publish-subscriber model”. Here a client publishes a message or a topic (temperature, humidity) to a broker that in turn sends these topics out to clients interested in subscribing to that message.

 

The publish subscriber model used in MQTT offers a couple of advantages as compared to the standard client-server model used in HTTP. Multicast, scalability and low power consumption are among the top three. These advantages are due to the fact that the publish-subscriber model overcomes some of the structural (one to one communication, tight coupling, fault sensitive) drawbacks of the traditional client-server model.

Let’s have a look at an analogy in order to understand the difference. Let us assume that MQTT and HTTP are two publishing companies: MQTT publishes magazines on various topics (sports, politics, cars, etc.) and provides them to a broker who in turn distributes them to subscribers interested in one or more topics. This way MQTT can cater many (multicast) subscribers at a given time, thus it is scalable. Since he only has to deal with a broker whom he contacts once a day, his investment (power consumption) in maintaining the business is not high.

 

HTTP, another publisher, likes to deal with one customer at a time. He highly relies on his customer and on his value chain (server to server). This, however, comes at a cost of relatively high business investment (power consumption) since he has to visit his customer each time for a handshake.

 

MQTT in contrast to HTTP is best suited for an application where bandwidth, packet size and power are at a premium. An industry generator with battery-powered temperature and humidity sensor cannot afford to maintain a connection with server each time it has to push the measured values (event or message) into the cloud. MQTT is just designed to overcome such constraints where the connection is maintained by using a very little power and the commands and events can be received with as little as 2 bytes of overhead (extra resources needed for operation).

 

Constrained Application Protocol or simply CoAP, is a UDP based protocol, which is mostly interpreted as a light version of HTTP (except the fact that HTTP works over TCP). It is specially designed to work in a constrained environment with limited bandwidth and power constraints, where communication has to be fast and ongoing. Unlike HTTP, CoAP can support one to many (multicast) requirements and is faster than other TCP based protocols which makes it a good choice for M2M.

 

It is quite common to see the device to device (D2D) or device to gateway (D2G) communication done over CoAP and the communication between gateway and cloud is HTTP job. This is due to the fact that there is a well-defined mapping between these two protocols.

So, if both MQTT and CoAP are good for the constrained environment, then what makes one better than another? The answer lies in the underlying transport layer their use. MQTT is better suited for event-based communication in a constrained environment where data needs to be sent in batches (for instance temperature and humidity values) and at regular intervals over a reliable channel.

CoAP is a better choice for continuous conditioning monitoring scenario in a constrained environment. Since it runs over UDP, CoAP offers faster communication among the devices which makes it a better option for M2M/D2D/D2G communication. CoAP is also best suited for web-based IIoT application where it has to work along with HTTP. In such a setup, you have CoAP at sensor side and HTTP running between proxy/gateway and cloud.

What about HTTP? It is on demand whenever you want to push a big chunk of data from gateway/industry modem/computer into the cloud or a web-based application without compromising on security. Here regardless of how data is collected and sent to a gateway (CoAP vs MQTT) if it comes to reliable big package delivery, then HTTP takes the front seat. Moreover, HTTP is still used as a standard protocol for devices who do not support any other protocols.

MQTT or CoAP or HTTP, it is a matter of speed vs reliability vs security, whichever suits your use case the best.

I hope you enjoyed reading the article and that it helped you to get at least a basic understanding of the major IIoT protocols. Your feedback, comments or suggestions are always welcome.

Read more…

Rise of the Intelligent Revenue Machines

An early theme of digital transformation was the notion of selling services rather than products. A contract with the “thing maker” to circulate cooling fluid throughout my factory rather than a purchase order for me to buy the pumps and filters needed to do it myself, for example. The contract lets me focus on creating products for my customers rather than maintaining the machines making this possible. I don’t want to spend time on the process (pumps and filters), I just need the outcome (properly cooled machines) in the least distracting way possible to my core business of producing goods, medicine, energy, etc. The contract lets you, purveyor of the connected pumps and filters, build a closer relationship with me, streamline your business, and avoid competing in an increasingly commoditized space.

The fundamental shift happening today goes beyond providing guaranteed services rather than just hardware. Ensuring my lights stay on rather than selling me light bulbs solved your commodity hardware problem, but over time service offerings will face similar pressure as your competitors follow your connected product path and undergo digital transformations of their own. Your long term return on investment in IoT depends on more than keeping my lights on and water flowing. The value your IoT system creates for you depends on your IoT system’s ability to generate more business for me. There’s no such thing as a cheaper “good enough” replacement part when it comes to generating new revenue.

In healthcare for example, when your IoT system enables me to perform procedures in 24% less time, my clinics can perform 24% more procedures each day, increasing my revenue by 24% and delivering a 24% better patient experience. That’s what I’m looking for when I’m buying medical equipment. Depending on my corporate agility, the adoption and rollout of your connected machines may be a phased approach, following a progression of business outcomes. Asset Management means knowing the status of each device at all times and controlling them accordingly. This first step helps me see the potential value of incoming data and better understand my current utilization. Workflow Integration is connecting this information with my enterprise systems, which enables Predictive Maintenance and automatically alerts service technicians when a machine shows signs of impending failure. Where everything comes together and bonds me securely to your connected product service is Yield Optimization.

At this point your IoT system is collecting data from machines in my facilities as well as external data like weather and information from my other enterprise systems, correlating this information and uncovering patterns and ways for me to achieve more with less. Your “things” are now more than hardware installed in my facility performing physical tasks. They’re active components in a new System of Intelligence engaged in a loop of continuous learning and improvement.

This is true digital transformation, the creation of business value out of data collected and processed by your IoT solution.

Read more…

To paraphrase Geoffrey Moore, smart “thing makers” are investing in IoT solutions for their customers today in order to generate more revenue for themselves tomorrow. Traditional hardware vendors are being commoditized and replaced whenever a cheaper “good enough” option comes along. To thrive in the long run, your value must be “sticky”, embedded in your customer’s business, providing benefit to their customers as well. The “things” you sell now simply enable your customers to run their basic operations. Whenever a part breaks, customers make a decision to order a new one either from you or a competitor. How differentiated is your equipment from the rest of the market? Your business is constantly at risk.

What we’re seeing as a result are “thing makers” creating smart systems that empower their customers to not just operate, but to *optimize* their operations. These devices still perform their physical functions as before, but also collect and share a stream of data about their status and conditions in the world around them. It’s the data they produce, and the insights your system derives from this data, that enable your organization to offer far more valuable products and services to your customers that are not so easily replaced.

If you know the state of your machines at all times, you can build predictive maintenance and service models enabling guaranteed uptime and automatic replenishment. If your equipment never breaks or runs empty, your customer is unlikely to replace it with a competitor’s version.

If your products provide not just lighting and temperature control but also insights correlating usage patterns with time, weather, and utility data that reduce your customer’s costs, you can sell them this information for a percentage of these savings.

It’s the future. Your connected product system is part of your customer’s operating procedures, continuously generating insights for maximizing productivity. Improved asset utilization, faster turnarounds, synchronized workflows, and more. Smoother operations and reliable performance deliver better experiences for their customers, further expanding your customer’s business, because of your IoT solution. You don’t just sell “things.” You sell outcomes, which is what your customers really wanted in the first place.

That’s pretty smart.

Read more…

There is an ongoing transition from a world where having an internet connection was sufficient, to a world where ubiquitous connectivity is quickly becoming the norm. The ability to gather and transport data at high speeds from anywhere is leading to increased automation, smart-everything (vehicles, homes, appliances – you name it), and a standardization of languages and protocols that make the possibilities nearly endless.

Recently, IEEE and Eclipse Foundation completed surveys that provided a snapshot on tools, platforms and solutions being used by engineers and programmers alike to build the Internet of Things. According to Joe McKendrick for RTInsights.com, there were several notable conclusions to be drawn from the results, including the revelation that, of the 713 tech professionals surveyed, nearly 42 percent said their companies currently deploy an IoT solution, and 32 percent said they will be deploying/working with an IoT solution over the next 18 months. Additionally, RT Insights writes:

“In terms of areas of concentration, 42% report they are working with IoT-ready middleware, while 41% are concentrating on home automation solutions. Another 36% are working with industrial automation as part of their IoT efforts. One-third are working on IoT for smart cities, and the same number are building smart energy solutions.”

An interesting note from those conclusions is that 36 percent are working with industrial automation as part of their IoT efforts. Earlier this year, we predicted that Industrial IoT (IIoT) app development would outpace consumer IoT apps, and although this sample size is somewhat limited, it still bodes well for the development of the IIoT sector that is just starting to come into its own.

Among IoT developers, there has been a bit of debate over the programming languages that best suit IoT apps. There are situationally appropriate uses for the main languages, but currently, the majority of developers prefer Java and the C language. For developers, being able to build out IoT apps that can work across platforms is a giant step toward standardization. Specifically, in the Industrial IoT, being able to build apps that can function at the Edge to enable smart data collection is a becoming an unofficial mandate for any companies hoping to transition legacy OT operations into the IT/OT convergence movement taking place across critical industries.

Of course, building apps is a meaningless task if the hardware being deployed can’t host those apps, a finding that was demonstrated by the survey:

Hardware associated with IoT implementations include sensors, used at 87% of sites, along with actuators (51%), gateways and hub devices (50%), and edge node devices (36%).

This Edge functionality and sensor deployment are two pieces that are driving the adaption of IoT technology across industries that have traditionally relied on data as the main tool for decision making. However, with smarter hardware, these industries now have the opportunity to improve the efficiency of that decision making – a transformative capability in the industrial realm.

Join FreeWave’s ZumLink IPR Pilot Program!

IIoT App Development with Java, Python and C++ languagesWhat if you could…..

  • Collect, analyze and react to data in real-time at the sensor edge?
  • Reduce BIG DATA that clogs data pipelines?
  • Minimize the cost of expensive PLCs?
  • Control your sensor at the closest touchpoint?

The ZumLink IPR App Server Radio combines 900 MHz wireless telemetry with the ability to program and host 3rd party Apps for intelligent control and automation of remote sensors and devices. To participate in the pilot program, visit: http://www.freewave.com/zumlink-ipr-pilot-program/.

Pilot Program participants:

  • Receive a complimentary hardware/software Dev Kit
  • Get support from FreeWave software engineers
  • Should have App developer’s skills

Let’s discuss:

  • Use cases that would help you or your organization solve a problem
  • Problems you would like to solve
  • Developers that could build this App
Read more…

What are common features of IIoT and SCADA/HMI and differences between them? And what advantages do Internet of Things Platforms have over SCADA systems? Find out answers in our new presentation.

Read more…

We've updated our IoT platform presentation to tell you more about it, derived products and solutions.

Pleasant viewing!

                           

Read more…

Our software offers fully functional monitoring solutions for healthcare organizations. Fast deployment, easy integration, and great usability guarantee quick troubleshooting to your healthcare IT teams.

AggreGate IoT Platform enables centralized monitoring and data aggregation for various wearable medical devices and mobile e-health applications. Intelligent Big Data processing algorithms allow detecting negative trends proactively, providing a strong foundation for building customized predictive medicine solutions.

In addition, choosing AggreGate solutions for your medical infrastructure monitoring, you get all types of industry-specific management.

Find out what AggreGate can offer for your health, medical devices, and facilities in IoT Solutions for Healthcare and Social Institutions website section.

Read more…

Tibbo announced the release 5.4 of AggreGate IoT Integration Platform.

We've achieved great results in optimizing AggreGate server performance, especially event and value update storage performance. From now on, a single server can process and persistently store up to a hundred thousand events/updates per second, which is almost equal to 10 billion events per day. Such performance figures don't even require any high-end server hardware.

A new chapter has been opened by this release, presenting AggreGate's graphical and textual programming languages inspired by IEC 61131-3 standard, also known as "SoftPLC". Millions of engineers are now able to use AggreGate as a process control logic development environment.

One innovative feature of AggreGate's automation languages is tight integration of runtime with the Tibbo Project System hardware. Your programmed logic can access and control all Tibbit modules of a Linux-based TPS board/box. Currently available languages are: Function Block Diagram (graphical), Structured Text (graphical), Sequential Function Chart (textual).

Widget capabilities are no longer limited by the standard set of components. Now it can be easily extended. New Widget Component SDK allows to implement custom visual components in Java and use them in AggreGate widgets. Extend AggreGate's wide component palette with UI controls best suited to your needs!

We continue making our UI interface clearer and more user-friendly. The next step is lightweight icons. We redesigned them to be up-to-date with modern flat paradigm. New color coding assists users to navigate over various available toolbar actions.

Other major improvements include:

  • Built-in timestamps and quality for data tables.
  • Component connectors that allow to visually link UI components with each other.
  • Secure and reliable Agent communications. Agent-Server communications now can be SSL-protected. When transferred data amount is critical, data compression can be enabled in parallel to encryption.
  • Granulation, a brand-new highly customizable data aggregation and consolidation tool. The granulation engine allows to combine datasets into compact representation that contains all important aspects of original information in virtually any form suitable for later processing. This allows to reduce memory and storage consumption along with boosting data processing performance.
  • Server remote upgrading. To reduce company's expenses, a remote AggreGate server upgrade operation is now supported. You can use our Unified Console application to connect to a remote server, upload a server upgrade bundle file and wait while the upgrade process is finished. That's it! All operations, including database backup, stopping server, upgrading and restarting will be performed at the server side automatically.

We are bringing our IT & Network Management solution (AggreGate Network Manager) to a new level by turning it into a full-fledged IT Service Management System. In this release, we introduce several essential instruments for that: Configuration Management Database (CMDB), metrics engine and topology-based root-cause analysis tools. Another ITSM functionality - IP address management module - is now available and you can use it out-of-the-box.

AggreGate 5.4 includes new device drivers: CoAP, MQTT, IEC 104, DLMS/COSEM, SMI-S.

You can get detailed information on the new 5.4 release, download and try the updated AggreGate IoT Platform on our website: http://aggregate.tibbo.com/news/release-54.html

Read more…

Top 5 Industrial IoT use cases

The industrial IoT has already proven its versatility with deployments going live in a number of enterprises, showing off dozens of different use cases. But a few key uses consistently present themselves within the same trade, and even throughout different industries.

Top 5 industrial IoT use cases

It’s important to note that IoT use cases will likely expand in the next few years. That being said, we have compiled the top five industrial IoT use cases of today:

Predictive maintenance

Keeping assets up and running has the potential to significantly decreasing operational expenditures (opex), and save companies millions of dollars. With the use of sensors, cameras and data analytics, managers in a range of industries are able to determine when a piece of equipment will fail before it ever does. These IoT-enabled systems can sense signs of warning, use data to create a maintenance timeline and preemptively service equipment before problems occur.

By leveraging streaming data from sensors and devices to quickly assess current conditions, recognize warning signs, deliver alerts and automatically trigger appropriate maintenance processes, IoT turns maintenance into a dynamic, rapid and automated task.

This approach promises cost savings over routine or time-based preventive maintenance, because tasks are performed only when they are needed. The key is to get the right information in the right time. This will allow managers to know which equipment needs maintenance, maintenance work can be better planned and systems remain online while workers stay on task. Other potential advantages include increased equipment lifetime, increased plant safety and fewer accidents with negative impact on environment.

Smart metering

A smart meter is an internet-capable device that measures energy, water or natural gas consumption of a building or home, according to Silicon Labs.

Traditional meters only measure total consumption, whereas smart meters record when and how much of a resource is consumed. Power companies are deploying smart meters to monitor consumer usage and adjust prices according to the time of day and season.

Smart metering benefits utilities by improving customer satisfaction with faster interaction, giving consumers more control of their energy usage to save money and reduce carbon emissions. Smart meters also give visibility of power consumption all the way to the meter so utilities can optimize energy distribution and take action to shift demand loads.

According to Sierra Wireless, smart metering helps utilities to:

  • Reduce operating expenses by managing manual operations remotely
  • Improve forecasting and streamline power-consumption
  • Improve customer service through profiling and segmentation
  • Reduce energy theft
  • Simplify micro-generation monitoring and track renewable power

Asset tracking

A study on the maturity of asset efficiency practices from Infosys and the Institute for Industrial Management (FIR) at Aachen University revealed that 85% of manufacturing companies globally are aware of asset efficiency, but only 15% of the surveyed firms have implemented it at a systematic level.

source: Actsoft
source: Actsoft

Infosys and other supporting companies including Bosch, GE, IBM, Intel, National Instruments and PTC have launched a testbed with the main goal of collecting asset information efficiently and accurately in real-time and running analytics to allow the firms to make the best decisions.

The goal of asset tracking is to allow an enterprise to easily locate and monitor key assets (e.g. raw materials, final products, and containers) and to optimize logistics, maintain inventory levels, prevent quality issues and detect theft.

One industry that heavily relies on asset tracking is maritime shipping. On a large scale, sensors help track the location of a ship at sea, and on a smaller scale they are able to provide the status and temperature of individual cargo containers. One benefit is real-time metrics on refrigerated containers. These containers must be stored at constant temperatures so that perishable goods remain fresh.

Each refrigerated container needs to be equipped with temperature sensors, a processing unit and a mobile transmitter.

To continue reading, please visit the full article on Industrial IoT & 5G

 

Read more…

Originally Posted and Written by: Michelle Canaan, John Lucker, & Bram Spector

Connectivity is changing the way people engage with their cars, homes, and bodies—and insurers are looking to keep pace. Even at an early stage, IoT technology may reshape the way insurance companies assess, price, and limit risks, with a wide range of potential implications for the industry.

Insurers’ path to growth: Embrace the future

In 1997, Progressive Insurance pioneered the use of the Internet to purchase auto insurance online, in real time.1 In a conservative industry, Progressive’s innovative approach broke several long-established trade-offs, shaking up traditional distribution channels and empowering consumers with price transparency.

This experiment in distribution ended up transforming the industry as a whole. Online sales quickly forced insurers to evolve their customer segmentation capabilities and, eventually, to refine pricing. These modifications propelled growth by allowing insurers to serve previously uninsurable market segments. And as segmentation became table stakes for carriers, a new cottage industry of tools, such as online rate comparison capabilities, emerged to capture customer attention. Insurers fought to maintain their competitive edge through innovation, but widespread transparency in product pricing over time created greater price competition and ultimately led to product commoditization. The tools and techniques that put the insurer in the driver’s seat slowly tipped the balance of power to the customer.

This case study of insurance innovation and its unintended consequences may be a precursor to the next generation of digital connectivity in the industry. Today, the availability of unlimited new sources of data that can be exploited in real time is radically altering how consumers and businesses interact. And the suite of technologies known as the Internet of Things (IoT) is accelerating the experimentation of Progressive and other financial services companies. With the IoT’s exponential growth, the ways in which citizens engage with their cars, homes, and bodies are getting smarter each day, and they expect the businesses they patronize to keep up with this evolution. Insurance, an industry generally recognized for its conservatism, is no exception.

IoT technology may still be in its infancy, but its potential to reshape the way insurers assess, price, and limit risks is already quite promising. Nevertheless, since innovation inevitably generates unintended possibilities and consequences, insurers will need to examine strategies from all angles in the earliest planning stages.

To better understand potential IoT applications in insurance, the Deloitte Center for Financial Services (DCFS), in conjunction with Wikistrat, performed a crowdsourcing simulation to explore the technology’s implications for the future of the financial services industry. Researchers probed participants (13 doctorate holders, 24 cyber and tech experts, 20 finance experts, and 6 entrepreneurs) from 20 countries and asked them to imagine how IoT technology might be applied in a financial services context. The results (figure 1) are not an exhaustive compilation of scenarios already in play or forthcoming but, rather, an illustration of several examples of how these analysts believe the IoT may reshape the industry.2

ER_2824_Fig.1

CONNECTIVITY AND OPPORTUNITY

Even this small sample of possible IoT applications shows how increased connectivity can generate tremendous new opportunities for insurers, beyond personalizing premium rates. Indeed, if harnessed effectively, IoT technology could potentially boost the industry’s traditionally low organic growth rates by creating new types of coverage opportunities. It offers carriers a chance to break free from the product commoditization trend that has left many personal and commercial lines to compete primarily on price rather than coverage differentiation or customer service.

For example, an insurer might use IoT technology to directly augment profitability by transforming the income statement’s loss component. IoT-based data, carefully gathered and analyzed, might help insurers evolve from a defensive posture—spreading risk among policyholders and compensating them for losses—to an offensive posture: helping policyholders prevent losses and insurers avoid claims in the first place. And by avoiding claims, insurers could not only reap the rewards of increased profitability, but also reduce premiums and aim to improve customer retention rates. Several examples, both speculative and real-life, include:

  • Sensors embedded in commercial infrastructure can monitor safety breaches such as smoke, mold, or toxic fumes, allowing for adjustments to the environment to head off or at least mitigate a potentially hazardous event.
  • Wearable sensors could monitor employee movements in high-risk areas and transmit data to employers in real time to warn the wearer of potential danger as well as decrease fraud related to workplace accidents.
  • Smart home sensors could detect moisture in a wall from pipe leakage and alert a homeowner to the issue prior to the pipe bursting. This might save the insurer from a large claim and the homeowner from both considerable inconvenience and losing irreplaceable valuables. The same can be said for placing IoT sensors in business properties and commercial machinery, mitigating property damage and injuries to workers and customers, as well as business interruption losses.
  • Socks and shoes that can alert diabetics early on to potential foot ulcers, odd joint angles, excessive pressure, and how well blood is pumping through capillaries are now entering the market, helping to avoid costly medical and disability claims as well as potentially life-altering amputations.3

Beyond minimizing losses, IoT applications could also potentially help insurers resolve the dilemma with which many have long wrestled: how to improve the customer experience, and therefore loyalty and retention, while still satisfying the unrelenting market demand for lower pricing. Until now, insurers have generally struggled to cultivate strong client relationships, both personal and commercial, given the infrequency of interactions throughout the insurance life cycle from policy sale to renewal—and the fact that most of those interactions entail unpleasant circumstances: either deductible payments or, worse, claims. This dynamic is even more pronounced in the independent agency model, in which the intermediary, not the carrier, usually dominates the relationship with the client.

The emerging technology intrinsic to the IoT that can potentially monitor and measure each insured’s behavioral and property footprint across an array of activities could turn out to be an insurer’s holy grail, as IoT applications can offer tangible benefits for value-conscious consumers while allowing carriers to remain connected to their policyholders’ everyday lives. While currently, people likely want as few associations with their insurers as possible, the IoT can potentially make insurers a desirable point of contact. The IoT’s true staying power will be manifested in the technology’s ability to create value for both the insurer and the policyholder, thereby strengthening their bond. And while the frequency of engagement shifts to the carrier, the independent agency channel will still likely remain relevant through the traditional client touchpoints.

By harnessing continuously streaming “quantified self” data, using advanced sensor connectivity devices, insurers could theoretically capture a vast variety of personal data and use it to analyze a policyholder’s movement, environment, location, health, and psychological and physical state. This could provide innovative opportunities for insurers to better understand, serve, and connect with policyholders—as well as insulate companies against client attrition to lower-priced competitors. Indeed, if an insurer can demonstrate how repurposing data collected for insurance considerations might help a carrier offer valuable ancillary non-insurance services, customers may be more likely to opt in to share further data, more closely binding insurer and customer.

Leveraging IoT technologies may also have the peripheral advantage of resuscitating the industry’s brand, making insurance more enticing to the relatively small pool of skilled professionals needed to put these strategies in play. And such a shift would be welcome, considering that Deloitte’s Talent in Insurance Survey revealed that the tech-savvy Millennial generation generally considers a career in the insurance industry “boring.”4 Such a reputational challenge clearly creates a daunting obstacle for insurance executives and HR professionals, particularly given the dearth of employees with necessary skill sets to successfully enable and systematize IoT strategies, set against a backdrop of intense competition from many other industries. Implementing cutting-edge IoT strategies could boost the “hip factor” that the industry currently lacks.

With change comes challenges

While most stakeholders might see attractive possibilities in the opportunity for behavior monitoring across the insurance ecosystem, inevitable hurdles stand in the way of wholesale adoption. How insurers surmount each potential barrier is central to successful evolution.

For instance, the industry’s historically conservative approach to innovation may impede the speed and flexibility required for carriers to implement enhanced consumer strategies based on IoT technology. Execution may require more nimble data management and data warehousing than currently in place, as engineers will need to design ways to quickly aggregate, analyze, and act upon disparate data streams. To achieve this speed, executives may need to spearhead adjustments to corporate culture grounded in more centralized location of data control. Capabilities to discern which data are truly predictive versus just noise in the system are also critical. Therefore, along with standardized formats for IoT technology,5 insurers may see an increasing need for data scientists to mine, organize, and make sense of mountains of raw information.

Perhaps most importantly, insurers would need to overcome the privacy concerns that could hinder consumers’ willingness to make available the data on which the IoT runs. Further, increased volume, velocity, and variety of data propagate a heightened need for appropriate security oversight and controls.

For insurers, efforts to capitalize on IoT technology may also require patience and long-term investments. Indeed, while bolstering market share, such efforts could put a short-term squeeze on revenues and profitability. To convince wary customers to opt in to monitoring programs, insurers may need to offer discounted pricing, at least at the start, on top of investments to finance infrastructure and staff supporting the new strategic initiative. This has essentially been the entry strategy for auto carriers in the usage-based insurance market, with discounts provided to convince drivers to allow their performance behind the wheel to be monitored, whether by a device installed in their vehicles or an application on their mobile device.

Results from the Wikistrat crowdsourcing simulation reveal several other IoT-related challenges that respondents put forward. (See figure 2.)6

ER_2824_Fig.2a

Each scenario implies some measure of material impact to the insurance industry. In fact, together they suggest that the same technology that could potentially help improve loss ratios and strengthen policyholder bonds over the long haul may also make some of the most traditionally lucrative insurance lines obsolete.

For example, if embedding sensors in cars and homes to prevent hazardous incidents increasingly becomes the norm, and these sensors are perfected to the point where accidents are drastically reduced, this development may minimize or eliminate the need for personal auto and home liability coverage, given the lower frequency and severity of losses that result from such monitoring. Insurers need to stay ahead of this, perhaps even eventually shifting books of business from personal to product liability as claims evolve from human error to product failure.

Examining the IoT through an insurance lens

Analyzing the intrinsic value of adopting an IoT strategy is fundamental in the development of a business plan, as executives must carefully consider each of the various dimensions to assess the potential value and imminent challenges associated with every stage of operationalization. Using Deloitte’s Information Value Loop can help capture the stages (create, communicate, aggregate, analyze, act) through which information passes in order to create value.7

The value loop framework is designed to evaluate the components of IoT implementation as well as potential bottlenecks in the process, by capturing the series and sequence of activities by which organizations create value from information (figure 3).

ER_2824_Fig.3

To complete the loop and create value, information passes through the value loop’s stages, each enabled by specific technologies. An act is monitored by a sensor that creates information. That information passes through a network so that it can be communicated, and standards—be they technical, legal, regulatory, or social—allow that information to be aggregated across time and space. Augmented intelligence is a generic term meant to capture all manner of analytical support, collectively used to analyze information. The loop is completed via augmented behavior technologies that either enable automated, autonomous action or shape human decisions in a manner leading to improved action.8

For a look at the value loop through an insurance lens, we will examine an IoT capability already at play in the industry: automobile telematics. By circumnavigating the stages of the framework, we can scrutinize the efficacy of how monitoring driving behavior is poised to eventually transform the auto insurance market with a vast infusion of value to both consumers and insurers.

Auto insurance and the value loop

Telematic sensors in the vehicle monitor an individual’s driving to create personalized data collection. The connected car, via in-vehicle telecommunication sensors, has been available in some form for over a decade.9 The key value for insurers is that sensors can closely monitor individual driving behavior, which directly corresponds to risk, for more accuracy in underwriting and pricing.

Originally, sensor manufacturers made devices available to install on vehicles; today, some carmakers are already integrating sensors into showroom models, available to drivers—and, potentially, their insurers—via smartphone apps. The sensors collect data (figure 4) which, if properly analyzed, might more accurately predict the unique level of risk associated with a specific individual’s driving and behavior. Once the data is created, an IoT-based system could quantify and transform it into “personalized” pricing.

ER_2824_Fig.4

Sensors’ increasing availability, affordability, and ease of use break what could potentially be a bottleneck at this stage of the Information Value Loop for other IoT capabilities in their early stages.

IoT technology aggregatesand communicatesinformation to the carrier to be evaluated. To identify potential correlations and create predictive models that produce reliable underwriting and pricing decisions, auto insurers need massive volumes of statistically and actuarially credible telematics data.

In the hierarchy of auto telematics monitoring, large insurers currently lead the pack when it comes to usage-based insurance market share, given the amount of data they have already accumulated or might potentially amass through their substantial client bases. In contrast, small and midsized insurers—with less comprehensive proprietary sources—will likely need more time to collect sufficient data on their own.

To break this bottleneck, smaller players could pool their telematics data with peers either independently or through a third-party vendor to create and share the broad insights necessary to allow a more level playing field throughout the industry.

Insurers analyze data and use it to encourage drivers to act by improving driver behavior/loss costs. By analyzing the collected data, insurers can now replace or augment proxy variables (age, car type, driving violations, education, gender, and credit score) correlated with the likelihood of having a loss with those factors directly contributing to the probability of loss for an individual driver (braking, acceleration, cornering, and average speed, as figure 4 shows). This is an inherently more equitable method to structure premiums: Rather than paying for something that might be true about a risk, a customer pays for what is true based on his own driving performance.

But even armed with all the data necessary to improve underwriting for “personalized” pricing, insurers need a way to convince millions of reluctant customers to opt in. To date, insurers have used the incentive of potential premium discounts to engage consumers in auto telematics monitoring.10 However, this model is not necessarily attractive enough to convince the majority of drivers to relinquish a measure of privacy and agree to usage-based insurance. It is also unsustainable for insurers that will eventually have to charge rates actually based on risk assessment rather than marketing initiatives.

Substantiating the point about consumer adoption is a recent survey by the Deloitte Center for Financial Services of 2,193 respondents representing a wide variety of demographic groups, aiming to understand consumer interest in mobile technology in financial services delivery, including the use of auto telematics monitoring. The survey identified three distinct groups among respondents when asked whether they would agree to allow an insurer to track their driving experience, if it meant they would be eligible for premium discounts based on their performance (figure 5).11 While one-quarter of respondents were amenable to being monitored, just as many said they would require a substantial discount to make it worth their while (figure 5), and nearly half would not consent.

ER_2824_Fig.5

While the Deloitte survey was prospective (asking how many respondents would be willing to have their driving monitored telematically), actual recruits have been proven to be difficult to bring on board. Indeed, a 2015 Lexis-Nexis study on the consumer market for telematics showed that usage-based insurance enrollment has remained at only 5 percent of households from 2014 to 2015 (figure 6).12

ER_2824_Fig.6

Both of these survey results suggest that premium discounts alone have not and likely will not induce many consumers to opt in to telematics monitoring going forward, and would likely be an unsustainable model for insurers to pursue. The good news: Research suggests that, while protective of their personal information, most consumers are willing to trade access to that data for valuable services from a reputable brand.13 Therefore, insurers will likely have to differentiate their telematics-based product offerings beyond any initial early-adopter premium savings by offering value-added services to encourage uptake, as well as to protect market share from other players moving into the telematics space.

In other words, insurers—by offering mutually beneficial, ongoing value-added services—can use IoT-based data to become an integral daily influence for connected policyholders. Companies can incentivize consumers to opt in by offering real-time, behavior-related services, such as individualized marketing and advertising, travel recommendations based on location, alerts about potentially hazardous road conditions or traffic, and even diagnostics and alerts about a vehicle’s potential issues (figure 7).14 More broadly, insurers could aim to serve as trusted advisers to help drivers realize the benefits of tomorrow’s connected car.15

Many IoT applications offer real value to both insurers and policyholders: Consider GPS-enabled geo-fencing, which can monitor and send alerts about driving behavior of teens or elderly parents. For example, Ford’s MyKey technology includes tools such as letting parents limit top speeds, mute the radio until seat belts are buckled, and keep the radio at a certain volume while the vehicle is moving.16 Other customers may be attracted to “green” monitoring, in which they receive feedback on how environmentally friendly their driving behavior is.

Insurers can also look to offer IoT-related services exclusive of risk transfer—for example, co-marketing location-based services with other providers, such as roadside assistance, auto repairs, and car washes may strengthen loyalty to a carrier. They can also include various nonvehicle-related service options such as alerts about nearby restaurants and shopping, perhaps in conjunction with points earned by good driving behavior in loyalty programs or through gamification, which could be redeemed at participating vendors. Indeed, consumers may be reluctant to switch carriers based solely on pricing, knowing they would be abandoning accumulated loyalty points as well as a host of personalized apps and settings.

For all types of insurance—not just auto—the objective is for insurers to identify the expectations that different types of policyholders may have, and then adapt those insights into practical applications through customized telematic monitoring to elevate the customer experience.

Telematics monitoring has demonstrated benefits even beyond better customer experience for policyholders. Insurers can use telematics tools to expose an individual’s risky driving behavior and encourage adjustments. Indeed, people being monitored by behavior sensors will likely improve their driving habits and reduce crash rates—a result to everyone’s benefit. This “nudge effect” indicates that the motivation to change driving behavior is likely linked to the actual surveillance facilitated by IoT technology.

The power of peer pressure is another galvanizing influence that can provoke beneficial consumer behavior. Take fitness wearables, which incentivize individuals to do as much or more exercise than the peers with whom they compete.17 In fact, research done in several industries points to an individual’s tendency to be influenced by peer behavior above most other factors. For example, researchers asked four separate groups of utility consumers to cut energy consumption: one for the good of the planet, a second for the well-being of future generations, a third for financial savings, and a fourth because their neighbors were doing it. The only group that elicited any drop in consumption (at 10 percent) was the fourth—the peer comparison group.18

Insurers equipped with not only specific policyholder information but aggregated data that puts a user’s experience in a community context have a real opportunity to influence customer behavior. Since people generally resist violating social norms, if a trusted adviser offers data that compares customer behavior to “the ideal driver”—or, better, to a group of friends, family, colleagues, or peers—they will, one hopes, adapt to safer habits.

ER_2824_Fig.7a

The future ain’t what it used to be—what should insurers do?

After decades of adherence to traditional business models, the insurance industry, pushed and guided by connected technology, is taking a road less traveled. Analysts expect some 38.5 billion IoT devices to be deployed globally by 2020, nearly three times as many as today,19 and insurers will no doubt install their fair share of sensors, data banks, and apps. In an otherwise static operating environment, IoT applications present insurers with an opportunity to benefit from technology that aims to improve profits, enable growth, strengthen the consumer experience, build new market relevance, and avoid disruption from more forward-looking traditional and nontraditional competitors.

Incorporating IoT technology into insurer business models will entail transformation to elicit the benefits offered by each strategy.

  • Carriers must confront the barriers associated with conflicting standards—data must be harvested and harnessed in a way that makes the information valid and able to generate valuable insights. This could include making in-house legacy systems more modernized and flexible, building or buying new systems, or collaborating with third-party sources to develop more standardized technology for harmonious connectivity.
  • Corporate culture will need a facelift—or, likely, something more dramatic—to overcome longstanding conventions on how information is managed and consumed across the organization. In line with industry practices around broader data management initiatives,20 successfully implementing IoT technology will require supportive “tone at the top,” change management initiatives, and enterprisewide training.
  • With premium savings already proving insufficient to entice most customers to allow insurers access to their personal usage data, companies will need to strategize how to convince or incentivize customers to opt in—after all, without that data, IoT applications are of limited use. To promote IoT-aided connectivity, insurers should look to market value-added services, loyalty points, and rewards for reducing risk. Insurers need to design these services in conjunction with their insurance offerings, to ensure that both make best use of the data being collected.
  • Insurers will need to carefully consider how an interconnected world might shift products from focusing on cleaning up after disruptions to forestalling those disruptions before they happen. IoT technology will likely upend certain lines of businesses, potentially even making some obsolete. Therefore, companies must consider how to heighten flexibility in their models, systems, and culture to counterbalance changing insurance needs related to greater connectivity.
  • IoT connectivity may also potentially level the playing field among insurers. Since a number of the broad capabilities that technology is introducing do not necessarily require large data sets to participate (such as measuring whether containers in a refrigerated truck are at optimal temperatures to prevent spoilage21 or whether soil has the right mix of nutrients for a particular crop22), small to midsized players or even new entrants may be able to seize competitive advantages from currently dominant players.
  • And finally, to test the efficacy of each IoT-related strategy prior to implementation, a framework such as the Information Value Loop may become an invaluable tool, helping forge a path forward and identify potential bottlenecks or barriers that may need to be resolved to get the greatest value out of investments in connectivity.

The bottom line: IoT is here to stay, and insurers need look beyond business as usual to remain competitive.

The IoT is here to stay, the rate of change is unlikely to slow anytime soon, and the conservative insurance industry is hardly impervious to connectivity-fueled disruption—both positive and negative. The bottom line: Insurers need to look beyond business as usual. In the long term, no company can afford to engage in premium price wars over commoditized products. A business model informed by IoT applications might emphasize differentiating offerings, strengthening customer bonds, energizing the industry brand, and curtailing risk either at or prior to its initiation.

IoT-related disruptors should also be considered through a long-term lens, and responses will likely need to be forward-looking and flexible to incorporate the increasingly connected, constantly evolving environment. With global connectivity reaching a fever pitch amid increasing rates of consumer uptake, embedding these neoteric schemes into the insurance industry’s DNA is no longer a matter of if but, rather, of when and how.

You can view the original post in its entirety Here

Read more…

What options do you have for remotely monitoring Water and Fluids with Industrial IoT sensor telemetry?

IIoT or Industrial IoT (Internet of Things) is everywhere. It’s across all industries, from high tech transport, to natural resources and governments. IIoT software and hardware is deployed for numerous, varying applications, and it’s critical to understand just what the customer needs. Especially since the customer can’t always articulate exactly what the remote monitoring and sensor telemetry should do. According to a study performed by Verizon: the worldwide Internet of Things market spend will grow from $591.7 billion in 2014 to $1.3 trillion in 2019. That’s tremendous.

One of the areas that we’ve seen recent growth is water and fluid monitoring. Water comes to us as a life sustaining asset and also as a force of destruction. The utility of water needs to be measured and monitored in order to effectively and efficiently use our greatest natural resource. Similarly, monitoring the destructive force of water can be just as important. Let’s talk about the different ways that you can measure and monitor water!

 

Flow Meters

Flow meters calculate the amount of water that flows through them. Flow meters are everywhere from your house to your office, to anywhere and everywhere water is used. Measuring water flow is a need recognized across industries, from agriculture to commercial, pharmaceuticals, and oil and gas. Flow meters in an IIoT solution provide not only a total flow amount, but allow you to utilize real time data to predict and adjust consumption. Further still, real time analysis allows immediate recognition of catastrophic events such as a burst pipe. The analysis will be drawn out further to establish predictive failure behavior and potentially prevent massive water loss issues like the ones that happened in Los Angeles and Hollywood Hills.

 

Water Detection

Almost certainly this one is all about protecting assets. There are essentially four ways that we have used to detect presence, quantity, volume, and levels of water. Each of these fits quite well for a particular purpose. They also compliment each other nicely!

 

Presence of Water: The Rope Sensor

Rope sensors are great and they come in a variety of lengths. A rope sensor will tell you if you have water present at any point along the sensor. Imagine a large trailer with rope sensors running along the bottom of the trailer. If you have a spill in that trailer, truck, or vehicle and any fluid reaches the rope sensor, then you’ll receive an alert and immediately know there’s a problem.

Rope sensors are also great for flood detection. Because you can purchase these sensors in practically any length, you can lay them across a flood channel. If any portion of that rope sensor gets wet then you know you have water present. However, in terms of flood detection rope sensors will tell you if there is water, but they won’t tell you how much.

 

Presence of Water: Yes or No

If your rope sensor went off on a flood channel you might want to know how much water is flowing through. Depending on the lay of the land there are a number of different applications that we use to provide this information.

 

Ultrasonic, Ultrasound, Pulse, and Radar Sensors

If you have a fixed structure next to or going over a flood channel then a great solution is an ultrasonic sensor. Essentially, once the sensor is fixed in place it will continuously ping the ground. When the reading between the sensor and the ground becomes more compact, you can calculate that distance and in turn determine how much water is flowing through the channel and the flood level. Also note that radar and ultrasonic fluid level sensors are quite useful for remotely monitoring levels and volumes of liquid products in assets like tanks!

 

Pressure Transducers

Another way that we have measured quantity of water is by using a pressure transducer. A sensor with a membrane sits at the bottom of a water well, lake, or a reservoir, or a flood channel. As the water increases above the sensor so does the pressure on the sensor’s membrane. The higher the pressure the more water you have moving through!

 

Making things Digital

Water metering and water detection are now all IIoT solutions. All of these meters / sensors connect to sensor hub connector hardware that sends data out into the internets and into a cloud data analysis solution. Whether you’re monitoring agriculture / viticulture, oil / gas / mining, municipal water treatment facilities or other water plants, nowadays you can obtain a cost-effective, rapidly deployable monitoring solution.

 

 

Read more…

Upcoming IoT Events

More IoT News

IoT Career Opportunities