Once a shepherd had two horses. One was strong and fast, another one was slow and weak. The shepherd instead of looking after the weak one and trying to make him fit, he preferred riding the healthy one and taking him almost everywhere. He also fed him better which in long-term led the healthy horse to become heavy and feel lethargic all the time. Later the healthy one became sick and the shepherd then instead of having healthy horse has to deal with two sick and weak horses.
What does this story have to do with digitalization? Well, this is our approach to the whole idea of digitalization and its implementation worldwide. Instead of investing time and resources and bringing the idea of digitalization to the underdeveloped nation or the “weak horse”, we are feeding the already “healthy horse” or the developed nation with fascinating ideas and projects which might result into sickness in the long term. The early symptoms of this sickness can already be witnessed. A World Economic Forum (ref:weforum.org) report says that labour markets will witness a net loss of over 5 million jobs in 15 major developed and emerging economies. Is this in line with the idea of digitalization?
The World witnessed more than 4.000 ransomware attacks per day in 2016 (ref:Justice.gov) which approx. 300% more than that in 2015. This since we made the whole IT infrastructure more vulnerable to such attacks by simply connecting every electronic device to the internet. Are we just simply neglecting the side effects of digitalization?
Being not able to read your power meter values on your smartphone is not a problem as compared to not having access to water and electricity due to scarcity. Being not able to sit in a self-driven car is not a problem as compared to be stuck in traffic jam in peak summer for hours and breathing exhaust gases all the time. These are some of the typical problems of the underdeveloped or developing nations. The question which the Gurus of digitalization should address is that if the idea of digitalization/IoT for industry/IT giants are about to create business and thus pushing up the turn over, then why not boosting digitalization across such nations by diverting the resources there. This is far better than making own factories intelligent and then laying down hundreds or thousands of hardworking and loyal employees.
Here are some of the typical problems of the under-developed or emerging nations, the answer to which could be well-implemented digitalization.
Corruption: According to an IMF report more than $ 1 trillion dollars is paid in bribes each year around the world with underdeveloped and developing countries topping the list of being the most corrupt nations. People in these nations pay up to 13 percent of their income to bribes which later discourages them from services made available by the government. Corruption is the root cause of crime in many countries and acts like a fuel to poverty and social inequality. Institutes worldwide are trying their best to strictly monitor and thus eliminate corruption worldwide, unfortunately without much success until now. This, however, might change in future. Experts nowadays are betting on the invention of blockchain to fight corruption. The blockchain is a centralized technology which offers full transaction transparency, thus providing no room for fraud or capital manipulation. Blockchain implementation, however, demands a solid digital infrastructure which in my opinion is an area where IT communication network provider should look into.
Image courtesy: Wikicommons
Commodity wastage or theft: Water and electricity to two important needs of every society. Their scarcity or theft leads to a major human rights problem. The figures about water scarcity worldwide are very alarming with some 780 million (ref: Water.org) people having no access to clean and safe water. One of the major reason for water scarcity is wastage or theft in emerging/underdeveloped nations. The electricity theft worldwide touched $89 billion (ref: Northeast Group LLC) annually in 2015 with India, Brazil and Russia being the top 3 nations with highest losses. With an introduction of smart water and electric meters along with in-built sensors, certain startups are trying to monitor the overall water and electricity supply and consumption. Based on which a customer profile can be generated so that any irregularities can be immediately reputed to the consumer as well as the respective authorities. This again needs support from government and the industry without which it will take ages to tackle the mentioned problem.
Image courtesy: Wikicommons
Landfills: Seems like the never-ending problem of nations with poor or insufficient infrastructure People in some of these nations spend their lives in an area surrounded by a heap of waste or landfills. This is there exist no proper waste management plan due to lack of manpower or resources. This bottleneck can, however, be eliminated by daily tracking and monitoring of location (webcams) with landfills and adjusting the waste management plan accordingly. Here, for instance, the resources can be diverted to a location with a frequent buildup of waste. This, however, demands a strong digital infrastructure which can only be established if government and industry work together.
Image courtesy: Wikicommons
Street crimes are on the rise in nations with higher social inequality. Authorities in these nations feel helpless due to the degree and frequency of crime happening every day versus the available manpower. Interestingly, the biggest problem is that many of these crimes go unreported since people in these nations have lost their faith in government/authorities/police. The legal structure in these countries needs a face-lift which can be achieved by digitalization the complete process of monitoring, documenting of crime and its prosecution. The street light camera or public surveillance camera project in the US is a good example of crime monitoring here. The public surveillance camera installed in Baltimore and Chicago (ref: Urban.org) region not only resulted in reduced crime but also proven to be cost-effective than the conventional way. A cloud-based complain lodging system can be established allowing the verified victim to lodge complain straight via smartphone. A digital platform managing all these complains based on degree or severity of the crime as well as the date of occurrence can be created.
Healthcare: Proper healthcare is still considered as a luxury in many of underdeveloped/developing nations. Approximately 80 percent (ref: facts and details.com) of people in these nations rely on public hospitals for treatment. These hospitals are often running over-capacity and are ill-equipped. A healthcare digital platform which integrates the existing database of all the hospitals in the region along with a list of their respective treatment capabilities, services could ensure the even distribution of patient load in these hospitals. Thus allowing treatment to each and very needy individual without any delay. Access to a healthcare App coupled with the platform can allow the patient to see which hospital nearby has an available bed and a doctor and can provide him/her with an option of online booking.
Uncontrolled traffic: Interestingly an ongoing problem of developing countries. With four-wheeler getting cheaper and infrastructure narrower day by day, the traffic condition in these countries is on the verge of a breakdown. Traffic jams and road accidents are on increase with pollution level due to an increased number of vehicles on road, reaching new peaks. Equipping traffic lights with infrared sensors or webcams can help the authorities to divert the traffic in case of traffic congestion. Moreover, long-term monitoring (analytics) can be beneficial in planning infrastructural change, road buildups in regions where traffic jams are frequent. Car sharing/renting/hiring apps should be promoted. By combing the complete transportation system along with consumer profile, one can monitor the user segment preferring public transportation over own. This segment can then be rewarded in form of discounted bus/train/metro tickets or by means of an annual grant.
Image courtesy: Wikicommons
The list of problems of these nations is long but properly implemented digitalization along with the synergy between government and industry could be the answer to all of these problems. We should not repeat the mistake of the shepherd and should help the weak horse to be as fit as the healthy horse. Only this way we can achieve social and economic balance across the world.
Connected devices are becoming essential components for enterprises as they can drive significant connectivity and integration between systems and data. The increasing number of devices getting connected to each other generates a huge amount of data.
However, when it comes to leveraging the full potential of these connected devices and data, it is necessary to have a scalable and robust environment which allows faster processing of data between systems.
The fundamental concern is on how to efficiently manage this data, as any data loss or delay in processing of data from a connected ecosystem can cause critical damage to an enterprise’s workflow.
Role of IoT gateway edge analytics in data processing & management
IoT Gateway is the key to any IoT deployment. It is a bridge between IoT devices and cloud that enables remote control of the devices and machines. The increasing number of devices propels the requirement for IoT gateways to solve the data management issues with Edge Analytics.
Edge analytics with IoT Gateway allows data processing before it is transmitted to the cloud. The gateway collects all the data from the connected devices and executes necessary algorithms or rule engine on it and sends actionable commands to connected devices. The actions allow for response to be taken in real-time and also helps in self-healing mechanism during faults/errors.
In large enterprises, having multiple geographical spread, there are a huge number of connected devices and generated data. This heterogeneous data, distributed at different levels (Devices and machines ) have high latency in cloud transferring due to the uncontrolled data flow. Here, distributed edge analytics is the solution as it allows faster data transfer and processing, resulting in the reduction of latency.
AWS Greengrass is the best example for the edge analytics setup. It allows enterprises to run local compute, messaging, data caching, sync, and ML inference capabilities for connected devices in a secure way. Greengrass ensures quick response of IoT devices at the time of local events, that reduces the cost of transmitting IoT data to the cloud.
How distributed edge analytics works in larger geographical areas
Let’s take an example of smart grids to understand the concept in-detail.
Smart grids are the combinations of smart meters, smart appliances, renewable energy resources, energy efficient resources, and substations. In a particular city area, the number of smart meters is equivalent to the number of households in that area. These AMI (Advanced Metering Infrastructure) continuously collects the energy consumption data and route it to the IoT gateways. The gateway enables edge analytics and then the processed data is rerouted to the cloud by the gateway.
As the number of AMI is high in a particular area, the number of gateways will be proportionately higher.
Merits of distributed edge analytics:
- Reduced data transfer latency
- Fast access to the faulty areas
- Quick functional recovery and self healing capabilities that brings resilience in the system
Distributed edge analytics also enables fast response to the cloud in case of faults and failures with Fog Computing so that the recovery time can be minimal. Let us understand how.
How fog computing works with smart grids for faster data processing
Fog computing is the combination of two key components of data processing, Edge and Cloud both. The idea of combining edge computing with more complex computing (cloud computing) results into more reliable and faster data processing.
As smart grid tech is increasing rapidly, fog computing is the best tool for the data and information processing between consumers, grid operators, and energy providers.
In the edge analytics concept, the gateways form a mesh network. The individual mesh network of a designated area creates Fog Nodes. Each fog node is connected to each other, resulting in a fog network of smart meters and IoT gateways in the larger setups. The combination of these fog nodes then allows distributed fog computing, which gives the benefit of fast and real-time data analysis in any large geographical area. This further enables faster fault response time.
Use case of smart grids in distributed edge analytics
eInfochips developed a solution in which gateways are being connected into a mesh network with peer-to-peer communication. Mesh and cluster of gateways enable high availability and reliabilityof the IoT deployment in smart grids. Clustering enables distributed edge analytics. These distributed edge nodes allow processing of data at the edge before transferring it to the cloud.
According to the market research data, fog computing market is growing with the attractive amount of cost annual growth rate (CAGR), 55.6% between 2017 and 2022 (MarketsandMarkets).
With our edge and fog computing expertise, we help the IoT solution providers to optimise their computing infrastructure by distributing load between the cloud and edge devices in an intelligent way through our ready-to-use dynamic rule engine or custom solutions.
When retail machines talk to each other directly or collaborate through edge gateways, customers are more likely to find what they're looking for. Why lose a sale due to a lack of inventory when a customer can be redirected to a nearby location where their product preferences can be met.
Lumada is Hitachi's Internet of Things platform and you'll get a high level view of it's components and capabilities in under a minute. I'll walk you through Assets, Edge Gateways, Core, Asset Avatars, Analytics and Studio.
The phrase, “the future is here,” is overused and has evolved into a catchphrase for companies struggling to position themselves in times of technological or digital transformations. Still, the sentiment is understood, especially in times like today, where the Internet of Things is quite literally changing the way we think about hardware and software. We’d like to offer an addendum to the phrase: “The future is here more quickly than we thought it would be.”
Digital transformation, increased computing ability, smart hardware and the growth of connectivity capabilities created a perfect storm of accelerated industry, and many were left scrambling to sift through the large amounts of information and solutions available. With that in mind, we wanted to provide some advice for companies across the industrial sector for the best ways to optimize operations for the Industrial IoT.
1) Upgrade your network and throughput capabilities.
Nothing can kill the ROI of automated processes more quickly than the literal inability to function. It’s important to understand that as you upgrade machinery and invest in the software to run it all, those systems demand greater bandwidth in order to effectively utilize the big data and analytics capabilities. Several options exist, but for most companies some combination of industrial-strength broadband (WiFi), narrow-band, cellular and RF communications will create the most effective network for the needs.
2) Invest in smart hardware.
This may seem like a no-brainer, and really, in the not-too-distant future, you may not even have a choice, but the shift toward Fog Computing is gaining momentum and being able to run decentralized computing between hardware and the Cloud can not only create greater operational efficiency, but it can also allow your data transmission to run more smoothly as well. The beauty of a Fog Computing system is that it allows a greater number of devices to transmit smaller data packets, which frees up bandwidth and speeds real-time data analytics. The core of this lies in the smart hardware.
3) Be proactive about application development.
Smart hardware means that it has the ability to host applications designed specifically for your needs. Previously, many companies shied away from app development because it required highly skilled developers and devices capable of hosting those apps – a combination that wasn’t readily available. Today, the scene has changed. With the rise of Node-RED, it is much easier today to create proprietary applications without a computer engineering degree, and any company serious about leveraging IIoT technology needs to be able to to use the full scope of its data.
4) Secure your communications.
There isn’t much more to be said about the importance of cybersecurity. If the last few years of massive data breaches haven’t rung alarm bells, then you aren’t paying attention. Cybersecurity today is a multi-layered need. Most companies building smart hardware are beginning to build encryption directly into the devices. But, since many companies use Cloud applications for computing and analytics, it is important to invest in strong security measures at that level as well. Unfortunately, the sophistication of cyber-attacks are only going to increase, along with the increase in importance of the data needing to be protected. It pays to be paranoid and act accordingly.
Going beyond connected buildings
Connecting smart buildings to smart grid, smart transportation, & other smart services is the need of the hour to truly manifest the potential of IoT . However, communicating with numerous systems made up of different protocols is a major challenge faced by integrators. Protocol converters are widely used to convert protocol A to B, but such devices do not offer ease of configuration and flexibility demanded by IoT. The solution to this is a universal gateway – a device that transacts data between two or more data sources using communication protocols specific to each of them. The Universal gateway is also termed as a universal protocol gateway. Such products include a combination of hardware & software, and used to connect data from one automation system like building automation to another like a smart grid.
Role of universal gateway in building automation
A typical building automation system comprises of five key components:
- I/O modules with multi-protocol implementation including open source, proprietary and wireless
- Controllers with multi-control loop implementation such as PID, Adaptive, Rule-Based and software based on multiple platforms like TI, Freescale, Qualcomm, NVidia
- Data storage & analytics with diverse DBMS like SQL, Mongo DB, Oracle along with varied Data Analytics through Sensor Data, Statistical modelling, Predictive analytics, and Real-time Analytics
- Dashboards & Apps with web-based or mobile-based Intuitive dashboards for data monitoring and apps for various OS and devices
- Gateways that enable communication of data between the above four data sources using communication protocols specific to each other
In building automation, connectivity technologies have propelled the adoption of connected smart devices for remote sensing, actuating and intelligent monitoring. Industry bodies and standards like BACnet International, Echelon Corporation, are extending or adopting different communication protocols to devices used in building automation, smart grid, etc. to make disparate solutions work seamlessly together. BACnet, Lonworks & other similar protocols have enabled standardization in Building automation. Different systems like HVAC, surveillance camera, access control, BMS, fire protection, audio-visual, lighting are integrated, monitored & controlled on a single system.
The Universal Gateway or Universal Protocol Gateway is an external, high-performance, multiprotocol gateway for integrating HVAC, surveillance camera, access control, fire protection controls into building management systems (BMS) & in turn integrating BMS into Internet of Things (IoT). These gateways also offer bidirectional data flow between devices on selected points.
Universal gateways support various standard protocols like Profibus FMS, DP-Master, DP-Slave, LonTalk, BACnet Ethernet, IP and PTP (RS232), Modbus serial, MODBUS/IP, M-Bus, EIB (European Installation bus), OPC, and many other proprietary protocols.
Benefits of custom-built universal gateways
Universal gateways are generally designed & developed to cater to the needs of mass market. Typically it caters to a limited set of protocol combinations, including a serial bus, a Fieldbus or real-time Ethernet protocols. Custom-built universal gateways provide a flexible platform for a transparent conversion of building automation / industrial automation protocols, thereby enabling connection of networks of different I/O, Controllers and OEM brands. Such gateways are quite flexible, with hundreds of protocol combinations possible through them.
In addition, custom-built universal gateways are software-focused and offer ease of configuration for protocols like CAN, DeviceNet, PROFIBUS, BACnet, LonTalk, Ethernet/IP, Modbus TCP, POWERLINK, CC-Link, EtherCAT, SERCOS III, MB/RTU, RS422, RS485, MB ASCII RS232, Controller Area Network, DeviceNet, FOUNDATION fieldbus, HART, C-Bus, Z-Wave, Zigbee and the like. In summary, use of universal gateways help in developing M2M communication and can aid in enabling IoT efficiently.
Originally Published on eInfochips Blogs
We all know HTTP (hypertext transfer protocol). These are the first 4 alphabets which see on any URL of a website you open in your browser. In simple terms, it is a list of rules that define do’s and don’ts of communication between web browser and web server. It is like you (web browser) going to ATM (webserver) to get some cash (request). Here the HTTP will describe the complete procedure – enter pin, amount, etc. You get your cash (result) once you follow the mentioned steps. Quite simple.
The World Wide Web (WWW) works on HTTP as it is the only protocol used there for the data transfer. However, this is not the case in the Industrial (I) IoT world. Here we have a bunch of protocols to choose depending on the type of application or so-called “use case”. The most common among them are MQTT, CoAP and of course HTTP. Before we discuss them, let us first have a look at certain networking terminologies and definitions.
Transport layer protocols (TCP, UDP)
Transport layer protocol, as the name implies, is responsible for transportation of message or information from one computer to another. The transport of the information can be done in two ways:
- Connectionless protocol (UDP): This kind of protocol is preferred in cases where speed and efficiency are more important than the reliability. In this case the data is sent without establishing or waiting for a connection. This means that a bit or segment of data can get lost during transportation. A typical example of such protocol is live video streaming where sometimes bad connection results in the fragmented video. For example, imagine yourself bringing a bunch of letters to the postbox and dropping them inside. You are just dropping the letters inside the box without knowing whether they will be delivered to their recipients. This is the case with connectionless protocols. On the other hand, bringing all these letters to the post office and ordering a return receipt for them, thus ensuring their delivery, can be compared to a connection-oriented protocol.
- Connection-oriented protocol (TCP): Here the protocol ensures the receipt of a message at the other end without any data loss on the way, thus ensuring a reliable transport. The connection-oriented protocol needs extra overhead (discussed later) as compared to the connectionless protocol. Just like, it takes extra resources (time, money) to order a registered letter with return receipt.
Packet and Packet size
A packet contains data (payload) along with information (header) like source, destination, size etc. Just like a DHL packet that contains stuff to be shipped along with information like address, weight, dimension etc. packet size in networking, is the amount of data (in bytes) carried over the transport layer protocols.
It is the extra information (in bytes) or features associated with the packet which ensures the reliable delivery of the data. In other terms, it is that bubble wrap foil around your shipment that is not necessarily needed but provides an extra layer of safety and reliability for a safe shipment of your parcel.
The amount of overhead associated with the packet depends on the type of transport protocol used. UDP in comparison to TCP has smaller overhead.
Bandwidth is the rate (bits/MB/GB per seconds) at which the data transfer takes place. The larger the bandwidth, the more data can be transferred at a given time.
So that was a crash course on networking. Now let us try to understand the mentioned IIoT protocols using these terminologies.
Message Queue Telemetry Transport or simply MQTT is a lightweight messaging protocol for industrial and mobile applications. It is best suited for application where network bandwidth and power usage are limited, for example, small sensor, remote location applications, machine to machine communication. MQTT communicates with a server over TCP and unlike HTTP works on publish subscriber model (see figure below).
Fig. Example of a publish subscriber model used in MQTT
In order to understand the concept behind the MQTT, one should try to understand the underlying architecture “The publish-subscriber model”. Here a client publishes a message or a topic (temperature, humidity) to a broker that in turn sends these topics out to clients interested in subscribing to that message.
The publish subscriber model used in MQTT offers a couple of advantages as compared to the standard client-server model used in HTTP. Multicast, scalability and low power consumption are among the top three. These advantages are due to the fact that the publish-subscriber model overcomes some of the structural (one to one communication, tight coupling, fault sensitive) drawbacks of the traditional client-server model.
Let’s have a look at an analogy in order to understand the difference. Let us assume that MQTT and HTTP are two publishing companies: MQTT publishes magazines on various topics (sports, politics, cars, etc.) and provides them to a broker who in turn distributes them to subscribers interested in one or more topics. This way MQTT can cater many (multicast) subscribers at a given time, thus it is scalable. Since he only has to deal with a broker whom he contacts once a day, his investment (power consumption) in maintaining the business is not high.
HTTP, another publisher, likes to deal with one customer at a time. He highly relies on his customer and on his value chain (server to server). This, however, comes at a cost of relatively high business investment (power consumption) since he has to visit his customer each time for a handshake.
MQTT in contrast to HTTP is best suited for an application where bandwidth, packet size and power are at a premium. An industry generator with battery-powered temperature and humidity sensor cannot afford to maintain a connection with server each time it has to push the measured values (event or message) into the cloud. MQTT is just designed to overcome such constraints where the connection is maintained by using a very little power and the commands and events can be received with as little as 2 bytes of overhead (extra resources needed for operation).
Constrained Application Protocol or simply CoAP, is a UDP based protocol, which is mostly interpreted as a light version of HTTP (except the fact that HTTP works over TCP). It is specially designed to work in a constrained environment with limited bandwidth and power constraints, where communication has to be fast and ongoing. Unlike HTTP, CoAP can support one to many (multicast) requirements and is faster than other TCP based protocols which makes it a good choice for M2M.
It is quite common to see the device to device (D2D) or device to gateway (D2G) communication done over CoAP and the communication between gateway and cloud is HTTP job. This is due to the fact that there is a well-defined mapping between these two protocols.
So, if both MQTT and CoAP are good for the constrained environment, then what makes one better than another? The answer lies in the underlying transport layer their use. MQTT is better suited for event-based communication in a constrained environment where data needs to be sent in batches (for instance temperature and humidity values) and at regular intervals over a reliable channel.
CoAP is a better choice for continuous conditioning monitoring scenario in a constrained environment. Since it runs over UDP, CoAP offers faster communication among the devices which makes it a better option for M2M/D2D/D2G communication. CoAP is also best suited for web-based IIoT application where it has to work along with HTTP. In such a setup, you have CoAP at sensor side and HTTP running between proxy/gateway and cloud.
What about HTTP? It is on demand whenever you want to push a big chunk of data from gateway/industry modem/computer into the cloud or a web-based application without compromising on security. Here regardless of how data is collected and sent to a gateway (CoAP vs MQTT) if it comes to reliable big package delivery, then HTTP takes the front seat. Moreover, HTTP is still used as a standard protocol for devices who do not support any other protocols.
MQTT or CoAP or HTTP, it is a matter of speed vs reliability vs security, whichever suits your use case the best.
I hope you enjoyed reading the article and that it helped you to get at least a basic understanding of the major IIoT protocols. Your feedback, comments or suggestions are always welcome.
The deployment of the Internet of Things (IoT) has disrupted niche organizations across multiple industries like financial services, technology, agricultural equipment etc. The organizations are shifting from traditional products to smart offerings and outcome-based deliverables.
Evolution of technology
We have progressed from reading a physical copy (paperback and hardcover) of the book to reading it on your device or on Kindle. We furthermore explored the traditional way of reading a book with the audio version of the book. We now can not only download the book and read it anywhere on our smartphones but can also just plug in the earphones and listen to it. This is defined by the progressive model by Amazon:
Amazon →Kindle →Echo →Audible
We need to evolve in a similar manner in the industrial world by thinking of the offer from the consumer’s perspective. Efficiently creating and monetizing value shared by IoT solutions will lead us to profitable outcomes.
Impact of IoT on businesses
Machina Research expands the scope of its IoT forecasts and highlights a USD 4 trillion revenue opportunity in 2025.
Business models require thinking through the consumption side of the offer (demand side). This explores the demand of the product and its usage. Your customers don’t merely purchase your product. They furthermore look for more opportunities – what they want is what your product can do for them. And hence, it is important to understand the consumer aspects of your offerings. Moreover, the business models also require thinking on the production side of the offer (supply side). This helps you better understand how the IoT product is created and delivered for a symbiotic growth.
Let’s further explore various IoT-based business models.
Product Business Model
This model enables to provide your customers with a physical IoT product and the software. You gradually can upgrade the software by notifying the customers of its cost and it will mirror the results at the consumer-end directly.
Let’s take an example of the self-driving cars. The company doesn’t have to additionally provide any product to the users; instead it can continuously improve the car by updating its onboard model and application. The firm can introduce a new feature and update its users by sending a notification. The users will just have to update the system to leverage that feature.
You can also use this model to collect data to create information service products to eventually sell with the product-service model.
Product-Service Business Model
This is a hybrid version of the traditional product business model and the newer service business models. This model enables organizations to offer a physical IoT product along with an informative model. Implementing information service to a product based on its collected data will ensure incremental revenue and provide a competitive advantage. Providing continual information to monetize the availability of the analyzed data that enhance the consumer’s process is the key of this business model.
Let’s take an example of ‘connected vehicles’. With the attached Onboard Diagnostic II chip, users will be able to know the temperature, RPM count, pressure, engine load, location of the vehicle, and fuel level. This information-based model revolves around vehicle safety, saving fuel and ultimately, reducing the maintenance cost. This information or data is the key for predictive maintenance that allows the customers to know the health of their vehicles to further avoid any mishap.
Service Business Model
This is a XAAS – Anything As A Service business model. The company rents a physical product with IoT solution and pays for it only for a period while it is running or working. The service business model is not exclusively related to software or physical products; it can also include the information products. This model allows the organizations to have a predictable and recurring revenue stream by providing their IoT services to the customers for a certain time period. However, your IoT solution must not only have value as a service, but, it must be aligned with how the customer expects to receive, consume and pay for the offering.
Let’s take an example of jet engines. The customers that don’t want to own and maintain the engines by themselves lease the IoT-based jet engines from the seller. The seller also provides maintenance by implementing predictive analytics. The customer pays for only the times where the engines were running and producing outcomes for them.
Service-Outcome Business Model
In the service-outcome business model, the seller becomes a business partner. There are two aspects of the service-outcome model. The first aspect is similar to the service model but instead of focusing on offering a single solution, there are product lines that are monetized. The other model comprises of monetizing based on the outcome or the performance of the offered solution.
The service-outcome business model has an add-on payment based on saving that incentivizes the vendor to improve its customer’s business. The add-on payment in this situation would be related to the reduction in human operating expenses.
Let’s take an example of the mining industry. Instead of providing the mining equipment, the company provides IoT solution for the equipment to their customers. And based on the data collected with the use of that IoT solution, the company can further adjust the equipment in accordance with the parameters of the better-performing ones. By adopting the service-outcome business model, both, the buyer and the seller receives an incremental value. This enables them to establish a baseline and generate a percentage of the incremental revenue or incremental savings based on phases or milestones.
Outcome Business Model
The final business model comprises of an entire IoT ecosystem. It brings together the producers (vendors) and the consumers (customers) of the IoT technology in order to monetize the solution. Instead of partnering with multiple vendors, the customer becomes part of an ecosystem that delivers the desired outcome.
This goes beyond the service-outcome business model where payment now is completely based on performance. This allows the alignment of the business models of the vendors with that of the customers.
Let’s take an example of smart farming. The outcome business model focuses on providing a bundled solution for an effective agricultural solution. Instead of providing separate solution for monitoring the moisture level of the soil, the sunlight, and the CO2 emissions, this model offers a set of solution that combines all these in a package. Separately, each of these separate product categories provides value, but when allied, dependencies are organized, creating greater value than the sum of the parts. This enhances the monetization aspect of the solution for the customers as well as the vendors as the payment will take place according to the outcome that each solution provides.
Analyzing business needs through IoT
The greatest challenge in implementing IoT isn’t technical. The key challenge lies in the business aspect. The challenges are further followed by lack of standardization and strong security. However, these are not the kind of challenges that can’t be addressed and solved by any organization. The key focus should be on using IoT technology to deliver and monetize outcomes.
Business issues may be more challenging than technology. Inculcating IoT solutions into your business practice does not mean you must change your business model too. It is equally important to align your sales and distribution goals along with implementing IoT initiatives. You can then continue on the IoT business model continuum as the time progresses.
Outcomes effectively revamp industries. They impact the business in a way that enables you to identify your competitors and partners. This reduces competitive risk and prepares you to decide when to spend the time and the resources needed to develop your IoT business and product line.
Many of us have yet to see an autonomous vehicle driving down the road, but it will be here faster than we can image. The car of tomorrow is connected, data-rich and autonomous. As 5G networks come online, sensors improve and compute and memory become faster and cheaper, the amount of data a vehicle will generate is expected to be 40 terabytes of data every day. This will make the autonomous vehicle the ultimate edge computing device.
Last week at Mobile World Congress Americas in San Francisco, Micron Technology hosted a panel discussion with automotive industry experts where they discussed the future of the connected car and the role of both the cloud and the edge in delivering the full promise of autonomous driving (FYI – Cars are now big at wireless trade shows. See Connected Vehicle Summit at MWC).
Experts from Micron, NVIDIA, Microsoft and Qualcomm discussed what 5G, cloud, IoT and edge analytics will mean for next-generation compute models and the automobile.
Micron claims to be the #1 memory supplier to the automotive industry and notes that its technology will be required to access the massive streams of data from vehicles. This data must be processed and analyzed, both in the car and in the cloud. Think about going down the road at 70 MPH in an autonomous vehicle. You need to have safe, secure and highly-responsive solutions, relying on split second decisions powered by enormous amounts of data. To quickly analyze the data necessary for future autonomous vehicles, higher bandwidth memory and storage solutions are required.
Smart, connected vehicles are the poster child for edge computing and IoT.
Some intriguing quotes from the discussion:
- “In last seven years 5839 patents have been granted for autonomous vehicle technology.” – Steve Brown, Moderator and Futurist
- “There is a proactive side of autonomous driving that can’t be fulfilled at the edge.” Doug Seven, Head of Connected Vehicle Platform, Microsoft
- “The thin client model won’t work for automobiles. You won’t have connectivity all the time.” – Steve Pawlowski, Vice President Advanced Computing Solutions, Micron
- “Once you have enough autonomous vehicles, the humans are the danger.” – Tim Wong, Director of Technical Program Management for Autonomous Vehicles, NVIDIA
The entire panel discussion can be found in the video below.
Disclaimer: The author of this post has a paid consulting relationship with Micron Technology.
Why do IoT Architects need to think about value, not just data?
Several years ago I was pitching what would now be called an Industrial Internet of Things (IIoT) solution to the Production Manager of a large manufacturing plant. After describing all the data we could collect, and the metrics we could turn it into, I thought I had done pretty well. What Production Manager wouldn't want our system to get his finger on the pulse of his operation?
Instead, his next question floored me:
"If I don't do anything with the data your system collects, then it doesn't create any value for me, does it?"
I had never imagined that someone presented with real-time, detailed information wouldn't immediately grab it and use it to improve their business. I was so taken aback I could not think of an intelligent response, and needless to say, we didn't win that deal.
I'm not going to suggest that he was right, since "doing something with the data" was implicit in his job description, but there is a germ of wisdom for the IoT community in what he said:
Merely delivering data does not deliver value.
Even lots of accurate data, even in real time. Many IoT systems -- still -- have clearly been designed under the assumption that their responsibility ends at collecting, storing and presenting data: systems where data is collected and put in a data repository or historian; systems where data is collected an put on on-line graphs.
A real-world ACTION that benefits a group of stakeholders is still the only way that any IT system delivers value. For an IoT system to deliver that value, it must construct a chain from data to action. I suggest we call this chain:
The Information Value Chain.
The Information Value Chain is only just starting when you collect the data. Turning that data into information and ultimately into ACTION is harder, and if anything your "data only" Internet of Things (IoT) system has made the problem worse, not better: understanding a small amount of data to turn it into action is extremely taxing, and takes many different skills. Doing that with a torrent of data is overwhelming.
What is the Information Value Chain?
Very simply, the Information Value Chain is the insight that data only creates value if it goes through a series of steps, steps which eventually result in action back in the real world.
If we focus primarily on collecting data, then we will create Data Lakes, which are impressive Information Technology constructs, but on their own are passive entities that deliver no inherent value to the organisation.
If we focus primarily on action, then we will make decisions based on inaccurate information and misleading data, resulting in the wrong action, wasted money and lost opportunity. A great example is this Case Study.
How to solve this conundrum? Before we get into the mechanics of building a robust Information Value Chain, the starting point is human, not technological.
To succeed you must start with the right goal
The starting point is this: What is the motivation for your project?
If it is to build an "IoT System," then I suggest that you are heading down the road to failure. An IoT System is a means, not an end, and has as many different embodiments as the word vehicle - Ferrari; Ford Focus; Mack truck; oil tanker.
Here is what you should be setting as your goal:
"To build a system that creates value in
[this] way; by enabling
[these] actions; using the best methods; with the minimal required human intervention; based on the best possible information; in as close to real time as possible."
There is a lot in this statement. Let's unpack it.
The central message of the Information Value Chain is to see our information systems as part of a sequence who's end result is action that delivers value.
- When I approach systems analysis for a Customer, the first thing I write on the right hand side of the whiteboard is a "$" sign.
- To the left I have the Customer help me develop an ROI model:
- Before: X1 action by X2 participant creates X3 value at X4 cost;
- After: Y1 action by Y2 participant creates Y3 value at Y4 cost.
- Then we step left again to describe the decisions that lead to those actions. Now we can write:
- Who (or what!) will make those decisions; on
- What timescale;
- Based on what algorithm.
- Now we can ask what information they will need to make these decision and
- How to extract this information from the data available.
- Then, and only then, do we know what data to collect; how to process it, how -- or whether -- to present it; and how much of it and how to store it.
We have found this approach moves IoT from a vague concept of something the Client thinks "maybe" they should do, but are not clear on how it will impact their business, to a compelling business tool with clear purpose and value. That what this is all about!
What do the links in The Information Value Chain mean?
The terms data, information and decision, as well as knowledge and intelligence get thrown around a lot, often interchangeably, yet these are distinct concepts. It is important to understand what we are talking about so that we can define and deliver each link in the chain successfully. Let's start from right to left, as we have just described in our systems analysis process so that we always keep our end goal in mind:
- Action: something that results in a change in the real-world which has a $ measurable value to a key stakeholder;
- Decision: a choice between possible Actions made according to a set of rules that maximize the value of the action taken;
- Information: Data interpreted in a specific context to best support the Decisions the User needs to be able to make;
- Data: individual facts collected from the Real World environment, as accurately and as timely as possible, not all of which will be relevant to the Decisions to be made;
- Real World: The totality of systems, machines, people and environmental factors that can affect the right Action to take in any given circumstance.
How do we turn The Information Value Chain into practice?
The Information Value Chain is a great conceptual framework to think about how to get from Data to Value, but as IoT system architects, we are concerned with the practical question of how to deliver Value from Data. This is the purpose of the 5D IoT Architecture, which maps the links in The Information Value Chain to 4 specific architectural components, suggests core requirements for each of those components, and adds a 5th component to continuously improve the solution itself.
This paper is the development of a series on concepts in Big Data, IoT and systems architecture originally published on Fraysen Systems.
Gartner recently released their 2017 Emerging Technologies Hype Cycle. Where do IoT Platforms stand? At the peak of inflated expectations!
Do you agree?
Gartner says that the hype cycle reveals three distinct megatrends that will enable businesses to survive and thrive in the digital economy over the next five to 10 years. (See graphic below).
Artificial intelligence (AI) everywhere, transparently immersive experiences and digital platforms are the trends that will provide unrivaled intelligence, create profoundly new experiences and offer platforms that allow organizations to connect with new business ecosystems.
The Emerging Technologies Hype Cycle is unique among most Gartner Hype Cycles because it garners insights from more than 2,000 technologies into a succinct set of compelling emerging technologies and trends. This Hype Cycle specifically focuses on the set of technologies that is showing promise in delivering a high degree of competitive advantage over the next five to 10 years.
"Enterprise architects who are focused on technology innovation must evaluate these high-level trends and the featured technologies, as well as the potential impact on their businesses," said Mike J. Walker, research director at Gartner. "In addition to the potential impact on businesses, these trends provide a significant opportunity for enterprise architecture leaders to help senior business and IT leaders respond to digital business opportunities and threats by creating signature-ready actionable and diagnostic deliverables that guide investment decisions."
Among the words, phrases and acronyms in the Tech worlds “Platform” seems to be a word which seems to grab the headlines. If one listens to any pitch from a start up venture it would be not uncommon to get the “platform pitch”in at least 1 out of 2 proposals. A lazy search on Google on the “Top 20 Tech weary words” fetched me the result that “platform was 3rd in the list . (https://www.businessinsider.com.au/the-worlds-top-20-tech-weary-words-for-2014-2014-5).
There have been words verbalised like “Being Platformed” as well and a host of books on the significance of platform in the Technology world. I will not go into the virtues of platform. I would dwell on how the leaders in respective segments are a few ( a maximum of 3 ) while in the IoT world we seem to have by some counts 170 of them ( McKinsey ) to 400 of them ( Beecham Research).This is definitely a bewildering array to go through and investigate .
What is a Platform – why there are only a few platform leaders ?
Stepping back – different people have different views and meanings of the word “platform”. To get a view of the diversity of platforms we have:
Browsers (Chrome and Firefox) ,smart phone operating systems ( iOS and Android) , blogging (Word Press , Medium ) .Social Media titans (YouTube, Facebook) and even Instagram are described as platforms. Uber, Airbnb and their ilk are widely described as ‘marketplaces’, ‘platforms’ or ‘marketplace-platforms.’ Web services (Google Payments, Amazon Elastic Cloud) and gaming consoles (Xbox, Apple’s ipod Touch, Sony Playstation). One interesting point to be noted that in each category the market is mostly duopolistic .
To accommodate this diversity the safest definition of platform would be as :
- An extensible codebase of a software-based system that provides core functionality provided by the modules that interoperate with it, and the interfaces ( aka Application Programming Interface (APIs)) through which they interoperate. In effect this system abstracts a number of common functions without bringing out the complexity of building and managing them , for the users .
- The goal is to enable interactions between producers and the consumers
- This is enabled through three layers comprising the Network ( to connect participants to the platform), Technology Infrastructure ( to help create and exchange value ) and Workflow and Data ( thereby matching participants with content , goods and services ) .
This definition brings in the 2 dimensions of a platform. One that would be for internal use and the other for external use .
- An internal dimension for building platforms is to ensure all necessary modules interoperate , and
- An external dimension for building platforms is to enable interaction with the outside world and make it as accessible and usable as is possible.
Internal dimension led platforms focus on internal productivity and efficiencies and focus on users. Here the development is internally sourced and is essentially built for internal use . The external dimension led platforms focus on the supply (developer side) and the demand (user side) . Essentially they are sometimes termed as “two-sided” platforms .The development beyond a point is crowd-sourced and they enrich the platform and the platform reaches out to them through APIs.
In most of the cases if the external dimension is well evolved then the internalities come with the efficiencies by default; with respect to design quality , selection of interfaces leading to interoperability , robustness of infrastructure , seamlessness in workflow and data streaming .
External dimension platforms compete for both users and developers
Here one important aspect to be remembered is a Platform may not be ready to provide solutions to contextual and domain specific problem statements. Applications built around the platform do that, these applications help get the Return on Investment ( RoI ) from the platforms .
In any segment you must have seen that the winners are a few ( atmost 2 or 3 , aspirants may be many, who progressively wither away ) .The reasons has been presented above with respect to design quality , interoperability, infrastructure robustness and seamlessness in workflow and data flow and the last but not the least excellent and friendly user interface . Not many can master all the 4 aspects .These help acquire a critical mass of customer base which keeps growing and a duopoly of sorts is created in the market space .
Successful platforms have the ability to support the variety of business use cases in the present and have strive to build the design to evolve over time and be to an extent future ready .
The Bazaar of IoT platforms- The reasons & who would be the winners wading through the maze ?
Now when coming to Internet of Things (IoT) , The IoT movement repeatedly talks about platforms, but those definitions don’t align with any of Uber, Medium or Android. The first issue is interoperability. And none of these align with each other either.
Now let us address the question is the why of “plethora of platforms” in IoT .
It can be seen clearly that a typical architecture of an IoT solution is multilayered. The layers to simplistically put would be Device to Device ( this involves hardware and firmware with Low Range Communication ) , Device to Server ( which would again involve hardware and communication ) and server to server ( which would mean that cloud based application and long range communication would hold the key along with network , data storage and data visualisation ) .
So we see protocols and standards are driven through their origins from communication technologies ( we see Telecom companies like AT&T and Verizon leading here ) , in the data storage area ( we have Amazon , Google leading the way ) , in the application side ( Azure from Microsoft and Thingworx from PTC being the prominent ones ) . Companies which has a library of business use cases with them given the dominance they have in their respective businesses (namely Bosch , GE , Honeywell ) have the ambition to build their community based platforms .Then we have a host of start ups who run a platform per a business use case they address .
So the genesis of the “plethora of platforms” in the multilayered solution stack of IoT . This adds to complexity and hence no one player can be a leader across the layers as on date .
In the coming years it could be reckoned that there would be a shakeout in the market and the platforms could veer around key broad based use cases of remote monitoring and environment conditioning , predictive maintenance and process automation .
The ones which will win the battle of supremacy would have cracked the codes of
- Open interfaces,
- Carrier grade reliability,
- Service levels,
- Scalability and
- And allow for aa seamless integration into the back-office environment which is essential to the enterprise’s business operations.
- With a impressive usability and user interface .
Given the multitier architecture and the attendant complexity it will be a while before a small group of winners starts to bubble to the top . Some of the also-ran aspirants may focus on domains and address a specific part of the ecosystem in which to play or in the industry segments like home or industrial to justify their presence .
For all the value and disruptive potential that Internet of Things (IoT) solutions provide, corporate buyers face a dilemma. Today’s IoT technologies are still immature point solutions that address emerging use cases with evolving technology standards. Buyers are concerned that what they buy today may become functionally or technologically obsolete tomorrow. Faced with this dilemma, many defer buying even if the IoT solutions they buy today offer tremendous value to their organizations.
This post describes a planning strategy called “future-proofing” that helps managers, buyers, and planners deal with obsolescence.
What causes IoT solution obsolescence?
An IoT solution, whether you buy it now or in the future, can become functionally obsolete for several reasons, as described in Figure One. Unlike more established technologies, today’s immature and fast evolving nature of IoT solutions, amplifies the risk of early obsolescence.
For example, today there are multiple Low Power Wide Area Network (LPWAN) connectivity options – SigFox, LoRa, RPMA (by Ingenu), Symphony Link (by Link Labs), NB-IoT and LTE-M. While each option has advantages and disadvantages, a subset of these will eventually “win” out as technology standards, business models and use cases emerge.
Similarly, there are 350+ IoT platforms in the marketplace today (source: “Current state of the 360+ platforms”, IoT Analytics, June 9, 2016). While many of these platforms target specific applications and industry segments, consolidation is inevitable as there are more vendors than the market can eventually support. The major IoT platform vendors (Amazon, Microsoft, Google, IBM, GE, et al), currently on a market share land grab, will drive consolidation when they begin to acquire select vertical platforms to gain rapid access to those markets.
What is Future-Proofing?
According to Collins English Dictionary (10th edition), “future-proof” is defined as:
“protected from consequences in the future, esp. pertaining to a technology that protect it from early obsolescence”
Because of the high cost of enterprise technologies, many buyers perceive obsolescence as bad. To them, future-proofing means keeping the technology as long as possible in order to minimize costs and maximize return on investment (ROI). Their companies have standardized their business processes, policies and even their technical support on the technologies that they have bought. When a solution goes End of Life (EOL) and transitions to a newer version, it means that managers will have to recertify and retrain everyone on the “new” solution all over again. In general, transitions happen over a period of months (and sometimes years) in large global companies. During this time, multiple generations of the solution will co-exist, with each requiring different processes and policies.
In today’s fast moving IoT market, planned and unplanned obsolescence will be the norm for the foreseeable future. The traditional concept of “future-proofing” doesn’t apply, and can lead to significant, adverse business disruption.
In the era of cloud based solutions and IoT, future-proofing is not about outguessing the future, and choosing the “right” solution so as to never have to “buy” again. Nor is it overbuying technology now to avoid buying in the future. Finally, future-proofing is not about avoiding change. Future-proofing is a solution lifecycle management strategy. It is a continuous process to maximize solution flexibility and options, while making deliberate choices and managing risk.
What does a future-proof IoT infrastructure look like?
In planning the future-proofed IoT infrastructure, managers must first understand its key characteristics, and then define specific requirements for each of those characteristics. At a high level, these characteristics include:
Usable– the infrastructure and solutions achieve all functional needs with no loss in performance, security, service level agreements (SLA) over the desired time period.
Scalable – supports future needs, applications, devices
Supportable – resolves technical, performance, reliability, SLA issues
Changeable – addresses “lock-in” and facilitates migration to updated solutions on your schedule based on your needs
Economical – the total cost of ownership of the solution stays within forecasted ranges
A framework for future-proofing your IoT infrastructure
Change is constant and cannot be avoided. The driving principle behind future-proofing is managing change, not avoiding or preventing it. This principle recognizes that every solution has a useful functional life, and that what is functionally useful today may be obsolete and discarded tomorrow.
A properly designed future-proof plan provides the organization with options and flexibility, rather than lock-in and risk. It prevents suboptimal decision-making by managing the infrastructure on a system level, rather than at the individual component level.
Future-proofing your IoT infrastructure is a three step process (Figure Two). It is not a “once and done” exercise but must be done annually to remain relevant.
Plan and Design
The first step of the future-proofing process is to identify and place the various IoT infrastructure, systems and solutions into one of nine actionable categories. These categories are shown in Figure Three. The horizontal rows represent the “change” category, while the vertical columns represent the timeframe decision timeframe.
The actual classification of the IoT infrastructure solutions into one of the categories is determined in conjunction with IT, operations and the business units. Key considerations for determining the “future-proof category” include:
Usability/functionality – functional utility, compliance with standards, performance against needs, SLAs, and performance
Scalability – ability to meet current and future needs, anticipated change in standards
Support – resources, expertise, reliability
Ease of transition –contractual agreements, technology interdependence/dependence, specialized skills
Economics – maintenance costs, licensing/content/subscription fees, utilities, new replacement costs, transition costs
Source and Build
Once the proper categorization is completed, the second step is to procure the necessary solutions, whether they are hardware or software. This requires that a sourcing strategy be put into place over the desired time period. The terms sourcing and buying are sometimes used interchangeably, but they are not the same. Sourcing is about ensuring strategic access to supply while buying is more transactional. In executing the future-proofing plan, procurement managers must understand the supplier product lifecycle, and develop specific tactics.
As an example, a large global company decides to standardize around a specific IoT edge device (and specific generation) and technology for the next five years. In order to maintain access to this supply during this time period, it employs a number of tactics, including:
Stocking of spare units to be deployed in the future
Placing large “Last time” orders before that version of the solution is discontinued
Sourcing refurbished versions of the technology
Incorporating leasing as sourcing strategy
Negotiating contractual arrangements with the vendor to continue the solution line
Support and Monitor
The third step in the future-proofing strategy is to keep the IoT infrastructure and solutions operational over the desired time period. This is relatively easy when the solutions and technologies are being serviced and supported by the vendors. However, as vendors transition to newer technology and solution versions, buyers may find limited support and expertise. This problem is amplified the further you are from the original end-of-life date.
To keep the infrastructure and solutions fully operational during this time, companies must employ various reactive and proactive tactics. Some of these include:
Incorporating and installing vendor firmware updates to maximize functionality, apply bug fixes and extend useful life. Vendors may issue firmware updates on both End of Life and current generation solutions.
Purchase warranty and extended warranty and maintenance service contracts to assure access to support
Develop in-house maintenance and repair capability
Negotiate special one-off engineering support services with the vendor or their designated contractors
Benson Chan is an innovation catalyst at Strategy of Things, helping companies transform the Internet of Things into the Innovation of Things through its innovation laboratory, research analyst, consulting and acceleration (execution) services. He has over 25 years of scaling innovative businesses and bringing innovations to market for Fortune 500 and start-up companies. Benson shares his deep experiences in strategy, business development, marketing, product management, engineering and operations management to help IoTCentral readers address strategic and practical IoT issues.
By Joe Barkai
Rapid Growth in Times of Uncertainty
The industrial Internet of Things (IoT) is enabling and accelerating the convergence of three key technology and business model shifts that are fueling the digital transformation of every industrial enterprise:
- Connectivity. The number of connected devices and mobile devices is growing at an increasingly faster pace, emanating massive amounts of real-time information that enables deep insight about themselves and the environment around them.
- Cloud Computing. After years of hesitation, cloud technology is finally becoming a mainstream business platform and a growth engine. New information systems and business operation constructs can be deployed and scale quickly and cost effectively, as connected assets and mobile devices deliver decision-making power to all ranks in the organization.
- New Business Models. Cloud-connected assets and customers, coupled with real-time information and decision-making capabilities form the foundation for new ways to engage the business and its customers. Businesses can deploy innovative customer-centric outcome-based engagement models and respond to changing market conditions with greater agility and flexibility.
Industry is making strides in developing Internet of Things technologies and articulating the potential business value of industrial IoT and Industry 4.0 solutions. The upcoming years of the IoT evolution will be characterized by rapid technology acceleration, as the vision of an always-connected world, in which everything and everybody is connected, is becoming an everyday reality.
And company leadership is under pressure to seize the opportunity. Eager technology vendors, enthusiastic investors and analysts, and deluge of breathless headlines, all entice corporate management to jump on the IoT bandwagon before it’s too late.
However, as technology forecaster Paul Saffo aptly observed, one should not mistake a clear view for a short distance.
Early rosy projections about growth in the number of connected devices and the economic impact of the industrial Internet of Things are proving overly optimistic, particularly about the ability of industrial companies to pursue the vision effectively. A survey by the Boston Consulting Group found that while US companies consider digital technologies critical, many lack a holistic adoption strategy and a sense of urgency. A report by KPMG reaches a similar conclusion, highlighting a growing gap between executive ambitions and the corresponding transformative action plans.
To a great extent, the excitement and promise of growth are tempered by lingering concerns about IoT network security and data privacy. Others are still uncertain how to go about articulating comprehensive business models and return on investment.
What Does the Industrial IoT Mean for Product Designers?
What does the industrial Internet of Things mean for the designers of connected products that enable new customer engagement models? Are IoT “things” just like any other industrial equipment, only connected to the Internet? or are there certain design and technical elements business planners and design engineers should consider?
To understand the relationships between the Internet of Things and product design, we need to consider three layers of responses:
Design for IoT
At its most fundamental level, designing products for IoT concentrates on incorporating basic telemetry features such as sensor electronics and Internet connectivity, and, rather obviously, the necessary mechanisms to secure these devices from rogue access and malicious hacking.
Design for the Business of IoT
A less obvious observation, often missed by IoT enthusiasts, is that the product architecture and features must be aligned with business operations. Designers should adopt a business-centric point of view and optimize features and capabilities specifically to achieve the intended business outcome.
For instance, a design to maximize system uptime requires not only remote monitoring capabilities, but could also include optimizing replaceable unit (FRU) granularity to streamline field service operations, spare parts inventory, and workforce availability and training.
Design by IoT
But there is much more to the question about the relationship between the Internet and the “things.”
Most engineering organizations lose sight of their products once they are sold or installed in the field. Always-connected products and customers provide a nonstop stream of structured and unstructured information about products, services, and user interactions. This rich feedback from diverse connected ecosystems, including social media, enable faster and precise design iterations and effective continuous improvement. In essence, the IoT is driving product design!
Seven Industrial IoT Predictions for 2017 and Beyond
With its growing prevalence, the Internet of Things is ushering in a new form of ecommerce – the Commerce of Things, where everyday objects are internet connected and capable of initiating a series of purchases on their own. This new way of buying and selling online is radically changing traditional ecommerce rules and creating a new set of challenges for companies. In this new world of commerce, the product sale is no longer just a transaction; it’s the beginning of an ongoing relationship between brands and customers. Successful online brands are focused on nurturing this relationship – and taking deliberate steps to turn transactional customers into loyal members.
There is a subtle but critical difference between a repeat customer and a member. Understanding this difference is the key to succeeding in an environment that is swiftly becoming a hyper-connected network of consumers who value the access and amenities that come with membership.
How do you build these relationships?
1.) Create lasting relationships to make members out of customers. Members share the experience and the story of the brand, rather than just execute a basic business transaction or product purchase. For years, Disney, where everything is a show and employees are cast members, has stood by the adage “Be Our Guest,” calling to their customers in a more intimate, personable way. Cable companies refer to their customers as “subscribers;” LinkedIn has always called users “members.”
To move customers from “transaction to membership” on a relationship continuum, companies must provide extra, incremental value that replaces pure monetary benefits with more intangible rewards of being, in Disney’s case, a guest.
2.) Use data and metrics to strengthen relationships. Once a company starts to grow its base of members, a whole new set of metrics becomes the benchmark for evaluating the customer relationship.
Asking one simple question, “What is a subscriber’s actual usage?” can yield revelations regarding whether someone is a transactional customer or an invested member. For example, January is the peak season for signing new members at fitness centers around the country. Are those who sign up then really members? If they are not actually getting personal value out of their membership, then the relationship remains transactional and fleeting at best.
Good data is powerful. If the data shows customers are not acting like members, then a company can follow up to discern the true nature of the relationship and figure out how it can become more valuable to the customer. This creates a win for both the customer and the company.
Delta Airlines’ SkyMiles program, for example, makes great use of data to cut through barriers that could otherwise prevent strong relationships from developing. When members call in, the automated phone system quickly recognizes callers based on their phone numbers, addresses them by name and asks about recent or upcoming trips.
Personalizing interactions, continually making improvements and utilizing customer insights are key in this new, Commerce of Things world. Taking these steps can help transform transactional customers into loyal members – and take an online business to the next level.
There is an ongoing transition from a world where having an internet connection was sufficient, to a world where ubiquitous connectivity is quickly becoming the norm. The ability to gather and transport data at high speeds from anywhere is leading to increased automation, smart-everything (vehicles, homes, appliances – you name it), and a standardization of languages and protocols that make the possibilities nearly endless.
Recently, IEEE and Eclipse Foundation completed surveys that provided a snapshot on tools, platforms and solutions being used by engineers and programmers alike to build the Internet of Things. According to Joe McKendrick for RTInsights.com, there were several notable conclusions to be drawn from the results, including the revelation that, of the 713 tech professionals surveyed, nearly 42 percent said their companies currently deploy an IoT solution, and 32 percent said they will be deploying/working with an IoT solution over the next 18 months. Additionally, RT Insights writes:
“In terms of areas of concentration, 42% report they are working with IoT-ready middleware, while 41% are concentrating on home automation solutions. Another 36% are working with industrial automation as part of their IoT efforts. One-third are working on IoT for smart cities, and the same number are building smart energy solutions.”
An interesting note from those conclusions is that 36 percent are working with industrial automation as part of their IoT efforts. Earlier this year, we predicted that Industrial IoT (IIoT) app development would outpace consumer IoT apps, and although this sample size is somewhat limited, it still bodes well for the development of the IIoT sector that is just starting to come into its own.
Among IoT developers, there has been a bit of debate over the programming languages that best suit IoT apps. There are situationally appropriate uses for the main languages, but currently, the majority of developers prefer Java and the C language. For developers, being able to build out IoT apps that can work across platforms is a giant step toward standardization. Specifically, in the Industrial IoT, being able to build apps that can function at the Edge to enable smart data collection is a becoming an unofficial mandate for any companies hoping to transition legacy OT operations into the IT/OT convergence movement taking place across critical industries.
Of course, building apps is a meaningless task if the hardware being deployed can’t host those apps, a finding that was demonstrated by the survey:
Hardware associated with IoT implementations include sensors, used at 87% of sites, along with actuators (51%), gateways and hub devices (50%), and edge node devices (36%).
This Edge functionality and sensor deployment are two pieces that are driving the adaption of IoT technology across industries that have traditionally relied on data as the main tool for decision making. However, with smarter hardware, these industries now have the opportunity to improve the efficiency of that decision making – a transformative capability in the industrial realm.
Join FreeWave’s ZumLink IPR Pilot Program!
What if you could…..
- Collect, analyze and react to data in real-time at the sensor edge?
- Reduce BIG DATA that clogs data pipelines?
- Minimize the cost of expensive PLCs?
- Control your sensor at the closest touchpoint?
The ZumLink IPR App Server Radio combines 900 MHz wireless telemetry with the ability to program and host 3rd party Apps for intelligent control and automation of remote sensors and devices. To participate in the pilot program, visit: http://www.freewave.com/zumlink-ipr-pilot-program/.
Pilot Program participants:
- Receive a complimentary hardware/software Dev Kit
- Get support from FreeWave software engineers
- Should have App developer’s skills
- Use cases that would help you or your organization solve a problem
- Problems you would like to solve
- Developers that could build this App
Note: this page contains paid content.
Please, subscribe to get an access.
Note: this page contains paid content.
Please, subscribe to get an access.
It’s nothing new. Liminal experiences are already all around us: from a movie poster at the bus stop to a billboard on the highway, to an inspirational quote at the bottom of a company newsletter, to absolutely everything at the…Continue