Subscribe to our Newsletter | To Post On IoT Central, Click here


internet of things (86)

As if the Internet of Things (IoT) was not complicated enough, the Marketing team at Cisco introduced its Fog Computing vision in January 2014, also known as Edge Computing  for other more purist vendors.

Given Cisco´s frantic activity in their Internet of Everything (IoE) marketing campaigns, it is not surprising that many bloggers have abused of shocking headlines around this subject taking advantage of the Hype of the IoT.

I hope this post help you better understand what is  the role of Fog Computing  in the IoT Reference Model and how companies are using IoT Intelligent gateways in the Fog to connect the "Things" to the Cloud through some applications areas and examples of Fog Computing.

The problem with the cloud

As the Internet of Things proliferates, businesses face a growing need to analyze data from sources at the edge of a network, whether mobile phones, gateways, or IoT sensors. Cloud computing has a disadvantage: It can’t process data quickly enough for modern business applications.

The IoT owes its explosive growth to the connection of physical things and operation technologies (OT) to analytics and machine learning applications, which can help glean insights from device-generated data and enable devices to make “smart” decisions without human intervention. Currently, such resources are mostly being provided by cloud service providers, where the computation and storage capacity exists.

However, despite its power, the cloud model is not applicable to environments where operations are time-critical or internet connectivity is poor. This is especially true in scenarios such as telemedicine and patient care, where milliseconds can have fatal consequences. The same can be said about vehicle to vehicle communications, where the prevention of collisions and accidents can’t afford the latency caused by the roundtrip to the cloud server.

“The cloud paradigm is like having your brain command your limbs from miles away — it won’t help you where you need quick reflexes.”

Moreover, having every device connected to the cloud and sending raw data over the internet can have privacy, security and legal implications, especially when dealing with sensitive data that is subject to separate regulations in different countries.

IoT nodes are closer to the action, but for the moment, they do not have the computing and storage resources to perform analytics and machine learning tasks. Cloud servers, on the other hand, have the horsepower, but are too far away to process data and respond in time.

The fog layer is the perfect junction where there are enough compute, storage and networking resources to mimic cloud capabilities at the edge and support the local ingestion of data and the quick turnaround of results.

The variety of IoT systems and the need for flexible solutions that respond to real-time events quickly make Fog Computing a compelling option.

The Fog Computing, Oh my good another layer in IoT!

A study by IDC estimates that by 2020, 10 percent of the world’s data will be produced by edge devices. This will further drive the need for more efficient fog computing solutions that provide low latency and holistic intelligence simultaneously.

“Computing at the edge of the network is, of course, not new -- we've been doing it for years to solve the same issue with other kinds of computing.”

The Fog Computing or Edge Computing  is a paradigm championed by some of the biggest IoT technology players, including Cisco, IBM, and Dell and represents a shift in architecture in which intelligence is pushed from the cloud to the edge, localizing certain kinds of analysis and decision-making.

Fog Computing enables quicker response times, unencumbered by network latency, as well as reduced traffic, selectively relaying the appropriate data to the cloud.

The concept of Fog Computing attempts to transcend some of these physical limitations. With Fog Computing processing happens on nodes physically closer to where the data is originally collected instead of sending vast amounts of IoT data to the cloud.

Photo Source: http://electronicdesign.com/site-files/electronicdesign.com/files/uploads/2014/06/113191_fig4sm-cisco-fog-computing.jpg

The OpenFog Consortium

The OpenFog Consortium, was founded on the premise based on open architectures and standards that are essential for the success of a ubiquitous Fog Computing ecosystem.

The collaboration among tech giants such as ARM, Cisco, Dell, GE, Intel, Microsoft and Schneider Electric defining an Open, Interoperable Fog Computing Architecture is without any doubt good news for a vibrant supplier ecosystem.

The OpenFog Reference Architecture is an architectural evolution from traditional closed systems and the burgeoning cloud-only models to an approach that emphasizes computation nearest the edge of the network when dictated by business concerns or critical application the functional requirements of the system.

The OpenFog Reference Architecture consists of putting micro data centers or even small, purpose-built high-performance data analytics machines in remote offices and locations in order to gain real-time insights from the data collected, or to promote data thinning at the edge, by dramatically reducing the amount of data that needs to be transmitted to a central data center. Without having to move unnecessary data to a central data center, analytics at the edge can simplify and drastically speed analysis while also cutting costs.

Benefits of Fog Computing

  • ·         Frees up network capacity - Fog computing uses much less bandwidth, which means it doesn't cause bottlenecks and other similar occupancies. Less data movement on the network frees up network capacity, which then can be used for other things.
  • ·         It is truly real-time - Fog computing has much higher expedience than any other cloud computing architecture we know today. Since all data analysis are being done at the spot it represents a true real time concept, which means it is a perfect match for the needs of Internet of Things concept.
  • ·         Boosts data security - Collected data is more secure when it doesn't travel. Also makes data storing much simpler, because it stays in its country of origin. Sending data abroad might violate certain laws.
  • ·         Analytics is done locally- Fog computing concept enables developers to access most important IoT data from other locations, but it still keeps piles of less important information in local storages;
  • ·         Some companies don't like their data being out of their premises- with Fog Computing lots of data is stored on the devices themselves (which are often located outside of company offices), this is perceived as a risk by part of developers' community.
  • ·         Whole system sounds a little bit confusing- Concept that includes huge number of devices that store, analyze and send their own data, located all around the world sounds utterly confusing.

Disadvantages of Fog Computing

Read more: http://bigdata.sys-con.com/node/3809885

Examples of Fog Computing

The applications of fog computing are many, and it is powering crucial parts of IoT ecosystems, especially in industrial environments. See below some use cases and examples.

  • Thanks to the power of fog computing, New York-based renewable energy company Envision has been able to obtain a 15 percent productivity improvement from the vast network of wind turbines it operates. The company is processing as much as 20 terabytes of data at a time, generated by 3 million sensors installed on the 20,000 turbines it manages. Moving computation to the edge has enabled Envision to cut down data analysis time from 10 minutes to mere seconds, providing them with actionable insights and significant business benefits.
  • Plat One is another firm using fog computing to improve data processing for the more than 1 million sensors it manages. The company uses the Cisco-ParStream platform to publish real-time sensor measurements for hundreds of thousands of devices, including smart lighting and parking, port and transportation management and a network of 50,000 coffee machines.
  • In Palo Alto, California, a $3 million project will enable traffic lights to integrate with connected vehicles, hopefully creating a future in which people won’t be waiting in their cars at empty intersections for no reason.
  • In transportation, it’s helping semi-autonomous cars assist drivers in avoiding distraction and veering off the road by providing real-time analytics and decisions on driving patterns.
  • It also can help reduce the transfer of gigantic volumes of audio and video recordings generated by police dashboard and video cameras. Cameras equipped with edge computing capabilities could analyze video feeds in real time and only send relevant data to the cloud when necessary.

See more at: Why Edge Computing Is Here to Stay: Five Use Cases By Patrick McGarry  

What is the future of fog computing?

The current trend shows that fog computing will continue to grow in usage and importance as the Internet of Things expands and conquers new grounds. With inexpensive, low-power processing and storage becoming more available, we can expect computation to move even closer to the edge and become ingrained in the same devices that are generating the data, creating even greater possibilities for inter-device intelligence and interactions. Sensors that only log data might one day become a thing of the past.

Janakiram MSV  wondered if Fog Computing  will be the Next Big Thing In Internet of Things? . It seems obvious that while cloud is a perfect match for the Internet of Things, we have other scenarios and IoT solutions that demand low-latency ingestion and immediate processing of data where Fog Computing is the answer.

Does the fog eliminate the cloud?

Fog computing improves efficiency and reduces the amount of data that needs to be sent to the cloud for processing. But it’s here to complement the cloud, not replace it.

The cloud will continue to have a pertinent role in the IoT cycle. In fact, with fog computing shouldering the burden of short-term analytics at the edge, cloud resources will be freed to take on the heavier tasks, especially where the analysis of historical data and large datasets is concerned. Insights obtained by the cloud can help update and tweak policies and functionality at the fog layer.

And there are still many cases where the centralized, highly efficient computing infrastructure of the cloud will outperform decentralized systems in performance, scalability and costs. This includes environments where data needs to be analyzed from largely dispersed sources.

“It is the combination of fog and cloud computing that will accelerate the adoption of IoT, especially for the enterprise.”

In essence, Fog Computing allows for big data to be processed locally, or at least in closer proximity to the systems that rely on it. Newer machines could incorporate more powerful microprocessors, and interact more fluidly with other machines on the edge of the network. While fog isn’t a replacement for cloud architecture, it is a necessary step forward that will facilitate the advancement of IoT, as more industries and businesses adopt emerging technologies.

'The Cloud' is not Over

Fog computing is far from a panacea. One of the immediate costs associated with this method pertains to equipping end devices with the necessary hardware to perform calculations remotely and independent of centralized data centers. Some vendors, however, are in the process of perfecting technologies for that purpose. The tradeoff is that by investing in such solutions immediately, organizations will avoid frequently updating their infrastructure and networks to deal with ever increasing data amounts as the IoT expands.

There are certain data types and use cases that actually benefit from centralized models. Data that carries the utmost security concerns, for example, will require the secure advantages of a centralized approach or one that continues to rely solely on physical infrastructure.

Though the benefits of Fog Computing are undeniable, the Cloud has a secure future in IoT for most companies with less time-sensitive computing needs and for analysing all the data gathered by IoT sensors.

 

Thanks in advance for your Likes and Shares

Thoughts ? Comments ?

Read more…

Using Data Science for Predictive Maintenance

Remember few years ago there were two recall announcements from National Highway Traffic Safety Administration for GM & Tesla – both related to problems that could cause fires. These caused tons of money to resolve.
Aerospace, Rail industry, Equipment manufacturers and Auto makers often face this challenge of ensuring maximum availability of critical assembly line systems, keeping those assets in good working order, while simultaneously minimizing the cost of maintenance and time based or count based repairs.
Identification of root causes of faults and failures must also happen without the need for a lab or testing. As more vehicles/industrial equipment and assembly robots begin to communicate their current status to a central server, detection of faults becomes more easy and practical.
Early identification of these potential issues helps organizations deploy maintenance team more cost effectively and maximize parts/equipment up-time. All the critical factors that help to predict failure, may be deeply buried in structured data like equipment year, make, model, warranty details etc and unstructured data covering millions of log entries, sensor data, error messages, odometer reading, speed, engine temperature, engine torque, acceleration and repair & maintenance reports.
Predictive maintenance, a technique to predict when an in-service machine will fail so that maintenance can be planned in advance, encompasses failure prediction, failure diagnosis, failure type classification, and recommendation of maintenance actions after failure.
Business benefits of Data Science with predictive maintenance:
  • Minimize maintenance costs - Don’t waste money through over-cautious time bound maintenance. Only repair equipment when repairs are actually needed.
  • Reduce unplanned downtime - Implement predictive maintenance to predict future equipment malfunctioning and failures and minimize the risk for unplanned disasters putting your business at risk.
  • Root cause analysis - Find causes for equipment malfunctions and work with suppliers to switch-off reasons for high failure rates. Increase return on your assets.
  • Efficient labor planning — no time wasted replacing/fixing equipment that doesn’t need it
  • Avoid warranty cost for failure recovery – thousands of recalls in case of automakers while production loss in assembly line

TrainItalia has invested 50M euros in Internet of Things project which expects to cut maintenance costs by up to 130M euros to increase train availability and customer satisfaction.

Rolls Royce is teaming up with Microsoft for Azure cloud based streaming analytics for predicting engine failures and ensuring right maintenance.
Sudden machine failures can ruin the reputation of a business resulting in potential contract penalties, and lost revenue. Data Science can help in real time and before time to save all this trouble.
Read more…

Top 5 Industrial IoT use cases

The industrial IoT has already proven its versatility with deployments going live in a number of enterprises, showing off dozens of different use cases. But a few key uses consistently present themselves within the same trade, and even throughout different industries.

Top 5 industrial IoT use cases

It’s important to note that IoT use cases will likely expand in the next few years. That being said, we have compiled the top five industrial IoT use cases of today:

Predictive maintenance

Keeping assets up and running has the potential to significantly decreasing operational expenditures (opex), and save companies millions of dollars. With the use of sensors, cameras and data analytics, managers in a range of industries are able to determine when a piece of equipment will fail before it ever does. These IoT-enabled systems can sense signs of warning, use data to create a maintenance timeline and preemptively service equipment before problems occur.

By leveraging streaming data from sensors and devices to quickly assess current conditions, recognize warning signs, deliver alerts and automatically trigger appropriate maintenance processes, IoT turns maintenance into a dynamic, rapid and automated task.

This approach promises cost savings over routine or time-based preventive maintenance, because tasks are performed only when they are needed. The key is to get the right information in the right time. This will allow managers to know which equipment needs maintenance, maintenance work can be better planned and systems remain online while workers stay on task. Other potential advantages include increased equipment lifetime, increased plant safety and fewer accidents with negative impact on environment.

Smart metering

A smart meter is an internet-capable device that measures energy, water or natural gas consumption of a building or home, according to Silicon Labs.

Traditional meters only measure total consumption, whereas smart meters record when and how much of a resource is consumed. Power companies are deploying smart meters to monitor consumer usage and adjust prices according to the time of day and season.

Smart metering benefits utilities by improving customer satisfaction with faster interaction, giving consumers more control of their energy usage to save money and reduce carbon emissions. Smart meters also give visibility of power consumption all the way to the meter so utilities can optimize energy distribution and take action to shift demand loads.

According to Sierra Wireless, smart metering helps utilities to:

  • Reduce operating expenses by managing manual operations remotely
  • Improve forecasting and streamline power-consumption
  • Improve customer service through profiling and segmentation
  • Reduce energy theft
  • Simplify micro-generation monitoring and track renewable power

Asset tracking

A study on the maturity of asset efficiency practices from Infosys and the Institute for Industrial Management (FIR) at Aachen University revealed that 85% of manufacturing companies globally are aware of asset efficiency, but only 15% of the surveyed firms have implemented it at a systematic level.

source: Actsoft
source: Actsoft

Infosys and other supporting companies including Bosch, GE, IBM, Intel, National Instruments and PTC have launched a testbed with the main goal of collecting asset information efficiently and accurately in real-time and running analytics to allow the firms to make the best decisions.

The goal of asset tracking is to allow an enterprise to easily locate and monitor key assets (e.g. raw materials, final products, and containers) and to optimize logistics, maintain inventory levels, prevent quality issues and detect theft.

One industry that heavily relies on asset tracking is maritime shipping. On a large scale, sensors help track the location of a ship at sea, and on a smaller scale they are able to provide the status and temperature of individual cargo containers. One benefit is real-time metrics on refrigerated containers. These containers must be stored at constant temperatures so that perishable goods remain fresh.

Each refrigerated container needs to be equipped with temperature sensors, a processing unit and a mobile transmitter.

To continue reading, please visit the full article on Industrial IoT & 5G

 

Read more…
The greatest advantage we have today is our ability to communicate with one another.
The  Internet of Things, also known as IoT, allows machines, computers, mobile or other smart devices to communicate with each other. Thanks to tags and sensors which collect data, which can be used to our advantage in numerous ways.
IoT has really stormed the  Digital Transformation. It is estimated that 50 billion devices connected to the Internet worldwide by 2020.
Let us have the Good news first:
  • Smart Cars will communicate with traffic lights to improve traffic, find a parking spot, lower insurance rates based on telematics data
  • Smart Homes will have connected controls like temperature, electricity, cameras for safety and watch over your kids
  • Smart healthcare devices will remind patients to take their medication, tell doctors when a refill is needed & help curb diabetic attacks, monitor symptoms and help disease prevention in real time, including in remote areas
  • Smart Cities & Smart Industries are the buzz-words in IT policies of many governments
  • With sensors and IoT enabled Robots used in Manufacturing - new products could potentially cost less in the future, which promotes better standards of living up and down all household income levels
  • Hyper-Personalization – with Bluetooth, NFC, and Wi-Fi all the connected devices can be used for specifically tailored advertising based on the preferences of the individual
  • Real time alerts in daily life - The Egg Minder tray holds 14 eggs in your refrigerator. It also sends a wireless signal to your phone to let you know how many eggs are in it and which ones are going bad.

Now here are the Bad things:

  • There are no international standards of compatibility that current exist at the macro level for the Internet of Things
  • No cross-industry technology reference architecture that will allow for true interoperability and ease of deployment
  • All the mundane work can be transferred to Robots and there is potential to loss of jobs
  •  All smart connected devices are expensive – Nest the learning thermostat cost about $250 as against $25 for a standard which gets a job done. Philips wireless controlled light cost $60 so your household will be huge expense to be remotely controlled

And the Ugly part:

  • Remember the Fire Sale of Die Hard movie, a Cyber-attack on nation’s computer infrastructure - shutting down transportation systems, disabling financial systems and turning off public utility systems. Cyber-attacks can become common when devices are sold without proper updated software for connectivity
  • Your life is open to hackers who can intercept your communications with individual devices and encroach your privacy. Imagine a criminal who can hack your smart metering utility system & identify when usage drops and assume that means nobody is home
  • Imagine when you get into your fully connected self-driving car, and with some hacking a stalker’s voice come up from speaker “your have been taken” and you may not find Liam Neeson anywhere nearby, to rescue you.

All the consumer digital footprints can be mined, aggregated, and analyzed via  Big Data to  predict your presence, intent, sentiment, and behavior, which can be used in a good way and bad way.
We just need to manage the safety and privacy concerns to make sure we can receive the full benefits of this technology without assuming unnecessary risks.
Read more…