Subscribe to our Newsletter | To Post On IoT Central, Click here


Featured Posts (334)

The other day we were discussing and debating on a solution to be designed to meet the sensing needs for access, temperature and humidity for some devices with form part of a networking infrastructure ecosystem. The idea was to build a IoT based system for monitoring and control.

The design discussions veered around the ability to collect data from the sensors and the types of short range communication protocols which could be deployed .Questions and clarification were raised if we were compliant to use short range communication protocols in sensitive areas as customer Data Centres which are like owned and  that they may be custodians of data of their end customers .

The hidden perils of data acquisition and data ownership reared its head which needed to be addressed as we moved forward .

The data which is acquired by sensors is essentially Machine Generated Data (MGD) .This post will  dwell on the subject of data ownership of MGD as follows :

  1. Sensors ( Data Acquisition and Communication )
  2. Machine Generated Data
  3. The Lifecycle of the MGD and the Ownership Paradigm
  4. Who should be the owner of the MGD?
  5. Sensors (Data Acquisition and Communication):

In the IoT ecosystem, the physical computing frontier is managed by the Sensors .Sensors essentially include three fundamental functions:

  • The act of sensing and acquiring the data
  • Communication of the data through appropriate protocols to communicate their readings to internet cloud services for further aggregation and trend analysis
  • The activity is energised by power supply,

The additional functions would include processing/system management and user interface

The Digital Computing part comprises the IoT application. This   is determined by the types of sensors, cloud connectivity, power sources, and (optionally) user interface used in an IoT sensor device. The following diagram showcases the primacy of sensors in a typical IoT Ecosystem.

When making physical measurements such as temperature, strain, or pressure, we need a sensor to convert the physical properties into an electrical signal, usually voltage. Then, the signal must be converted to the proper amplitude and filtered for noise before being digitized, displayed, stored, or used to make a decision. Data-acquisition systems use ADCs (analog-to-digital converters) to digitize the signals with adequate signal conditioning.

Sensor data communication to the cloud can be done in multiple ways from wireline to wireless communication of various complexities. While wire line communication has some important benefits (such as reliability, privacy, and power delivery over the same wires), wireless communication is the technology that is the key catalyst in the majority of IoT applications that were not previously practical with wired systems. Reliability, channel security, long range, low power consumption, ease of use, and low cost are now reaching new levels, previously thought infeasible

Some examples of recently popular IoT wireless communication types: Wi-Fi, Bluetooth Low Energy (aka Smart), Zigbee (and other mesh 802.15.4 variants), cellular, LPWA (Low-Power, Wide-Area network variants: Ingenu, LoRaWAN, Sigfox, NB-LTE, Weightless), and Iridium satellite.

  1. Machine Generated Data (MGD)  :

Sensor data is the integral component of the increasing reality of the Internet of Things (IoT) environment. With IpV6 , anything can be outfitted with a unique ip address with  the capacity to transfer data over a network. Sensor data  is essentially Machine Generated Data . MGD is that is produced entirely by devices / machines though an event or observation.

Here we would define human-generated data, what is recorded is the direct result of human choices. Examples are buying on the web, making an inquiry, filling in a form , making payments with corresponding updates on database. We would not consider the ownership of this data in the post and would be limiting our post to MGD.

  1. The journey of the MCD and the Ownership Paradigm:

The different phases exist in the typical  journey of Machine Generated Data .

Capture and Acquisition of Data– This is a machine or a device based function through signal reception.

Processing and Synthesis of the Data – This is a function which ensures enrichment and integration of Data

Publication of the Data – This is done by expert systems and analysts who work on exception management , triggers and trends .

Usage of Data – The action which need to be taken on the processed and reported information is used by the end user .

Archival and Purging of Data – This function is essentially done by the data maintenance team with supervision.

Now let us dwell on the Ownership Paradigms .They range from the origination of data , adding value to the data through make over , monetising of data through insights generated. Interestingly, let us explore if there is any conclusive method for determining how ownership should be assigned. A number of players may be involved in the journey of the data (e.g. the user, hardware manufacturer, application developer, provider of database architecture and the purchaser of data, each having an equal lay of the claim in different stages of this journey )

  1. Who should be the owner of MGD :

Let me share the multiple and conflicting views  :

  1. The owner of the device which records Data .In essence, the owner of machine-generated data(MGD), is the entity who holds title to the device that recordw the data. In other words, the entity that owns the IoT device also owns the data produced by that device.

But there could be a  lack of clarity if the device is leased rather than owned.. When real-world constructs such as lease holdings of (say servers) come into play, it indeed gets complex and even murky.

  1. Who should be the owner of MGD :

Let me share the multiple and conflicting views  :

The owner of the device which records Data .In essence, the owner of machine-generated data(MGD), is the entity who holds title to the device that recordw the data. In other words, the entity that owns the IoT device also owns the data produced by that device.

But there could be a  lack of clarity if the device is leased rather than owned.. When real-world constructs such as lease holdings of (say servers) come into play, it indeed gets complex and even murky.

The owner is the user of the Data :The other dimension is data may be owned by one party and controlled by another. Possession of data does not necessarily equate to title. Through possession there is control. Title is ownership. Referred to as usage rights, each time data sets are copied, recopied and transmitted, control of the data follows it. There could be cases where the owner of the device could be the user of the data.

 The maker of the Database who essentially invests in aggregating, processing and making the data usable is the owner of the Data :This has a number of buyers of this paradigm . The owner of a smart thermostat does not, for example, own the data about how he uses it. The only thing that is ‘ownable’ is an aggregation or collection of such data provided there has been a relevant investment in carrying out that aggregation or collection (the individual user is very unlikely to have made that investment). The owner here could be the Home automation company . The value which could be generated though this investment could be producing market intelligence , exploiting the insights form data to build market presence and differentiation ,

The purchaser of Data could be the owner of the Data: An auto insurance company could buy the  vehicle generated data ( from the makers of automobiles )  and could design a product for  targeted offerings to specific market segments based on say driving behaviour patterns  and  demographics  .This may not be as easy as this seems – refer the url  :  http://joebarkai.com/who-owns-car-data/ which states that the owner of the vehicle and not the maker of the car owns the data collected from the electronic data recorder .

The value chain of who owns the data can be a complex one with multiple claimants . As one aggregates more sources it just gets more complicated. A good example is in the making of smart cities. The sources of data can be from multiple layers and operational areas . City authorities would be making the effort to make use of the data in areas of waste management , traffic congestion , air pollution etc . So does the city authority own the data?

My personal take is , if someone in the MGD value chain  is making the data usable for  a larger good , and  in the process may monetize the data to cover the investments , that entity deserves to  be the owner of the data  as that is where value is generated .


Posted on August 14, 2017

Read more…

Are you drowning in Data Lake?

Today more than even, every business is focusing on collecting the data and applying analytics to be competitive. Big Data Analytics has passed the hype stage and has become the essential part of business plans.

Data Lake is the latest buzzword for dumping every element of data you can find internally or externally. If you Google the term data lake, you will get more than 14 million results. With entry of Hadoop, everyone wants to dump their siloes of data warehouses, data marts and create data lake.
The idea behind a data lake is to have one central platform to store and analyze every kind of data relevant to the enterprise. With the digital transformation, the data generated every day has multiplied by several times and business are collecting this consumer data,Internet of Things data and other data for further analysis. 
As the storage has become cheaper, more data is being stored in its raw format in the hopes of finding nuggets of information but eventually it becomes difficult. It is like using your smartphone to click photographs left, right and center, but when you want to show some specific photograph to someone it’s very difficult.
Data Lakes, if not maintained properly, have the potential to grow aimlessly consuming all the budget. Some companies have their data lakes overflowing on premise systems into the cloud.
Most data lakes lack governance, lack the tools and skills to handle large volumes of disparate data, and many lack a compelling business case. But, this water (the data) from your data lake has to be crystal clear and drinkable, else it will become a swamp.
Before getting into bandwagon of creating the data lake that may cost thousands of dollars and months to implement, you should start asking these questions.
·        What data we want to store in Data Lake?
·        How much data to be stored?
·        How will we access this massive amounts of data and get value from it easily?
Here are some guidelines to avoid drowning into data lakes.
·        First and foremost - create one or more business use cases that lay out exactly what will be done with the data that gets collected. With that exercise you will avoid dumping data, which is meaningless.
·        Determine the Returns you want to get out of Data Lake. Developing a data lake is not a casual thing. You need good business benefits coming out of it.
·        Make sure your overall big data and analytics initiatives are designed to exploit the data lake fully & help achieve business goals
·        Instead of getting into vendor traps and their buzzwords, focus on your needs, and determine the best way to get there.
·        Deliver the data to wide audience to check and revert with feedback while creating value
There are many cloud vendors to help you out building data lakes – Microsoft Azure, Amazon S3 etc.
By making data available to Data Scientists & anyone who needs it, for as long as they need it, data lakes are a powerful lever for innovation and disruption across industries.
Read more…

20 Job Interview Questions for IoT Professionals

Bill McCabe knows everyone. He has to. He’s a thought leader in IoT, with a particular focus on recruiting. He’s authored dozens of articles on all things IoT and recruitment, and has placed a number of IoT professionals at organizations big and small. We wanted to know in particular, for the IoT job seeker, what are the top 20 questions they should be prepared to answer in their interview. Below is what Bill shared.

  1. What changes in the IoT do you feel is the most groundbreaking?

  2. How would you assess a security concern in our software?  

  3. What was the last training course you took?

  4. What is the most overlooked thing with the IoT during development and deployment ?

  5. How will you take our technology to the next level?

  6. What effect will the Internet of Things have on your daily life?

  7. Do you think IoT will be a job killer or a job creator?

  8. What concerns do you have about IoT and privacy and security ?

  9. What are the difference between the Industrial Internet of Things and the Internet of Things?   

  10. What do you think will be the impact of IoT on Smart Cities?

We have 10 more important questions for you to consider in your IoT interview. To see the rest of the questions, become a member of IoT Central (it’s free!) and click here.

Did you get a great, interesting or hard IoT related question during your interview? If so, let us know and we’ll add it to this list. Leave your question in the comments section or email us

 

Read more…
Comments: 0

IoT Platforms: The Peak of Inflated Expectations

Gartner recently released their 2017 Emerging Technologies Hype Cycle. Where do IoT Platforms stand? At the peak of inflated expectations!

Do you agree?

Gartner says that the hype cycle reveals three distinct megatrends that will enable businesses to survive and thrive in the digital economy over the next five to 10 years. (See graphic below).

Artificial intelligence (AI) everywhere, transparently immersive experiences and digital platforms are the trends that will provide unrivaled intelligence, create profoundly new experiences and offer platforms that allow organizations to connect with new business ecosystems.

The Emerging Technologies Hype Cycle is unique among most Gartner Hype Cycles because it garners insights from more than 2,000 technologies into a succinct set of compelling emerging technologies and trends. This Hype Cycle specifically focuses on the set of technologies that is showing promise in delivering a high degree of competitive advantage over the next five to 10 years.

"Enterprise architects who are focused on technology innovation must evaluate these high-level trends and the featured technologies, as well as the potential impact on their businesses," said Mike J. Walker, research director at Gartner. "In addition to the potential impact on businesses, these trends provide a significant opportunity for enterprise architecture leaders to help senior business and IT leaders respond to digital business opportunities and threats by creating signature-ready actionable and diagnostic deliverables that guide investment decisions."

Read more…
Building a system to get value from the Internet of Things (IoT) or Industrial Internet of Things (IIoT) is a complex process involving a number of very distinct components. We need an Architecture for IoT to define the components which are necessary and sufficient for the system to deliver value. An information system only delivers value if it completes the Information Value Chain, causing real-world action to take place in response to the data it collects. This is what the 5D Architecture does. Luckily, every IoT or IIoT system needs to perform the same 5 core functions in order to deliver value, and therefore the architecture of all these systems is — pleasingly — the same!
Read more…

Save IoT, Save The World

When looking for a title for this article, I remembered the famous phrase from TV serie Heroes, "Save the cheerleader,  Save the world". Sorry if one more time I abuse of shocking headlines to attract more readers. 

Is the Internet of Thing (IoT) in danger? In light of the latest events I have attended in Berlin and London and news like this "Intel To Amputate Three Modules For Internet Of Things, Including Joule", I really believe  IoT is falling into the Gartner´s Trough of Disillusionment phase  and we need IoT heroes to push it faster towards the Plateau of Productivity phase.

The other part of the article's title, "Save the World," may sound pretentious, but the world need to be save. This year hot spring and summer is confirming even the most disbelieving that Global Warming is very real (Read more at " Global Warming, Christmas and the Internet of Things" and in spite I do not consider that only IoT can save our blue planet, per recent events like "Portugal forest fire", IoT can help and much.

If we cannot control runaway pollution of our air and water, the world will end

Source: Ron Lake, Technology Evangelist at Galdos Systems Inc.

Let's go by parts.

Has the Interest in IoT Slowed Down?  Some Symptoms

The IoT no longer fills single events. Now events like Internet of Things World Europe 2017 or IoT Tech Expo Europe in Berlin need help from other technologies like AR/VR AI, Blockchain, or 5G to attract exhibitors and visitors.

The heroes of IoT have lost their past evangelizing enthusiasm. What IoT heroes need to do?

  • The IoT Industry heroes need to focus on Customer Value. It is important that IoT heroes address real pain points rather than creating something gimmicky.
  • IoT Heroes can not do it alone, partnership with other heroes are absolutely essential for success in the Internet of Things.
  • IoT heroes need to be more creative with new Use Cases. As sensors continue to decrease in cost and IoT-specific networks get rolled out, everybody expect the number of use cases to increase exponentially.
  • Raise awareness about the major concern, IoT Security
  • IoT heroes should follow the trends by pairing connectivity with AI/Blockchain/AR/VR heroes

How can IoT save us from world challenges?

Gary Atkinson, Director of Emerging Technologies at ARM, identifies five main challenges that the planet is heading towards:

1.       We’re running out of agricultural land

2.       Water is our rarest commodity

3.       Energy needs to be cheaper to be efficient. 

4.       Healthcare is a growing problem

5.       Transport - Everyone will be able to afford cars, but won’t be able to afford to pay for fuel.

Save IoT, Save Agricultural land

If we all expect that IoT Agricultural solutions will be cheap, will have a long-lasting battery (+10 years), and will emit signals at least 5 miles, the smart farming will be a reality and we will not have excuses to save agricultural land.

Additional info:

Save IoT, Save the Water

Water is currently the most precious natural resource on planet Earth.

On the occasion of World Water Day, tech giant IBM entered into a pact with Ireland’s Dublin City University for a collaborative R&D pilot to leverage the internet of things (IoT) technology to help protect water.

The IoT could for instance make desalinisation coming to a cost-effectiveness point. India uses mostly a pivot irrigation system, which means 30% of land is lost and 50 to 60% of water is lost by evaporation. The switch to tape based irrigation could save 2/3 of the water used.

Back in 2014, HydroPoint Data Systems utilised the Internet of Things (IoT) to help with water conservation efforts. According to the company and its partners, this system saved local people some $137m in expenses and 15 billion gallons of water in the first year alone.

Additional info:

Save IoT, Make Energy renewable and cheaper

Smarter, more efficient energy consumption it’s been the dream of environmentalists for decades. Now, it’s possible through the power of Internet of Things devices. Because of their connection capabilities, energy consumption such as the power in a commercial building or even smart home can be constantly monitored and adjusted.

Energy consumption could be reduced thanks to a smarter consumption and the implementation of micro generation storage. Knowing that lightning is the second biggest consumer of energy (after motors), and that there are about 1 billion streetlights in the world, upgrading streetlights infrastructure would strongly impact the world consumption.

Experts said that thanks to the Internet of Things, we can move from about 13 percent aggregate energy efficiency to 40 percent in the next 25 to 30 years.

Creating a new connected economy powered by renewable energy will cause a temporary surge in productivity worldwide as grids are modernized and infrastructure is rolled out. Installing wind and solar is labor intensive, for example, so for two generations, people will have plenty of work to do.

Additional info:

IoT company SkyGrid which is based in Melbourne and Sydney, is developing a smart hot-water system in partnership with hot-water company Quantum Energy. The aim is to intelligently control when a building’s hot-water systems are switched on, so that energy isn’t wasted heating water when no one is around to use it – something that currently wastes as much as 50% of a system’s power.

  • EnLight works on streetlight efficiency
  • Freestyle has partnered with engineering firm PowerTec, on an intelligent energy grid for Kangaroo Island in South Australia. Sensors and controllers in the grid intelligently manage energy sources to sway energy consumption towards renewables without sacrificing the reliability of the supply.
  • Top 10 Internet of Things Companies Disrupting the Energy Industry -
    • PingThings is combining big data and machine learning to change the way that state utility grids operate.
    • Actility employs IoT and machine-to-machine (M2M) communication to reinvent the way the energy sector operates.
    • Tibber is a personal assistant that can regulate a house’s energy consumption and buy more energy if the need arises.
    • Wattz is implementing solar power solutions that rely not on the sun’s light, but capturing ambient light from LED and compact fluorescent bulbs to recharge the batteries in IoT devices.
    • Positive Energy uses IoT devices and software to optimize the functional efficiency of industrial buildings and smart homes alike. 
    • Smappee allows users to turn devices on and off remotely. It also has the capability to monitor solar panel output values and gas and water usage in real-time.
    • GasZen allows customers to convert their traditional “dumb,” or non-networked, propane tanks into smart tanks that can be monitored by both the gas provider and the user remotely. 
    • 75Farenheit, beyond their ability to predict and adapt to changing climates, they offer analytics and suggestions on how to make the operation of a building more efficient.
    • Inspire Energy is giving citizens the power to become a part of the growing clean energy movement.
    • Verdigris Technologies primary target is energy consumption and waste.

Save IoT, Save Healthcare

Despite incredible improvements in health since 1950, there are still a number of challenges, which should have been easy to solve.

In a 2016 report by Deloitte we can read “Change is the new normal for the global health care sector. As providers, payers, governments, and other stakeholders strive to deliver effective, efficient, and equitable care, they do so in an ecosystem that is undergoing a dramatic and fundamental shift in business, clinical, and operating models. This shift is being fueled by aging and growing populations; the proliferation of chronic diseases; heightened focus on care quality and value; evolving financial and quality regulations; informed and empowered consumers; and innovative treatments and technologies — all of which are leading to rising costs and an increase in spending levels for care provision, infrastructure improvements, and technology innovations.”

The IoT has brought many exciting advances to healthcare, improving patient experiences, increasing the quality of care provided, as well as updating and streamlining healthcare operations. From digital assistants to ‘smart’ medicine bottles, a new wave of connected devices could help people live independently for longer.

According with Goldman Sachs, IoT functions would produce an estimated $32.4 billion in annual revenue (45% from remote patient monitoring, 37% from telehealth, and 18% from behavior modification). But Healthcare IoT not only increases revenue, IoT reduces this cost by offering a more cost-effective method of managing chronic illness. The $305 billion estimated savings is accounted for by a combination of chronic disease management and telehealth.

Additional info:

Save IoT, Save Transportation 

I leave this topic for a special post in the coming months.

Key Takeaway: Save IoT and IoT will enable Save the World

As I have commented many times the IoT is a Journey. Those who have been more time in the race know that there are easier and other more difficult stages, but not for that reason we abandon the hardness of climbing one of them.

 If we have not yet achieved that the IoT has a unique definition, it is not surprising that the term could disappear for reasons of business marketing. Nor does it matter that technologies such as AI, VR / AR, Robots, Blockchain, join to IoT to solve world problems. We could call it "Unified Information Technology".

The World of 2017 has some immense problems but It is scary to think about the challenges it for the next 10, 20 50 years. As we have seen IoT must play an important Enabler to Save the World. 

IoT heroes, save the IoT, Save the World.

 Thanks in advance for your Likes and Shares

Thoughts ? Comments ?

Read more…

Here is the latest round-up of articles from IoT Central. Remember: get your friends and enemies to join IoT Central here.

Navigating the Critical Database Decision While Building Our IoT Application

Posted by Gena Minevich

The promise of IoT solutions comes from their tremendous ability to harness data on a scale that has never before been possible. This data, wrangled by countless transmitters and sensors, offers us a wealth of insights about everything from the homes we live in to the products we buy to the health of our own bodies – all while IoT applications provide the power to act upon this data in real-time. Delivering these remarkable capabilities calls for a similarly capable database, one that can match IoT applications’ stringent requirements around performance, scalability, and availability.

Ongoing trends in IoT device lifecycle management

Posted by Mohit Bhardwaj 

IoT device lifecycle management is the key element for industries to have complete insight and control of their devices infrastructure. Today, device lifecycle management enables many industries to transition to ‘smart’ ecosystems, like smart energy (a.k.a Internet of Energy or smart grid), smart buildings, smart retail, smart transportation, smart cities, smart factories, and smart agriculture. As more and more devices get connected, the challenges with data security, control, and management becomes critical. IoT remote device lifecycle management plays a key role in enabling a 360 degree data view of the device infrastructure.

Interview: Bringing Machine Learning to The Edge

Posted by David Oro

A couple of weeks ago, I spent a few hours at GE Digital’s headquarters in San Ramon, CA. It was a great overview by several executives of how GE is using their Predixplatform to create software to design, build, operate, and manage the entire asset lifecycle for the Industrial IoT.  A big part of this transformation for GE involves hiring tons of software developersacquisitions, and partnerships. One of those partnerships is with Silicon Valley based FogHorn Systems (GE Ventures, Dell Ventures, March Capital and a few others are investors). FogHorn is a developer of “edge intelligence” software for industrial and commercial IoT applications. FogHorn and GE are working very closely on many IIoT customer use cases, across verticals, bolstered by the integration of FogHorn with Predix. I turned to FogHorn Systems CEO David C. King to learn more about edge intelligence software for the Industrial IoT.

The Buzz of Platforms and the Bazaar of IoT Platforms

Posted by Somjit Amrit

Among the words, phrases and acronyms in the Tech worlds “Platform” seems to be a word which seems to grab the headlines. If one listens to any pitch from a start up venture it would be not uncommon to get the “platform pitch”in at least 1 out of 2 proposals. A lazy search on Google on the “Top 20 Tech weary  words” fetched me the result that “platform was 3rd in the list . There have been words verbalised like “Being Platformed” as well and a host of books on the significance of platform in the Technology world. I will not go into the virtues of platform. I would dwell on how the leaders in respective segments  are a few ( a maximum of 3 ) while in the IoT world we seem to have by some counts 170 of them ( McKinsey ) to 400 of them ( Beecham Research).This is definitely a bewildering array to go through and investigate . What is a Platform – why there are only a few platform leaders ?

Infographic: Securing Connected Cars

Posted by David Oro 

In my recent interview with Sam Shawki, the founder and chief executive officer of MagicCube, I wrote about getting a new Ram Truck and noted that it was a beast not just in size and towing power, but a beast of electronics and connectivity. According to Intertrust Technologies, the percentage of new cars shipped with Internet connectivity will rise from 13% in 2015 to 75% in 2020, and that in 2020, connected cars will account for 22% of all vehicles on the road. That number is sure to grow. More stats in the infographic below. 

AggreGate Server on Nanopi NEO

Posted by Victor Polyakov

We’ve tested AggreGate Server on Nanopi NEO, one of the smallest Linux-based single-board PCs. Despite its small size, this device simply rules! It has RAM 512 Mb on board, 1,2 GHz quad-core CPU, 10/100M Ethernet network interface, and many other interfaces to connect the world. AggreGate possibilities on the NEO board are similar to Linux-based Tibbo Project System. It can act as a simple close-knit protocol gateway with intermediate data processing.


Follow us on Twitter | Join our LinkedIn group | Members Only | For Bloggers | Subscribe

 

 

Read more…

Why Data Visualization Matters Now?

Data Visualization is not new, it has been around in various forms for more than thousands of years. 
Ancient Egyptians used symbolic paintings, drawn on walls & pottery, to tell timeless stories of their culture for generations to come.
Human brain understands the information via pictures more easily than writing sentences, essays, spreadsheets etc. You must have seen traffic symbols while driving…why do they have only 1 picture instead of writing a whole sentence like school ahead, deer crossing or narrow bridge? Because you as driver can grasp the image faster while keeping your eyes on the road.
Over last 25 years technology has given us popular methods like line, bar, and pie charts showing company progress in different forms, which still dominate the boardrooms.
Data visualization has become a fundamental discipline as it enables more and more businesses and decision makers to see big data and analytics presented visually. It helps identify the exact area that needs attention or improvement than leaving it to the leaders to interpret as they want.
Until recently making sense of all of that raw data was too daunting for most, but recent computing developments have created new tools like Tableau, Qlik with striking visual techniques, especially for use online, including the use of animations.
There is a wealth of information hiding in the data in your database that is just waiting to be discovered. Even historical complicated data collected from disparate sources start to make sense when shown pictorially. Data Scientists do a fantastic job of analyzing this data using machine learning, finding relationship but communicating the story to others is the last milestone.
In today's Digital age, we as consumers generate tons of data every day and businesses want to use that for hyper-personalization, sending right offers to us by collecting, storing & analyzing this data. Data Visualization is the necessary ingredient to bring power of this big data to mainstream.
It is hard to tell how the data behaves in the data table. Only when we apply visualization via graphs or charts, we get a clear picture how the data behaves. 
Data visualization allows us to quickly interpret the data and adjust different variables to see their effect and technology is increasingly making it easier for us to do so. 
The best data visualizations are ones that expose something new about the underlying patterns and relationships contained within the data. Data Visualization brings multiple advantages such as showing the big picture quickly with simplicity for further action.
Finally as they say “A picture is worth a thousand words” and it is much important when you are trying to show the relationships within the data.
Data is the new oil, but it is crude, and cannot really be used unless it is refined with visualization to bring the new gold nuggets
Read more…

Among the words, phrases and acronyms in the Tech worlds “Platform” seems to be a word which seems to grab the headlines. If one listens to any pitch from a start up venture it would be not uncommon to get the “platform pitch”in at least 1 out of 2 proposals. A lazy search on Google on the “Top 20 Tech weary  words” fetched me the result that “platform was 3rd in the list . (https://www.businessinsider.com.au/the-worlds-top-20-tech-weary-words-for-2014-2014-5).

There have been words verbalised like “Being Platformed” as well and a host of books on the significance of platform in the Technology world. I will not go into the virtues of platform. I would dwell on how the leaders in respective segments  are a few ( a maximum of 3 ) while in the IoT world we seem to have by some counts 170 of them ( McKinsey ) to 400 of them ( Beecham Research).This is definitely a bewildering array to go through and investigate .

What is a Platform – why there are only a few platform leaders ?

Stepping back – different people have different views and meanings of the word “platform”. To get a view of the diversity of platforms we have:

Browsers (Chrome and Firefox) ,smart phone operating systems ( iOS and Android) , blogging  (Word Press , Medium ) .Social Media titans (YouTube, Facebook) and even Instagram are described as platforms. Uber, Airbnb and their ilk are widely described as ‘marketplaces’, ‘platforms’ or ‘marketplace-platforms.’ Web services (Google Payments, Amazon Elastic Cloud) and  gaming consoles (Xbox, Apple’s ipod Touch, Sony Playstation). One interesting point to be  noted that in each category the market is mostly duopolistic .

To accommodate this diversity the safest definition of platform would be as :

  1.  An extensible codebase of a software-based system that provides core functionality provided by the modules that interoperate with it, and the interfaces ( aka Application Programming Interface (APIs)) through which they interoperate. In effect this system  abstracts a number of common functions without bringing out the complexity of building and managing them ,  for the users .
  2.  The goal is to  enable interactions between producers and the consumers
  3. This is enabled through three layers comprising the Network ( to connect  participants to the platform), Technology Infrastructure ( to help create and exchange value )  and Workflow and Data ( thereby matching participants with content , goods and services ) .

This definition brings in the 2 dimensions of a platform. One that would be for internal use and the other for external use .

  1. An internal dimension  for building platforms is to ensure all necessary modules interoperate , and
  2. An external dimension for building platforms is to enable interaction with the outside world and make it as accessible and usable as is possible.

Internal dimension led platforms focus on internal productivity and efficiencies and focus on users. Here the development is internally sourced and is essentially  built for internal use .  The external dimension led platforms focus on the supply (developer side) and the demand (user side) . Essentially they are sometimes termed as “two-sided” platforms .The development beyond a point is crowd-sourced and they enrich the platform and the platform reaches out to them through APIs.

In most of the cases if the external dimension is well evolved then the internalities come with the efficiencies by default; with respect to design quality , selection of interfaces leading to interoperability  , robustness of infrastructure , seamlessness in workflow and data streaming  .

External dimension platforms compete for both users and developers

Here one important aspect to be remembered is a Platform may not be ready to provide solutions to contextual and domain specific problem statements. Applications built around the platform do that, these applications help get the Return on Investment ( RoI ) from the platforms .

In any segment you must have seen that the winners are a few ( atmost 2 or 3  , aspirants may be many, who progressively wither away )  .The reasons has been presented above with respect to design quality , interoperability, infrastructure robustness and seamlessness in workflow and data flow and the last but not the least excellent and friendly user interface . Not many can master all the 4 aspects .These help acquire a critical mass of customer base which keeps growing and a duopoly of sorts is created in the market space .

Successful platforms have the ability to support the variety of business use cases in the present and have strive to  build the  design to evolve over time and be to an extent future ready .

The Bazaar of IoT platforms- The reasons & who would be the winners  wading through the maze ?

Now when coming to Internet of Things (IoT)  , The IoT  movement repeatedly talks about platforms, but those definitions don’t align with any of Uber, Medium or Android. The first issue is interoperability.  And none of these align with each other either.

Now let us address the question is the why of “plethora of platforms” in IoT .

It can be seen clearly that a typical architecture of an IoT solution is multilayered. The layers to simplistically put would be Device to Device ( this involves hardware and firmware with Low Range Communication ) , Device to Server ( which would again involve hardware and communication ) and server to server ( which would mean that cloud based application and long range communication would hold the key along with network , data storage and data visualisation ) .

So we see protocols and standards are driven through their origins from communication technologies ( we see Telecom companies like AT&T and Verizon leading here ) , in the data storage area ( we have Amazon , Google leading the way ) , in the application side ( Azure from Microsoft and Thingworx from PTC being the prominent ones ) . Companies which has a library of business use cases with them given the dominance they have in their respective businesses (namely Bosch , GE , Honeywell ) have the ambition to build their community based platforms .Then we have a host of start ups who run a platform per a business use case they address .

So the genesis of the “plethora of platforms” in the multilayered solution stack of IoT . This adds to complexity and hence no one player can be a leader across the layers as on date .

In the coming  years it could be reckoned that there would be a shakeout in the market and the platforms could veer around key broad based use cases of remote monitoring and environment conditioning , predictive maintenance and process automation .

The ones which will win the battle of supremacy would have cracked the codes of

  1. Security,
  2. Open interfaces,
  3. Carrier grade reliability,
  4. Service levels,
  5. Scalability and
  6. And allow for aa seamless integration into the back-office environment which is essential to the enterprise’s business operations.
  7. With a impressive  usability and user interface .

Given the multitier architecture and the attendant complexity it will be a while before a small group of winners starts to bubble to the top . Some of the also-ran aspirants may focus on domains and address a  specific part of the ecosystem in which to play or in the industry segments like home or industrial to justify their presence .

 

 

Read more…

Interview: Bringing Machine Learning to The Edge

A couple of weeks ago, I spent a few hours at GE Digital’s headquarters in San Ramon, CA. It was a great overview by several executives of how GE is using their Predix platform to create software to design, build, operate, and manage the entire asset lifecycle for the Industrial IoT.  A big part of this transformation for GE involves hiring tons of software developers, acquisitions, and partnerships.

One of those partnerships is with Silicon Valley based FogHorn Systems (GE Ventures, Dell Ventures, March Capital and a few others are investors). FogHorn is a developer of “edge intelligence” software for industrial and commercial IoT applications. FogHorn and GE are working very closely on many IIoT customer use cases, across verticals, bolstered by the integration of FogHorn with Predix.

I turned to FogHorn Systems CEO David C. King to learn more about edge intelligence software for the Industrial IoT. David has been at the helm of FogHorn since 2015, a year after its founding. Prior to FogHorn, David co-founded AirTight Networks, Inc., a technology leader in secure cloud-managed Wi-Fi. Before AirTight, he served as Chairman, President and Chief Executive Officer of Proxim Inc., a pioneer in WLANs and the first publicly traded Wi-Fi company, from 1993-2002.

Lots of talk about the edge in IoT. It’s my smartphone and my doorbell, as well as the sensor on a traffic light or a wind turbine. What exactly is the edge of the network and how do you define it?

We define edge as the closest compute point that can process real time streaming data. So in your case, all three -- phone, doorbell, sensors -- are edges because you can bring compute to the data on any of these platforms. The question is what compute is possible? The single variable filtering that you can do on a sensor is very simple when compared to the complex Machine Learning models that can execute on your phone.   

Analytics is done in the data center or cloud. You claim to do this at the edge now.  Please describe your offering.  

FogHorn has developed a tiny footprint complex event processor (CEP) that provides advanced streaming analytics, and machine learning capabilities at the edge.  This powerful combination of being able to pre-process, cleanse the data and execute ML models, all in real-time, brings the power of big data analytics to the edge. The FogHorn software platform is highly flexible and can be easily scaled to optimize for footprint and/or feature needs.

Tell us about a customer you’re working with and how they are applying your technology.

FogHorn Lightning is an extensible platform currently used by customers from Manufacturing, Oil & Gas, Power & Water, Renewable Energy, Mining, Transportation, Smart Buildings/Cities and other industrial verticals. The deployment patterns range across gateways, PLCs, to ruggedized servers in production, at Fortune 100 sites. A common implementation of FogHorn Lightning is product quality inspection, predictive maintenance, real time health monitoring. Customers are seeing immediate business value; e.g. identifying defects in the early stages of manufacturing reduces, scrap and increases yield. Additionally, there is a trend to FogHorn to generate new streams of revenue by providing real-time smart maintenance for their end customers.

When compared to software-defined IIoT smart gateways, there are still millions more hardware-defined M2M gateways out there. At what point do we cross the chasm to smarter gateways, and where are we now in this cycle?

We are still very early in adoption of IIoT technologies. Understandably, typical industrial sectors are conservative, and have much longer adoption curves. However, we are beginning to observe that it the ROI from edge intelligence is accelerating customer demand for FogHorn. We will cross the chasm once industries identify key use cases that generate new revenue streams, which is still about 3-5 years away.

You can’t talk about IoT without talking about security, and it’s even more important in the industrial sector. How do you address security concerns for your customers and what does the industry need to do to make IoT more secure?

Yes, you are right. When you think of IoT, especially IIoT, security is a top concern. Hacks such as “Devil’s Ivy” will become everyday events with increasingly connected devices. At FogHorn, our edge intelligence software runs very close to the data source, and is local to the asset. This implies that we are secure (like the assets) behind firewalls, and in a DMZ layer. And because most of our processing is done locally, we are less vulnerable to malicious hacks that occur when connected.

Because IIoT is still such a nascent set of technologies, we caution users to deploy solutions after thoroughly weighing the business value, and convenience versus security risk factors. My guiding question before any deployment: “Can I do this locally, without connecting to an external network?”. The answer is usually yes, and if otherwise, you should probably talk to us.

How can companies make their industrial processes better?

We understand that today’s industrial processes are highly complex and advanced, with many moving parts. While it may seem humanly impossible to optimize it any more without help from technology, we believe that a key asset is still untapped: your operator! Companies will start seeing incredible improvements once they translate the tribal knowledge on the plant floor into actionable insights. This can be further supplemented by techniques from machine learning, and artificial intelligence, to tease out the known unknowns, and also, the unknown unknowns.

Anything else you’d like to add?

FogHorn is redefining edge intelligence for IIoT. A year ago, we started our journey as a company that did analytics on tiny footprint devices. Today, we have accelerated the transition to Machine Learning at the edge, and are very are excited about the market validation. With our Operational Technology focus, we are looking forward to defining new business models, and delivering transformational value for our industrial customers.

Read more…

Do you want to hire a Data Scientist?

As mentioned by Tom Davenport few years back,Data Scientist is still a hottest job of century.
Data scientists are those elite people who solve business problems by analyzing tons of data and communicate the results in a very compelling way to senior leadership and persuade them to take action.
They have the critical responsibility to understand the data and help business get more knowledgeable about their customers.
The importance of Data Scientists has rose to top due to two key issues:
·     Increased need & desire among businesses to gain greater value from their data to be competitive
·     Over 80% of data/information that businesses generate and collect is unstructured or semi-structured data that need special treatment
So it is extremely important to hire a right person for the job.Requirements for being a data scientist are pretty rigorous, and truly qualified candidates are few and far between.
Data Scientists are very high in demand, hard to attract, come at a very high cost so if there is a wrong hire then it’s really more frustrating. 
Here are some guidelines for checking them:
·     Check the logical reasoning ability
·     Problem solving skills
·     Ability to collaborate & communicate with business folks
·     Practical experience on collaborating Big Data tools
·     Statistical and machine learning experience
·     Should be able to describe their projects very clearly where they have solved business problems
·     Should be able to tell story from the data
·     Should know the latest of cognitive computingdeep learning
I have seen smartest data scientists in my career who do the best job best but cannot communicate the results to senior leaders effectively. Ideally they should know the data in depth and can explain its significance properly. Data visualizations comes very handy at this stage.
Today with digital disrupting every field it has an impact on data science also.
Gartner has called this new breed as citizen data scientists. Their primary job function is outside analytics, they don’t know much about statistics but can work on ready to use algorithms available in APIs like Watson, Tensor flow, Azure and other well-known tools.
The good data scientist can make use of them to spread the awareness and expand their influence.
It has become more important to hire a right data scientist as they will show you the results which may make or break the company.
Read more…

Infographic: Securing Connected Cars

In my recent interview with Sam Shawki, the founder and chief executive officer of MagicCube, I wrote about getting a new Ram Truck and noted that it was a beast not just in size and towing power, but a beast of electronics and connectivity. According to Intertrust Technologies, the percentage of new cars shipped with Internet connectivity will rise from 13% in 2015 to 75% in 2020, and that in 2020, connected cars will account for 22% of all vehicles on the road. That number is sure to grow. More stats in the infographic below. 


Connected Cars

Read more…

The promise of IoT solutions comes from their tremendous ability to harness data on a scale that has never before been possible. This data, wrangled by countless transmitters and sensors, offers us a wealth of insights about everything from the homes we live in to the products we buy to the health of our own bodies – all while IoT applications provide the power to act upon this data in real-time. 

Delivering these remarkable capabilities calls for a similarly capable database, one that can match IoT applications’ stringent requirements around performance, scalability, and availability. While these are the three core database concerns behind just about any application, I’d argue the IoT offers an even more mission-critical use case. Consider the magnitude of the data that IoT applications must process, along with the fact that many of these applications aren’t even functional if they cannot perform with near real-time fidelity – let alone become wholly unavailable – and it’s clear that IoT applications are introducing a new level of intensity when it comes to the demands placed on the database.

We know a thing or two about vetting and working with database solutions for IoT applications. Our project is the creation of an IoT Smart Kitchen Commerce solution –embedded within connected kitchen appliances – which links to grocery retailers and makes it easy to purchase needed items from within the customer’s own kitchen. Enabling this system’s simple-to-use front-end required tremendous back-end complexity. The database facilitating the solution needed to handle rich information on more than 1 million individual grocery products, which had all been collected by mapping retailers’ online product catalogs in granular detail. At its heart, this smart kitchen solution is all about the quality of the experience it offers users – whether it could truly make grocery shopping and managing kitchen restocking easier for customers – so its success absolutely depends on the database delivering real-time responsiveness and total reliability.

A major question in determining the specifics of the database we’d use for this project was the debate between an SQL or a NoSQL strategy. Each of these technologies has a lot to offer when applied to solutions for which they’re well-suited. Different IoT application use cases may benefit from utilizing SQL (relational) or NoSQL (non-relational) databases, and, to be both diplomatic and accurate, many projects might use both in tandem to great effect.

In our particular case, we discovered that NoSQL was the best fit for the task at hand, though the path to realizing it was a bit of a winding road. We actually began developing our solution by using MySQL as the database, managed by our internal team. Unfortunately, this led to a lot of difficulties. We found MySQL replication and other  administrative work to be disagreeable and full of thorny issues.

To alleviate what was becoming quite a pain point, we then decided to shift to a NoSQL database, recognizing that open source MongoDB would fit our solution much better, for reasons I’ll elaborate on. Also, having learned that managing the database internally wasn’t preferred or a great use of our resources, we enlisted mLab to be our MongoDB Database-as-a-Service provider. This accomplished two things: it made sure that the move to NoSQL MongoDB was handled seamlessly by experts, and provided our team the bandwidth to dive headlong into product development, which was really the best use of our team’s time.

NoSQL made sense in our case because the many data providers we work with use different data schemas. This understandably had created issues with MySQL, which requires defined schemas prior to accepting data. For enterprises in this position, our advice is to utilize the advantages of a NoSQL database in providing denormalized documents to quickly and easily accept any data in any format.

In addressing the key issue of scalability, a NoSQL approach has some effective advantages for dealing with the vast magnitude of IoT datasets while maintaining high performance. With NoSQL, for instance, it’s possible to utilize however many sharded servers there’s a need for, while each single server can maintain a limited and pre-determined size, making scaling easy to execute. It’s also worth saying that, because NoSQL shifts a good deal of logic to the application and away from the database, it’s easier for enterprises to recruit the talent they need. Great Java or C# coders outnumber great database programmers, and the expertise to optimize performance for complex queries is a rare and valued skill. While this remains true for any database, implementing a NoSQL strategy makes it that much less challenging to put a team in place with the right skillsets. It’s also the case that NoSQL is rapidly rising in popularity throughout the industry – with this rise, the tooling, frameworks, and knowhow required to best utilize NoSQL are becoming increasingly prevalent as well.

As we’ve discovered through our experience with MongoDB, the non-relational database has proven appropriate to meeting the huge data-handling demands of always-on IoT applications. Again, relational databases may offer a better fit for certain IoT applications, such as those working with less sizeable or dynamic datasets. But for our particular application – by handling vast and dynamic datasets regardless of their structure – our open source, NoSQL MongoDB approach provides the high-speed read and write performance, scalability, and high availability needed to deliver positive and effective real-time experiences for IoT consumers.

Read more…

California Coastal Views and Rugged IoT

I spent the last few days on the Santa Barbara coast staring at the offshore oil rigs. With a marine layer and reflections of the sun, they look like large container ships that don't move (Imperial Walkers, if you're my son). These oil rigs reached an all-time high of 68,798,091 barrels in 1995, and are most famous for the a spill that happened in January 1969 when Union Oil's Platform A experienced an uncontrolled blowout in the Dos Cuadras field that lasted for approximately eight days. The spill of approximately 80,000 to 100,000 barrels of crude oil affected over forty miles of coastline. Several environmental laws were passed at the federal and state levels following the blowout, including the National Environmental Policy Act (NEPA) and the California Environmental Quality Act (CEQA).

I got to thinking more about these rigs because I could not access the Internet from where I was staying, and I was only 20 miles outside of town. This reminded me of articles that IoT Central member Scott Allen has been sharing about IIoT in some very remote and tough places. Worth checking out in this issue, as well as member contributions from Benson ChanVinay SolankiDarren Tessitore, and Mark Shapiro.

Enjoy.

Rugged IIoT Saves Lives at “The Home of the World's Worst Weather”

Posted by Scott Allen

Imagine your worst winter day. Bone-chilling cold, howling, bitter winds, blinding snow and sleet, and your truck is encased in ice. What do you do? You tough it out, scrape the ice off the windshield and get to work. The radio network deployed at one of the world’s most important weather research facilities has to endure and perform in extremely brutal climates nearly every day of the year, 24/7/365. Lives depend on its successful transmission of weather data. And for over a decade, wireless data radios have gotten the job done at the Mount Washington Observatory.

Hardware or Software Security: Which is right for my IoT Device?

Posted by Mark Shapiro 

Since many embedded devices are deployed outside of the standard enterprise security perimeter, it is critical that security be included in the device itself. Ultimately, some combination of hardware and software may be required. Building protection into the device itself provides a critical security layer whatever options are used. Security must be considered early in the design of a new device or system.

Internet of Things: Sensors & Sensing

Posted by Vinay Solanki 

Continuing my series of post on IoT/M2M and having been covered topics such as smart villagescommunication protocols, role of telecom (part A and part B), managing road accidentsIoT for retail sectorsmart home, smart clothesIoT business modelsbest article aggregation and also what IoT really means to a layman - I think sensors deserve my attention - Here I will cover just basics of different type of sensors and their application.

Future-Proofing Your IoT Infrastructure

Posted by Benson Chan

For all the value and disruptive potential that Internet of Things (IoT) solutions provide, corporate buyers face a dilemma. Today’s IoT technologies are still immature point solutions that address emerging use cases with evolving technology standards. Buyers are concerned that what they buy today may become functionally or technologically obsolete tomorrow. Faced with this dilemma, many defer buying even if the IoT solutions they buy today offer tremendous value to their organizations. This post describes a planning strategy called “future-proofing” that helps managers, buyers, and planners deal with obsolescence.

Never Miss A Beat with Predictive Maintenance

Posted by Darren Tessitore 

Predictive maintenance is the idea of fixing something before it breaks, which can save you LOTS of time and money! “Don’t Fix It If It Ain’t Broke.” Horrible advice, especially if you run a manufacturing plant. Any industrial business should really be focusing on predictive maintenance. Predictive maintenance is the idea of fixing something before it breaks. It saves businesses time, money and a lot of frustration.


Follow us on Twitter | Join our LinkedIn group | Members Only | For Bloggers | Subscribe

Read more…

How Customer Analytics has evolved...

Customer analytics has been one of hottest buzzwords for years. Few years back it was only marketing department’s monopoly carried out with limited volumes of customer data, which was stored in relational databases like Oracle or appliances like Teradata and Netezza.
SAS & SPSS were the leaders in providing customer analytics but it was restricted to conducting segmentation of customers who are likely to buy your products or services.
In the 90’s came web analytics, it was more popular for page hits, time on sessions, use of cookies for visitors and then using that for customer analytics.
By the late 2000s, Facebook, Twitter and all the other socialchannels changed the way people interacted with brands and each other. Businesses needed to have a presence on the major social sites to stay relevant.
With the digital age things have changed drastically. Customer issuperman now. Their mobile interactions have increased substantially and they leave digital footprint everywhere they go. They are more informed, more connected, always on and looking for exceptionally simple and easy experience.
This tsunami of data has changed the customer analytics forever.
Today customer analytics is not only restricted to marketing forchurn and retention but more focus is going on how to improve thecustomer experience and is done by every department of the organization.
A lot of companies had problems integrating large bulk of customer data between various databases and warehouse systems. They are not completely sure of which key metrics to use for profiling customers. Hence creating customer 360 degree view became the foundation for customer analytics. It can capture all customer interactions which can be used for further analytics.
From the technology perspective, the biggest change is the introduction of big data platforms which can do the analytics very fast on all the data organization has, instead of sampling and segmentation.
Then came Cloud based platforms, which can scale up and down as per the need of analysis, so companies didn’t have to invest upfront on infrastructure.
Predictive models of customer churn, Retention, Cross-Sell do exist today as well, but they run against more data than ever before.
Even analytics has further evolved from descriptive to predictive toprescriptive. Only showing what will happen next is not helping anymore but what actions you need to take is becoming more critical.
There are various ways customer analytics is carried out:
·       Acquiring all the customer data
·       Understanding the customer journey
·       Applying big data concepts to customer relationships
·       Finding high propensity prospects
·       Upselling by identifying related products and interests
·       Generating customer loyalty by discovering response patterns
·       Predicting customer lifetime value (CLV)
·       Identifying dissatisfied customers & churn patterns
·       Applying predictive analytics
·       Implementing continuous improvement
Hyper-personalization is the center stage now which gives your customer the right message, on the right platform, using the right channel, at the right time
Now via Cognitive computing and Artificial Intelligence using IBM Watson, Microsoft and Google cognitive services, customer analytics will become sharper as their deep learning neural network algorithms provide a game changing aspect.
Tomorrow there may not be just plain simple customer sentiment analyticsbased on feedbacks or surveys or social media, but with help of cognitive it may be what customer’s facial expressions show in real time.
There’s no doubt that customer analytics is absolutely essential for brand survival.
Read more…

Connected devices or IoT seem to have become the de facto solution for any industry, today. Increase in connected devices lead to increase in the amount of data transferred, stored, computed, and consumed across networks and devices. This propels a need for an efficient data management and data security. IoT Device Lifecycle Management plays a key role and enables industries to manage its connected devices with ease, and at the same time provides additional advantages like data security, remote control, and multi-protocol connectivity etc.

IoT Device Lifecycle Management is aiding industries to transition their systems to “Smart” ecosystems. It plays a much important role in enabling a broader view on entire device infrastructure.

Let’s take a look at how an IoT DLM is helping Utility and Home Automation verticals:

1.   Smart Grids :

(i)  “What is a smart grid??”

Smart grid is the adoption of ‘Smart’ technologies in the expansion of Transmission and Distribution network, enabling a demand-based power supply production. Smart technology enables optimized utilization of energy resources by providing real-time insights on energy consumption, with the help of smart metering and automation at the distribution end.

Smart meter is a device that periodically stores the electrical energy con

sumption data and intimates the energy provider in a timely manner for monitoring and billing. Unlike previous metering methods, smart meter has more advanced sensors, power consumption notifications, and bi-directional communication between the meter and energy provider.

How DLM Is Helping In Smart Energy System :

At present, government regulations are changing towards energy conservation, motivating consumers towards smart metering. Device Lifecycle Management enables smart metering through AMI (Advance Metering Infrastructure) system. AMI is a system which enables two way communication between utility provider and consumer.

DLM Benefits for Consumer

(i) The consumer can manage his energy consumption through the system, which will continually show them energy utilization for every device connected to    the AMI System.

(ii) A home area network (HAN) for communication between devices, enabling a wide range of protocols and standards.

(iii) DLM has a data analysis system providing an in-depth energy consumption analysis for each device connected in the network, accordingly it helps the        consumer to plan their energy utilization in a cost effective way.

(iv) Consumers have control of devices through mobile applica

tions, so they can utilize and manage energy by scheduling up-time for each device.

(v) AMI + HAN + DLM resulting into a Smart Grid System.

DLM Benefits for Utility Provider:

(i) AMI system periodically sends data about load variations and peak time of maximum energy utilization by the consumer to utility providers.

(ii) Through DLM, utility provider can point out customers’ peak energy consumption time, consumers can focus on the devices those are running on that time  period and manage them accordingly.

(iii) Energy distributor can incorporate data analysis and 

get insights into monthly consumption of consumer, load variations, and peak load timings.    Accordingly, utility provider can enable a dynamic pricing for consumers during the peak hours.

(iv) By way of load analysis, the utility provider can also get insights into the times of heavy energy usage and send notifications to the customer about the peak in usage and in turn the customer can manage the energy usage or can check for any malfunctions in the devices at the consumer end. This will help with excessive usage and also identifying faulty devices.

2.    Smart Building :

Smart Building is centralized control on building utilities like heating, air conditioning, lighting, security, alarm system, etc. IoT Device Lifecycle management plays a key role in smart building design and facilitates user comfort, energy efficiency and increase in device lifecycle. Smart building includes building automation through networking, communication protocols, sensors/actuators, IoT gateway, ventilation control, HVAC system, and other electronics devices for monitoring and control.

How IoT Gateway Playing Role in Smart Building??

(i) In building automation, first stage is sensors and actuators data input, all sensors are equipped with wire or wireless protocols (Bluetooth, ZigBee, Z-wave, LAN etc.) for communication with IoT gateway.

(ii) IoT Gateway provides interfacing between Sens

ors and cloud forming a bridge between them. It enables device software updates, device on-boarding, control panel, diagnostic information etc.

(iii) Real time data analysis from devices or sensors and provides necessary output or command message to the control system. Message can be an alarm, HVAC control message or other utilities management commands.

(iv) IoT gateway enables data analysis for each device. The User can utilize energy efficiently by scheduling device up time and down time according to data analysis.

(v) It is enabling building automation or smart building implementation easy and reliable. It enables security through layered security system (TPM & TEE, authorized connection, no third party inclusion), which covers both data and hardware security.

How DLM is an Essential part for Building Automation:

(i) DLM enables remote control on building utilities like lighting, alarm systems, HVAC system etc.  

(ii) HVAC System: Heating, Ventilation, and Air Conditioning (HVAC) is a system which is so common in current technologies of building construction. DLM enables remote control of HVAC system with real time data analysis. For example, if the sensor data are showing a drop in temperature then DLM will control the air conditioning according to the required temperature. DLM controls pneumatic and hydraulic valves (Ventilator, Water Piping) by sending control signals to actuators which results in the complete mechanical control of the cooling air/water flow in the building.

(iii) DLM offers centralized alarm system for fire, gas leakage, humidity, temperature, etc. All alarm systems are remotely controlled and user gets real time notification if there is an alert.

(iv) It enables control of the lighting system of the building by changing their intensity according to the daylight. Input from the photovoltaic sensors and DLM data analysis results into the output control signals for lights.

(v)  DLM provides device authentication and verification whenever there is an updating of the system. It enables a secure environment with a layered security system, hardware, data, and software. 

In summary, IoT Device Lifecycle Management is the key growth driver for many industries, today. As in the cases explained above, it helps stakeholders on both sides of the equation – the consumers and service providers.

Read more…

Imagine your worst winter day. Bone-chilling cold, howling, bitter winds, blinding snow and sleet, and your truck is encased in ice. What do you do? You tough it out, scrape the ice off the windshield and get to work.

The radio network deployed at one of the world’s most important weather research facilities has to endure and perform in extremely brutal climates nearly every day of the year, 24/7/365. Lives depend on its successful transmission of weather data. And for over a decade, wireless data radios have gotten the job done at the Mount Washington Observatory.

LOCATION: The private, non-profit Mount Washington Observatory (MWO) in New Hampshire, USA, one of the most important state-of-the-art climate research facilities in the world.

With a weather recording history dating back to 1932, the MWO’s mission is to research the Earth’s climate. Weather observations are reported to the National Weather Service and National Oceanic and Atmospheric Administration for use in nationwide and global forecasting models.

Additionally, the New Hampshire State Park (NHSP), US Forest Service Snow Rangers, and New Hampshire Fish and Game all rely on the MWO’s current weather data to determine the safety and viability of launching search operations.

In short, the MWO saves lives and provides critical climate data, and rugged wireless data radios delivers it – no matter what the weather conditions may be.

Located on the highest peak in the Northeast United States (elevation 6,288 ft.), the MWO operates mission-critical weather stations in notoriously brutal and erratic weather conditions that are amongst the worst in the world. The long-standing slogan of the MWO is “The Home of the World’s Worst Weather” and summit conditions certainly prove this.

During the summer, researchers encounter 50-100 mph winds with penetrating fog.  Winter conditions include sub-arctic temperatures, 140+ mph winds, freezing fog, and heavy glaze icing.  The weather can change rapidly, going from clear and warm to fogged-in and freezing within minutes.  Additionally, ice accretion rates of up to 12”/hour are often observed. Winter winds can change from light and variable to hurricane-force, and beyond, without notice, with blinding snow eliminating all visibility.  In fact, at one time Mt. Washington held the world record for recorded wind speed of 231 mph.

These unique conditions make the Observatory an ideal location for research and product testing. If a product is stamped “Mt Washington Tested”, know that it has experienced the harshest conditions imaginable on this continent.

It is because of these year-round brutal conditions that the MWO turns to proven data radio technology for mission-critical and extremely rugged wireless communications.

THE NETWORK

On its mountaintop weather station, MWO deploys a radio network of 900 MHz frequency hopping spread spectrum (FHSS) radios (both serial and Ethernet) connecting a network of 28 sensors and devices on five different remote weather stations. These stations and sensors measure temperature, humidity, wind speed/direction and ground temperature. Continuous links are vital to provide real-time weather feeds.

The master radio is located 4 miles away on the summit of 4,063 ft. Wildcat Mountain, with 5 client stations situated at 1,000 ft. intervals along the Mt. Washington Auto Road, a privately owned 7.6 mile gravel and tar road that winds its way to the summit at 6,288 ft. These combined stations comprise MWO’s Auto Road Vertical Profile (ARVP). The Auto Road is closed to the public in winter, but the staff of the MWO and the NHSP routinely travel its treacherous path to and from the summit in full-sized snowcats, breaking through snowdrifts of 10 and 20 feet, carving a notch into its side in the vicinity of the actual road.

Because this type of winter travel is so treacherous, current weather data along the road is crucial for the safety of the crew, and both the MWO and the NHSP rely on FreeWave radios to maintain the constant communications links between weather stations and data servers.

The FHSS radio network has been in operation since 2004.

All 6 weather stations are solar-powered in locations that only get sunlight approximately 40% of the year, so the MWO needs radios that consume minimal power while providing constant 24/7/365 connectivity on the Mount Washington Regional Mesonet. In meteorology, a mesonet is a network of automated weather and environmental monitoring stations designed to observe meteorological phenomena.

RESULTS

According to the MWOs IT Manager, Peter Gagne, “For almost 13 years these radios have been on duty continuously, and I personally can attest to their durability and reliability in conditions that, frankly, radios shouldn’t survive. These radios routinely are exposed to bitter cold and winds that far exceed the radios specifications, and have always passed the test. It is because of this outstanding record of performance, as well as the superior customer support we receive, that we have decided to stay with FHSS radios, despite the multitude of competitors, in the upgrade of our ARVP sites this year of 2017.”

Highlights include:

  • Cost-effective, real-time data transmission enabled by a rugged serial communication solution.
  • Mount Washington Observatory is able to issue severe warnings that assist operations and rescue efforts.
  • Real-time weather data and highly reliable performance in extreme weather conditions.

FreeWave Technologies has been a supplier to the MWO for more than a decade and has provided a reliable and rugged wireless data communiocation network in spite of the brutal weather conditions. To learn more, visit: http://www.freewave.com/case-studies/.  

Read more…

Upcoming IoT Events

More IoT News

IoT Career Opportunities