Subscribe to our Newsletter | To Post On IoT Central, Click here


All Posts (531)

The other day we were discussing and debating on a solution to be designed to meet the sensing needs for access, temperature and humidity for some devices with form part of a networking infrastructure ecosystem. The idea was to build a IoT based system for monitoring and control.

The design discussions veered around the ability to collect data from the sensors and the types of short range communication protocols which could be deployed .Questions and clarification were raised if we were compliant to use short range communication protocols in sensitive areas as customer Data Centres which are like owned and  that they may be custodians of data of their end customers .

The hidden perils of data acquisition and data ownership reared its head which needed to be addressed as we moved forward .

The data which is acquired by sensors is essentially Machine Generated Data (MGD) .This post will  dwell on the subject of data ownership of MGD as follows :

  1. Sensors ( Data Acquisition and Communication )
  2. Machine Generated Data
  3. The Lifecycle of the MGD and the Ownership Paradigm
  4. Who should be the owner of the MGD?
  5. Sensors (Data Acquisition and Communication):

In the IoT ecosystem, the physical computing frontier is managed by the Sensors .Sensors essentially include three fundamental functions:

  • The act of sensing and acquiring the data
  • Communication of the data through appropriate protocols to communicate their readings to internet cloud services for further aggregation and trend analysis
  • The activity is energised by power supply,

The additional functions would include processing/system management and user interface

The Digital Computing part comprises the IoT application. This   is determined by the types of sensors, cloud connectivity, power sources, and (optionally) user interface used in an IoT sensor device. The following diagram showcases the primacy of sensors in a typical IoT Ecosystem.

When making physical measurements such as temperature, strain, or pressure, we need a sensor to convert the physical properties into an electrical signal, usually voltage. Then, the signal must be converted to the proper amplitude and filtered for noise before being digitized, displayed, stored, or used to make a decision. Data-acquisition systems use ADCs (analog-to-digital converters) to digitize the signals with adequate signal conditioning.

Sensor data communication to the cloud can be done in multiple ways from wireline to wireless communication of various complexities. While wire line communication has some important benefits (such as reliability, privacy, and power delivery over the same wires), wireless communication is the technology that is the key catalyst in the majority of IoT applications that were not previously practical with wired systems. Reliability, channel security, long range, low power consumption, ease of use, and low cost are now reaching new levels, previously thought infeasible

Some examples of recently popular IoT wireless communication types: Wi-Fi, Bluetooth Low Energy (aka Smart), Zigbee (and other mesh 802.15.4 variants), cellular, LPWA (Low-Power, Wide-Area network variants: Ingenu, LoRaWAN, Sigfox, NB-LTE, Weightless), and Iridium satellite.

  1. Machine Generated Data (MGD)  :

Sensor data is the integral component of the increasing reality of the Internet of Things (IoT) environment. With IpV6 , anything can be outfitted with a unique ip address with  the capacity to transfer data over a network. Sensor data  is essentially Machine Generated Data . MGD is that is produced entirely by devices / machines though an event or observation.

Here we would define human-generated data, what is recorded is the direct result of human choices. Examples are buying on the web, making an inquiry, filling in a form , making payments with corresponding updates on database. We would not consider the ownership of this data in the post and would be limiting our post to MGD.

  1. The journey of the MCD and the Ownership Paradigm:

The different phases exist in the typical  journey of Machine Generated Data .

Capture and Acquisition of Data– This is a machine or a device based function through signal reception.

Processing and Synthesis of the Data – This is a function which ensures enrichment and integration of Data

Publication of the Data – This is done by expert systems and analysts who work on exception management , triggers and trends .

Usage of Data – The action which need to be taken on the processed and reported information is used by the end user .

Archival and Purging of Data – This function is essentially done by the data maintenance team with supervision.

Now let us dwell on the Ownership Paradigms .They range from the origination of data , adding value to the data through make over , monetising of data through insights generated. Interestingly, let us explore if there is any conclusive method for determining how ownership should be assigned. A number of players may be involved in the journey of the data (e.g. the user, hardware manufacturer, application developer, provider of database architecture and the purchaser of data, each having an equal lay of the claim in different stages of this journey )

  1. Who should be the owner of MGD :

Let me share the multiple and conflicting views  :

  1. The owner of the device which records Data .In essence, the owner of machine-generated data(MGD), is the entity who holds title to the device that recordw the data. In other words, the entity that owns the IoT device also owns the data produced by that device.

But there could be a  lack of clarity if the device is leased rather than owned.. When real-world constructs such as lease holdings of (say servers) come into play, it indeed gets complex and even murky.

  1. Who should be the owner of MGD :

Let me share the multiple and conflicting views  :

The owner of the device which records Data .In essence, the owner of machine-generated data(MGD), is the entity who holds title to the device that recordw the data. In other words, the entity that owns the IoT device also owns the data produced by that device.

But there could be a  lack of clarity if the device is leased rather than owned.. When real-world constructs such as lease holdings of (say servers) come into play, it indeed gets complex and even murky.

The owner is the user of the Data :The other dimension is data may be owned by one party and controlled by another. Possession of data does not necessarily equate to title. Through possession there is control. Title is ownership. Referred to as usage rights, each time data sets are copied, recopied and transmitted, control of the data follows it. There could be cases where the owner of the device could be the user of the data.

 The maker of the Database who essentially invests in aggregating, processing and making the data usable is the owner of the Data :This has a number of buyers of this paradigm . The owner of a smart thermostat does not, for example, own the data about how he uses it. The only thing that is ‘ownable’ is an aggregation or collection of such data provided there has been a relevant investment in carrying out that aggregation or collection (the individual user is very unlikely to have made that investment). The owner here could be the Home automation company . The value which could be generated though this investment could be producing market intelligence , exploiting the insights form data to build market presence and differentiation ,

The purchaser of Data could be the owner of the Data: An auto insurance company could buy the  vehicle generated data ( from the makers of automobiles )  and could design a product for  targeted offerings to specific market segments based on say driving behaviour patterns  and  demographics  .This may not be as easy as this seems – refer the url  :  http://joebarkai.com/who-owns-car-data/ which states that the owner of the vehicle and not the maker of the car owns the data collected from the electronic data recorder .

The value chain of who owns the data can be a complex one with multiple claimants . As one aggregates more sources it just gets more complicated. A good example is in the making of smart cities. The sources of data can be from multiple layers and operational areas . City authorities would be making the effort to make use of the data in areas of waste management , traffic congestion , air pollution etc . So does the city authority own the data?

My personal take is , if someone in the MGD value chain  is making the data usable for  a larger good , and  in the process may monetize the data to cover the investments , that entity deserves to  be the owner of the data  as that is where value is generated .


Posted on August 14, 2017

Read more…

Are you drowning in Data Lake?

Today more than even, every business is focusing on collecting the data and applying analytics to be competitive. Big Data Analytics has passed the hype stage and has become the essential part of business plans.

Data Lake is the latest buzzword for dumping every element of data you can find internally or externally. If you Google the term data lake, you will get more than 14 million results. With entry of Hadoop, everyone wants to dump their siloes of data warehouses, data marts and create data lake.
The idea behind a data lake is to have one central platform to store and analyze every kind of data relevant to the enterprise. With the digital transformation, the data generated every day has multiplied by several times and business are collecting this consumer data,Internet of Things data and other data for further analysis. 
As the storage has become cheaper, more data is being stored in its raw format in the hopes of finding nuggets of information but eventually it becomes difficult. It is like using your smartphone to click photographs left, right and center, but when you want to show some specific photograph to someone it’s very difficult.
Data Lakes, if not maintained properly, have the potential to grow aimlessly consuming all the budget. Some companies have their data lakes overflowing on premise systems into the cloud.
Most data lakes lack governance, lack the tools and skills to handle large volumes of disparate data, and many lack a compelling business case. But, this water (the data) from your data lake has to be crystal clear and drinkable, else it will become a swamp.
Before getting into bandwagon of creating the data lake that may cost thousands of dollars and months to implement, you should start asking these questions.
·        What data we want to store in Data Lake?
·        How much data to be stored?
·        How will we access this massive amounts of data and get value from it easily?
Here are some guidelines to avoid drowning into data lakes.
·        First and foremost - create one or more business use cases that lay out exactly what will be done with the data that gets collected. With that exercise you will avoid dumping data, which is meaningless.
·        Determine the Returns you want to get out of Data Lake. Developing a data lake is not a casual thing. You need good business benefits coming out of it.
·        Make sure your overall big data and analytics initiatives are designed to exploit the data lake fully & help achieve business goals
·        Instead of getting into vendor traps and their buzzwords, focus on your needs, and determine the best way to get there.
·        Deliver the data to wide audience to check and revert with feedback while creating value
There are many cloud vendors to help you out building data lakes – Microsoft Azure, Amazon S3 etc.
By making data available to Data Scientists & anyone who needs it, for as long as they need it, data lakes are a powerful lever for innovation and disruption across industries.
Read more…

In the United States, precision agriculture is one of the largest industries by both operational scale and economic impact. The technology utilized is typically on the cutting edge, especially for automation and control. Things like sensors, programmable IoT radios and generally more complex software applications have allowed that industry to evolve, domestically, to a point where land and other resources are used optimally. Internationally, although there have been ‘smart’ or ‘precision’ practices in certain sectors of agriculture, many countries are just now starting to adopt the technology to its fullest extent, including the ability to innovate via start-ups and new practices.

India & the Digital Agriculture Revolution

According to an article in India Times (image credit), the country is aiming to secure a 20 percent stake in the IoT market share in the next five years through its ‘Digital India’ initiative. While many might look at India and think of the sprawling and diverse urban environments that could offer some potential complications for IoT, it is rural areas seeing the most interesting developments. There has been a noticeable growth in tele-medicine operations, which can allow patients in remote areas to interact with doctors for consultation, eliminating the need to get to a city, or vice versa. Perhaps an even greater area of growth lies in the agricultural realm. According to the article, agriculture employs 50 percent of the country’s population, so the potential for a digital revolution is high. Farmers are just starting to implement sensor technology, automation hardware, and even leading-edge tools like voluntary milking systems the allow cows to be milked on an automated machine according to biological needs.

Israel’s Precision Ag Start-Up Community

In Israel, where IoT technology is starting to mature, the name of the game is data collection and analytics. Mobile applications, sensor data collection hardware, and advanced analytics software are three areas that Israel is seeing significant market growth, according to Israel21c:

Israel stands out in precision-ag subsectors of water management, data science, drones and sensors, says Stephane Itzigsohn, investment associate at OurCrowd. … “Multiple startups are aiming toward the same goal — providing good agricultural data — but approaching it from slightly different angles,” Itzigsohn tells ISRAEL21c. “One might use satellite images or aerial photography; another might use autonomous tractors. Not all will get to that peak in the long journey of farming becoming more efficient.”

For example, CropX, an investor-backed advanced adaptive irrigation software solution, can be placed throughout a farming area and synced with a smart phone, allowing the operators to receive real-time data updates on things like soil and weather conditions. CropX is based in both Tel Aviv and San Francisco, indicating that the technology may be poised for wide international adoption in the future.

Analytics Drive Italy’s Drought Recovery

Italy is perhaps best known for a single agricultural export: wine. However, many would be surprised to find out that it is one of the top corn producers in the European Union, producing more than 7 million tons of corn in 2015, according to an RCR Wireless report. In 2016, the EU’s total corn output dropped noticeably due to year-long droughts affecting production. In Italy, start-up companies collaborated with industrial ag operations develop and deploy widespread soil sensor and water automation technology to help streamline farming practices and create a more efficient system for resource use. The technology allowed farmers to get a comprehensive look at their operations and identify high and low yield areas in order to better utilize the available space.

Precision Agriculture and the Industrial IoT

The continued maturation of IIoT technology is enabling countries around the globe to better utilize resources like water, energy, and land area to create better agricultural operations. As populations continue to expand, and food production becomes even more important, being able to connect these technologies across the globe could become a key factor in optimizing crop output in critical areas. Imagine the above farm in Italy being able to send its data to data scientists in Germany or the Eastern Europe who could in turn analyze it and provide actionable feedback. Or an industrial farm in Israel managing its yields sending that information in real-time around the country. These possibilities are not far off, and as the networks, hardware and software continue to be adapted, the future of precision ag internationally, will become the present.

For additional reading:

India Times: http://www.indiatimes.com/news/india/how-the-internet-of-things-is-digitizing-agriculture-speeding-up-rural-development-in-india-326546.html

Israel 21c: https://www.israel21c.org/5-israeli-precision-ag-technologies-making-farms-smarter/

RCRWireless: http://www.rcrwireless.com/20161005/big-data-analytics/precision-agriculture-omica-tag31-tag99

Read more…

20 Job Interview Questions for IoT Professionals

Bill McCabe knows everyone. He has to. He’s a thought leader in IoT, with a particular focus on recruiting. He’s authored dozens of articles on all things IoT and recruitment, and has placed a number of IoT professionals at organizations big and small. We wanted to know in particular, for the IoT job seeker, what are the top 20 questions they should be prepared to answer in their interview. Below is what Bill shared.

  1. What changes in the IoT do you feel is the most groundbreaking?

  2. How would you assess a security concern in our software?  

  3. What was the last training course you took?

  4. What is the most overlooked thing with the IoT during development and deployment ?

  5. How will you take our technology to the next level?

  6. What effect will the Internet of Things have on your daily life?

  7. Do you think IoT will be a job killer or a job creator?

  8. What concerns do you have about IoT and privacy and security ?

  9. What are the difference between the Industrial Internet of Things and the Internet of Things?   

  10. What do you think will be the impact of IoT on Smart Cities?

We have 10 more important questions for you to consider in your IoT interview. To see the rest of the questions, become a member of IoT Central (it’s free!) and click here.

Did you get a great, interesting or hard IoT related question during your interview? If so, let us know and we’ll add it to this list. Leave your question in the comments section or email us

 

Read more…
Comments: 0

IoT Platforms: The Peak of Inflated Expectations

Gartner recently released their 2017 Emerging Technologies Hype Cycle. Where do IoT Platforms stand? At the peak of inflated expectations!

Do you agree?

Gartner says that the hype cycle reveals three distinct megatrends that will enable businesses to survive and thrive in the digital economy over the next five to 10 years. (See graphic below).

Artificial intelligence (AI) everywhere, transparently immersive experiences and digital platforms are the trends that will provide unrivaled intelligence, create profoundly new experiences and offer platforms that allow organizations to connect with new business ecosystems.

The Emerging Technologies Hype Cycle is unique among most Gartner Hype Cycles because it garners insights from more than 2,000 technologies into a succinct set of compelling emerging technologies and trends. This Hype Cycle specifically focuses on the set of technologies that is showing promise in delivering a high degree of competitive advantage over the next five to 10 years.

"Enterprise architects who are focused on technology innovation must evaluate these high-level trends and the featured technologies, as well as the potential impact on their businesses," said Mike J. Walker, research director at Gartner. "In addition to the potential impact on businesses, these trends provide a significant opportunity for enterprise architecture leaders to help senior business and IT leaders respond to digital business opportunities and threats by creating signature-ready actionable and diagnostic deliverables that guide investment decisions."

Read more…
Building a system to get value from the Internet of Things (IoT) or Industrial Internet of Things (IIoT) is a complex process involving a number of very distinct components. We need an Architecture for IoT to define the components which are necessary and sufficient for the system to deliver value. An information system only delivers value if it completes the Information Value Chain, causing real-world action to take place in response to the data it collects. This is what the 5D Architecture does. Luckily, every IoT or IIoT system needs to perform the same 5 core functions in order to deliver value, and therefore the architecture of all these systems is — pleasingly — the same!
Read more…

Save IoT, Save The World

When looking for a title for this article, I remembered the famous phrase from TV serie Heroes, "Save the cheerleader,  Save the world". Sorry if one more time I abuse of shocking headlines to attract more readers. 

Is the Internet of Thing (IoT) in danger? In light of the latest events I have attended in Berlin and London and news like this "Intel To Amputate Three Modules For Internet Of Things, Including Joule", I really believe  IoT is falling into the Gartner´s Trough of Disillusionment phase  and we need IoT heroes to push it faster towards the Plateau of Productivity phase.

The other part of the article's title, "Save the World," may sound pretentious, but the world need to be save. This year hot spring and summer is confirming even the most disbelieving that Global Warming is very real (Read more at " Global Warming, Christmas and the Internet of Things" and in spite I do not consider that only IoT can save our blue planet, per recent events like "Portugal forest fire", IoT can help and much.

If we cannot control runaway pollution of our air and water, the world will end

Source: Ron Lake, Technology Evangelist at Galdos Systems Inc.

Let's go by parts.

Has the Interest in IoT Slowed Down?  Some Symptoms

The IoT no longer fills single events. Now events like Internet of Things World Europe 2017 or IoT Tech Expo Europe in Berlin need help from other technologies like AR/VR AI, Blockchain, or 5G to attract exhibitors and visitors.

The heroes of IoT have lost their past evangelizing enthusiasm. What IoT heroes need to do?

  • The IoT Industry heroes need to focus on Customer Value. It is important that IoT heroes address real pain points rather than creating something gimmicky.
  • IoT Heroes can not do it alone, partnership with other heroes are absolutely essential for success in the Internet of Things.
  • IoT heroes need to be more creative with new Use Cases. As sensors continue to decrease in cost and IoT-specific networks get rolled out, everybody expect the number of use cases to increase exponentially.
  • Raise awareness about the major concern, IoT Security
  • IoT heroes should follow the trends by pairing connectivity with AI/Blockchain/AR/VR heroes

How can IoT save us from world challenges?

Gary Atkinson, Director of Emerging Technologies at ARM, identifies five main challenges that the planet is heading towards:

1.       We’re running out of agricultural land

2.       Water is our rarest commodity

3.       Energy needs to be cheaper to be efficient. 

4.       Healthcare is a growing problem

5.       Transport - Everyone will be able to afford cars, but won’t be able to afford to pay for fuel.

Save IoT, Save Agricultural land

If we all expect that IoT Agricultural solutions will be cheap, will have a long-lasting battery (+10 years), and will emit signals at least 5 miles, the smart farming will be a reality and we will not have excuses to save agricultural land.

Additional info:

Save IoT, Save the Water

Water is currently the most precious natural resource on planet Earth.

On the occasion of World Water Day, tech giant IBM entered into a pact with Ireland’s Dublin City University for a collaborative R&D pilot to leverage the internet of things (IoT) technology to help protect water.

The IoT could for instance make desalinisation coming to a cost-effectiveness point. India uses mostly a pivot irrigation system, which means 30% of land is lost and 50 to 60% of water is lost by evaporation. The switch to tape based irrigation could save 2/3 of the water used.

Back in 2014, HydroPoint Data Systems utilised the Internet of Things (IoT) to help with water conservation efforts. According to the company and its partners, this system saved local people some $137m in expenses and 15 billion gallons of water in the first year alone.

Additional info:

Save IoT, Make Energy renewable and cheaper

Smarter, more efficient energy consumption it’s been the dream of environmentalists for decades. Now, it’s possible through the power of Internet of Things devices. Because of their connection capabilities, energy consumption such as the power in a commercial building or even smart home can be constantly monitored and adjusted.

Energy consumption could be reduced thanks to a smarter consumption and the implementation of micro generation storage. Knowing that lightning is the second biggest consumer of energy (after motors), and that there are about 1 billion streetlights in the world, upgrading streetlights infrastructure would strongly impact the world consumption.

Experts said that thanks to the Internet of Things, we can move from about 13 percent aggregate energy efficiency to 40 percent in the next 25 to 30 years.

Creating a new connected economy powered by renewable energy will cause a temporary surge in productivity worldwide as grids are modernized and infrastructure is rolled out. Installing wind and solar is labor intensive, for example, so for two generations, people will have plenty of work to do.

Additional info:

IoT company SkyGrid which is based in Melbourne and Sydney, is developing a smart hot-water system in partnership with hot-water company Quantum Energy. The aim is to intelligently control when a building’s hot-water systems are switched on, so that energy isn’t wasted heating water when no one is around to use it – something that currently wastes as much as 50% of a system’s power.

  • EnLight works on streetlight efficiency
  • Freestyle has partnered with engineering firm PowerTec, on an intelligent energy grid for Kangaroo Island in South Australia. Sensors and controllers in the grid intelligently manage energy sources to sway energy consumption towards renewables without sacrificing the reliability of the supply.
  • Top 10 Internet of Things Companies Disrupting the Energy Industry -
    • PingThings is combining big data and machine learning to change the way that state utility grids operate.
    • Actility employs IoT and machine-to-machine (M2M) communication to reinvent the way the energy sector operates.
    • Tibber is a personal assistant that can regulate a house’s energy consumption and buy more energy if the need arises.
    • Wattz is implementing solar power solutions that rely not on the sun’s light, but capturing ambient light from LED and compact fluorescent bulbs to recharge the batteries in IoT devices.
    • Positive Energy uses IoT devices and software to optimize the functional efficiency of industrial buildings and smart homes alike. 
    • Smappee allows users to turn devices on and off remotely. It also has the capability to monitor solar panel output values and gas and water usage in real-time.
    • GasZen allows customers to convert their traditional “dumb,” or non-networked, propane tanks into smart tanks that can be monitored by both the gas provider and the user remotely. 
    • 75Farenheit, beyond their ability to predict and adapt to changing climates, they offer analytics and suggestions on how to make the operation of a building more efficient.
    • Inspire Energy is giving citizens the power to become a part of the growing clean energy movement.
    • Verdigris Technologies primary target is energy consumption and waste.

Save IoT, Save Healthcare

Despite incredible improvements in health since 1950, there are still a number of challenges, which should have been easy to solve.

In a 2016 report by Deloitte we can read “Change is the new normal for the global health care sector. As providers, payers, governments, and other stakeholders strive to deliver effective, efficient, and equitable care, they do so in an ecosystem that is undergoing a dramatic and fundamental shift in business, clinical, and operating models. This shift is being fueled by aging and growing populations; the proliferation of chronic diseases; heightened focus on care quality and value; evolving financial and quality regulations; informed and empowered consumers; and innovative treatments and technologies — all of which are leading to rising costs and an increase in spending levels for care provision, infrastructure improvements, and technology innovations.”

The IoT has brought many exciting advances to healthcare, improving patient experiences, increasing the quality of care provided, as well as updating and streamlining healthcare operations. From digital assistants to ‘smart’ medicine bottles, a new wave of connected devices could help people live independently for longer.

According with Goldman Sachs, IoT functions would produce an estimated $32.4 billion in annual revenue (45% from remote patient monitoring, 37% from telehealth, and 18% from behavior modification). But Healthcare IoT not only increases revenue, IoT reduces this cost by offering a more cost-effective method of managing chronic illness. The $305 billion estimated savings is accounted for by a combination of chronic disease management and telehealth.

Additional info:

Save IoT, Save Transportation 

I leave this topic for a special post in the coming months.

Key Takeaway: Save IoT and IoT will enable Save the World

As I have commented many times the IoT is a Journey. Those who have been more time in the race know that there are easier and other more difficult stages, but not for that reason we abandon the hardness of climbing one of them.

 If we have not yet achieved that the IoT has a unique definition, it is not surprising that the term could disappear for reasons of business marketing. Nor does it matter that technologies such as AI, VR / AR, Robots, Blockchain, join to IoT to solve world problems. We could call it "Unified Information Technology".

The World of 2017 has some immense problems but It is scary to think about the challenges it for the next 10, 20 50 years. As we have seen IoT must play an important Enabler to Save the World. 

IoT heroes, save the IoT, Save the World.

 Thanks in advance for your Likes and Shares

Thoughts ? Comments ?

Read more…

Here is the latest round-up of articles from IoT Central. Remember: get your friends and enemies to join IoT Central here.

Navigating the Critical Database Decision While Building Our IoT Application

Posted by Gena Minevich

The promise of IoT solutions comes from their tremendous ability to harness data on a scale that has never before been possible. This data, wrangled by countless transmitters and sensors, offers us a wealth of insights about everything from the homes we live in to the products we buy to the health of our own bodies – all while IoT applications provide the power to act upon this data in real-time. Delivering these remarkable capabilities calls for a similarly capable database, one that can match IoT applications’ stringent requirements around performance, scalability, and availability.

Ongoing trends in IoT device lifecycle management

Posted by Mohit Bhardwaj 

IoT device lifecycle management is the key element for industries to have complete insight and control of their devices infrastructure. Today, device lifecycle management enables many industries to transition to ‘smart’ ecosystems, like smart energy (a.k.a Internet of Energy or smart grid), smart buildings, smart retail, smart transportation, smart cities, smart factories, and smart agriculture. As more and more devices get connected, the challenges with data security, control, and management becomes critical. IoT remote device lifecycle management plays a key role in enabling a 360 degree data view of the device infrastructure.

Interview: Bringing Machine Learning to The Edge

Posted by David Oro

A couple of weeks ago, I spent a few hours at GE Digital’s headquarters in San Ramon, CA. It was a great overview by several executives of how GE is using their Predixplatform to create software to design, build, operate, and manage the entire asset lifecycle for the Industrial IoT.  A big part of this transformation for GE involves hiring tons of software developersacquisitions, and partnerships. One of those partnerships is with Silicon Valley based FogHorn Systems (GE Ventures, Dell Ventures, March Capital and a few others are investors). FogHorn is a developer of “edge intelligence” software for industrial and commercial IoT applications. FogHorn and GE are working very closely on many IIoT customer use cases, across verticals, bolstered by the integration of FogHorn with Predix. I turned to FogHorn Systems CEO David C. King to learn more about edge intelligence software for the Industrial IoT.

The Buzz of Platforms and the Bazaar of IoT Platforms

Posted by Somjit Amrit

Among the words, phrases and acronyms in the Tech worlds “Platform” seems to be a word which seems to grab the headlines. If one listens to any pitch from a start up venture it would be not uncommon to get the “platform pitch”in at least 1 out of 2 proposals. A lazy search on Google on the “Top 20 Tech weary  words” fetched me the result that “platform was 3rd in the list . There have been words verbalised like “Being Platformed” as well and a host of books on the significance of platform in the Technology world. I will not go into the virtues of platform. I would dwell on how the leaders in respective segments  are a few ( a maximum of 3 ) while in the IoT world we seem to have by some counts 170 of them ( McKinsey ) to 400 of them ( Beecham Research).This is definitely a bewildering array to go through and investigate . What is a Platform – why there are only a few platform leaders ?

Infographic: Securing Connected Cars

Posted by David Oro 

In my recent interview with Sam Shawki, the founder and chief executive officer of MagicCube, I wrote about getting a new Ram Truck and noted that it was a beast not just in size and towing power, but a beast of electronics and connectivity. According to Intertrust Technologies, the percentage of new cars shipped with Internet connectivity will rise from 13% in 2015 to 75% in 2020, and that in 2020, connected cars will account for 22% of all vehicles on the road. That number is sure to grow. More stats in the infographic below. 

AggreGate Server on Nanopi NEO

Posted by Victor Polyakov

We’ve tested AggreGate Server on Nanopi NEO, one of the smallest Linux-based single-board PCs. Despite its small size, this device simply rules! It has RAM 512 Mb on board, 1,2 GHz quad-core CPU, 10/100M Ethernet network interface, and many other interfaces to connect the world. AggreGate possibilities on the NEO board are similar to Linux-based Tibbo Project System. It can act as a simple close-knit protocol gateway with intermediate data processing.


Follow us on Twitter | Join our LinkedIn group | Members Only | For Bloggers | Subscribe

 

 

Read more…

Why Data Visualization Matters Now?

Data Visualization is not new, it has been around in various forms for more than thousands of years. 
Ancient Egyptians used symbolic paintings, drawn on walls & pottery, to tell timeless stories of their culture for generations to come.
Human brain understands the information via pictures more easily than writing sentences, essays, spreadsheets etc. You must have seen traffic symbols while driving…why do they have only 1 picture instead of writing a whole sentence like school ahead, deer crossing or narrow bridge? Because you as driver can grasp the image faster while keeping your eyes on the road.
Over last 25 years technology has given us popular methods like line, bar, and pie charts showing company progress in different forms, which still dominate the boardrooms.
Data visualization has become a fundamental discipline as it enables more and more businesses and decision makers to see big data and analytics presented visually. It helps identify the exact area that needs attention or improvement than leaving it to the leaders to interpret as they want.
Until recently making sense of all of that raw data was too daunting for most, but recent computing developments have created new tools like Tableau, Qlik with striking visual techniques, especially for use online, including the use of animations.
There is a wealth of information hiding in the data in your database that is just waiting to be discovered. Even historical complicated data collected from disparate sources start to make sense when shown pictorially. Data Scientists do a fantastic job of analyzing this data using machine learning, finding relationship but communicating the story to others is the last milestone.
In today's Digital age, we as consumers generate tons of data every day and businesses want to use that for hyper-personalization, sending right offers to us by collecting, storing & analyzing this data. Data Visualization is the necessary ingredient to bring power of this big data to mainstream.
It is hard to tell how the data behaves in the data table. Only when we apply visualization via graphs or charts, we get a clear picture how the data behaves. 
Data visualization allows us to quickly interpret the data and adjust different variables to see their effect and technology is increasingly making it easier for us to do so. 
The best data visualizations are ones that expose something new about the underlying patterns and relationships contained within the data. Data Visualization brings multiple advantages such as showing the big picture quickly with simplicity for further action.
Finally as they say “A picture is worth a thousand words” and it is much important when you are trying to show the relationships within the data.
Data is the new oil, but it is crude, and cannot really be used unless it is refined with visualization to bring the new gold nuggets
Read more…

Among the words, phrases and acronyms in the Tech worlds “Platform” seems to be a word which seems to grab the headlines. If one listens to any pitch from a start up venture it would be not uncommon to get the “platform pitch”in at least 1 out of 2 proposals. A lazy search on Google on the “Top 20 Tech weary  words” fetched me the result that “platform was 3rd in the list . (https://www.businessinsider.com.au/the-worlds-top-20-tech-weary-words-for-2014-2014-5).

There have been words verbalised like “Being Platformed” as well and a host of books on the significance of platform in the Technology world. I will not go into the virtues of platform. I would dwell on how the leaders in respective segments  are a few ( a maximum of 3 ) while in the IoT world we seem to have by some counts 170 of them ( McKinsey ) to 400 of them ( Beecham Research).This is definitely a bewildering array to go through and investigate .

What is a Platform – why there are only a few platform leaders ?

Stepping back – different people have different views and meanings of the word “platform”. To get a view of the diversity of platforms we have:

Browsers (Chrome and Firefox) ,smart phone operating systems ( iOS and Android) , blogging  (Word Press , Medium ) .Social Media titans (YouTube, Facebook) and even Instagram are described as platforms. Uber, Airbnb and their ilk are widely described as ‘marketplaces’, ‘platforms’ or ‘marketplace-platforms.’ Web services (Google Payments, Amazon Elastic Cloud) and  gaming consoles (Xbox, Apple’s ipod Touch, Sony Playstation). One interesting point to be  noted that in each category the market is mostly duopolistic .

To accommodate this diversity the safest definition of platform would be as :

  1.  An extensible codebase of a software-based system that provides core functionality provided by the modules that interoperate with it, and the interfaces ( aka Application Programming Interface (APIs)) through which they interoperate. In effect this system  abstracts a number of common functions without bringing out the complexity of building and managing them ,  for the users .
  2.  The goal is to  enable interactions between producers and the consumers
  3. This is enabled through three layers comprising the Network ( to connect  participants to the platform), Technology Infrastructure ( to help create and exchange value )  and Workflow and Data ( thereby matching participants with content , goods and services ) .

This definition brings in the 2 dimensions of a platform. One that would be for internal use and the other for external use .

  1. An internal dimension  for building platforms is to ensure all necessary modules interoperate , and
  2. An external dimension for building platforms is to enable interaction with the outside world and make it as accessible and usable as is possible.

Internal dimension led platforms focus on internal productivity and efficiencies and focus on users. Here the development is internally sourced and is essentially  built for internal use .  The external dimension led platforms focus on the supply (developer side) and the demand (user side) . Essentially they are sometimes termed as “two-sided” platforms .The development beyond a point is crowd-sourced and they enrich the platform and the platform reaches out to them through APIs.

In most of the cases if the external dimension is well evolved then the internalities come with the efficiencies by default; with respect to design quality , selection of interfaces leading to interoperability  , robustness of infrastructure , seamlessness in workflow and data streaming  .

External dimension platforms compete for both users and developers

Here one important aspect to be remembered is a Platform may not be ready to provide solutions to contextual and domain specific problem statements. Applications built around the platform do that, these applications help get the Return on Investment ( RoI ) from the platforms .

In any segment you must have seen that the winners are a few ( atmost 2 or 3  , aspirants may be many, who progressively wither away )  .The reasons has been presented above with respect to design quality , interoperability, infrastructure robustness and seamlessness in workflow and data flow and the last but not the least excellent and friendly user interface . Not many can master all the 4 aspects .These help acquire a critical mass of customer base which keeps growing and a duopoly of sorts is created in the market space .

Successful platforms have the ability to support the variety of business use cases in the present and have strive to  build the  design to evolve over time and be to an extent future ready .

The Bazaar of IoT platforms- The reasons & who would be the winners  wading through the maze ?

Now when coming to Internet of Things (IoT)  , The IoT  movement repeatedly talks about platforms, but those definitions don’t align with any of Uber, Medium or Android. The first issue is interoperability.  And none of these align with each other either.

Now let us address the question is the why of “plethora of platforms” in IoT .

It can be seen clearly that a typical architecture of an IoT solution is multilayered. The layers to simplistically put would be Device to Device ( this involves hardware and firmware with Low Range Communication ) , Device to Server ( which would again involve hardware and communication ) and server to server ( which would mean that cloud based application and long range communication would hold the key along with network , data storage and data visualisation ) .

So we see protocols and standards are driven through their origins from communication technologies ( we see Telecom companies like AT&T and Verizon leading here ) , in the data storage area ( we have Amazon , Google leading the way ) , in the application side ( Azure from Microsoft and Thingworx from PTC being the prominent ones ) . Companies which has a library of business use cases with them given the dominance they have in their respective businesses (namely Bosch , GE , Honeywell ) have the ambition to build their community based platforms .Then we have a host of start ups who run a platform per a business use case they address .

So the genesis of the “plethora of platforms” in the multilayered solution stack of IoT . This adds to complexity and hence no one player can be a leader across the layers as on date .

In the coming  years it could be reckoned that there would be a shakeout in the market and the platforms could veer around key broad based use cases of remote monitoring and environment conditioning , predictive maintenance and process automation .

The ones which will win the battle of supremacy would have cracked the codes of

  1. Security,
  2. Open interfaces,
  3. Carrier grade reliability,
  4. Service levels,
  5. Scalability and
  6. And allow for aa seamless integration into the back-office environment which is essential to the enterprise’s business operations.
  7. With a impressive  usability and user interface .

Given the multitier architecture and the attendant complexity it will be a while before a small group of winners starts to bubble to the top . Some of the also-ran aspirants may focus on domains and address a  specific part of the ecosystem in which to play or in the industry segments like home or industrial to justify their presence .

 

 

Read more…

Interview: Bringing Machine Learning to The Edge

A couple of weeks ago, I spent a few hours at GE Digital’s headquarters in San Ramon, CA. It was a great overview by several executives of how GE is using their Predix platform to create software to design, build, operate, and manage the entire asset lifecycle for the Industrial IoT.  A big part of this transformation for GE involves hiring tons of software developers, acquisitions, and partnerships.

One of those partnerships is with Silicon Valley based FogHorn Systems (GE Ventures, Dell Ventures, March Capital and a few others are investors). FogHorn is a developer of “edge intelligence” software for industrial and commercial IoT applications. FogHorn and GE are working very closely on many IIoT customer use cases, across verticals, bolstered by the integration of FogHorn with Predix.

I turned to FogHorn Systems CEO David C. King to learn more about edge intelligence software for the Industrial IoT. David has been at the helm of FogHorn since 2015, a year after its founding. Prior to FogHorn, David co-founded AirTight Networks, Inc., a technology leader in secure cloud-managed Wi-Fi. Before AirTight, he served as Chairman, President and Chief Executive Officer of Proxim Inc., a pioneer in WLANs and the first publicly traded Wi-Fi company, from 1993-2002.

Lots of talk about the edge in IoT. It’s my smartphone and my doorbell, as well as the sensor on a traffic light or a wind turbine. What exactly is the edge of the network and how do you define it?

We define edge as the closest compute point that can process real time streaming data. So in your case, all three -- phone, doorbell, sensors -- are edges because you can bring compute to the data on any of these platforms. The question is what compute is possible? The single variable filtering that you can do on a sensor is very simple when compared to the complex Machine Learning models that can execute on your phone.   

Analytics is done in the data center or cloud. You claim to do this at the edge now.  Please describe your offering.  

FogHorn has developed a tiny footprint complex event processor (CEP) that provides advanced streaming analytics, and machine learning capabilities at the edge.  This powerful combination of being able to pre-process, cleanse the data and execute ML models, all in real-time, brings the power of big data analytics to the edge. The FogHorn software platform is highly flexible and can be easily scaled to optimize for footprint and/or feature needs.

Tell us about a customer you’re working with and how they are applying your technology.

FogHorn Lightning is an extensible platform currently used by customers from Manufacturing, Oil & Gas, Power & Water, Renewable Energy, Mining, Transportation, Smart Buildings/Cities and other industrial verticals. The deployment patterns range across gateways, PLCs, to ruggedized servers in production, at Fortune 100 sites. A common implementation of FogHorn Lightning is product quality inspection, predictive maintenance, real time health monitoring. Customers are seeing immediate business value; e.g. identifying defects in the early stages of manufacturing reduces, scrap and increases yield. Additionally, there is a trend to FogHorn to generate new streams of revenue by providing real-time smart maintenance for their end customers.

When compared to software-defined IIoT smart gateways, there are still millions more hardware-defined M2M gateways out there. At what point do we cross the chasm to smarter gateways, and where are we now in this cycle?

We are still very early in adoption of IIoT technologies. Understandably, typical industrial sectors are conservative, and have much longer adoption curves. However, we are beginning to observe that it the ROI from edge intelligence is accelerating customer demand for FogHorn. We will cross the chasm once industries identify key use cases that generate new revenue streams, which is still about 3-5 years away.

You can’t talk about IoT without talking about security, and it’s even more important in the industrial sector. How do you address security concerns for your customers and what does the industry need to do to make IoT more secure?

Yes, you are right. When you think of IoT, especially IIoT, security is a top concern. Hacks such as “Devil’s Ivy” will become everyday events with increasingly connected devices. At FogHorn, our edge intelligence software runs very close to the data source, and is local to the asset. This implies that we are secure (like the assets) behind firewalls, and in a DMZ layer. And because most of our processing is done locally, we are less vulnerable to malicious hacks that occur when connected.

Because IIoT is still such a nascent set of technologies, we caution users to deploy solutions after thoroughly weighing the business value, and convenience versus security risk factors. My guiding question before any deployment: “Can I do this locally, without connecting to an external network?”. The answer is usually yes, and if otherwise, you should probably talk to us.

How can companies make their industrial processes better?

We understand that today’s industrial processes are highly complex and advanced, with many moving parts. While it may seem humanly impossible to optimize it any more without help from technology, we believe that a key asset is still untapped: your operator! Companies will start seeing incredible improvements once they translate the tribal knowledge on the plant floor into actionable insights. This can be further supplemented by techniques from machine learning, and artificial intelligence, to tease out the known unknowns, and also, the unknown unknowns.

Anything else you’d like to add?

FogHorn is redefining edge intelligence for IIoT. A year ago, we started our journey as a company that did analytics on tiny footprint devices. Today, we have accelerated the transition to Machine Learning at the edge, and are very are excited about the market validation. With our Operational Technology focus, we are looking forward to defining new business models, and delivering transformational value for our industrial customers.

Read more…

Do you want to hire a Data Scientist?

As mentioned by Tom Davenport few years back,Data Scientist is still a hottest job of century.
Data scientists are those elite people who solve business problems by analyzing tons of data and communicate the results in a very compelling way to senior leadership and persuade them to take action.
They have the critical responsibility to understand the data and help business get more knowledgeable about their customers.
The importance of Data Scientists has rose to top due to two key issues:
·     Increased need & desire among businesses to gain greater value from their data to be competitive
·     Over 80% of data/information that businesses generate and collect is unstructured or semi-structured data that need special treatment
So it is extremely important to hire a right person for the job.Requirements for being a data scientist are pretty rigorous, and truly qualified candidates are few and far between.
Data Scientists are very high in demand, hard to attract, come at a very high cost so if there is a wrong hire then it’s really more frustrating. 
Here are some guidelines for checking them:
·     Check the logical reasoning ability
·     Problem solving skills
·     Ability to collaborate & communicate with business folks
·     Practical experience on collaborating Big Data tools
·     Statistical and machine learning experience
·     Should be able to describe their projects very clearly where they have solved business problems
·     Should be able to tell story from the data
·     Should know the latest of cognitive computingdeep learning
I have seen smartest data scientists in my career who do the best job best but cannot communicate the results to senior leaders effectively. Ideally they should know the data in depth and can explain its significance properly. Data visualizations comes very handy at this stage.
Today with digital disrupting every field it has an impact on data science also.
Gartner has called this new breed as citizen data scientists. Their primary job function is outside analytics, they don’t know much about statistics but can work on ready to use algorithms available in APIs like Watson, Tensor flow, Azure and other well-known tools.
The good data scientist can make use of them to spread the awareness and expand their influence.
It has become more important to hire a right data scientist as they will show you the results which may make or break the company.
Read more…

This is my first post on IoT Central.  Looking forward to hearing more about IoT from the members.  

I am curious to see what people think of the MKR Labs report on how hackers can turn your Amazon Echo into a listening device.  According to the report, tt seems one of Amazon's most popular and new products is vulnerable to "a physical attack that allows an attacker to gain a root shell on the underlying Linux operating system and install malware without leaving physical evidence of tampering."  This type of malware can give hackers remote access to your Echo.  It will also give them the ability to grab customer authentication tokens and the ability to stream live microphone audio to any other remote services without altering the functionality of the device.  

Today, GearBrain did a post on this news.  I am curious to see what others think about this type of hacking and how big of an issue you think this is to Amazon and other manufacturers of voice controlled digital assistants like Echo.  

Read more…

Infographic: Securing Connected Cars

In my recent interview with Sam Shawki, the founder and chief executive officer of MagicCube, I wrote about getting a new Ram Truck and noted that it was a beast not just in size and towing power, but a beast of electronics and connectivity. According to Intertrust Technologies, the percentage of new cars shipped with Internet connectivity will rise from 13% in 2015 to 75% in 2020, and that in 2020, connected cars will account for 22% of all vehicles on the road. That number is sure to grow. More stats in the infographic below. 


Connected Cars

Read more…

AggreGate Server on Nanopi NEO

We’ve tested AggreGate Server on Nanopi NEO, one of the smallest Linux-based single-board PCs.

Despite its small size, this device simply rules! It has RAM 512 Mb on board, 1,2 GHz quad-core CPU, 10/100M Ethernet network interface, and many other interfaces to connect the world.

AggreGate possibilities on the NEO board are similar to Linux-based Tibbo Project System. It can act as a simple close-knit protocol gateway with intermediate data processing.

Check it infrastructure monitoring toolsindustrial automation software and other AggreGate IoT solutions on our website.

Read more…

The promise of IoT solutions comes from their tremendous ability to harness data on a scale that has never before been possible. This data, wrangled by countless transmitters and sensors, offers us a wealth of insights about everything from the homes we live in to the products we buy to the health of our own bodies – all while IoT applications provide the power to act upon this data in real-time. 

Delivering these remarkable capabilities calls for a similarly capable database, one that can match IoT applications’ stringent requirements around performance, scalability, and availability. While these are the three core database concerns behind just about any application, I’d argue the IoT offers an even more mission-critical use case. Consider the magnitude of the data that IoT applications must process, along with the fact that many of these applications aren’t even functional if they cannot perform with near real-time fidelity – let alone become wholly unavailable – and it’s clear that IoT applications are introducing a new level of intensity when it comes to the demands placed on the database.

We know a thing or two about vetting and working with database solutions for IoT applications. Our project is the creation of an IoT Smart Kitchen Commerce solution –embedded within connected kitchen appliances – which links to grocery retailers and makes it easy to purchase needed items from within the customer’s own kitchen. Enabling this system’s simple-to-use front-end required tremendous back-end complexity. The database facilitating the solution needed to handle rich information on more than 1 million individual grocery products, which had all been collected by mapping retailers’ online product catalogs in granular detail. At its heart, this smart kitchen solution is all about the quality of the experience it offers users – whether it could truly make grocery shopping and managing kitchen restocking easier for customers – so its success absolutely depends on the database delivering real-time responsiveness and total reliability.

A major question in determining the specifics of the database we’d use for this project was the debate between an SQL or a NoSQL strategy. Each of these technologies has a lot to offer when applied to solutions for which they’re well-suited. Different IoT application use cases may benefit from utilizing SQL (relational) or NoSQL (non-relational) databases, and, to be both diplomatic and accurate, many projects might use both in tandem to great effect.

In our particular case, we discovered that NoSQL was the best fit for the task at hand, though the path to realizing it was a bit of a winding road. We actually began developing our solution by using MySQL as the database, managed by our internal team. Unfortunately, this led to a lot of difficulties. We found MySQL replication and other  administrative work to be disagreeable and full of thorny issues.

To alleviate what was becoming quite a pain point, we then decided to shift to a NoSQL database, recognizing that open source MongoDB would fit our solution much better, for reasons I’ll elaborate on. Also, having learned that managing the database internally wasn’t preferred or a great use of our resources, we enlisted mLab to be our MongoDB Database-as-a-Service provider. This accomplished two things: it made sure that the move to NoSQL MongoDB was handled seamlessly by experts, and provided our team the bandwidth to dive headlong into product development, which was really the best use of our team’s time.

NoSQL made sense in our case because the many data providers we work with use different data schemas. This understandably had created issues with MySQL, which requires defined schemas prior to accepting data. For enterprises in this position, our advice is to utilize the advantages of a NoSQL database in providing denormalized documents to quickly and easily accept any data in any format.

In addressing the key issue of scalability, a NoSQL approach has some effective advantages for dealing with the vast magnitude of IoT datasets while maintaining high performance. With NoSQL, for instance, it’s possible to utilize however many sharded servers there’s a need for, while each single server can maintain a limited and pre-determined size, making scaling easy to execute. It’s also worth saying that, because NoSQL shifts a good deal of logic to the application and away from the database, it’s easier for enterprises to recruit the talent they need. Great Java or C# coders outnumber great database programmers, and the expertise to optimize performance for complex queries is a rare and valued skill. While this remains true for any database, implementing a NoSQL strategy makes it that much less challenging to put a team in place with the right skillsets. It’s also the case that NoSQL is rapidly rising in popularity throughout the industry – with this rise, the tooling, frameworks, and knowhow required to best utilize NoSQL are becoming increasingly prevalent as well.

As we’ve discovered through our experience with MongoDB, the non-relational database has proven appropriate to meeting the huge data-handling demands of always-on IoT applications. Again, relational databases may offer a better fit for certain IoT applications, such as those working with less sizeable or dynamic datasets. But for our particular application – by handling vast and dynamic datasets regardless of their structure – our open source, NoSQL MongoDB approach provides the high-speed read and write performance, scalability, and high availability needed to deliver positive and effective real-time experiences for IoT consumers.

Read more…

Upcoming IoT Events

More IoT News

IoT Career Opportunities