All Posts (945)

Guest blog post by Ajit Jaokar

In this post, I propose that IoT analytics should be a part of 'Smart objects' and discuss the implications of doing so

The term ‘Smart objects’ has been around from the times of Ubiquitous Computing.

However, as we have started building Smart objects, I believe that the meaning and definition has evolved.

Here is my view on how the definition of Smart Objects has changed in the world of Edge Computing and increasing processing capacity

At a minimum, a smart Object should have 3 things

a) An Identity ex ipv6
b) Sensors / actuators
c) A radio (Bluetooth / cellular etc)

In addition, a smart object could incorporate

a) Physical context ex location
b) Social context ex proximity in social media

To extend even more, Smartness could incorporate analytics

Some of these analytics could be performed on the device itself ex computing at the edge concept from Intel, Cisco and others.

However, Edge Computing as discussed today, still has some limitations

For example:

a)     The need to incorporate multiple feeds from different sensors to reach a decision ‘at the edge’

b)    The need for a workflow process i.e. actions based on readings – again often at the edge with it’s accompanying security and safety measures

To manage multiple sensor feeds, we need to understand concepts like sensor fusion (pdf) (source freescale).

We already have some rudimentary workflow through mechanisms like IFTTT(If this then that)

In addition, the rise of CPU capacity leads to greater intelligence on the device – for example Qualcomm Zeroth platform which enables Deep learning algorithms on the device.

So, in a nutshell, its a evolving concept especially if we include IoT analytics in the definition of Smart objects (and that some of these analytics could be performed at the Edge)  ..

We cover these ideas in the #DataScience for #IoT course and also at the courses I teach at Oxford University

Comments welcome

Follow us @IoTCtrl | Join our Community

Read more…

Time Series IoT applications in Railroads

Guest blog post by Ajit Jaokar

Time Series IoT applications in Railroads

Authors: Vinay Mehendiratta, PhD, Director of Research and Analytics at Eka Software

and Ajit Jaokar, Data Science for IoT course  

 

This blog post is part of a series of blogs exploring Time Series data and IoT.

The content and approach are part of the Data Science for Internet of Things practitioners course.  

Please contact [email protected] for more details.

Only for this month, we have a special part-payment pricing for the course (which begins in November).

We plan to develop these ideas more – including an IoT toolkit in the R programming language for IoT datasets. You can sign up for more posts from us HERE

Introduction 

Over the last fifteen years, Railroads in the US, Europe and other countries have been using  RFID devices on their locomotives and railcars.  Typically, this Information is stored in traditional (i.e. mostly relational) databases. Information from the RFID scanner provides information about the railcar number and locomotive number. This railcar number is then mapped to existing railcar and train schedule. Timestamp information on scanned data also provides us the sequence of cars on that train. Information from data obtained by scanning RFID on locomotive provides us the number of locomotives and the total horsepower assigned to the train. It also informs whether locomotive is coupled in front of the train or rear of the train.

The scanned data  requires cleansing. Often, readings  from a railcar RFID are  missing at certain scanner. In this case, the missing value is estimated by looking at the scanner reading  before and after the problematic scanner to estimate the time of arrival.

Major Railroads have also defined their territory using links where a link is the directional connection between two nodes.  Railroads have put RFID scanners at major links. 

 An RFID gives information on railcar sequence in train, locomotive consist, and track in real-time.  Railroads store this real-time and historical data for analysis.

Figure 1: Use cases of Rail Time Series Data

 

 

Figure 1 above shows use cases of time series data in railroad industry. We believe that all of these use cases are applicable for freight railroads. These use cases can also be used for passenger railroads with some changes.  They involve the use of Analytics and RFID

Uses of Real-Time Time Series Data

Here are some ways that time series data is/can be used in railroads in real-time.

  1. Dispatching: Scanner data is being used for dispatching decisions for many years now.  Scanner data is used to display the latest location of trains. Dispatchers use this information, track type, train type, time table information to determine the priority that should be assigned to various trains.
  2. Information for Passengers: Passengers can use train arrival and departure estimates for planning their journey.

 

Uses of Historical Time Series Data:

Here are some ways that historical time series data is/can be used in railroads.

  • Schedule Adherence Identify trains that are consistently delayed: We can identify trains that are on Schedule, delayed or earlier. . We can identify trains that consistently occupy tracks more than the schedule permit. These are the trains that should be considered for a schedule change. These are the trains that are candidate for root cause analysis.

  • Better Planning: We would be able to determine if planned ‘sectional running time’ are accurate or need to be checked. Sectional run times are generally determined based on experience and are estimates at network level but don’t consider local infrastructure (signal, track type). Sectional running time is used in development of train schedule and maintenance schedule at network and local level

  • Infrastructure Improvemen - Track Utilization: We can identify the section of track where trains have the highest occupancy. This would lead us to identify tracks that are being operated near track capacity or above track capacity. Assumption here is that Utilization above track capacity would result in delays. We can identify the set of trains, tracks, time of day, day of the week when occupancy is high and low. This would provide us insights in train movement and perhaps provide suggestions on train schedule change. We might be able to determine if trains are held up at station/yards or on mainline.  An in-depth and careful analysis can help us determine if attention needs to be paid to yard operations or mainline operations.

  • Simulation Studies: RFID scan data provides us actual time of arrival and departure for every car (hence train). Modelers do create hypothetical trains to feed to simulation studies. This information (actual train arrival/departure time at every scanner, train consist, locomotive consist) is used in infrastructure expansion projects.

  • Maintenance Planning : Historical Occupancy of tracks would enable us to identify time windows when maintenance should be scheduled in future. Railroads use inspection cars to inspect and record track condition regularly. Some railroads are facing the challenge of getting the right geo coordinates for segment of track. Careful insights of this geo and time series data measure track health and deterioration. Satellite imagery data is becoming available frequently. A combination of these two sources can do well to inspect tracks, schedule maintenance, predict track failures, and move maintenance gangs.

  • Statistical Analysis of Railroad Behavior
  1. We can map train behavior with train definition (train type, schedule, train speed, train length) and track definition (signal type, track class, grade, curve, authority type) and identify patterns.
  2. Passenger trains do affect the operations of freight trains. Scanner data can be used to determine the delay imposed on freight trains
  3. Time series information of railcars can be used to identify misrouted cars or lost cars.
  4. Locomotive consist information and time series data based performance can be used together to determine the best locomotive consist such as make, horsepower (historically) for every track segment
  5. Locomotive is a costly asset for any railroad. Time series data can easily be used to determine locomotive utilization.
  •  Demand Forecasting : Demand for railroad empty cars is known as an indicator of a country’s economy. While demand of railroad cars vary with car type and macro-economic factors, it is worth making efforts getting insights on historical perspective. Number of cars by car type can be estimated and forecasted for every major origin-destination pair. Number of train starts and train ends at every origin and destination can be used to forecast the number of trains for a future month. Number of trains forecasted would help a railroad determine the number of crew, locomotives. It would also help railroad determine the load that tracks would go through.  Number of forecasted trains can be used in infrastructure studies.

 

  • Safety: Safety is  the most important feature of railroad culture. Track maintenance, track wear and tear ( track utilization) are all related to safety. Time series data of railcars, signal type, track type, train type, accident type, train schedule can all be analyzed together to identify potential relationship (if any) between various relevant factors.

 

  • Train Performance Calculations: What is the unopposed running speed on a track with a given grade, curve, locomotive consist, car type, wind direction and speed?  These factors were  determined by Davis [1] in 1926. Could time series data help us calibrate the co-efficient of Davis’s equation for railcars with new design?

  • Planning and Optimization: All findings above can be used to develop smarter optimization models for train schedule, maintenance planning, locomotive planning, crew scheduling, and railcar assignment.

 

Conclusion:

In this article,  we have highlighted some use cases of time series data for Railroads. There are many more factors that could be considered especially in the use of Technology for implementing these Time series algorithms. In subsequent sections, we will show how some of these use cases could be implemented based on the R programming language.

To know more about the Data Science for Internet of Things practitioners course.  Please contact [email protected] for more details. You can sign up for more posts from us HERE

Reference:

  1. Davis, W.J, Jr.: The tractive resistance of electric locomotives and cars, General Electric Rewiew, vol. 29, October 1926. 

Follow us @IoTCtrl | Join our Community

Read more…

The 10 Best Books to Read Now on IoT

At IoT Central we aim to cover all things industrial and IoT. Our site is segmented into five channels: Platforms, Apps & Tools, Data, Security and Case Studies. If you’re going to connect everything in the world to the Internet you should expect to cover a lot. That means plenty of reading, sharing and discussing.  

To tackle the reading part we reached out to our peers and friends and put together the 10 best books to read now on IoT. From theoretical to technical, we tried to find the most important and current reading while throwing in one or two relevant classics.

Below is the list we compiled. What books would you recommend?

Shaping Things

By Bruce Sterling

sterlingjpg.jpg

I first came across Bruce Sterling’s name when he wrote the November 1996 Wired cover story on Burning Man. I happened to attend the desert arts festival for the first time that year and Bruce’s prose nailed the the experience. I’ve been a fan of his ever since. "Shaping Things is about created objects and the environment, which is to say, it's about everything," says Bruce. This is a great higher level book that looks at the technosocial transformation needed to understand our relationship between the Internet of Things and the environment in which it exists.

The Hardware Startup

By Renee DiResta, Brady Forrest, Ryan Vinyard

hardwardstartup.gif

Consumer Internet startups seem to get all the media ink these days - think AirBnB, Instagram, What’sApp, Uber. But many forget that much of the technological innovation began with hardware - think Fairchild Semiconductor, Xerox PARC and the stuff that came out of IBM. With an emphasis on ‘Things,’ IoT is set to usher in a new era of hardware startups and any entrepreneur in this space should find this book to be a valuable read.

IoT M2M Cookbook

By Harald Nauman

IoT-M2M-Cookbook-Cover-frame-283x400.png

If IoT devices can’t communicate, you’re not going to get much use out of them. Someone pointed me to Harald Naumann’s book IOT/M2M Cookbook. Harold is an M2M evangelist with a primary interest in implementation of wireless applications. His blog is chocked full of technical tips on wireless communications.

IoT Central members can see the full list here. Become a member today here

Read more…

guest blog by Jin Kim, VP Product Development for Objectivity, Inc.

Almost any popular, fast-growing market experiences at least a bit of confusion around terminology. Multiple firms are frantically competing to insert their own “marketectures,” branding, and colloquialisms into the conversation with the hope their verbiage will come out on top.

Add in the inherent complexity at the intersection of Business Intelligence and Big Data, and it’s easy to understand how difficult it is to discern one competitive claim from another. Everyone and their strategic partner is focused on “leveraging data to glean actionable insights that will improve your business.” Unfortunately, the process involved in achieving this goal is complex, multi-layered, and very different from application to application depending on the type of data involved.

For our purposes, let’s compare and contrast two terms that are starting to be used interchangeably – Information Fusion and Data Integration. These two terms in fact refer to distinctly separate functions with different attributes. By putting them side-by-side, we can showcase their differences and help practitioners understand when to use each.

Before we delve into their differences, let’s take a look at their most striking similarity. Both of these technologies and best practices are designed to integrate and organize data coming in from multiple sources in order to present a unified view of data for consumption by various applications to derive actionable insights, thus making it easier for analytics applications to use and derive the “actionable insights” everyone is looking to generate.

However, Information Fusion diverges from Data Integration in a few key ways that make it much more appropriate for many of today’s environments.

• Data Reduction – Information Fusion is, first and foremost, designed to enable data abstraction. So, while data integration focuses on combining data to create consumable data, Information Fusion frequently involves “fusing” data at different abstraction levels and differing levels of uncertainty to support a more narrow set of application workloads.

• Handling Streaming/Real-Time Data – Data Integration is best used with data-at-rest or batch-oriented data. The problem is that the most compelling applications associated with Big Data and the Industrial Internet of Things are often based on streaming, sensor data. Information Fusion is capable of integrating, transforming and organizing all manner of data (structured, semi-structured, and unstructured), but specifically time-series data, for use by today’s most demanding analytics applications to bridge the gap between Fast Data and Big Data. Another way to put this is Data integration creates an integrated set of data where the larger set is retained. By comparison, Information Fusion uses multiple techniques to reduce the amount of stateless data and provide only the stateful, valuable and relevant, data to deliver improved confidence.

• Human Interfaces – Information Fusion also adds in the opportunity for a human analyst to incorporate their own contributions to the data in order to further reduce uncertainty. By adding and saving inferences and detail that can only be derived with human analysis and support into existing and new data, organizations are able to maximize their analytics efforts and deliver a more complete “Big Picture” view of a situation.

As you can see, Information Fusion, unlike Data Integration, focuses on deriving insight from real-time streaming data and enriching this stream with semantic context from other Big Data sources. This is a critical distinction, as todays most advanced, mission-critical, analytical applications start looking to Information Fusion to add real-time value.

Originally posted on Data Science Central

Follow us @IoTCtrl | Join our Community

Read more…

Looking Back on a Decade of Analytics

Guest blog post by Venkat Viswanathan

More than 10 years ago, on an early summer afternoon in 2005, I recall an interesting conversation with a friend about the potential of analytics, in an empty coffee shop on the beach. Having spent much of the previous decade helping clients derive value from traditional Business Intelligence (BI) and other IT implementations, we were conjecturing that the next wave might be business teams enhancing their decision making with better data and more detailed analysis, sometimes adopting advanced math. We spent hours talking about our collective experience, and the more we spoke, the more it piqued my curiosity.

Deriving Actionable Insights

Around that time, the emphasis on analytics was primarily for its predictive capabilities. Mainstream media played this up with cover stories on BusinessWeek (“Math will Rock Your World”, Jan 2005, ), by Stephen Baker) and in popular business books like Freakonomics, Steven Levitt & Co., 2005 and later Super Crunchers, Ian Ayres, 2007. However, within a couple of years, it became evident that the vast majority of businesses were still not convinced about the quality of data they collect, (many still aren’t!) and the fundamental need was to create organizational change. Companies needed to understand there was a better way to derive actionable business insights that would result in better decisions on the ground. More than just help with solving tough math problems, large enterprises needed the ability to dig deep into the data, working in the trenches to create a center of excellence tailored to address their specific gaps in knowledge, talent availability and problem solving skills essential to realize the analytics potential.

Along the way, changes in the business environment, innovations in the technology ecosystem, and the successes of data driven business models – creating digital role models like Google AdWords, and Facebook early on and Uber and AirBnB more recently – have redefined the Analytics opportunities for businesses. As social media platforms took off, companies took notice, and around 2009, consumer brands started warming up to “social listening.” Decoding social media conversations and other unstructured data became an important part of their analytics needs. Knowledge of semantic analysis, platform APIs, and client business context proved crucial in combining machine intelligence with human intelligence and delivering digital business insights.

Visualizing the Future

Another wave of Analytics for enterprises was around data visualization. I love saying “visualization is to analytics what email was to the Internet.” A simple, engaging and dynamic application that is used every day and helps with managing a business. The pioneering work and books of Edward Tufte and Stephen Few were an inspiration for many as they realized both the power of data visualization and that they needed help to do it well. Tableau’s meteoric rise and ever expanding footprint meant that by 2012 many companies had already invested in it and were looking for knowledge and expertise in embedding it within their businesses. The visual medium is crucial to democratizing access to data and allowing business users to quickly navigate to actionable insights.

Fast forward to 2015, and peering ahead, the future is bright! With the advent of multiple iterations of drones, intelligent machines, connected devices and wearables – from watches to activity trackers to devices that infer your state of mind from your breath, and the increasing buzz around the potential for the Internet of Things (IoT), we are at the cusp of some fundamental breakthroughs. In less than a decade, our machine generated data and algorithm driven interactions will far outnumber traditional data sources and applications. Analytics will gain all new interpretations, and business applications. The future of the space has infinite possibilities and is exceedingly exciting. As we head toward the close of 2015, we can look forward to the coming year assured that it will be filled with innovation and advancement. I feel fortunate to be a part of this incredible industry and look forward to the many advancements to come.

Follow us @IoTCtrl | Join our Community

Read more…

Guest blog post by Ajit Jaokar

Introduction

 This blog is a review of two books. Both are available for free from the MapR site, written by Ted Dunning and Ellen Friedman (published by O Reilly) : About Time Series Databases: New ways to store and access data andA new look at Anomaly Detection

 The  MapR platform is a key part of the Data Science for the Internet of Things (IoT) course – University of Oxford and I shall be covering these issues in my course

 In this post, I discuss the significance of Time series databases from an IoT perspective based on my review of these books. Specifically, we discuss Classification and Anomaly detection which often go together for typical IoT applications. The books are easy to read with analogies like HAL (Space Odyssey ) and I recommend them.

 

Time Series data

The idea of time series data is not new. Historically, time series data can be stored even in simple structures like flat files. The difference now is the huge volume of data and the future applications possible by collecting this data – especially for IoT. These large scale time series databases and applications are the focus of the book. Large scale time series applications typically need a NoSQL database like Apache Cassandra, Apache HBase,  MapR-DB etc.  The book’s focus is Apache HBase and MapR-DB for the collection, storage and access of large-scale time series data.

  Essentially, time series data involves measurements or observations of events as a function of the time at which they occurred. The airline ‘black box’ is a good example of a time series data. The black box records data many times per second for dozens of parameters throughout the flight including altitude, flight path, engine temperature and power, indicated air speed, fuel consumption, and control settings. Each measurement includes the time it was made. The analogy applies to sensor data. Increasingly, with the proliferation of IoT, Time series data is becoming more common and universal. The data so acquired through sensors is typically stored in Time Series Databases.  The TSDB (Time series database) is optimized for best performance for queries based on a range of time

 

Time series data applications

Time series databases apply to many IoT use cases for example:

  • Trucking, to reduce taxes according to how much trucks drive on public roads (which sometimes incur a tax). It’s not just a matter of how many miles a truck drives but rather which miles.
  • A smart pallet can be a source of time series data that might record events of interest such as when the pallet was filled with goods, when it was loaded or unloaded from a truck, when it was transferred into storage in a warehouse, or even the environmental parameters involved, such as temperature.
  • Similarly, commercial waste containers, called dumpsters in the US, could be equipped with sensors to report on how full they are at different points in time.
  • Cell tower traffic can also be modelled as a time series and anomalies like flash crowd events that can be used to provide early warning.
  • Data Center Monitoring can be modelled as a Time series to predict  outages, plan upgrades
  • Similarly, Satellites, Robots and many more devices can be modelled as Time series data

From these readings captured in a Time Series database, we can derive analytics such as:

Prognosis: What are the short- and long-term trends for some measurement or ensemble of measurements?

Introspection: How do several measurements correlate over a period of time?

Prediction:  How do I build a machine-learning model based on the temporal behaviour of many measurements correlated to externally known facts?

Introspection:  Have similar patterns of measurements preceded similar events?

Diagnosis:  What measurements might indicate the cause of some event, such as a failure?

 

Classification and Anomaly detection for IoT

The books gives examples of usage of Anomaly detection and Classification for IoT data.

For Time series IoT based readings, anomaly detection and Classification go together. Anomaly detection determines what normal looks like, and how to detect deviations from normal.

When searching for anomalies, we don’t know what their characteristics will be in advance. Once we know characteristics, we can use a different form of machine learning i.e. classification

Anomaly in this context just means different than expected—it does not refer to desirable or un‐ desirable. Anomaly detection is a discovery process to help you figure out what is going on and what you need to look for. The anomaly-detection program must discover interesting patterns or connections in the data itself.

Anomaly detection and classification go together when it comes to finding a solution to real-world problems. Anomaly detection is used first in the discovery phase—to help you figure out what is going on and what you need to look for. You could use the anomaly-detection model to spot outliers, then set up an efficient classification model to assign new examples to the categories you’ve already identified. You then update the anomaly detector to consider these new examples as normal and repeat the process

The book goes on to give examples of usage of these techniques in EKG

For example, for the challenge of finding an approachable, practical way to model normal for a very complicated curve such as the EKG, we could use a type of machine learning known as deep learning.

 Deep learning involves letting a system learn in several layers, in order to deal with large and complicated problems in approachable steps. Curves such as the EKG have repeated components separated in time rather than superposed. We take advantage of the repetitive and separated nature of an EKG curve in order to accurately model its complicated shape to detect normal patterns using Deep learning

The book also refers to a Data structure called t-Digest for Accurate Calculation of Extreme Quantiles  t-digest was developed by one of the authors, Ted Dunning, as a way to accurately estimate extreme quantiles for very large data sets with limited memory use. This capability makes t-digest particularly useful for selecting a good threshold for anomaly detection. The t-digest algorithm is available in Apache Mahout as part of the Mahout math library. It’s also available as open source athttps://github.com/tdunning/t-digest

 

Anomaly detection is a complex field and needs a lot of data.

For example: what happens if you only save a month of sensor data at a time, but the critical events leading up to a catastrophic part failure happened six weeks or more before the event?

IoT from a large scale Data standpoint

To conclude, much of the complexity for IoT analytics comes from the management of Large scale data.

Collectively, Interconnected Objects and the data they share make up the Internet of Things (IoT).

Relationships between objects and people, between objects and other objects, conditions in the present, and histories of their condition over time can be monitored and stored for future analysis, but doing so is quite a challenge.

However, the rewards are also potentially enormous. That’s where machine learning and anomaly detection can provide a huge benefit.

For Time series, the book covers themes such as

Storing and Processing Time Series Data

The Direct Blob Insertion Design

Why Relational Databases Aren’t Quite Right

Architecture of Open TSDB

Value Added: Direct Blob Loading for High Performance

Using SQL-on-Hadoop Tools

Using Apache Spark SQL

 Advanced Topics for Time Series Databases(Stationary Data, Wandering Sources, Space-Filling Curves )

For Anomaly detection:

Windows and Clusters

 Anomalies in Sporadic Events

Website Traffic Prediction

Extreme Seasonality Effects

Etc

 

Links again:

About Time Series Databases: New ways to store and access data and 

A new look at Anomaly Detection  by Ted Dunning and Ellen Friedman (published by O Reilly).

Also the link for Data Science for the Internet of Things (IoT) course – University of Oxford where I hope to cover these issues in more detail in context of  MapR

Follow us @IoTCtrl | Join our Community

Read more…

Guest blog post by Bill Vorhies

Summary:  NIST weighs in on the Internet of Things to create a common vocabulary and development roadmap.

In July we wrote about the 7 volume Big Data Technology Roadmap being developed by the National Institute of Standards and Technology (NIST) which is part of the Department of Commerce.  You had an opportunity to review and comment on this final draft before publication.  See that article here.

Well they’re back and this time with a final draft of their comprehensive roadmap for the Internet of Things (IoT).  Despite the fact the IoT is the widely accepted name for this field, NIST elected to call their study “Framework for Cyber Physical Systems (CPS)”.  Doesn’t quite roll off the tongue like IoT but don’t let that deter you from taking a look.

Like its Big Data predecessor, this is the result of collaboration among business, academia, and government experts organized into the Cyber-Physical Systems Public Working Group (CPS PWG).  At 227 pages it’s a comprehensive reference of all things IoT from a wide range of contributors.

When the CPS-PWG decided on its somewhat unusual name it appears they were trying to draw a definition around a number of phrases, some of which have fallen out of use.  In addition to IoT, they include the domains of M2M (machine to machine), the industrial internet, and smart cities among others. Those of us on the Big Data and predictive analytics side of things tend to view this all as IoT.

Despite the odd naming, they created an interesting taxonomy about what makes CPS (IoT) different and distinguishable from other things:

The combination of the cyber and the physical, and their connectedness, is essential to CPS

  • CPS devices may be repurposed beyond applications that were their basis of design –e.g., a cell phone in a car may be used as a mobile traffic sensor; energy usage information may be used to diagnose equipment faults.
  • CPS networks may have “brokers” and other infrastructure-based devices and aggregators that are owned and managed by third parties, resulting in potential trust issues – e.g., publish and subscribe messaging, certificate authorities, type and object registries.
  • CPS are noted for enabling cross-domain applications – e.g., the intersection of manufacturing and energy distribution systems, smart cities, and consumer-based sensing.
  • Because CPS are designed to interact directly with the physical world, there is a more urgent need for emphasis on security, privacy, safety, reliability, and resilience, and corresponding assurance for pervasive interconnected devices and infrastructures.
  • CPS should be composable and may be service based. Components are available that may be combined into a system dynamically and the system architecture may be modified during runtime to address changing concerns. There are challenges, however. For example, timing composability may be particularly difficult. Also, it may not always be necessary or desired to purchase assets to build a system; instead, services can be purchased on a per-use basis, only paying for using the resources needed for a specific application and at the specific time of usage.

The document which you can download here covers nine broad areas:

  1. Functional
  2. Business
  3. Human
  4. Trustworthiness
  5. Timing
  6. Data
  7. Boundaries
  8. Composability
  9. Lifecycle

There are also excellent listings of references for those wishing a deeper dive and a good appendix of Definitions and Acronyms.  Also a number of well detailed use cases to spur the imagination of you IoT entrepreneurs.

The good news about these NIST studies, both this IoT study and its Big Data brother is that they are quite comprehensive and represent the thinking of a very wide range of public and private experts.  They are also completely public domain.  The bad news is that they take a long time to complete between their committee development and public review process.  This one started in 2014 and I don’t find much here about real time or streaming analytics or combined analytic and transactional databases like SAP HANA or VoltDB that are today’s forefront of IoT enablement.

If you want to be part of the process, the Working Group is taking public comment via email until November 2, 2015.

The template for submitting comments is available here.  Please submit comments using the spreadsheet template by November 2, 2015 via email to [email protected].

The Draft CPS Framework is freely available for download here. An additional Technical Annex, Timing Framework for Cyber-Physical Systems, is also freely available for download here.  Their homepage is found here http://www.nist.gov/cps/cps-pwg-workshop.cfm.

 

September 28, 2015

Bill Vorhies, President & Chief Data Scientist – Data-Magnum - © 2015, all rights reserved.

 

About the author:  Bill Vorhies is President & Chief Data Scientist at Data-Magnum and has practiced as a data scientist and commercial predictive modeler since 2001.  Bill is also Editorial Director for Data Science Central.  He can be reached at:

[email protected] or [email protected]

Follow us @IoTCtrl | Join our Community

Read more…

This week at the Gartner Symposium/ITxpo 2015 upwards of 10,000 CIOs and business technology professionals from around the world are gathering to talk all things IT. Gartner regularly polls their clients and today released the Top 10 Strategic Technology Trends for 2016.

Gartner defines a strategic technology trend as one with the potential for significant impact on the organization. Factors that denote significant impact include a high potential for disruption to the business, end users or IT, the need for a major investment, or the risk of being late to adopt.

As this community would expect, IoT dominates the majority of the list.

In this latest report, everything is a device and the general idea is that the digital mesh is a dynamic network linking various endpoints.

gartner-top10-strategic-trends-for-2016.png

IoT related trends include:

The Device Mesh

The device mesh refers to an expanding set of endpoints people use to access applications and information or interact with people, social communities, governments and businesses. The device mesh includes mobile devices, wearable, consumer and home electronic devices, automotive devices and environmental devices — such as sensors in the Internet of Things (IoT).

Ambient User Experience

While this trend focuses on augmented and virtual reality, IoT sensors play a key role in how this is implemented.


Information of Everything

Everything in the digital mesh produces, uses and transmits information. Advances in semantic tools such as graph databases as well as other emerging data classification and information analysis techniques will bring meaning to the often chaotic deluge of information.


Advanced Machine Learning

Gartner explores deep neural nets (DNNs), (an advanced form of machine learning particularly applicable to large, complex datasets) and claims this is what makes smart machines appear "intelligent." DNNs enable hardware- or software-based machines to learn for themselves all the features in their environment, from the finest details to broad sweeping abstract classes of content.

 

Autonomous Agents and Things

Gartner Research Fellow David Cearley says, "Over the next five years we will evolve to a postapp world with intelligent agents delivering dynamic and contextual actions and interfaces. IT leaders should explore how they can use autonomous things and agents to augment human activity and free people for work that only people can do. However, they must recognize that smart agents and things are a long-term phenomenon that will continually evolve and expand their uses for the next 20 years."

 

Adaptive Security Architecture

Security and IoT should go hand-in-hand. Gartner says that relying on perimeter defense and rule-based security is inadequate, especially as organizations exploit more cloud-based services and open APIs for customers and partners to integrate with their systems.

 

Advanced System Architecture, Mesh App and Service Architecture

These are three of the ten trends that I’m summarizing into one. All of these require more computing power and new ways of deploying software. Say goodbye to the monolithic approach and welcome agility. Application teams must create new modern architectures to deliver agile, flexible and dynamic cloud-based applications with agile, flexible and dynamic user experiences that span the digital mesh.

 

Internet of Things Platforms

IoT platforms complement the mesh app and service architecture and Mr. Cearley rounds out the trends by stating, "Any enterprise embracing the IoT will need to develop an IoT platform strategy, but incomplete competing vendor approaches will make standardization difficult through 2018.”


Lots of work to still do. Further reading here.

Read more…

5 Really Cool Internet of Things Sports Gadgets

Guest blog post by Bernard Marr

Elite level athletes have long had the ability to integrate data analysis principles into their training – monitoring and crunching data on their performance to help them break personal bests and world records.

Thanks to the explosion of the Internet of Things – the idea that just about any everyday object can be made “smart”, and able to collect data and communicate wirelessly – these sort of insights are now available to athletes and players at any level.

Here’s a rundown of what I think are five of the best Internet of Things enabled sports and training gadgets and apps which can help you to take your game to the next level:

Babolat Play Pure Drive racquet

Babolat has been producing tennis racquets for almost 150 years, and has always moved with the times, their products evolving from wooden frames, to metal and then carbon fibres. The influx of smart tech and data analysis in sports is the latest game changer, and Babolat has stayed on the ball here, too, with the introduction of the Play Pure Drive racquet.

Sensors in the handle record every shot that is made, registering the direction of travel and the point of contact, as well as the force, of the ball with the racquet.

Keeping all of the sensors in the handle means that the impact of their weight or positioning on the racquets handle is minimized. A smartphone app acts as a personal coach, analyzing the data collected by the racquet and comparing it with data from other players stored in its database, in order to suggest improvements to your game.  

Sony Smart Tennis Sensor

You don’t have to buy a whole new racquet to benefit from smart tennis technology. Providing you have a compatible racquet, Sony’s Smart Tennis Sensor will simply clip on, allowing you to collect data on every shot. Like the Babolat it comes with its own app which is also a portal to the data collected and collated by other users of the service. Unlike the Babolat, the device can also record video, allowing you to review every shot after the game. This video can be overlaid with graphical visualizations created from the data captured by the racquet, allowing for even deeper insights into a player’s performance.

Adidas MiCoach Smart Ball

This smart football (or soccer ball to Americans) aims to help you improve your play by providing instant feedback on the power and trajectory of your kicks. Like Babolat, Adidas is another old-school sports equipment manufacturer which has consistently moved with the times and clearly sees Big Data and Analytics as the current driving force in sports tech development. The device hides all of its sensors right in the middle of the ball where they won’t affect its dynamics, and transmits them over Bluetooth to its partner smartphone app. It allows free kicks and penalties to be practiced even in a confined area – kick the ball against a wall and the visualizations will show how it would have travelled if you were in the middle of an open pitch.

Zepp Golf

Zepp Labs is a company established with the aim of bringing data analysis into consumer-level sports tech. Their Zepp Golf solution consists of a sensor-enabled glove which is worn during play, and which transmits data on the player’s swing to an analytical smart phone app. One insight which came up early in testing of the product was that older golfers tend to pull back less distance before a swing, resulting in less shot power. The personal coach element of the app monitors a player’s performance for these flaws and suggests remedial action in real-time. The company also produces smart products for baseball and tennis players, and has most recently moved into softball.

Sensoria Smart Sock

A sock might be one of the last gadgets you would expect to see “smartened up” for the Internet of Things age, but you would be wrong!

Sensoria have produced this sensor-stuffed smart sock for runners, which is able to measure not only how far and fast you travel, but the way your foot impacts with the ground – helping you to minimize the risk of stress or injury by maintaining a good form throughout your run. Among its innovations is the sensor technology itself. Textile-based sensors have been developed which can sit between the foot and the running shoe without causing discomfort, and can even be thrown in the washing machine with your regular laundry. Sensoria is another company which was founded specifically to produce sports equipment with analytic functionality, and after raising money for its first product, the sock, via crowdfunding, it has gone on to develop a t-shirt and sports bra also incorporating its textile sensor technology.

Follow us @IoTCtrl | Join our Community

Read more…

IoT practitioners are at the forefront of their company's digital initiatives. But is the rest of your company ready for its digital moment? The expectations are high in the C-Suite for digital transformations, but there's still more talk than action for many companies.

New research by McKinsey Institute suggests only 17% of corporate boards are participating in strategy for big data or digital initiatives. The good news is almost half of big companies have managed to get their CEOs personally involved, up from 23 percent in 2012.

Other findings from the survey include:

  • The most common hurdle to meeting digital priorities, executives say, is insufficient talent or leadership.

  • Across the C-Suite, 71% expect that over the next three years, digital trends and initiatives will result in greater top-line revenues for their business, and large shares expect their profitability will grow.

  • More than half of executives say that, in response to digital, their companies have adapted products, services, and touchpoints to better address customer needs.

  • Executives most often cite analytics and data science as the area where their organizations have the most pressing needs for digital talent, followed by mobile development and user experience.

  • Executives who report ample organizational support for adopting risky digital initiatives are twice as likely to work for a high-performing company as executives reporting resistance to risky initiatives due to fear of failure.

  • Forty-seven percent say cutting-edge digital work helps them attract and retain digital talent.

  • Companies’ priorities vary across industries, reflecting key sources of value in each sector: big data is a top priority in healthcare, for example, while automation is a greater focus in manufacturing (see graphic below).

60E-BPbvWY-c9EK-UZVbJRovVDDYOJPbwjEpNqKIjOHHJcqHNtfa65RRCrC0inETkEGTUEJSXc-aNGpbAawcPQqAW835Gv098rxEb0yaJDzZWD_vGqp-dt_kCbQLWy5v=s1600

 

QRZbYNwMvrmKegQgv8-IUoXecd8G1F2rndy7P3QqeAzG6XhxS4WgClxezmVCPrCgQiQVpiWnF-gnT6xYXAvezbLFG0RNsePLNXiXvWOeeoxztfl7Y1QMkgC4HimxcPsvHw=s1600

The digital interconnection of billions of devices is today’s most dynamic business opportunity and at present, the Internet of Things remains a wide-open playing field for enterprises and digital strategy. According to the study, buy-in from the C-Suite and aligning with corporate culture and objectives is key to digital success.

You can read the complete survey here.

Read more…

Guest blog post by Ajit Jaokar

Introduction

 

In this series of exploratory blog posts, we explore the relationship between recurrent neural networks (RNNs) and IoT data.  The article is written by Ajit Jaokar, Dr Paul Katsande and Dr Vinay Mehendiratta  as part of the Data Science for Internet of Things practitioners course. Please contact [email protected] for more details

 

RNNs are already used for Time series analysis. Because IoT problems can often be modelled as a Time series, RNNs could apply to IoT data. In this multi-part blog, we first discuss Time series applications and then discuss how RNNs could apply to Time series applications. Finally, we discuss applicability to IoT.

 

In this article (Part One), we present the overall thought process behind the use of Recurrent neural networks and Time series applications - especially a type of RNN called Long Short Term Memory networks (LSTMs).

Time series applications

The process of Prediction involves making claims about the state of something in future depending on values in the past and its current state.  Many IoT applications (such as temperature values, smart meter readings etc) have a time dimension. Classical pattern recognition problems are concerned with knowing dependencies between variables (Regression) or in classification of input vectors into categories (Classification).  By including changes in time, we add an additional temporal dimension giving us Spatio-temporal data. Even when a phenomenon is continuous, it can be converted to a Time series by the process of sampling. Thus, phenomenon like speech, ECGs etc can be modelled as a time series when sampled.

 

Hence, we have a variable x changing in time xt (t=1,2,...) and we would like to predict the value of x at time t+h. Time series forecasting is a problem of function approximation and the forecast is made by computing an error measure over a time series.  Also, given a time series model, we can solve many related problems. For example: Forecast the value of the variable at a time t x(t); Classify the time series at a time in future(will prices go up or down); Model one time series in terms of another(for example Oil prices to Interest rates) etc.

Neural networks for Time series Prediction

However, while many scenarios (such as Weather prediction, foreign exchange fluctuations, energy consumption etc) can be expressed as a time series, formulating and solving the equations is hard in these cases because of reasons such as

a)      There are too many factors influencing the outcome

b)      There are hidden/unknown factors influencing the outcome.  

In many such scenarios, the focus is not to find the precise solution to the problem but rather to find a possible steady state where the system will converge. Neural networks often apply in such scenarios because they are able to learn from examples only and are able to catch hidden and strongly non-linear dependencies.

Neural networks are trained from historical data with the objective that the network will discover hidden dependencies and that it will be able to use them for predicting into future.  Thus, neural networks are not represented by an explicitly given model and can spot nonlinear dependencies in spatiotemporal patterns. They solve the problem of feature engineering i.e. in finding out what is the best representation of the sample data to learn a solution to your problem

Neural networks for time series processing – incorporating the Time domain

When used for Time series forecasting, the obvious first problem is: How to model Time in the  neural network? For traditional applications of Neural networks (such as pattern recognition or classification), we do not need to model the Time dimension. Time is difficult to model in a neural network because it is constantly moving forward.  However, by including a set of delays, we can retain successive values in the time series. Thus, each past value is treated as an additional spatial dimension. This process of converting the time dimension into an infinite-dimensional spatial vector is called embedding. Because for practical purposes, we need a limited set of values – we consider a history of previous d samples (the embedding dimension) as shown in the figure below

 

Source: http://www.cs.cmu.edu/afs/cs/academic/class/15782-f06/slides/timeseries.pdf.

For an evolution of neural networks, see the previous post Evolution of Deep learning models. Recurrent neural networks are often used for modelling Time series. An example is using Recurrent Neural Networks To Forecasting of Forex(pdf)

 

A recurrent neural network (RNN) is a class of artificial neural network where connections between units form a directed cycle. This creates an internal state of the network which allows it to exhibit dynamic temporal behaviour. Unlike feedforward neural networks, RNNs can use their internal memory to process arbitrary sequences of inputs. This feature of using internal memory to process arbitrary sequences of inputs makes RNNs applicable to tasks such as unsegmented connected handwriting recognition, where they have achieved the best known results. (Wikipedia). The fundamental feature of a Recurrent Neural Network (RNN) is that the network contains at least one feed-back connection, so the activations can flow round in a loop. That enables the networks to do temporal processing and learn sequences, e.g., perform sequence recognition/reproduction or temporal association/prediction. Using the same idea as time delays as above, the recurrent neural network can be converted into a traditional feed forward neural network by unfolding over time as shown below.

 

 

Source:  http://www.cs.bham.ac.uk/~jxb/INC/l12.pdf

 

RNNs are used to model Time series because the feedback mechanism creates a ‘memory’ i.e. an ability to process the Time dimension. Memory is important because many Time series problems (such as Traffic modelling) need a long term / historical modelling of Time values.  Long Short Term Memory networks(LSTMs) are a special kind of RNN, capable of learning long-term dependencies especially because they are capable of remembering information over long time frames. The figure below summarises feed forward neural networks and Recurrent neural networks.

Source: http://deeplearning.cs.cmu.edu/notes/shaoweiwang.pdf

Implications for IoT datasets

In subsequent articles, we will explore LSTMs in greater detail and implications for IoT data.

This article covers topics we teach in the Data Science for Internet of Things practitioners course. Please contact [email protected] for more details

 Follow us @IoTCtrl | Join our Community

Read more…

The smart phone on your belt is dramatically different from the flip phones of a decade ago. Technology continues to move at incredible speeds and we are truly living in a golden age. But the where we are headed is unlike where we’ve been.

In the future, the Internet of Things will be a reality in every sector. Smart systems will be released with sensors and robotics that simplify and automate manufacturing. The system will operate through wired and wireless networks and an infrastructure will help us to accomplish more during the course of a day.

This begins with physical objects, built with sensors and actuators placed in them. These individual parts will send and receive information in order to complete specific tasks. They will depend on real time data and this information will affect the big picture. In fact, each device on the assembly line will connect to a central system that will orchestrate and synchronize the entire system to ensure things run smoothly and as effectively as possible.


In order for smart manufacturing to work, there need to be systems in place that work with the smart manufacturing vision. Sensors must be placed in technology and a host system installed. This will help with logistics, order placement, procurement and other essential functions that impact the overall system.

So who does this? While your IT department could technically handle the task, it would be time consuming and cost you hundreds of man hours to develop. A better choice is to consider a vendor who can help with the effort. These individuals will help to create a functional system which is tightly integrated and allows you to effectively manage your manufacturing operations. With new industry standards being released for manufacturing all the time, it is certain the internet of things will play a pivotal role in the future of manufacturing automation.

An example of it is already seen in the food and beverage industry. Machines currently communicate sensitive information like temperature, humidity and the condition of the containers. Companies can also track shipments with identifying codes and determine where they originated from in the company and where these items were shipped to in the world. If there is a case of contamination, they can also quickly contact locations who received items that might be tainted.

When the internet of things becomes dominate on these manufacturing lines, there will be more power. There will be a central master computer that will run the entire operation. It will have an intelligent way to analyze, address concerns and to remain independent at all times, all while continuing to meet the demands of production.

There is no denying the internet of things will play an important role in the future of production. Good will be released faster and profits will spike for a company. That makes it important to embrace today and incorporate in the current structure of your business. Doing that will help you to be part of the future and to remain a visionary in the industry.

 

Are you hiring ahead of the coming shift in how workers work?

Read more…

A Quick History of the Internet of Things

How Did We Create Such a Rich Market?

Want to know how the "Internet of Things" became a thing at all? To do so, you must look back to the start: the birth of networking and the explosion of consumer technology.

The internet isn’t that old, so far as the world wide web. In 1974, the structure we know and love today was born. Just ten years later. that the first domain name system was introduced, allowing for easier networking. The first website actually came online in 1991. The "internet," as a network of connected devices in consumer homes, was only proposed just a scant two years before that, yet it came crashing into our mainstream world. 

In no time the internet took over. By 1995, multiple websites and systems came online. I remember watching crude bulletin board systems arise, then quickly be replaced by Geocities pages and early websites. The first business webpages actually came in the form of reproduced fliers, essentially scanned and put online to promote companies. All of these new ideas came from the imaginings of others that had taken place decades earlier.

The term “internet of things” or “IoT” is also not a new one. You can find references to it as far back as the idea of the Internet itself, but if you survey an IoT team, it is more than likely that few know this. The history, or at least the ideology, goes back a great deal further than most people know. This, of course, has ramifications on the marketplace, both in how older technology companies approach the space and how traditional product introduction processes operate.

Thinkers across history could be responsible for coining the term, depending on the story you read. Some point to Tesla and Edison as the first to lead connected objects. Others look at the literal applications by Tim Berners Lee and Mark Weiser, the latter of which famously created a water fountain synced to the activities of the NYSE. The founders of Nest could also make the list, one of the first truly non-computer connected objects.

Even the idealism and futurism of the 1950s and 1960s gave way to the Internet of Things thinking. Imagine a classic 60s technology ad, displaying the "home of the future." Everything is connected and communicating, and people are never out of reach of their day-to-day technology.

Then, of course, is Kevin Ashton, a man who comes up when you Google "who came up with the Internet of Things." Kevin is a frequent thinker in the space who is corrected attributed to a verifiable creation of the term, "Internet of Things." Like most corporate lingo, the origin is likely impossible to pin down, but the idea that the term was born in a boardroom is not surprising. The leaders who would go on to actually take these objects to market in the 90s included "traditional" players like IBM and Sony.

The story is that, no matter what route you pick to decipher the past, the rise of Internet of Things thinking is ubiquitous. From the moment "networking" arrived into everyday life, people were thinking about how it would impact our world.

1998 itself is a turning point in many ways, when something changed. Apple returned to the market with the iMac, and the team that designed this platform would go on to design the iPhone and, most critical to IoT research, the iPod. Big name manufacturers that had for most of their development focused on the PC were now investing in everyday objects with connectivity and technological features. The smartphone era was planted, and with it would come the first real consumer-level IoT object based on existing computers.

The history of IoT is extraordinarily dense, and the reading of the history depends on who you ask. If you were to question a designer at IBM in the late 1980s, you would find ideas similar to what we now call IoT in constant use. However, if you ask an emerging startup from the early 2000s, you would find a wave of thinkers taking credit for the idea. The reality is somewhere in between: those who thought ahead about computers expected what we have today, billions of devices.

IoT has continued to grow and to evolve and projections are bright for this new methodology for using the internet. The future of IoT is now –with devices coming online every day. The world is reliant upon connected cars, connected medical devices and even connected homes.

Companies today are scrambling to get their own IoT systems online and moving, and new recruits are being brought in every day to head up IoT systems in companies from small to large. How well do they know the history of the space and exactly how broad it can be?

 We want your input - please share your thoughts below!   Click Here 

Read more…

Charting the IoT Opportunity

By Venkat Viswanathan and Ravi Ravishankar

 

As the Internet of Things (IoT) gains momentum, it’s apparent that it will force change in nearly every industry, much like the Internet did. The trend will also cause a fundamental shift in consumer behavior and expectations, as did the Internet. And just like the Internet, the IoT is going to put a lot of companies out of business.

 

Despite these similarities, however, the IoT is really nothing like the Internet. It’s far more complex and challenging.

 

Lack of Standardization

Unlike the Internet, where the increased need for speed and memory was addressed as a by-product of the devices themselves, the sensors and devices connecting to the IoT network have, for the most part, inadequate processing or memory. Furthermore, no standard exists for communication and interoperability between these millions of devices. Samsung, Intel, Dell and other hardware manufacturers have set up a consortium to address this issue. Another equally powerful consortium formed by Haier, Panasonic, Qualcomm and others aims to do the exact same thing. This has raised concerns that each of these groups will engage in a battle to push their standard, resulting in no single solution.

 

New Communication Frontier

The Internet was designed for machine to human interactions. The IoT, on the other hand, is intended for machine-to-machine communications, which is very different in nature. The network must be able to support diverse equipment and sensors that are trying to connect simultaneously, and also manage the flow of large quantities of incredibly diverse data...all at very low costs. To meet these requirements, a completely new ecosystem—independent of the Internet—must evolve.

 

Data Privacy

The IoT also raises serious challenges for data security and privacy. Justified consumer concerns will call for stricter privacy standards and demand a greater role in determining what data they will share. These aren’t the only security issues likely to arise. In order for a complete IoT ecosystem to emerge, multiple players must use data from connected devices—but who owns the data? Is it the initial device that emits it, or the service provider that transports that information, or the company that uses it to provide the consumer better service offerings?

 

Geographic Challenges

For multinational organizations with data coming from various regions around the globe, things get even more complicated. Different countries have different data privacy laws. China and many parts of the EU, for example, will not let companies take data about their citizens out of their borders. This will result in the emergence of data lakes. To enable business decisions, companies must be able to access data within various geographies, run their analysis locally and disseminate the insights back to their headquarters…all in real-time and at low costs.   

 

In spite of all these challenges, the IoT is not something companies can afford to keep at arm’s length. Like the Internet, it will empower consumers with more data and insights than ever before, and they in turn will force companies to change the way they do business. From an analytics perspective, it’s very exciting. Companies will now have access to quality data that, if they combine it with other sources of information, can provide them with immense opportunities to stay relevant.

 

As an example, let’s look at the medical equipment industry. Typically these companies determine what equipment to sell based on parameters like number of beds and whether the facility is in a developing or developed market. However, these and other metrics are a poor substitute for evaluating need based on actual use. A small hospital in a developing country, for example, will diagnose and treat a much wider range of diseases than a similar facility in a more developed region. By equipping the machines with sensors, these manufacturers can obtain a better understanding of what is occurring within each facility and optimize selling decisions more effectively as a result.

 

This is just one example to underscore the tremendous potential that the IoT holds for businesses. In order to truly realize these and other opportunities, companies must understand the challenges outlined above and have a framework in place to address them. In the early days of the Internet, few could have predicted its transformative impact on all facets of our lives—personal and professional. As the IoT heads into its next phase of maturity, we can expect to see a similar effect emerge.

 

Ravi Ravishankar is Global Head of Product Marketing and Management at Equinix's Products, Services and Solutions Group and Venkat Viswanathan is Chairman at LatentView Analytics.

 

Originally Posted on Data Science Central

Follow us @IoTCtrl | Join our Community

Read more…

The Internet of Things encompasses a wide range of connected services, technologies, and hardware devices. Yet, for consumers, it is the growing number of portable and wearable devices that will be their main interface with IOT tech. The wearable device market is rapidly evolving, especially when it comes to smart watches and fitness monitoring devices.

As opportunities grow, the wearables dominating the market are also changing. What does this mean for those involved in the development, marketing, and sales of these IOT connected devices?

 How Big is the Wearable Market in 2015?

International Data Corporation (IDC) has predicted that wearable device shipments in 2015 will rise to 173% of the total sales achieved in the previous financial year. This translates to over 72 million devices, including smartwatches and health trackers. This growth has been largely driven by high profile releases such as the Apple Watch in April of 2015, and also by widely publicized financial opportunities, Fitbit’s recent IPO being a prime example.

With the potential to move over 72 million units across the market, it is no surprise that leading technology companies like LG, Samsung, Sony, Microsoft, Apple, and Motorola are starting to increase their focus on wearable technology.

When we look closer at the marketplace, we see a strong mix of upstart companies and traditional players, with Fitbit, Garmin, and Xiaomi all new entrants. This blend of "old" technology giants and very new companies is promising - the marketplace is growing rapidly, and opportunity actually exists.

Future growth will be an incentive for further investment. IDC figures suggest that by 2019, global sales of wearables could exceed 150 million units. The market is open completely, with any company able to take a device to market open to growth.

Do these figures mean success for all involved in the wearable market? Not entirely.

Challenges for Businesses to Adapt

Although the overall market has grown, recent trends show that wearable fitness devices are losing out to increased smartwatch sales. Gartner’s latest research suggests that the dip could largely be associated with the increasing crossover in functionality between fitness devices and the latest smartwatches. 50 percent of those seeking a fitness wearable will end up choosing a smartwatch instead, and brands do not necessarily know why this shift is happening.

I think that one feature overlap is contributing to this. Fitness devices chiefly collect information relating to distance covered, physical location, and heath, including heart rate. Nearly every smartwatch on the market today can do all of this, and more. For a savvy consumer, combining a Samsung Galaxy Gear smartwatch with a high-end Galaxy Note 4 or Galaxy S6 would provide GPS tracking, information on calories burnt, heart rate monitoring, and even blood oxygen levels. The technology is advancing year on year, and it is clear that the innovation gap is already closing.

There are two consequences I see with this lack of clear differentiation. The first is that fitness-focused products need to innovate or die. With the market contracting by supporting multi-feature devices over purpose-built tools, the new goal should be for innovation to differentiate. Put simply, the fitness trackers of the world need to do something that smartwatches cannot.

The second consequence is that companies like Fitbit and Nike, which are focused on fitness tracking, will need to lower prices to compete with integrated smartwatches. When a consumer is faced with a $120 fitness tracker and a $200 smartwatch with phone connectivity, alerts, and apps, the choice becomes very one-sided. Yet, the bottom of the market, and the sector more likely to actually increase sales of purpose-built trackers, is relatively unsaturated. 

Fitbit, Jawbone, and Nike make up 97% of the wearable fitness device market. In smartwatch territory, it is Samsung and Apple that lead the market. Looking at one of the least expensive fitness trackers, Fitbit's Zip, we see a $60 base price point. Even at this level, the casual user has to pause and think - their phone already does much or all of what the Zip does, and a waterproof fitness case is cheaper. Fitbit, in this case, needs either to more fundamentally differentiate or drop its pricepoint.

Where is the Money in Wearables?

Even with staggering sales numbers, wearables are not in themselves a key revenue stream. Instead, it is the associated value that provides the biggest benefit to manufacturers.

Smartwatches, in particular, are seen as accessories. They are paired to smartphones and in turn can help to drive sales. They are also showpiece items. Even if Samsung, Apple, Sony etc. only manage to sell wearable technology to 10% of their smartphone customers (a speculative number), they will generate brand marketability, and logically would experience knock-on sales.

When it comes to companies like Nike, Fitbit, and Jawbone, the profit can come from connected services. Examples include subscription based exercise plans, analytics software, and in the case of Nike, a wearable can lead to increased apparel sales.

Still, there is an incredible gap for new entrants to the market. Apple and Samsung can rely on a massive pool of existing customers, and directly integrate their offerings into that group. Fitbit cannot, with no "hub" devices on the market. Even subscription-based models cannot make up for the gap. This makes the marketplace incredibly hard to predict going forward - nothing prevents a company like Samsung from releasing another mid-range watch and completely dividing the market. 

As with all IOT technology, the wearable device is only one part of the experience, and therefore only one part of the business model. It is the way in which data is collected, analyzed, and presented that provides the true value of any smart device. Smartwatches already have an advantage because they are highly integrated into their respective smartphone operating systems. Wearable fitness device companies have the opportunity to provide fitness tracking as a service, and must find new ways to monetize the service to generate direct revenue on top of initial hardware sales.

What does the Future Hold For Wearable Technology?

Over a billion smartphones were sold around the world in 2014. Global wearable sales make up less than 10% of that number. The challenge for manufacturers is to develop wearables that easily integrate with daily life that also are something that consumers want to use on a daily basis.

While wearables are high in consumer mindshare, they are relatively low in actual penetration. Smartwatches are now able to integrate a fitness device with a smart device in a way that is both compelling and practical, but is it enough? Those in the industry will need the best ideas, the best strategies, and the best talent to ensure that in-demand products are developed in line with business goals, and that they result in strong financial growth.

 

When considering how to hire leadership for the emerging Internet of Things market, keeping these consideration in mind is critical.I can help guide your choices, find the best candidates, and bring IoT experience to your company. Contact me today for a consultation.

Read more…

What is the Internet of Everything (IoE)?

Guest blog post by Peter Diamandis, chairman and CEO of the X PRIZE Foundation, best known for its $10 million Ansari X PRIZE for private spaceflight.  Today the X PRIZE leads the world in designing and operating large-scale global competitions to solve market failures.

Every month I hold a webinar for my Abundance 360 executive mastermind members that focuses on different exponential technologies impacting billion-person problems.

This week I interviewed Padma Warrior, CTO and Chief Strategist of Cisco, to discuss the Internet of Everything (IOE).

Padma is a brilliant and visionary person, one of the most important female leaders of this decade.

She first got my attention when she quoted a recent Cisco study placing the value of IoE as a $19 trillion opportunity.

This blog is about how you can tap into that $19 Trillion.

What is the Internet of Everything (IoE)?

The Internet of Everything describes the networked connections between devices, people, processes and data.

By 2020, the IoE has the potential to connect 50 billion people, devices and things.

In the next 10 years, Cisco is projecting IoE will generate $19 trillion of value – $14 trillion from the private sector, and $5 trillion from governments and public sectors (initiatives like smart cities and infrastructure).

Imagine a Connected World

Let me try to paint an IoE picture for you.

Imagine a world in which everything is connected and packed with sensors.

50+ billion connected devices, loaded with a dozen or more sensors, will create a trillion-sensor ecosystem.

These devices will create what I call a state of perfect knowledge, where we'll be able to know what we want, where we want, when we want.

Combined with the power of data mining and machine learning, the value that you can create and the capabilities you will have as an individual and as a business will be extraordinary.

Here are a few basic examples to get you thinking:

  • Retail: Beyond knowing what you purchased, stores will monitor your eye gaze, knowing what you glanced at… what you picked up and considered, and put back on the shelf. Dynamic pricing will entice you to pick it up again.
  • City Traffic: Cars looking for parking cause 40% of traffic in city centers. Parking sensors will tell your car where to find an open spot.
  • Lighting: Streetlights and house lights will only turn on when you're nearby.
  • Vineyards/Farming: Today IoE enables winemakers to monitor the exact condition (temperature, humidity, sun) of every vine and recommend optimal harvest times. IoE can follow details of fermentation and even assure perfect handling through distribution and sale to the consumer at the wine store.
  • Dynamic pricing: In the future, everything has dynamic pricing where supply and demand drives pricing. Uber already knows when demand is high, or when I'm stuck miles from my house, and can charge more as a result.
  • Transportation: Self-driving cars and IoE will make ALL traffic a thing of the past.
  • Healthcare: You will be the CEO of your own health. Wearables will be tracking your vitals constantly, allowing you and others to make better health decisions.
  • Banking/Insurance: Research shows that if you exercise and eat healthy, you're more likely to repay your loan. Imagine a variable interest rate (or lower insurance rate) depending on exercise patterns and eating habits?
  • Forests: With connected sensors placed on trees, you can make urban forests healthier and better able to withstand -- and even take advantage of -- the effects of climate change.
  • Office Furniture: Software and sensors embedded in office furniture are being used to improve office productivity, ergonomics and employee health.
  • Invisibles: Forget wearables, the next big thing is sensor-based technology that you can't see, whether they are in jewelry, attached to the skin like a bandage, or perhaps even embedded under the skin or inside the body. By 2017, 30% of wearables will be "unobtrusive to the naked eye," according to market researcher Gartner.

The Biggest Business Opportunities Will Be in Making Systems More Efficient

The Internet of Everything will become the nervous system of the human economy.

Entrepreneurs who capitalize on this will drive massive value and enable better decisions and reduce inefficiencies.

If you are an entrepreneur or running a business, you need to do two key things:

1. Digitize: Determine which of your processes are not yet digitized and find a way to digitize them. Then, collect data and analyze that data. Go from your old-style manual process (or data collection system) to an autonomous digital version.

2: Skate to the Puck: Have a brainstorm with the smartest members of your team (or find some local Singularity University alumni to join you) and ask yourselves the following questions:

  • What kind of sensors will exist in 3 years' time, and what kind of data could we be collecting?
  • In three years, which of our "things" will be connected and joining the Internet of Everything? With the answers to these two basic questions, come up with the business opportunities that will exist in three years… and begin developing the business models, developing the software and planning out your domination.

This is the sort of content and conversations we discuss at my 250-person executive mastermind group called Abundance 360. The program is ~88% filled. You can apply here.

Share this email with your friends, especially if they are interested in any of the areas outlined above.

We are living toward incredible times where the only constant is change, and the rate of change is increasing.

Best,
Peter

Originally posted on Data Science Central

Follow us @IoTCtrl | Join our Community

Read more…

IoT in Transportation and Logistics

It is said that there are 100,000 freighters on the seas and that 90% of everything you have has come via container ship. The first time I saw Hong Kong Harbour from my swank room at the JW Marriott what struck me most was the number of container ships. As I scanned the waters I counted several dozen of the floating giants and imagined everything onboard was coming out of China and going somewhere on the planet.

14402540603_1c9b3e853d_z.jpg

Photo Credit: Andrew Smith via Flickr

Once at its destination port a gantry crane unloads the containers and places them either on a truck or a rail car. Then the goods are sent off to their respective warehouses, where another delivery vehicle most likely takes it to another vendor or supplier where it might eventually end up in your garage. Just thinking about this one aspect of transportation and logistics is mind boggling. And it’s perfect for IoT.

The fine folks at Deloitte University Press have written a wonderful overview on IoT considerations for the shipping and logistics industries. Entitled, “Shipping smarter: IoT opportunities in transport and logistics,” the report highlights that while companies in transport and logistics (T&L) have always been data-driven, with specific applications like real-time tracking of shipments, warehouse-capacity optimization, predictive asset maintenance, route optimization, improved last-mile delivery, and more, they still have a huge opportunity ahead of them in IoT.

The increasing number of connected devices, embedded sensors, and analytics technologies will only increase the data and accelerate. This will lead to more efficient use of transport infrastructure, better engagement with customers, and more informed decision making. The report has four recommendations for T&L and IoT, but what I found most thought provoking was their framework that captures the series and sequence of activities by which organizations create value from information: the Information Value Loop (see below).

You can find the full report here. Further reading on the subject is listed after the graphic.



DUP1160_Value_Loop.jpg

Further Reading



Read more…

IoT Big Swings

Last week Tom Davenport, a Distinguished Professor at Babson College, wrote about “GE’s Digital Big Swing” in the Wall Street Journal. As he cites in his latest piece, there are many others taking big swings in digital and IoT overall. (BTW - If you’re not following Tom, you really should do so now. His thoughts are a perfect mix of research and practice covering big data, analytics and changes in the digital landscape.)

During my time at Pivotal, I was witness to the digital big swing that GE took and saw the energy, effort and resources they were committing to make sure that whatever they made that could be connected to the Internet - jet engines, power plants, surgical image machines - would capture all data to improve products and the customer experience. I don’t think GE watchers - investors, competitors, partners - fully understand yet the enormity of this bet.

They keep making moves. This week the company announced the creation of GE Digital, a transformative move that brings together all of the digital capabilities from across the company into one organization.

Jeffrey Immelt, Chairman and CEO of GE, said, “As GE transforms itself to become the world’s premier digital industrial company, this will provide GE’s customers with the best industrial solutions and the software needed to solve real world problems. It will make GE a digital show site and grow our software and analytics enterprise from $6B in 2015 to a top 10 software company by 2020.”

GE, the industrial giant, a Top 10 software company? That’s taking GE’s slogan “Imagination at Work” and making it real.

Much like the cloud trend before it, the IoT trend is something where all major vendors are investing.

Yesterday at Salesforce’s behemoth customer conference Dreamforce, the company announced the Salesforce Internet of Things Cloud. Based on a home-grown data processing technology called Thunder, Salesforce touts their IoT Cloud as empowering businesses to connect data from the Internet of Things, as well as any digital content, with customer information, giving context to data and making it actionable—all in real-time.

With perhaps a nod of guilt to marketing hype, other notable big swings include:

  • IBM - The company has created an Internet of Things business unit and plans to spend $3 billion to grow its analytics capabilities so that organizations can benefit from the intelligence that connected devices can provide. According to IBM, as much as 90 percent of data that is generated by connected devices is never acted on or analyzed.

  • Cisco - Its approach focuses on six pillars for an IoT System - network connectivity, fog computing, security, data analytics, management and automation and an application enablement platform. You can buy all the pieces of the system from Cisco, of course.

  • Samsung - They are betting on openness and industry collaboration. By 2017, all Samsung televisions will be IoT devices, and in five years all Samsung hardware will be IoT-ready. They also recently open sourced IoT.js, a platform for IoT applications written in JavaScript, and JerryScript, a JavaScript engine for small, embedded devices.

  • Monsanto - Their near billion dollar purchase of The Climate Corporation is combining The Climate Corporation’s expertise in agriculture analytics and risk-management with Monsanto’s R&D capabilities, and will provide farmers access to more information about the many factors that affect the success of their crops.

In the wake of these giant big swings will be new and exciting startups - sensor companies, chip players, software, analytics and device makers. If you know of a compelling start-up in the industrial IOT space, drop me a line at [email protected]. We would love to hear from you.




Read more…

Brontobytes, Yottabytes, Geopbytes, and Beyond

Guest blog post by Bill Vorhies

Now that everyone is thinking about IoT and the phenomenal amount of data that will stream past us and presumably need to be stored we need to break out a vocabulary well beyond our comfort zone of mere terabytes (about the size of a good hard drive on your desk).

In this article Beyond Just “Big” Data author Paul McFedries argues for nomenclature even beyond Geopbytes (and I'd never heard of that one).  There is a presumption though that all that IoT data actually needs to be stored which is misleading.  We may want to store some big chunks of it but increasingly our tools are allowing for 'in stream analytics' and for filtering the stream to identify only the packets we're interested in.  I don't know that we'll ever need to store Geopbytes but you'll enjoy his argument.  Use the link Beyond Just “Big” Data.

Here's the beginning of his thoughts:

Beyond Just “Big” Data

We need new words to describe the coming wave of machine-generated information

When Gartner released its annual Hype Cycle for Emerging Technologies for 2014, it was interesting to note that big data was now located on the downslope from the “Peak of Inflated Expectations,” while the Internet of Things (often shortened to IoT) was right at the peak, and data science was on the upslope. This felt intuitively right. First, although big data—those massive amounts of information that require special techniques to store, search, and analyze—remains a thriving and much-discussed area, it’s no longer the new kid on the data block. Second, everyone expects that the data sets generated by the Internet of Things will be even more impressive than today’s big-data collections. And third, collecting data is one significant challenge, but analyzing and extracting knowledge from it is quite another, and the purview of data science.

Follow us @IoTCtrl | Join our Community

Read more…

Internet of Things? Maybe. Maybe Not.

Everything is connected, through the cloud all machine-generated data are collected and widely shared over the Internet. That’s how we imagine IoT – the Internet of Things.

 

Correction: That’s how THEY imagine IoT. What WE envision here is not just about the Internet of Things but also the Intelligence of Things. The idea is: When a device is equipped with connectivity and sensors, why not take another bold move to make the device intelligent? With an agile and affordable computing unit, every device has the power to analyze collected data and take fact-backed actions, thus making intelligence “in-place” a part of the Internet of Things, anywhere and at anytime. Intelligence, according to Jeff Hawkins*, is defined by predictions.

 

Computers, home appliances, vehicles – even the apparel and kitchenware – can be turned into a thinking unit.  They can help you act or react to the environment or your neighbours based on your behavioral routines and preferences. Your running shoes could control the friction of their soles according to your weight, the weather, and the kind of trail you choose. Your home theater system fine-tunes sound effects according to the movie genre and what time of day you are watching. There are plenty of exciting applications that come with the advent of intelligent things.

 

The question is, how does it work?

 

The data collected from sensors uploads to the cloud and is stored in (machine) learning systems, while streaming data input triggers an analytic engine to predict the best outcome and to react accordingly. Big data accumulates the background knowledge while small data evokes intelligence in-place.

 

In-Place Computing, fully utilizing the unbounded memory space of our existing 64-bit architecture, opens up the window for this sci-fi-like scenario. In-place computing utilizes virtual memory space, and thus avoids hardware lock-in and offers cross-platform computing power. As Qualcomm announced the introduction of 64-bit CPUs for handheld devices, now all mobile devices are entitled to serve complicated computing jobs at your fingertips. In-place Computing, can thus be the catalyst for a new era of “Intelligence of Things.”

 

*Check out this awesome video where Jeff Hawkins explains how brain science will change computing

Originally posted on Data Science Central

Follow us @IoTCtrlJoin our Community

Read more…

Upcoming IoT Events

More IoT News

Arcadia makes supporting clean energy easier

Nowadays, it’s easier than ever to power your home with clean energy, and yet, many Americans don’t know how to make the switch. Luckily, you don’t have to install expensive solar panels or switch utility companies…

Continue

Answering your Huawei ban questions

A lot has happened since we uploaded our most recent video about the Huawei ban last month. Another reprieve has been issued, licenses have been granted and the FCC has officially barred Huawei equipment from U.S. networks. Our viewers had some… Continue

IoT Career Opportunities