Subscribe to our Newsletter | To Post On IoT Central, Click here


David Oro's Posts (196)

Guest post by Jason English, Principal Analyst, Intellyx

Surely you’ve caught some of the excitement about drones for enterprise use. Packages and communications delivered to the world by these ultimately mobile IoT fliers. Heavy VC investment in commercial and supply chain drone applications could drive this sector to be worth as much as $13 billion by 2020.

We all remember Amazon teasing a drone-delivery future in this now-famous ad from 5 years ago. But there’s no way the online retailer will corner this game. Expect drone delivery research to advance quickly at leading transportation firms like FedEx, UPS and DHL. Uber Eats might even have drones fly over some sushi for engineers too busy for lunch.

But could drones possibly become passé for widespread business use before they can even get out of the hangar?

Drones are the ultimate IoT play for enterprise

Of all the interesting ‘things’ in the commercial IoT continuum, from geo-location tags in trucks and packages, to remote cameras, factory robots, smart sensors and controls, power meters, wearables and medical devices, nothing captures our imagination quite like a drone.

In a sense, drones can let our productive work ‘slip the surly bonds of earth,’ with the ability to move anything, and see anything, almost anywhere in the world. It gives businesses a flock of birds to command, rather than the two-dimensional constraints of surface dwelling gadgets and robots.

Take the telecommunications industry. The ability to dispatch a maintenance drone to inspect and verify the equipment on a relay tower can save a human technician a risky and time-consuming day trip up the pole for a visual inspection, improving service efficiency while reducing insurance premiums.

In many cases, the drones are even replacing telco network infrastructure themselves, maintaining a tethered position to provide communication services or wi-fi coverage services to the ground below, especially in emergency outage conditions. Facebook killed its ambitious Aquila project to expand global internet access last year, but that isn’t stopping other regional and private drone network programs.

For oil and gas, or just about any industry that involves surveying or inspection, the value of drones with advanced cameras is self-evident. Real estate firms now commonly provide dramatic flyover footage of for-sale properties, for epic establishing shots, without the epic budget.

Big agriculture is getting in on the game, exploring inspecting, seeding and possibly even spraying or weeding large crop fields with unmanned farmer drones.

And of course, for logistics and delivery services, the needle is moving. A UPS pilot program employed drones atop trucks to more efficiently handle actual doorstep delivery of packages, potentially saving the cost of untold hours of truck drivers stopping and getting out of their brown van for each package.

No drone zone - Sedona AZAre drones a nuisance, or a security menace?

I recall swimming on the serene shores of Lake Kachess here in Washington a few years ago with family and friends, miles from civilization and its accompanying noises, when an electric-razor whirring sound broke the spell of nature. A hobbyist from another campsite was buzzing us.

The kids thought it was pretty cool, but I didn’t appreciate it. What if it runs out of batteries, or flies out of range of the controller while overhead?

As drones started dropping to consumer-friendly price points, I started seeing ‘No Drones Allowed’ signs in National Park sites like Sedona, Arizona, Crater Lake, Oregon, and at Snoqualmie Falls near my house (the site famous for the ‘Twin Peaks’ show exteriors). Certainly a few disruptive drone hobbyists caused such a response.

In entertainment, drones are often associated with less-than-desirable government uses of military and surveillance activity. Hollywood films often place spy drones in the employ of authoritarian antagonists and put killer drones under the joystick of covert operations teams.

With the miniaturization of electronics and ever-improving transmitter capabilities in a lightweight package, many drones have also proven easily hackable, and detailed specifications and software mods are readily available on the Dark Web for the mischievous.

Drones are also quite effective as mobile hacking platforms — in essence they are flying laptops after all. Drones can remotely sniff for network packets without a hacker needing to step onto the target’s corporate campus.

Not the best PR for this category of IoT devices.

Flying through FAA guidelines

Fortunately, the FAA has been closely regulating and tracking the use of drones (or UAS – ‘Unmanned Aircraft Systems’ as they call them) from the start, and have implemented measures such as a 5-mile ‘no fly zone’ for drones around sites such as airports, and requiring any operator of a drone more than 0.55 pounds (most of them) get a specific license to fly.

Clearer guidelines certainly help, and lead to more responsible use of the technology. For their part, the FAA says they don’t want to inhibit innovation and commercial use of UAS, and based on news in drone industry journals like InterDrone, the agency is partnering with business operators to consider input on guidelines for situations such as night flight and flying over people.

Who’s Taking Down Drones?

I didn’t know this before I started writing this story, but it is actually illegal to shoot down drones in the United States — even if they venture onto private property — as much as I would expect some sort of ‘Castle Law’ to allow it in this gun-lobby-controlled nation. Drones are afforded the protections due any other commercial aircraft under Federal law.

So, short of the shotgun approach, who is taking down drones today?

  • Regulators. Most democratic nations seem to be fast-tracking commercial use approvals, in order to encourage additional innovation in the space and stay up to speed with the rest of the world. That said, expect new rules and licensing guidelines to develop.
  • Hackers. Certainly the strongest threat to commercial use of drones lies in the ability for determined saboteurs to intercept or interrupt control of these devices, which are optimized for performance and range, rather than encryption and security.
  • Organized Labor. Remember that UPS drone pilot program? Well-organized workers took issue with having much of their work automated by drones. Companies will need to consider the human side of their existing business when implementing drone programs.
  • Eagles. Yes, Dutch law enforcement officials developed a program to use the actual birds of prey, not the classic rock band, to snatch suspicious drones right out of the sky and ground them. How cool is that?

The Intellyx Take

Setting all the fun toys, military stigma, and regulation uncertainty aside, I expect commercial drones to become rather commonplace in the next five years, working alongside us — or, above us.

As drone technology improves, production costs will come down, while better sensors, IoT cybersecurity measures, and even onboard AI will come into play to make them a safer and situationally aware part of the automated fabric of many companies.

They’ll never be right for every kind of work though. Drones will need to expand and enhance the abilities of our human workforce to maintain strong support in the enterprise. In the end, businesses will still need to perform an objective cost-benefit analysis to determine where drones are best fit for purpose.

Then, let ‘em fly. Just don’t tell Rambo the Drone-Killing Ram.

©2019 Intellyx LLC. Sharing or reprint of this work, edited for length with attribution is authorized, under a Creative Commons Attribution-NoDerivatives 4.0 International License. At the time of this writing, none of the companies mentioned above are Intellyx customers. Image credits: No Drone Zone, Cococino National Forest; Drone, Witolt Wacshut; CC 2.0 license, Flickr.

Read more…

The IoT Brings Smart Cities to Life

Guest article by Richard van Hooijdonk

In around 30 years, planet Earth will be home to almost ten billion people, 68 per cent of which will live in urban areas. And those urban areas will face a torrent of problems, as authorities will have to rely on limited resources to provide public services to a growing number of citizens. Besides traffic congestion and the potential rise in crime rates, rapid urbanization could also lead to a number of environmental problems like air pollution and overwhelmed waste collection systems. To tackle these challenges and make cities more liveable and manageable, governments are increasingly turning to the smart city concept.

At the heart of this approach is the use of technology to improve public services such as transportation, water systems, waste disposal, and many others. And among all the technologies smart cities deploy, the Internet of Things stands out as the most important, as it’s a network of sensors and connected devices that collect data critical for understanding how urban areas function. As Stephen Brobst, the chief technology officer of Teradata, a big data analytics company, says, the IoT enables us to “get a view of the whole city across these different domains of the life of the city as it’s captured in the sensor data.”

The many ways in which the IoT helps smart cities

Investments in smart cities are ramping up across the world and are expected to grow from $80 billion this year to $135 billion by 2021. Part of that money is allocated for IoT projects that help governments and residents to increase energy efficiency, improve traffic flow, reduce pollution, cut costs, and enjoy a number of other benefits. In other words, the IoT helps smart cities to achieve many of their key goals. Take, for example, the problem of traffic congestion in cities, which is in large part caused by drivers looking for parking space. IoT sensors embedded into the city’s streets, as in the case of Barcelona, can detect empty parking spots and alert drivers through a smartphone app. This helps people park their cars faster, saving time and fuel while reducing harmful emissions.

Many smart cities also tend to promote bike-sharing services as a way to reduce pollution and congestion, but bike theft could be an obstacle for that plan. One way IoT tech can help solve this issue is through technology such as Bitlock, a keyless bike lock that’s unlocked by the user’s smartphone and tracks the GPS location of the bike. This will help police potentially track and recover stolen bikes, while also allowing private and public organizations to analyze bike traffic patterns and find ways to improve the service.

IoT technology is also efficient in tracking and analyzing water use in buildings. For instance, Banyan Water, a smart water management company, claims it’s helped customers to save more than seven billion liters of water since its inception in 2011. The way it does this is by placing sensors and ultrasonic meters that track water consumption across the building, using software to analyze the gathered data and find anomalies such as leaks and overspend.

Municipal waste management companies could benefit from the IoT, too, by placing sensors in waste collection sites, and instead of adhering to strict schedules, dispatching haulers only when collection is really needed. This could cut “overhead for waste makers by up to a whopping 60 percent.”

Things to keep in mind when implementing IoT projects
Clearly, IoT technology can improve lives in urban areas in many different ways, but simply implementing the latest tech won’t necessarily make a city ‘smart’. Marc Jadoul, the head of IoT market development at Nokia, explains that even before the first sensor is installed, the authorities must define their future objectives and budget. The next step is to create broadband internet and IoT infrastructure that can sustain increased traffic. Jadoul also suggests that the authorities need to “think big, but start small” and “identify appropriate milestones and metrics” to be able to monitor their progress. Lastly, technology isn’t the goal, but rather an instrument to make people’s lives better and more connected. To that end, the authorities should promote citizens’ engagement in ‘smart’ projects by asking for their feedback and informing them of the progress. After all, “it’s citizens’ acceptance and engagement that will eventually determine success or failure of any smart city initiative,” Jadoul concludes.

Two key challenges for the IoT and smart cities
And while authorities and citizens see smart cities as a way to live better lives, hackers see them as a potential target. The wealth of data and sensitive services that connected devices produce can be abused by bad actors to disrupt a city’s operations. For instance, imagine if cyber-attacks crippled a traffic light system or a water filtration plant and the hackers asked for ransom. This makes cyber-security one of the key priorities of any smart city endeavour. Another challenge for authorities is the need to buy expensive servers, sensors, high-speed internet networks, and a range of other equipment. Many cities struggle to find the money, although IoT projects could lead to cost savings “to the tune of $2.3 trillion in efficiencies created and revenue generated worldwide by 2024.”

Just rolling out the tech won’t be enough
As our planet becomes increasingly crowded and more people flood to cities, authorities will be under pressure to provide public services to an ever-growing number of citizens and offset the negative consequences of urbanization. Technology such as the IoT and the concept of smart cities might be a solution and a way to fight traffic congestion, pollution, inadequate water systems, and a number of other problems. But for this approach to succeed, citizen acceptance and engagement is crucial, as simply rolling out the tech won’t be enough.

Author: Richard van Hooijdonk
International keynote speaker, trend watcher and futurist Richard van Hooijdonk offers inspiring lectures on how technology impacts the way we live, work and do business. Over 420,000 people have already attended his renowned inspiration sessions, in the Netherlands as well as abroad. He works together with RTL television and presents the weekly radio program ‘Mindshift’ on BNR news radio. Van Hooijdonk is also a guest lecturer at Nyenrode and Erasmus Universities. https://www.richardvanhooijdonk.com

 

 

 

 

Sources:

Cover photo by https://www.shutterstock.com/g/yingyaipumi

Azevedo, Mary Ann, https://newsroom.cisco.com/feature-content?type=webcontent&articleId=1868607.

Giarratana, Chris, https://www.trafficsafetystore.com/blog/how-iot-technology-is-creating-the-future-smart-cities/.

Glaeser, Edward and Helen Dempster, https://www.theigc.org/reader/contagion-crime-and-congestion-overcoming-the-downsides-of-density/cities-and-urbanisation-encourage-economic-growth-in-the-developing-world/.

Horwitz, Lauren, https://www.cisco.com/c/en/us/solutions/internet-of-things/smart-city-infrastructure-guide.html.

Ismail, Nick, https://www.information-age.com/smart-city-technology-123473905/.

Jadoul, Marc, https://www.nokia.com/blog/10-recommendations-creating-smart-city/.

Maddox, Teena, https://www.techrepublic.com/article/smart-cities-expected-to-invest-80b-in-technologies-in-2018/.

https://www.nationalgeographic.com/environment/habitats/urban-threats/.

http://www.sensanetworks.com/blog/waste-management-gets-sexy-smart-sensor-tech/

https://www.un.org/development/desa/en/news/population/world-population-prospects-2017.html.

https://www.un.org/development/desa/en/news/population/2018-revision-of-world-urbanization-prospects.html.

 

Read more…

 

As we covered in the past, Gartner is out with their predictions for IoT. This time for the year's 2018-2023. The announcement was made at the Gartner Symposium/ITxpo 2018 in Barcelona, Spain. 

Nick Jones, research vice president at Gartner said, “The IoT will continue to deliver new opportunities for digital business innovation for the next decade, many of which will be enabled by new or improved technologies. CIOs who master innovative IoT trends have the opportunity to lead digital innovation in their business.”

And CIOs if you're not paying attention, get on it. Gartner says you need skills and partners to support IoT. Come 2023 the average CIO will be responsible for more than three times as many endpoints as this year.

Gartner shortlisted the 10 most strategic IoT technologies and trends that will enable new revenue streams and business models, as well as new experiences and relationships:

Trend No. 1: Artificial Intelligence (AI)

Gartner forecasts that 14.2 billion connected things will be in use in 2019, and that the total will reach 25 billion by 2021, producing immense volume of data. “Data is the fuel that powers the IoT and the organization’s ability to derive meaning from it will define their long term success,” said Mr. Jones. “AI will be applied to a wide range of IoT information, including video, still images, speech, network traffic activity and sensor data.”

The technology landscape for AI is complex and will remain so through 2023, with many IT vendors investing heavily in AI, variants of AI coexisting, and new AI-based tolls and services emerging. Despite this complexity, it will be possible to achieve good results with AI in a wide range of IoT situations. As a result, CIOs must build an organization with the tools and skills to exploit AI in their IoT strategy.

Trend No. 2: Social, Legal and Ethical IoT

As the IoT matures and becomes more widely deployed, a wide range of social, legal and ethical issues will grow in importance. These include ownership of data and the deductions made from it; algorithmic bias; privacy; and compliance with regulations such as the General Data Protection Regulation.

“Successful deployment of an IoT solution demands that it’s not just technically effective but also socially acceptable,” said Mr. Jones. “CIOs must, therefore, educate themselves and their staff in this area, and consider forming groups, such as ethics councils, to review corporate strategy. CIOs should also consider having key algorithms and AI systems reviewed by external consultancies to identify potential bias.”

Trend No. 3: Infonomics and Data Broking

Last year’s Gartner survey of IoT projects showed 35 percent of respondents were selling or planning to sell data collected by their products and services. The theory of infonomics takes this monetization of data further by seeing it as a strategic business asset to be recorded in the company accounts. By 2023, the buying and selling of IoT data will become an essential part of many IoT systems. CIOs must educate their organizations on the risks and opportunities related to data broking in order to set the IT policies required in this area and to advise other parts of the organization.

Trend No. 4: The Shift from Intelligent Edge to Intelligent Mesh

The shift from centralized and cloud to edge architectures is well under way in the IoT space. However, this is not the end point because the neat set of layers associated with edge architecture will evolve to a more unstructured architecture comprising of a wide range of “things” and services connected in a dynamic mesh. These mesh architectures will enable more flexible, intelligent and responsive IoT systems — although often at the cost of additional complexities. CIOs must prepare for mesh architectures’ impact on IT infrastructure, skills and sourcing.

Trend No. 5: IoT Governance

As the IoT continues to expand, the need for a governance framework that ensures appropriate behavior in the creation, storage, use and deletion of information related to IoT projects will become increasingly important. Governance ranges from simple technical tasks such as device audits and firmware updates to more complex issues such as the control of devices and the usage of the information they generate. CIOs must take on the role of educating their organizations on governance issues and in some cases invest in staff and technologies to tackle governance.

Trend No. 6: Sensor Innovation

The sensor market will evolve continuously through 2023. New sensors will enable a wider range of situations and events to be detected, current sensors will fall in price to become more affordable or will be packaged in new ways to support new applications, and new algorithms will emerge to deduce more information from current sensor technologies. CIOs should ensure their teams are monitoring sensor innovations to identify those that might assist new opportunities and business innovation.

Trend No. 7: Trusted Hardware and Operating System

Gartner surveys invariably show that security is the most significant area of technical concern for organizations deploying IoT systems. This is because organizations often don’t have control over the source and nature of the software and hardware being utilised in IoT initiatives. “However, by 2023, we expect to see the deployment of hardware and software combinations that together create more trustworthy and secure IoT systems,” said Mr. Jones. “We advise CIOs to collaborate with chief information security officers to ensure the right staff are involved in reviewing any decisions that involve purchasing IoT devices and embedded operating systems.”

Trend 8: Novel IoT User Experiences

The IoT user experience (UX) covers a wide range of technologies and design techniques. It will be driven by four factors: new sensors, new algorithms, new experience architectures and context, and socially aware experiences. With an increasing number of interactions occurring with things that don’t have screens and keyboards, organizations’ UX designers will be required to use new technologies and adopt new perspectives if they want to create a superior UX that reduces friction, locks in users, and encourages usage and retention.

Trend No. 9: Silicon Chip Innovation

“Currently, most IoT endpoint devices use conventional processor chips, with low-power ARM architectures being particularly popular. However, traditional instruction sets and memory architectures aren’t well-suited to all the tasks that endpoints need to perform,” said Mr. Jones. “For example, the performance of deep neural networks (DNNs) is often limited by memory bandwidth, rather than processing power.”

By 2023, it’s expected that new special-purpose chips will reduce the power consumption required to run a DNN, enabling new edge architectures and embedded DNN functions in low-power IoT endpoints. This will support new capabilities such as data analytics integrated with sensors, and speech recognition included in low cost battery-powered devices. CIOs are advised to take note of this trend as silicon chips enabling functions such as embedded AI will in turn enable organizations to create highly innovative products and services.

Trend No. 10: New Wireless Networking Technologies for IoT

IoT networking involves balancing a set of competing requirements, such as endpoint cost, power consumption, bandwidth, latency, connection density, operating cost, quality of service, and range. No single networking technology optimizes all of these and new IoT networking technologies will provide CIOs with additional choice and flexibility. In particular they should explore 5G, the forthcoming generation of low earth orbit satellites, and backscatter networks.

Gartner clients can learn more in the report “Top Strategic IoT Trends and Technologies Through 2023.”

Photo credit: Jim Templeton Cross www.templeton-cross.com, Gartner Symposium/ITxpo Barcelona 2011

Read more…

 

When I ask people what they think the Internet of Things (IoT) is all about, the vast majority will say “smart homes,” probably based on personal experience. If I say that it is also about industries making using of data from sensors, then most people’s immediate reaction is to think of manufacturing. Sensors have been used for a long time in manufacturing, and the concept of using data generated at the edge to monitor and run automated processes is well understood.

This perception, however, is underselling the IoT. In practice, it can be applied anywhere.

Monitoring ‘things’

The use cases for industries with “things” to monitor are easy to identify.

Manufacturing is one of the most obvious. Connected sensors can be used to monitor and manage the health of manufacturing equipment, identify root causes of defects and improve quality.

Health care has equipment that generates digital information about how patients’ bodies are working (e.g., blood pressure) and what they look like (e.g., scans). There are numerous opportunities to monitor people’s health more closely and accurately and catch signs of disease early, or even avoid it altogether.

The insurance industry is using telematics to monitor driving behaviour and assess the risk posed by individual drivers. Telematics also helps with the claims process because information from before a crash can indicate who is at fault, and images of a damaged vehicle can be used to assess whether the car should be written off or repaired.

The IoT also, however, has potential in industries that, on the face of it, do not really have “things,” such as financial services. Banks and other financial providers are extremely interested in the IoT, focusing on “things” which do not belong to the banks themselves, but to customers: mobile phones and payment cards, for example. Banks can improve fraud detection by notifying customers each time their cards are used – in real time – and also checking that the customer is with the card at the time. That, clearly, is a huge service for customers: no more cloning and no more fraudulent transactions.

A change in business model

A fundamental shift in business model is being enabled by IoT analytics: a move from products to services. For example, Rolls-Royce is traditionally considered an engine manufacturer. The company made and sold engines, then sold services to maintain those engines. Now, however, rather than pay for maintenance, airlines can choose to pay an hourly rate for the time that the engine is propelling the aircraft. In other words, it can pay for what it actually wants: the plane in flight at particular times. Increasingly individuals, too, are choosing to pay for a service, rather than goods, such as access to a car-sharing service, rather than owning a car.

This shift, however, has challenges for the service providers. If you are providing a service that includes a physical asset, you do not want to have to spend time and resources inspecting that asset. Instead, you want it to run itself as much as possible. The IoT allows providers to remotely monitor and collect data on all the important aspects of each asset – how it is performing, how it is being used and environmental factors, for example – and therefore automate much of its management.

The data collected from the IoT is only really useful when you can derive useful intelligence from it, and preferably in an automated way. This automation, however, requires intelligence, and that means artificial intelligence (AI).

The importance of AI – and the problem

This is one of the biggest reasons why the IoT is really taking off now: AI algorithms are becoming more usable. There is, however, still a problem. Most AI algorithms need huge amounts of data and computing power. They therefore rely on powerful servers and central data storage.

In computing terms, we humans perform most of our computation and decision making at the edge (in our brain) and in the (pre-)moment, referring to other sources (internet, library, other people) where our own processing power and memory will not suffice. This is more or less the complete opposite of the current AI algorithms, which tend to perform most of their calculations far from the data source, in servers, drawing on stored data.

To enable timely decision making in the world of IoT, you need to be able to deploy some of the cleverness (predictive models and decisioning rules) at the edge, closer to the “things” that you are managing. Some businesses are already doing this, whilst many others are still trying to figure out how to organise and make sense of the deluge of data available to them. Those at the forefront of combining AI and IoT have a huge opportunity to steal a march on their competition.

In my personal view, this is the biggest change in business models since the dot-com boom. And, as in the 1990s, there will be some big winners, and there will also be those who don’t quite get it right, and fall by the wayside.

by Jennifer Major, Head of IoT, SAS

This blog originally appeared as a SAS "Higgen Insights" Blog

Photo by Franki Chamaki on Unsplash

Read more…

In 2016, the Industrial Internet Consortium gained agreement upon an understanding of the term “trustworthiness” and its effect on design and operation of an industrial system. At the core of that understanding was a definition of trustworthiness and the designation of five characteristics that define trustworthiness.

As defined by the IIC in its recently released Industrial Internet of Things Vocabulary v2.1 document: “Trustworthiness is the degree of confidence one has that the system performs as expected. Characteristics include safety, security, privacy, reliability and resilience in the face of environmental disturbances, human errors, system faults and attacks.”

Let’s take a deeper look at the 5 foundational characteristics at the core of trustworthiness:

  • Safety ensures that a system operates without causing unacceptable risk of physical injury or damage to the health of people. This protection of humans is focused either directly or indirectly, as the result of damage to property or to the environment.
  • Security protects a system from unintended or unauthorized access, change or destruction while Information Technology (IT) security ensures availability, integrity and confidentiality (AIC model) of data at rest, in motion or in use.
  • Reliability describes the ability of a system or component to perform its required functions under stated conditions for a specified period of time.
  • Resilience describes the ability of a system or component to prevent or at least reduce any serious impact of a disruption while maintaining an acceptable level of service.
  • Privacy protects the right of individuals to control or influence what information related to them may be collected and stored and by whom and to whom that information may be disclosed.

Achieving trustworthiness in industrial IoT systems requires recognition that a complex IoT system is comprised of subsystems and the integral components of the subsystems. The trustworthiness of the overall system depends upon the trustworthiness of each of the subsystems and each of the components, how they are integrated, and how they interact with each other. Trustworthiness must be pervasive in IoT systems, which means there must be trustworthiness by design and a means to achieve assurance that the trustworthiness aspects have been addressed properly. Permeation of trust is the flow of trust within a system from its overall usage down to its smallest components and requires trustworthiness of all aspects of the system. Trustworthiness requires ongoing effort over time as systems and circumstances change.

As such, the IIC Trustworthiness Task Group, in close cooperation with the IIC Security Working Group, is tasked to frequently enhance and redefine the definition and role of trustworthiness in industrial systems as the IIoT continues to evolve. Ultimately, their goal is to moves system designers from traditional safety thought processes into a new paradigm for system design that takes into consideration all 5 of the trustworthiness characteristics and their interactions within the system.  

You can read more about trustworthiness and its relationship with industrial systems and the convergence of IT/OT in the Fall 2018 issue of ICC’s Journal of Innovation.

By Marcellus Buchheit, Co-founder of Wibu-Systems AG and President and CEO of Wibu-Systems USA

This blog originally appeared as a Wibu-Systems Blog

Read more…

How to Use IoT Datasets in #AI Applications

Guest post by Ajit Jaokar

Recently, google launched a Dataset search – which is a great resource to find Datasets.  In this post, I list some IoT datasets which can be used for Machine Learning or Deep Learning applications. But finding datasets is only part of the story.  A static dataset for IoT is not enough i.e. some of the interesting analysis is in streaming mode. To create an end to end streaming implementation from a given dataset, we need knowledge of full stack skills. These are more complex (and in high demand). In this post, I hence describe the datasets but also a full stack implementation. An end to end flow implementation is described in the book Agile Data Science, 2.0 by Russell Jurney. I use this book in my teaching at the Data Science for Internet of Things course at the University of Oxford. I demonstrate the implementation from this book below. The views here represent my own.

In understanding an end to end application, the first problem is .. how to capture data from a wide range of IoT devices. The protocol used for this is typically MQTT. MQTT is lightweight IoT connectivity protocol. MQTT is publish-subscribe-based messaging protocol used in IoT applications to manage a large number of IoT devices who often have limited connectivity, bandwidth and power. MQTT integrates with Apache Kafka. Kafka provides high scalability, longer storage and easy integration to legacy systems. Apache Kafka is a highly scalable distributed streaming platform. Kafka ingests, stores, processes and forwards high volumes of data from thousands of IoT devices. (source Kai Waehner)

Full stack – End to End

With this background, let us try to understand the end to end (full stack) implementation of an IoT dataset. This section is adapted from the Agile Data Science 2.0 book

Image source:  Agile Data Science, 2.0 by Russell Jurney

We have the following components

Events: represents an occurrence with a relevant timestamp. Events can represent various things (ex logs from the server). In our case, they represent time series data from sensors typically represented as JSON objects

Collectors are event aggregators which collect events from various sources and queue them for action by real-time workers. Typically, Kafka or Azure event hub may be used at this stage.

Bulk storage – represents a file system capable of high I/O – for example S3 or HDFS

Distributed document store – ex MongoDB

web application server – ex flask, Node.js

The data processing is done via spark. Pyspark is used for the Machine learning (either scikit learn or Sparl MLlib libraries) and the results are stored in MongoDB. Apache Airflow can be used for scheduling

Code

from github repository of Agile Data Science, 2.0 

https://github.com/rjurney/Agile_Data_Code_2/tree/training

The EC2 scripts: https://github.com/rjurney/Agile_Data_Code_2/blob/training/aws/ec2_bootstrap.sh*

The real-time notebook with Spark ML/Streaming : https://github.com/rjurney/Agile_Data_Code_2/blob/training/ch08/Deploying%20Predictive%20Systems.ipynb

Finally, below are some of the reference datasets you can use with IoT.

To conclude, using the strategy and code described here – you could in principle, create an end to end streaming IoT application. 

IoT datasets

Utilities

Gas Sensor Array Drift Dataset Data Set

Water Treatment Plant Data Set

Internet Usage Data Data Set

Commercial Building Energy Dataset

Individual household electric power consumption Data Set

AMPds2: The Almanac of Minutely Power dataset (Version 2)

Commercial Building Energy Dataset Energy, - Smart Building Energy ...

Individual household electric power consumption Energy, Smart home ...

Energy, Smart home AMPds contains electricity, water, and natural g...

UK Domestic Appliance-Level Electricity Energy, Smart Home Power de...

Gas sensors for home activity monitoring Smart home Recordings of 8...

Smart cities

Traffic Sign Recognition Testsets

Pollution Measurements for the City of Brasov in Romania

GNFUV Unmanned Surface Vehicles Sensor Data Data Set

CGIAR dataset Agriculture, Climate - High-resolution climate datase...

Uber trip data Transportation About 20 million Uber pickups in New ...

Traffic Sign Recognition Transportation

Malaga datasets Smart City A broad range of categories such as ener...

CityPulse Dataset Collection Smart City Road Traffic Data, Pollutio...

Open Data Institute – node Trento Smart City Weather, Air quality, ...

Taxi Service Trajectory Transportation Trajectories performed by al...

T-Drive trajectory data Transportation Chicago Bus Traces data Tran...

Citypulse ataset Collection

Taxi service trajectories

Health and home activity

Educational Process Mining Education, Recordings of 115 subjects’ a...

PhysioBank databases Healthcare - Archive of over 80 physiological ...

Saarbruecken Voice Database Healthcare - A collection of voice reco...

CASAS datasets for activities of daily living - Smart home Several ...

ARAS Human Activity Dataset - Smart home Human activity recognition...

MERLSense Data - Smart home, building Motion sensor data of residua...

SportVU Sport Video of basketball and soccer games captured from 6 ...

RealDisp Sport Includes a wide range of physical activities (warm u...

GeoLife GPS Trajectories Transportation A GPS trajectory by a seque...

Various sensor driving datasets

IoT Network Dataset

Various MHEALTH / physical activity dataset

Source: for some of the datasets Deep Learning for IoT Big Data and Streaming Analytics: A Survey

This article origally appeared on our sister site Data Science Central.

Read more…

Here's the latest IoT Central Digest. Encourage your friends and colleagues to be a part of our community by forwarding this newsletter to them. They can join IoT Central here. You can contribute your thoughts on IoT here.  

Featured Resources and Technical Contributions

Source for picture: contribution marked with a +

Read more…

Guest post by Kai Waehner

I built a scenario for a hybrid machine learning infrastructure leveraging Apache Kafka as scalable central nervous system. The public cloud is used for training analytic models at extreme scale (e.g. using TensorFlow and TPUs on Google Cloud Platform (GCP) via Google ML Engine. The predictions (i.e. model inference) are executed on premise at the edge in a local Kafka infrastructure (e.g. leveraging Kafka Streams or KSQL for streaming analytics).

This post focuses on the on premise deployment. I created a Github project with a KSQL UDF for sensor analytics. It leverages the new API features of KSQL to build UDF / UDAF functions easily with Java to do continuous stream processing on incoming events.

Use Case: Connected Cars - Real Time Streaming Analytics using Deep Learning

Continuously process millions of events from connected devices (sensors of cars in this example):

Connected_Cars_IoT_Deep_Learning

I built different analytic models for this. They are trained on public cloud leveraging TensorFlow, H2O and Google ML Engine. Model creation is not focus of this example. The final model is ready for production already and can be deployed for doing predictions in real time.

Model serving can be done via a model server or natively embedded into the stream processing application. See the trade-offs of RPC vs. Stream Processing for model deployment and a ....

Demo: Model Inference at the Edge with MQTT, Kafka and KSQL

The Github project generates car sensor data, forwards it via Confluent MQTT Proxy to ....

This project focuses on the ingestion of data into Kafka via MQTT and processing of data via KSQL: MQTT_Proxy_Confluent_Cloud

A great benefit of Confluent MQTT Proxy is simplicity for realizing IoT scenarios without the need for a MQTT Broker. You can forward messages directly from the MQTT devices to Kafka via the MQTT Proxy. This reduces efforts and costs significantly. This is a perfect solution if you "just" want to communicate between Kafka and MQTT devices.

If you want to see the other part of the story (integration with sink applications like Elasticsearch / Grafana), please take a look at the Github project "KSQL for streaming IoT data". This realizes the integration with ElasticSearch and Grafana via Kafka Connect and the Elastic connector.

KSQL UDF - Source Code

It is pretty easy to develop UDFs. Just implement the function in one Java method within a UDF class:

            @Udf(description = "apply analytic model to sensor input")             public String anomaly(String sensorinput){ "YOUR LOGIC" } 

Here is the full source code for the Anomaly Detection KSQL UDF.

How to run the demo with Apache Kafka and MQTT Proxy?

All steps to execute the demo are describe in the Github project.

You just need to install Confluent Platform and then follow these steps to deploy the UDF, create MQTT events and process them via KSQL levera....

I use Mosquitto to generate MQTT messages. Of course, you can use any other MQTT client, too. That is the great benefit of an open and standardized protocol.

Hybrid Cloud Architecture for Apache Kafka and Machine Learning

If you want to learn more about the concepts behind a scalable, vendor-agnostic Machine Learning infrastructure, take a look at my presentation on Slideshare or watch the recording of the corresponding Confluent webinar "Unleashing Apache Kafka and TensorFlow in the Cloud".

https://www.slideshare.net/KaiWaehner/unleashing-apache-kafka-and-t...

Please share any feedback! Do you like it, or not? Any other thoughts?

This article origally appeared on our sister site Data Science Central.

Read more…

Here's the latest IoT Central Digest. Encourage your friends and colleagues to be a part of our community by forwarding this newsletter to them. They can join IoT Central here. You can contribute your thoughts on IoT here.  

Featured Resources and Technical Contributions

Source for picture: contribution marked with a +

Read more…

Here's the latest IoT Central Digest. Encourage your friends and colleagues to be a part of our community by forwarding this newsletter to them. They can join IoT Central here. You can contribute your thoughts on IoT here.  

Featured Resources and Technical Contributions

 

Source for picture: contribution marked with a +

Read more…

We often don’t compare technology to fable stories, but when it comes to the internet of things (IoT), the story of Pandora’s Box comes to mind. It’s a technology that has great potential, but where the weakness and possibilities lie are in it’s lack of basic security measures. We might even go as far as to say, what security? These are the concerns we’re thinking about at IT Security Central.

As a completely remote company, we’re taking measures to understand how the internet of things can impact our company data security. Hackers look to exploit technology vulnerabilities to access valuable information. Hacking an IoT connected fish tank, smart fridge - these aren’t far-fetched stories. These are stories that are happening now. 

The lack of secured IoT devices starts in the development phase. These devices are developed on a basic linux operating system with default codes that buyers rarely change. When these devices are developed, security isn’t on the agenda; rather, developers are looking at human behaviors and outside threats. When they should be looking inwards.

An unsecured IoT device is the weak link in the connection. As one of the fundamental purposes of the technology is to provide connection and accessibility, this one weak link can bring down the entire network. And if your remote worker’s BYOD devices are in someway connected to that network, your company just became vulnerable.

Remote workers or ‘the gig economy’ is expected to increase in frequency. According to the Global Mobile Workforce Forecast Update, employees working remotely is suppose to increase to 42.5% of the working population by 2022. At that time, the world is projected to see half of its population working outside the office either full-time, or part-time. 

Security vulnerabilities, remote workers and IoT - where is the connection? The scary thing, remote workers are likely to already have IoT devices in their work environment, and most likely, they are not protected. These devices can mostly be smart home devices that workers have acquired to make their daily lives easier. Common devices include Amazon Echo, Neo and GeniCan.

The first step in active prevention is to make your employees aware of the importance of data security and then aid them with the tools for success.

Best Practices for Protecting Your Network (from Remote Workers)

With the wealth of internet-based security technologies, the idea of protecting your network with in-house servers and the traditional firewall is (well) old school. With cloud-based companies, you can now access and protect data in easy step-by-step processes, and the best news, most of these companies do the data management for you.

One of the most progressive approaches to remote worker security would be to adopt a monitoring service to collect data and actively look for anomalies in the network. Through data collection and analysis, a monitoring software creates a user profile of normal, everyday behavior. The administrator can set ‘alerts’ for when certain data repositories and files are accessed, or when sensitive data is moved. The longer a data breach goes undetected, the larger financial implication for the company. Requiring remote workers to download and use a remote monitoring software is one of the highest levels of protect against data loss.

But if monitoring isn’t on your agenda, these are a few basic tactics that employers can encourage remote workers to undertake.

Permissions Management

Though the workers are remote, administration can set limits to data access. This process starts by undergoing a through analysis and understanding of each position. It’s important to understand who needs access to what information, and who doesn’t need access to information. Once this is understood, administrators can restrict information, and they can also set ‘alerts’ when information is accessed without prior approval.

Home Network Policy

Once employees leave the brick & mortar walls, the manager has little access where and on what internet network they’re accessing information. But don’t fret, this freedom and flexibility is part of what make remote work appealing. Where privacy might be a factor, we don’t suggest to go as far as asking remote workers to eliminate IoT devices on their network. Rather, we encourage to create a policy that specifically states the security requirements that the IoT must have in order for the work network to be accessed. By educating your employees, you can save them and data loss heartbreak.

Encryption

Encryption, encryption, encryption. You’ve heard the importance of encryption. For remote workers, the company can never be too safe, so they should go the extra mile and set remote workers up on an encrypted network. A VPN ensures all connections and communications are encrypted when the network is accessed. Don’t worry about IoT connectivity in their home, or when remote employees connect to an unsecured public wi-fi connection. A VPN provides the next level of security through encryption, and a hacker won’t be able to access communication or data without alerting administrators to a potential breach. 

IoT devices are already integrating into our at-home lives, and when remote workers access their at-home networks, suddenly the topics collide. As more workers go remote, it’s important to look inwards towards security to see how everyday IoT devices impact company data. Take the time to ensure that remote workers are protecting the network effectively.

Guest post by Isaac Kohen. Isaac Kohen is the founder and CEO of Teramind (https://www.teramind.co/), an employee monitoring and insider threat prevention platform that detects, records, and prevents, malicious user behavior in addition to helping teams to drive productivity and efficiency. Isaac can be reached at [email protected]. Connect with Isaac on social media: LinkedIn, IT Security Central and Twitter @TeramindCo.

 

 

Read more…

Guest post by Romain Wurtz, Chief Technology Officer, NarrativeWave

As companies engage in the implementation of analytics and data science applications, many challenges lie ahead. According to the Harvard Business Review, many data science applications fail due to poor goal definition, a lack of understanding of the key data, or a lack of focus on business value.

We believe the best route to data analytics and particularly analytics for the Industrial Internet of Things, must have several key elements:

Key Elements of Effective Analytics:

Builds upon your Subject Matter Experts’ existing knowledge. Allows engineers to use the platform and be part of the analytics process.

Enables automation of key processes.  Builds a solid foundation for more complex analytics (e.g. predictive).

This article takes a look at each of these elements in further detail and explores why they are important to driving value for your organization.

Having a platform built on your subject matter experts’ knowledge is the best starting point.

Your Subject Matter Experts (SMEs) and engineers have been building and maintaining your equipment for decades. Their expertise and knowledge is the best available expertise on how your equipment should be operated, maintained, and evaluated. Incorporating their knowledge to best evaluate data from the equipment and what that data means, is the ideal starting point for the application of analytics.

Analytics platforms using purely Machine Learning or Artificial Intelligence may lack insight on what the data means and the meaning of events within the data. Without human interaction or interpretation, more advanced analytics, such as predictions, have a difficult time achieving the desired outcome. Without a determined outcome, the process can take months to evaluate, and even then, the analytic effectiveness and accuracy can remain unknown and unproven.

We believe the best starting point for analytics is one that starts by using your own proven analytic methods as a foundation and then allows for a natural, building blocks approach.

Using a platform that allows engineers to be part of the process helps with the adoption of analytics.

Adopting new analytics and data driven business models is fundamentally about changing the way business has been done for many years. In an effort to make this transition, gaining adoption and trust of key players within your organization will significantly impact the success of a new program. Having a platform where SMEs can interact and engage, without having to be a data scientist or a developer, results in higher adoption and more impactful business outcomes for the organization.

Implementing a platform that automates current processes creates short-term and significant value.

In order to gain value from large data sets and sensor data, only a platform that starts to automate part of the process can create scalable value. Meaning, the platform must be able to interpret data, generate insights, and provide recommended outcomes for end users. Otherwise, it becomes just another way to visualize and explore data. This can add value on its own, but doesn’t reach the impact that automation provides. As noted earlier, building a system on your proven analytic methods, and then adding a layer of more advanced analytics, such as machine learning based predictions, is the best route to a highly accurate, automated platform.

Building a platform with a solid foundation of your experts’ knowledge is the best way to approach implementing an entire suite of analytics.

Building a platform configured by your own SMEs creates the optimal foundation for an entire range of analytics. Your experts can provide knowledge about significant areas such as:

The meaning of key data. How sensors are related to each other.

What constitutes an actionable event?  What constitutes a false alarm?

Exceptions to the rule.

Once this knowledge is part of an automated platform, adding a full range of analytics becomes more impactful. For example, knowledge of what constitutes a false alarm can lead to an insight describing what turned a false alarm into a valid alarm and what indicators are worth automatically tracking. By contrast, an approach that solely tries to use machine learning or AI techniques without these key understandings, can struggle with the “right” business outcome, accuracy, dealing with exceptions, and delivering significant value to the business.

Business Cases & Outcomes

These business case examples show how we at NarrativeWave impact customer’s operations, profitability, unplanned downtime, and workforce efficiency.

Improved Accuracy of Event & Alarm Analysis.

Challenge: The traditional workflow of diagnosing events or alarms on large industrial assets is a manual process for engineers. A manufacturer was looking for a solution that would increase accuracy and reduce the risk of costly human errors. 

Solution: NarrativeWave’s platform allowed the customer’s engineers to create detection models and equations through the SaaS platform. Currently, this manufacturer receives accurate and automated root-cause analysis of events in near real-time.

Impact: The software provided a 25% increase in accuracy of diagnosing events, which means a more consistent, predictable solution for this manufacturer’s engineers and clients.

Reduced Time Spent Diagnosing Alerts & Alarms

Challenge: Sensors on large industrial assets generate millions of data points per second. When an alert was triggered, engineers spent hours conducting redundant, manual research to diagnose the problem and produce an actionable report for clients. The diagnostic process can take up to 16 hours and technicians were struggling to keep up with the expanding service requirements. 

Solution: The NarrativeWave platform automated their manual processes, delivering an analysis, actionable insights, recommendations, and a report to their engineers in less than 3 minutes. This allowed their engineers to make near real-time decisions on what happened, why it happened, and what to do next.

Impact: The outcome resulted in a 95% time savings in diagnosing alerts and alarms, which reduced unplanned equipment downtime, improved workforce efficiency, and enhanced service contract profitability. This proved the opportunity for a multi-million dollar savings per year for this OEM, and better supported real-time service contracts.

Optimized Productivity of Skilled Engineering Labor

Challenge: More than 50% of all industry alarms are false positives, which still have to be diagnosed and solved. A customer was looking for a solution that would allow their engineers to optimize their workflow and spend less time servicing invalid alarms. 

Solution: The NarrativeWave platform automated the root cause analysis of events to produce actionable insights based on the manufacturer’s data. The outcome was an explanation of the event that occurred and guidance on what to do next, which was provided to the engineers within a few minutes.

Impact: The platform accurately and quickly invalidated false alarms, allowing engineers to focus more time on resolving valid alarms and serving their clients. For the first time, engineers were being leveraged in the best way to impact this manufacturer’s operations.

Increased Efficiency in Creating Detection Models

Challenge: A large enterprise client had a robust analysis setup with 3 detection models and 150 threshold variants. The client’s process for iterating detection models originally took 3–4 months and required engineers to rely on development from either a software engineer, data scientist, or an outside software vendor. 

Solution: NarrativeWave’s platform provided an intuitive pipeline, enabling their business users to quickly create, manage, and iterate their own detection models. The platform is user-directed, managed and utilized by the customer’s internal engineers, without the ongoing need of developers or data scientists.

Impact: The iteration timeframe has been dramatically reduced since using NarrativeWave. More importantly, this customer’s engineers can setup iterations on their own, allowing for immediate impact on the business operations and for their clients.

Enhanced Next Generation Knowledge Base

Challenge: Engineers have been detecting alarms individually for 30 or more years. While working with a major engine manufacturer, NarrativeWave found the detection process was not recorded, standardized, or made available to other engineers and management within the organization. 

Solution: The platform is setup to record the engineers’ knowledge and feedback, resulting in a platform that gets smarter over time. Engineers can customize the business analysis and recommendations to make them as accurate as possible, therefore creating an evolving knowledge base for SMEs. 

Impact: The outcome resulted in the manufacturer, for the first time, being able to capture their engineers’ knowledge. This increased collaboration between engineers, improved standardization, and allowed valuable knowledge to be visible across the organization.

Improved Fleet Health & Management

Challenge: Manufacturers and equipment operators currently lack visibility into assets across their entire fleet, making it difficult to identify poorly performing assets and best performing assets. 

Solution: With NarrativeWave, asset performance can be evaluated near real-time, enabling organizations to better manage critical assets and plan for future actions, all by the click of a mouse.

Impact: The platform-wide view provides significant time-savings of tracking and managing fleet health for equipment manufacturers and operators. Additionally, the platform reduces unplanned downtime and helps organizations prevent critical equipment failures.

Improved Predictive Analytics & Maintenance

Challenge: Manufacturers and equipment operators are interested in deploying predictive models for better asset maintenance and warranty support. Pure machine learning approaches lack a solid foundational basis and can be difficult to implement successfully.

Solution: With the NarrativeWave Knowledge Base, key information such as the meaning of events, the relationship of sensors, and what constitutes a valid alarm are already known. By applying machine learning techniques to a solid NarrativeWave foundation, predictive analytics is more effectively implemented. 

Impact: This approach provides a strategic method of utilizing predictive analytics and improves the outcome of implementing analytics. The result is a highly accurate, auditable platform rather than a pure “black box” approach.

 

Read more…

 

Internet of Things (IoT) has generated a ton of excitement and furious activity. However, I sense some discomfort and even dread in the IoT ecosystem about the future – typical when a field is not growing at a hockey-stick pace . . .

“History may not repeat itself but it rhymes”, Mark Twain may have said. What history does IoT rhyme with?

 I have often used this diagram to crisply define IoT.

Even 10 years ago, the first two blocks in the diagram were major challenges; in 2017, sensors, connectivity, cloud and Big Data are entirely manageable. But extracting insights and more importantly, applying the insights in, say an industrial environment, is still a challenge. While there are examples of business value generated by IoT, the larger value proposition beyond these islands of successes is still speculative. How do you make it real in the fastest possible manner?

In a slogan form, the value proposition of IoT is ”Do more at higher quality with better user experience”. Let us consider a generic application scenario in industrial IoT.

IoT Data Science prescribes actions (“prescriptive analytics”) which are implemented, outcomes of which are monitored and improved over time. Today, humans are involved in this chain, either as observers or as actors (picking a tool from the shelf and attaching it to the machine).

BTW, when I mentioned “Better UX” in the slogan, I was referring to this human interaction elements improved by “Artificial Intelligence” via natural language or visual processing.

Today and for the foreseeable future, IoT Data Science is achieved through Machine Learning which I think of as “competence without comprehension” (Dennett, 2017). We cannot even agree on what human intelligence or comprehension is and I want to distance myself from such speculative (but entertaining) parlor games!

Given such a description of the state of IoT art in 2017, it appears to me that what is preventing us from hockey-stick growth is the state of IoT Data Science. The output of IoT Data Science has to serve two purposes: (1) insights for the humans in the loop and (2) lead us to closed-loop automation, BOTH with the business objective of “Do More at Higher Quality” (or increased throughput and continuous improvement).

Machine Learning has to evolve and evolve quickly to meet these two purposes. One, IoT Data Science has to be more “democratized” so that it is easy to deploy for the humans in the loop – this work is underway by many startups and some larger incumbents. Two, Machine Learning has to become *continuous* learning for continuous improvement which is also at hand (NEXT Machine Learning Paradigm: “DYNAMICAL" ML).

With IoT defined as above, when it comes to “rhyming with history”, I make the point (in Neural Plasticity & Machine Learning blog) that the current Machine Learning revolution is NOT like the Industrial Revolution (of steam engine and electrical machines) which caused productivity to soar between 1920 and 1970; it is more like the Printing Press revolution of the 1400s!

Printing press and movable type played a key role in the development of Renaissance, Reformation and the Age of Enlightenment. Printing press created a disruptive change in “information spread” via augmentation of “memory”. Oral tradition depended on how much one can hold in one’s memory; on the printed page, memories last forever (well, almost) and travel anywhere.

Similarly, IoT Data Science is in the early stages of creating disruptive change in “competence spread” via Machine Learning (which is *competence without comprehension*) based on Big Data analysis. Humans can process only a very limited portion of Big Data in their heads; Data Science can make sense of Big Data and provide competence for skilled actions.

 

To make the correspondence explicit, "information spread" in the present case is "competence spread"; "memory" analog is "learning" and "printed page" is "machine learning".

 

Just like Information Spread was enhanced by “augmented memory” (via printed page), Competence Spread will be enhanced by Machine Learning. Information Spread and the Printing Press “revolution” resulted in Michelangelo paintings, fractured religions and a new Scientific method. What will Competence Spread and IoT Data Science “revolution” lead to?!

From an abstract point of view, Memory involves more organization in the brain and hence a reduction in entropy. Printed page can hold a lot more “memories” and hence the Printing Press revolution gave us an external way to reduce entropy of “the human system”. Competence is also an exercise in entropy reduction; data get analyzed and organized; insights are drawn. IoT Data Science is very adept at handling tons of Big Data and extracting insights to increase competence; thus, IoT Data Science gives us an external way to reduce entropy.

What does such reduction in entropy mean in practical terms? Recognizing that entropy reduction happens for Human+IoT as a *system*, the immediate opportunity will be in empowering the human element with competence augmentation. What I see emerging quickly is, instead of a “personal” assistant, a Work Assistant which is an individualized “machine learner” enhancing our *work* competence which no doubt, will lead each of us to “do more at higher quality”. Beyond that, there is no telling what amazing things “competence-empowered human comprehension” will create . . .

I am no Industrial IoT futurist; in the Year 1440, Gutenberg could not have foreseen Michelangelo paintings, fractured religions or a new Scientific method! Similarly, standing here in 2017, it is not apparent what new disruptions IoT revolution will spawn that drop entropy precipitously. I for one am excited about the possibilities and surprises in store in the next few decades.

PG Madhavan, Ph.D. - “LEADER . . . of a life in pursuit of excellence . . . in IoT Data Science” 

http://www.linkedin.com/in/pgmad

This post original appeared here.

Read more…

Quantifying IoT Insecurity Costs

Ever wonder what is the real cost of IOT insecurity?

Well reseachers at the University of California, Berkeley, School of Information recently published a report that attempts to lay out the costs to consumers in the context of DDoS attacks. The report focuses on exploiting vulnerable devices for their computing power and ability to use their network’s bandwidth for cyberattacks—specifically DDoS attacks on Internet domains and servers.

Researchers infected several consumer IoT devices with the Mirai malware and measured how the devices used electricity and bandwidth resources in non-infected and infected state. Their hypothesis: compromised IoT devices participating in a DDoS attack will use more resources (energy and bandwidth) and degrade the performance of a user’s network more than uninfected devices in normal daily operation.

Based on energy and bandwidth consumption they developed calculator to estimate the costs incurred by consumers when their devices are used in DDoS attacks. Two recent and well publicized attacks, and one hypothetical, were calculated:

  • Krebs On Security Attack: According to their cost calculator, the total electricity and bandwidth consumption costs borne by consumers in this attack was $323,973.75.

  • Dyn, Inc. Attack: They calculate the total cost borne by consumers as $115,307.91.

  • "Worst-Case" Attack: This hypothetical “Worst-Case” scenario approximates the costs that could result if the Mirai botnet operated at its peak power using a UDP DDoS attack. The projected cost to consumers of this attack is $68,146,558.13.

Commenting on the study, Bob Noel, Director of Strategic Relationships and Marketing for Plixer said, “Organizations with enslaved IoT devices on their network do not experience a high enough direct cost ($13.50 per device) to force them to worry about this problem. Where awareness and concern may gain traction is through class action lawsuits filed by DDoS victims. DDoS victims can suffer financial losses running into the millions of dollars, and legal action taken against corporations that took part in the distributed attack could be mechanism to recuperate losses. Companies can reduce their risk of participating in DDoS attacks in a number of ways. They must stop deploying IoT as trusted devices, with unfettered access. IoT devices are purposed-built with a very narrow set of communication patterns. Organizations should take advantage of this and operate under a least privilege approach. Network traffic analytics should be used to baseline normal IoT device behavior and alarm on a single packet of data that deviates. In this manner it is easy to identify when an IoT device is participating as a botnet zombie, and organizations can remediate the problem and eliminate their risk of being sued.”

Or as we've argued before, regulation is key. And now that we have an economic cost on IoT insecurity, we have better information for regulators to pursue strategies and legislation for enforcing workable security standards to reduce the negative impacts of IoT devices on society.

 

 

 

Read more…

IoT Fight

Here's the latest IoT Central Digest. Encourage your friends and colleagues to be a part of our community by forwarding this newsletter to them. They can join IoT Central here. You can contribute your thoughts on IoT here.  

Featured Resources and Technical Contributions

 

Source for picture: contribution marked with a +

From our Sponsors

Read more…

Surprise! Operations and IT aren't getting along when it comes to IoT.

451 Research announced new survey results that show operational technology (OT) and IT stakeholders are not aligned on IoT projects. Sure will be harder to drive business results if this doesn't get fixed. Here are some key findings:

Research shows that IT and OT personnel are not well aligned on IoT initiatives, and they need to cross that divide for those enterprise IoT projects to prove viable.

  • Only one-third of OT respondents (34%) said they ‘cooperate closely with IT’ on IoT projects from conception to operations.
  • A relatively small group of respondents said they were in ‘active conflict’ with IT over IoT, OT professionals are four times more likely to characterize their relationship with IT that way.  
  • More than half (55%) of the OT survey respondents currently deploy IoT within their organization, and 44% have successfully moved those projects from proof of concept to full-scale deployment.
  • New operational efficiencies and data-analytics capabilities are driving successful projects; however, many IoT projects face roadblocks in the trial stage due to the IT and OT divide and budget, staff, and ROI concerns. 

Additional details in the graphic below. Want the full findings? 451 Research will happily sell it to you.

 

Read more…

Counterfeit Menace

Here's the latest IoT Central Digest. Encourage your friends and colleagues to be a part of our community by forwarding this newsletter to them. They can join IoT Central here. You can contribute your thoughts on IoT here.  

Featured Resources and Technical Contributions

Source for picture: contribution marked with a +

From our Sponsors


Follow us on Twitter | Join our LinkedIn group | Members Only | For Bloggers | Subscribe

Read more…

IoT Survival Guide

Here's the latest IoT Central Digest. Encourage your friends and colleagues to be a part of our community by forwarding this newsletter to them. They can join IoT Central here. You can contribute your thoughts on IoT here.  

Featured Resources and Technical Contributions

 

Source for picture: contribution marked with a +

From our Sponsors


Follow us on Twitter | Join our LinkedIn group | Members Only | For Bloggers | Subscribe

Read more…

As demand for location services in all areas of the Internet of Things (IoT) grows, so too has the requirement for precision location. For many applications, especially those that need to scale to cover large areas, providing ”proximity zone” types of location is simply not accurate enough. That means the old way of determining location—primarily using Bluetooth beacons—is no longer sufficient.

Bluetooth beacons have been the go-to solution for determining location for years, but they have three limiting factors:

  • Beacons only work with smartphones, not tags, which limits how they can be used
  • They are able to locate objects in best case within 3-4 meters, which is fine for determining a general location, but is not refined enough to meet the requirements for many of today’s applications
  • Beacons are battery-operated, which impacts their ability to deliver real-time location; frequent transmissions drain the device’s battery, meaning frequent replacements are necessary

The shortcoming of beacons and other location technologies that rely on smartphones has spawned an industry shift to a more network-centric approach, with the intelligence moving to the receiver antenna and a centralized software application, rather than the intelligence residing in a smartphone app. That, in turn, has launched the development of a wide range of active, low-cost Bluetooth Low Energy (BLE) tags with long battery life and possible on-board sensors.

Another shift occurring is a change in how signals from these tags are measured to determine location. The traditional method—using signal strength to estimate location—does not take into consideration how the signal will be impacted by its environment.  While a weak signal could indicate an object is far away from a beacon, it’s also possible a physical object, such as a concrete pillar or wall, is impacting the signal. 

Two new approaches are emerging for BLE angle estimation. The first is based on the signal’s Angle of Arrival (AoA)—the precise direction the device is from the receiver antenna arrays. With AoA, multiple antennas are used within the same devices to measure the signal. This allows the antenna to locate a tag or smartphone with accuracy of 10 to 20 centimeters, not meters.

The second approach considers the signal’s Angle of Departure (AoD). In this approach, the location intelligence is moved back to the mobile devices. The AoD approach works like "indoor GPS," where the fixed infrastructure devices (aka Locators) are only broadcasting and are not aware of the receiving devices, similarly to how a GPS Satellite works. This means the capability to locate an unlimited amount of devices, and no privacy issues. 

As the use cases for indoor location services continue to grow, with every industry from manufacturing and logistics to healthcare and retail, to law enforcement and beyond clamoring for more precision, new approaches beyond Bluetooth beacons need to be considered. The AoA and AoD methodologies are quickly gaining momentum as the next generation of location technology.

Guest post by Antti Kainulainen is CTO & cofounder of Quuppa. Before Quuppa, he was with Nokia Research Center (NRC) during 2005-2012, where he was the lead engineer in several projects related to indoor positioning. He also took part in the standardization work of the Bluetooth Wireless technology. Antti received his M.Sc. degree in technology from Helsinki University of Technology in 2007. He has 16 granted patents and 22 pending patent applications. More at www.quuppa.com

 

Read more…

Farm to Fork IoT

Here's the latest IoT Central Digest. Encourage your friends and colleagues to be a part of our community by forwarding this newsletter to them. They can join IoT Central here. You can contribute your thoughts on IoT here.  

Featured Resources and Technical Contributions

Source for picture: contribution marked with a +

From our Sponsors


Follow us on Twitter | Join our LinkedIn group | Members Only | For Bloggers | Subscribe

Read more…