Subscribe to our Newsletter | To Post On IoT Central, Click here


Apps and Tools (47)

Here's a quick list of the 6 Best Online Resources for Embedded Firmware. Enjoy!

  1. Reverse Engineering Stack Exchange: "Reverse Engineering Stack Exchange is a question and answer site for researchers and developers who explore the principles of a system through analysis of its structure, function, and operation."
  2. The Ganssle Group: We love Jack Ganssle! He writes, "I'm on a mission to help embedded developers produce better products faster. My newsletter, seminars and the 1200+ articles on this site all give better ways to build embedded products, while maximizing the fun of engineering hardware and firmware."
  3. The Embedded Systems Conference: "Experience the industry's largest, most comprehensive technical conference for embedded systems professionals. Connect with top-level engineers, developers, and decision makers at the forefront of driving embedded systems design. Showcase your latest software design innovation, hardware breakthroughs, hottest IoT solutions, and demo your services in-person to hundreds of attendees with active projects."
  4. Embedded Computing Design: "For the past 30+ years OpenSystems Media (formerly OpenSystems Publishing) has focused solely on the embedded computing market. OpenSystems Media offers balance: taking not only a broad, encompassing look at trends and technologies, but also focusing on certain solutions in-depth."
  5. Embedded.fm Podcast: "Embedded.fm is a site dedicated to the many aspects of engineering. We talk about the how, why, and what of engineering, usually devices. The site includes a weekly audio show created and hosted by Elecia White and Christopher White. Our guests include makers, entrepreneurs, educators, and normal, traditional engineers. The site also includes a blog written by Elecia White, Christopher White, Andrei Chichak, and Chris Svec."
Read more…

A few years ago, the idea of a “Telco in a Box” was very usual among the Telecommunication industry. Basically, it was a pre-integrated, turnkey real-time billing and customer care solution that enabled communications service providers (CSPs) to accelerate their growth strategies and increase profitability.

Companies like Accenture, Oracle, Redknee or Tech Mahindra used this concept addressed to Mobile Virtual Network Operators or MVNOs, Tier 3 Operators and Tier 1 sub brands. The benefits of this solution were clear:

  • A low-risk, quick to launch turnkey solution
  • Go to market faster than competitors

It was a matter of time that this marketing slogan reached the Internet of Things (IoT). And so it has been, at the moment with little noise, but it is certain that we will see much more "IoT in a Box" in the next months.

What is IoT in a Box and What's in the box

Today we could say that IoT in a Box is:

  • A pre-configured, fully integrated, enterprise-enabled IoT bundle optimized for IoT processing (Telco view)
  • All the required building blocks to develop a wireless IoT system (IoT Vendor view)

In the first case, the IoT in a Box must include some of the following components depending of the application:

  • ·         Hardware / Hardware as a Service
    • ·         1 o more battery powered modules with sensors for monitoring for instance temperature, humidity, geo-location, movement, vibration, battery level or signal strength
    • ·         1 or more Relay switch or actuators
    • ·         1 GSM chip (SIM) per module with a data plan
    • ·         IoT gateway
  • ·         Software / Software as a Service
    • ·         Device management
    • ·         Enterprise database with storage plan
    • ·         Security Connectivity
    • ·         Pre-configured dashboards
    • ·         Pre-configured thresholds and alerts
    • ·         Mobile app
  • ·         Services / Services as a Service
    • ·         Professional Services (optional)
    • ·         Support (basic included, premium optional)

When you receive your IoT in a Box.  All you must do is:

  1. charge your modules
  2. place them on (or in) things,
  3. login to your own org to name your modules, and then
  4. turn on your modules. As soon as you activate a module, it starts to send sensor data, and you can start monitoring your things in near-time - online or using the mobile app.

“The concept behind a basic “IoT in a box” is that It takes you less than 1 hour to set up your own IoT system.”

In the second case, the IoT in a Box must include a Development Kit and all required building blocks to develop a wireless IoT system. We will see some examples later.

What if I want to expand the capabilities of my IoT application?

Although IoT in a Box is aimed at solving a simple business need, in certain scenarios or industries it may be necessary to extend the capabilities included in the Box. In this regard, vendors must provide accessories, expansion modules, I/Os and peripherals, Multi-standard connectivity options  and additional Pre-configured dashboards and alerts depending of the industry and application.

Selling IoT in a Box

When I wrote Welcome to the first “Selling IoT” Master Class!, I did not emphasize in selling IoT to Small and Medium Business (SMB) and Consumer market.  Precisely, the main objective that vendors pursue with the “IoT in a box” is increase sales in SMB market. This is a huge market and vendors need a way to escalate by channel partners, but as I do not consider myself an expert selling to SMB, so I look forward for your advices.

Is IoT in a Box already in the market?

Due to confidentiality agreements, I cannot include info from different vendors that will be selling IoT in a Box very soon.  But we can find already some examples of IoT in a Box in the market. See below some of them based on public information.

T- Mobile IoT in a Box - With the T-Mobile IoT Box, you can realize your individual M2M application without great effort. Connect your devices and sensors and transfer the obtained data to a cloud system via mobile radio. A data interface provides processing and integration information to other systems, websites, or apps. The T-Mobile IoT Box consists of a developer board with an integrated M2M SIM card, several inputs / outputs and Bluetooth smart interface, an online portal and a RESTful API.

T-Mobile US – IoT promotion for device makers - Building on its movement into the internet of things (IoT) market, T-Mobile US announced a new IoT-specific pricing model as part of a promotion that includes a free Cat1 LTE module along with data services.

T-Mobile US, SVP Doug Chartier said: “The wireless industry needs simpler options for IoT to take off, and that’s exactly what we’re delivering.”

Telia M2M in a Box - M2M technology easy and affordable for any business. Telia M2M in a Box gives you a set of hardware with sensors providing you with real time information about position, movement and climate, which you can monitor directly in the web portal. A versatile and user-friendly measurement tool to observe, monitor and protect your business remotely.

Capgemini IoT-in-a-Box is a rapid, low-cost, low-risk, method to pilot IoT strategy to test and define business cases and provides a pre-configured, enterprise-ready IoT system for monitoring up to 25 devices. It simplifies the task of aligning integrating and configuring all IoT components to provide rapid time to value.

IBM - The Intelligent Building – IoT Starter Kit (Enterprise Edition) is an out–of-the-box IoT solution for Intelligent Buildings. The kit provides seamless integration of the EnOcean Smart Gateway with the Watson IoT Platform.

Relayr- Relayr -Industrial Grade Starter Kit for IoT Acceleration powered by relayr, Intel, Dell and Bosch.

Microsoft – Solair IoT in a Box was an IoT plug&play kit to connect things, sensors, machines to a gateway and then, in a few clicks, instantly visualize data on the Solair application. After acquisition of Solair probably Microsoft had discontinued this offer.

Bosch - Bosch IoT Starter kits that come with pre-configured XDK devices + cloud connectivity. It is as out of the box as it could be!

HPE - HPE Uncorks IoT In A Box - Called (at least by Hewlett Packard Enterprise) the ‘industry’s first converged systems for the IoT’, the Edgeline EL1000 and Edgeline EL4000 systems ‘integrate data capture, control, compute and storage to deliver heavy-duty analytics and insights at the edge to enable real-time decision making.’

Electric Imp - IoT QuickStart by Electric Imp - Electric Imp’s IoT QuickStart Family is designed to help you cut the time to build, test and prototype complex IoT solutions all while maintaining industrial-strength security, scalability and control. Based on reference designs that Electric Imp experts have developed over the past five years, the IoT QuickStart Family appliances represent the most frequently requested secure connectivity and device prototype solutions, each delivered in a fraction of the time and cost required by custom-built solutions.

Creator Ci40 IoT Developer Kit - The Creator Ci40 board is a high-performance, low-power microcomputer that packs a cXT200 chip based on a subsystem optimized by us specifically for IoT applications. The cXT200 SoC includes a dual-core, dual-threaded MIPS CPU clocked at 550 MHz and an Ensigma connectivity engine that covers super-fast 802.11ac 2×2 MIMO Wi-Fi and low-power Bluetooth/Bluetooth low energy (Classic and Smart). See also: Imagination Launches ‘IoT In A Box’ Kickstarter - and Build a home IoT irrigation system with 'IoT-in-a-box' kit .

Nextcloud Box – a private cloud and IoT solution for home users – from Nextcloud, Canonical and WDLabs. Nextcloud Box makes hosting a personal cloud simple and cost effective whilst maintaining a secure private environment that can be expanded with additional features via apps. The Nextcloud Box consists of a hard drive and a case, complemented by a Raspberry Pi 2 or a similar credit-card sized computer. The pre-configured, easy-to-use platform handles file storage and syncing, communication and more, requires no maintenance and enables users to install more functionality through apps like Spreed, OpenHab and Collabora Online. The box offers 1TB of storage at the price point of Eur 70. For information on where to buy please visit nextcloud.com/box.

WIKON – My M2M BOX – Our special expertise lies in the compliance with industrial standards for our product developments and the development of powerful embedded hardware and software. Special developments for explosion zones, adverse environmental conditions, IP-68 standards and extended temperature ranges are frequently in demand.

Mobica collaborates with Advantech to develop a complete IoT Solution - Mobica, a Silver member of Oracle Partner Network (OPN) and global provider of a leading-edge software engineering, testing and consultancy services, developed a solution which aggregates data from a variety of sensors and sends it to the Oracle Internet of Things Cloud Service for analysis and integration. Mobica used an Advantech UTX-3115 IoT gateway and a M2.COM based WISE-1520 Low-Power Wi-Fi IoT node for sensor input.

The ThingBox Project - Use Internet of Things technologies without any technical knowledge and for free.

Eight best IoT starter kits: The best internet of things developer kits –

Imagination Meluncurkan kit IoT –“IoT http://misteriotcom/2015/11/24/imagination-meluncurkan-kit-iot-iot-in-a-box/

There are many IoT Vendors who offer Devices, IoT platform, Apps and Services bundled with the same purpose of IoT in a Box, democratize the IoT.

IoT in a Box and IoT Marketplaces

As we know “IoT is not only about connecting things, neither controlling things”, it is about the Things become more intelligent and therefore companies could offer new services under new business models. I believe that IoT marketplaces will play a key role in the evolution of IoT in a box. We have already some examples:

Libelium, the IoT Marketplace is a one stop click-and-buy online store. The company is helping frustrated companies with pre-integrated solutions from choosing the right hardware, cloud components to application.

Telus IoT Marketplace – Connect the things that matter to your business by leveraging connected devices provided by their partner network.

ThingWorks Marketplace – gives easy access to everything you need to build and run your ThingWorx based IoT application. All components listed on the ThingWorx Marketplace are customized, tested and guaranteed to work with the ThingWorx platform.

Intel IoT marketplace – Coming soon.

“IoT in a Box solutions that encompass infrastructure, networking, analytics, service enablement and monetization to connect devices, expose data, services and processes to applications, consumers and machines will be the foundation for IoT marketplaces”.

IoT Service in a Box, the logical evolution of IoT in a Box 

I believe that the logical evolution of IoT in a Box will be IoT Service in a Box sold through IoT marketplaces. It is a matter of time that we will see:

  • ·         Predictive Maintenance in a Box as a Service
  • ·         Loss Prevention in a Box as a Service
  • ·         Asset Location in a Box as a Service
  • ·         Predictive Intrusion in a Box as a Service
  • ·         Vending Machine Product Recommendation in a Box as a Service
  • ·         Real time micro-Inventory in a Box as a Service
  • ·         Customer Emotion in a Box as a Service
  • ·         ……  Your imagination is the limit

 

 

Read more…

As more and more companies are drawn into the IOT bandwagon for the lure of the future business potential, the value realization from the IOT technologies continues to be more elusive than ever. If one were to take into account the enormous spends by the enterprises till date on the IOT products and solutions, and compare this with the new business opportunity generated by IOT till date, we can sum this investment with the tagline – “Chasing Million with Billions”.

While this is mostly true for most emerging technologies and usually dominating a technology market more often than not, becomes a battle of investments, as companies outbid each other for the acquisitions and market share, IOT leadership would be far more challenging than anything else seen before.

So what is the curious case about IOT? Well for starters winning the IOT leadership would determine the future existence of many companies. We would witness the demise of many companies and the rise of new giants by the time dust settles for the IOT leadership. More than 2 years have elapsed since IOT became part of every boardroom discussion and now the battle of IOT has moved on from the strategy to execution. 

With the appeal so universal which originates by adding the adjectives, “smart”, “connected” and “digital” to all the products and services we are using or would use in future, the IOT technology space has slowly morphed itself into the “battle of the platform”. So consortium of companies are aligning with each other and are positioning their bets on platform leadership.

So what is the problem with this shift? The very nature of the platform development encompassing the needs of applications hosted for smart homes, smart cities, connected vehicles and products spanning the domains across healthcare, retail, manufacturing to name a few means an enormous list of “backlog” for development of the “new” platform. So IOT platform development is hit by the bane of the trinity – development of full scope, in budget and on time for scalable market adoption. Estimates for building such a multi-purpose all-encompassing platform with full feature set would set aside any product development organization by a few billion dollars.    

Additional complexity to the platform development is the timeline for the market availability of such feature complete platform, especially considering that data ingestion from thousands of disparate devices across multiple network protocols in streaming format real-time. Above and beyond this, the cost to secure all end points and prevent the devices from the potential hacking would surely add several million dollars to the cost.

So what is challenge with the platform development?  The problem is the very nature of the IOT market – the universal appeal and low price points. Most markets which have such a universal appeal often can accommodate 2-3 players at maximum, so all the competing platforms in development now and spending big dollars can face a high potential of failure. As more announcements are made and more investments pours in, the bloodier the war for IOT supremacy would become. The very nature of the digital market which ensures “winner take all market” is both the lure and the source of agony.

What does chasing Million with Billions imply?  As the transaction volumes increases, the transaction value dramatically decrease and with smaller per capita spending by the end user, the ROI calculations moves the break-even date far out into the future. Net Present Value for the future cash flow projections with the diminished order sizes for the next few years at best could accrue in Millions, but the upfront investment required to win the IOT leadership would require investment in the order of Billion. A more detailed analysis of the IOT Economic Perspective is presented in this previous series. (http://bit.ly/2a2sfcq). Generally bigger the stakes at the end, the fierce the competition becomes and IOT would witness one of the longest standing investment war for supremacy. While the winner would definitely be taking all, the pain for the competition would be intense. While many would drop out of the race in the short term due to the lack of funding or cash crunch, a few giants with deep pockets would continue to wrestle on. 

So would your strategy be the best? Would you leapfrog the paradox of earning million with billion and come out as the eventual winner? And which side of the competition would you stand when this IOT leadership war is over?

In the next series I would be providing more recommendation to solve the curious case of IOT platform leadership. Please drop in your comments.

Note: This article is independent view and presents the IOT story from a vendor neutral perspective.

Read more…

If IoT is the Meteor, is OT the Dinosaur?

Today’s digital transformation of business and government will have the same effect. It will make short work of any organization that does not evolve rapidly. CEOs must quickly define where their organizations can compete for success, and lead them on that journey. If they can’t—or won’t—change, they risk fading away like the dinosaurs.
Read more…

In this article, I’ll keep introducing AggreGate IoT Platform-based products. In 2010, two years after AggreGate Network Manager release, we started AggreGate SCADA/HMI project ‒ fourth-generation SCADA system.

So what is fourth-generation SCADA? 

Wikipedia suggests the following definitions:

  1. First-generation SCADA are monolithic systems that had been developed before the Internet expansion became widespread. Such systems do not operate anymore.
  2. Second-generation SCADA solutions are operating in enterprise local systems. They are supposed to employ IP network for connection between controllers, data collection servers, controlling servers, and APM operators.
  3. Third-generation SCADA architecture enables coordination of geographically-distributed automatic process control systems. These systems include multiple manufacturing sites and remote monitoring objects. Until recently, third-generation SCADA was a cutting-edge SCADA product with the possibility of HMI launch in mobile device browsers, remote project editing right on the production server, and testing without server shutdown and project file copying.
  4. Finally, fourth-generation SCADA should fit the Internet of Things. It implies decentralization and unification at a greater extent, i.e. possibility of algorithm execution point shift between SCADA servers and controllers. Another indispensable feature is operation via cellular and satellite network avoiding VPN (controllers with no static IP address can connect to SCADA servers operating in a cloud).

Naturally, every SCADA vendor develops their products evolving from generation to generation, while the previous versions become stagnant for not being compatible with the latest trends.

IoT Platform-based AggreGate SCADA/HMI (http://aggregate.tibbo.com/solutions/scada-hmi.html) has inherited all functions of fourth-generation SCADA:

  • Built on Java platform, the system operates on Linux perfectly, which allows SCADA core running on embedded systems. Our OEM partners supply systems built on Raspberry Pi, BeagleBone Black and similar low-priced microcomputers. In addition, there is an option for SCADA core to access IP communications, serial ports, as well as discrete and analog inputs, etc.
  • The same solution operating on regular servers provides centralized data collection and HMI handling. Servers based on unified architecture establish peering relations for data interchange with PLCs.
  • The system is fully compatible with all Tibbo programmable controllers and modules.
  • HMIs can be launched on Linux or Windows PCs, touch boards, or opened in web-browsers.
  • There are no such concepts as “development environment” or “runtime environment” in our solution. Development is implemented via remote connection right on a production server considering role-based permissions. In addition, there are a lot of ways of cloning the whole project or its parts. Platform capabilities for designing reference projects and derived products will be described in a separate article.
  • AggreGate Platform is tailored to work with M2M devices. The server with controllers connecting to it by themselves operates perfectly. In our terminology, such controllers connecting to the server themselves are called agents.

 

There is still a question left: why have we developed another SCADA? The international market is saturated with such solutions.

The point is that AggreGate SCADA/HMI as an AggreGate Platform add-on is technically a set of drivers for data collection and typical HMI vector images. All features necessary for SCADA are AggreGate Platform components: GUI (widget) builder, report editor, alert and event control tools, tag modeling system, failover clustering technology, SDK with DDK, etc.

Our investment to SCADA system development was not so great comparing to the development of such a system from scratch. To implement industrial and building automation projects, we developed the drivers for standard process control protocols (Modbus, OPC, OPC UA, BACnet, DNP3, etc.) and designed several thousands of vector images.

Along with standard SCADA system functions, AggreGate Platform fills it with exceptional features, for instance: 

  • Statistics storage in Round-Robin Database (RRD) and NoSQL database (BigData)
  • Unlimited horizontal and vertical system scaling based on AggreGate distributed architecture
  • Data collection and control via both IT monitoring protocols (SNMP, FTP, JMX, SSH, WMI) and generic ones (SQL, SOAP, CORBA, LDAP…).

These features allow you to apply the system in multiple projects, not typical for SCADA solutions. AggreGate SCADA/HMI, in particular, is used for manufacturer fleet telemetry, MES replacement, cell tower and data center engineering infrastructure monitoring (included into AggreGate Data Center Supervisor solution). 

In terms of AggreGate architecture and project building concept, AggreGate SCADA/HMI resembles most of other products. A typical project development cycle includes:

  • Deploying a server or several servers in a failover configuration
  • Connecting to a storage that can be either a standard relational DBMS or integrated Apache Cassandra DMBS saving dozens of thousands tags per second
  • Connecting controllers and other data sources (e.g. external databases), configuring tag polling period
  • Configuring automated tag processing algorithms on a server side. These can be models determining additional calculated tags, alerts delivering e-mail and SMS notifications, schedules for performing certain jobs, etc.
  • Developing HMIs, dashboards, and navigation between them
  • Setting user roles, access, and external authentication via LDAP/AD configuration.

 Running on Linux, AggreGate server collects data from OPC servers running on Windows. This procedure is implemented via IP network and DCOM protocol. As a result, there is no need for installing SCADA server and OPC server on a single computer anymore.

There are no such notions as “project”, “development environment”, and “runtime environment” in AggreGate SCADA/HMI. According to its concept, a single primary server is installed on a worksite. During the initial deployment phase, system engineers can connect to the server locally or remotely for developing HMIs, create PLC user accounts, set up data storage, and so on. After this phase, the same server will be utilized during commissioning and further on a regular basis, although the system migration to another server is possible and simple.

Unified environment enables to introduce modifications into the production server without any interruptions. In this case one should: 

  • Make temporary copies of one or two system components (for example, HMIs or alerts)
  • Introduce changes in the copy and test them
  • Replace the original component with the successfully modified copy.

One of the vital SCADA system parts is GUI Builder. Inherited from AggreGate Platform, GUI Builder assists in drawing and animating any HMIs containing both simple components (buttons, captions, text fields, lists, etc.) and complex ones (tables, multi-layer panes, tabbed panes, charts, geographical maps, dynamic SVG images, video windows, etc.).


Even though AggreGate GUI Builder is similar to other system editors of this kind, it has an outstanding feature. Alongside with standard visual component absolute layout, any pane can utilize a grid layout similar to an HTML table. Plus, in case of a complex form with multiple tabbed panes (simple, multi-layer, tabbed, split panes), every pane can employ both absolute and grid layouts. 

Grid layout allows designing HMIs, data input forms, and dashboards that seamlessly adjust to any screen resolution. In case of absolute layout, component proportional scaling is used. In this case, component height also increases, which leads to unacceptable results for almost all forms and dialogs.

HMIs are animated through bindings that allow data copying between server object properties and visual component properties in response to server and HMI events. AggreGate expression language brings aid in applying any operations to replicated data on the fly (processing numbers, strings, dates and time, tables, etc.).

Any data processed by AggreGate can be utilized for reporting. Expression builder and integrated SQL-like query language help retrieve necessary indicators, and the system creates the optimal template for their visual representation. After this, you can customize the template using the report builder.

As for the KPIs, you can configure alerts raised in response to critical object state events or event chain retrieving. The system sends alert notifications in almost any way (popup windows, sound notifications, E-mail messages, SMS). Automatically launched corrective actions can run both autonomously and under operator control. The alert module supports other typical industrial control features: flapping detection, hysteresis, prioritization, acknowledgement, escalation, etc.

AggreGate SCADA/HMI automates industrial processes, displays all necessary data in the operator center, provides visualization, saves information into a database, and creates reports ‒ in fact, everything that is expected from SCADA. The system promptly analyzes technological process efficiency and takes important decisions on its optimization, i.e. it partially performs MES software functions.

Usually, there are several SCADA installations operating simultaneously at large enterprises. Every installation has its own function in a certain workshop. The systems are logically bound by the production chain. Thus, their integration and automated KPIs transmission to MES/ERP levels are required. In AggreGate ecosystem, this is carried out by exchanging unified data model parts between servers with the help of distributed architecture (http://aggregate.tibbo.com/technology/architecture/distributed-architecture.html). 

It often happens that on a single object/within a single project, it’s necessary to implement not only SCADA, but also IT infrastructure management system, building automation, access control and physical access control, automatic system for commercial accounting of power consumption, and other solutions in various combinations. AggreGate has all these features implemented within one installation and possibility of binding modules on a single server. Where can you run across it? For example, in data centers where active networking equipment, climate sensors, UPS, DGU, conditioners, water-cooling system, personnel access, time and attendance should be monitored. Some more examples: cell towers, where radio-relay equipment of transport network, sector antenna parameters, intrusion detection sensors, and other systems must be controlled. In large warehouses, it is vital to monitor personnel access, loader behavior, ventilation and lighting systems. Almost all large-scale objects can gain an advantage from merging various monitoring and management systems.

In our upcoming articles, we will describe distinguishing features of our SCADA solution, various industrial automation problems and their described solutions, as well as newsworthy projects we’ve taken part in.

Victor Polyakov, Managing Director, Tibbo Systems

Read more…

Guest blog post by Sandeep Raut



Digital Transformation is helping all the corners of life and healthcare is no exception.
Patients when discharged from the hospital are given verbal and written instructions regarding their post-discharge care but many of them get readmitted in 30 days due to various reasons. 
Over last 5 years this 30 days readmission rate is almost 19% with over 25 billions of dollars spent per year.

In October 2012 the Centers for Medicaid and Medicare Services (CMS) began penalizing hospitals with the highest readmission rates for health conditions like acute myocardial infarction (AMI), heart failure (HF), pneumonia (PN), chronic obstructive pulmonary disease (COPD) and total hip arthroplasty/total knee arthroplasty (THA/TKA).


Various steps to reduce the readmission:

  • Send the patient home with 30-day medication supply, wrapped in packaging that clearly explains timing, dosage, frequency, etc
  • Have hospital staff make follow-up appointments with patient's physician and don't discharge patient until this schedule is set up
  • Use Digital technologies like Big Data & IoT to collect vitals and keep up visual as well as verbal communication with patients, especially those that are high risk for readmission. 
  • Kaiser Permanente & Novartis are using Telemedicine technologies like video cameras for remote monitoring to determine what's happening to the patient after discharge
  • Piedmont Hospital in Atlanta provides home care on wheels like case management, housekeeping services, transportation to the pharmacy and physician's office          
  • Use of Data Science algorithms to predict patients with high risk of readmission
  • Walgreens launched WellTransitions program where patients receive a medication review upon admission and discharge from hospital, bedside medication delivery, medication education and counseling, and regularly scheduled follow-up support by phone and online.
  • HealthLoop is a cloud based platform that automates follow-up care keeping doctors, patients and care-givers connected between visits with clinical information that is insightful, actionable, and engaging.
  • Propeller Health, a startup company in Madison has developed an app and sensors track medication usage and then send time and location data to a smartphone
  • Mango Health for iPhone and wearables like Apple Watch makes managing your medications fun, easy, and rewarding. App feature include: dose reminders, drug interaction info, a health history, and best of all - points and rewards, just for taking your medicines.
These emerging digital tools enable health care organizations to assess and better manage who is at risk for readmission and determine the optimal course of action for the patients.

Such tools also enable patients to live at home, in greater comfort and at lower cost, lifting the burden on themselves and their families.
Digital is helping mankind in all ways !!
Read more…

Interactive Map of IoT Organizations -- TAKE 2

I am excited to launch the 2nd version of my Interactive Map of IoT Organizations  Thanks for all the support and encouragement from David Oro!

https://www.diku.ca/blog/2016/12/04/interactive-map-of-iot-organizations-take-2/

Here are the material changes from the first version:

  1. Each organization now has their specific address instead of being city-based
  2. Now includes the Founder(s) of the organization and a link to more information about them. This is in addition to the “Founded” year which was in the first version
  3. Cleanup of categories. Folks are still trying to determine what it means to be an IoT Platform. For me, it’s most important to focus on standards and integration of systems as there will be organizations that specialize in one aspect of an IoT platform whether it’s the analytics, rules engine, device management, workflow, or visualization functions.
  4. The initial launch of the map had 246 organizations, this new map has 759 organizations. Thanks to many people on LinkedIn and through blog comments for suggesting their companies which accounted for 180 additional organizations. The other 330+ organizations I have been finding on my own by trolling news, Twitter, IoT conference Web sites, “Partners” sections of each organization.

I set up a Twitter account @EyeOhTee and although I still need to tweet more, you may see some interesting news on there and feel free to tweet out this post, plug plug!

Besides the basic data shown on the map, I also track many more attributes of each product. I will publish additional findings and analysis on this blog and here on IoT Central.

I hope you find the map useful and I would love to hear if, and how, it has helped you. Whether you located a company in your area to collaborate or a supplier for a problem you are trying to solve or just learning like me it will have made it worth the time I spend on this.

BGJ

Read more…

MQTT Library Demo

About The Application

To illustrate the use of the MQTT library, we have created two simple Tibbo BASIC applications called "mqtt_publisher" and "mqtt_subscriber".

In our MQTT demo, the publisher device is monitoring three buttons (Tibbits #38). This is done through the keypad (kp.) object.

The three buttons on the publisher device correspond to the red, yellow, and green LEDs (Tibbits #39) on the subscriber device.

As buttons are pushed and released, the publisher device calls mqtt_publish() with topics "LED/Red", "LED/Green", and "LED/Red". Each topic's data is either 0 for "button released" or 1 for "button pressed". The related code is in the on_kp() event handler.

The subscriber device subscribes to all three topics with a single call to mqtt_sub() and the line "LED/#". This is done once, inside callback_mqtt_connect_ok().

With every notification message received from the server, the subscriber device gets callback_mqtt_notif() invoked. The LEDs are turned on and off inside this functions's body.

Testing the MQTT demo

The demo was designed to run on our TPS3 boards, but you can easily modify it for other devices.

The easiest way to get the test hardware is to order "MQTTPublisher" and "MQTTSubscriber" TPS configurations.

You can also order all the parts separately:

  • On the publisher side:
    • TPP3 board in the TPB3 enclosure.
    • Your will need Tibbits #00-3 in sockets S1, S3, S5; and
    • Tibbits #38 in sockets S2, S4, S6;
    • You will also need some form of power, i.e. Tibbit #10 and #18, plus a suitable 12V power adaptor.
  • On the subscriber side:
    • TPP3 board in the TPB3 enclosure.
    • Your will need Tibbits #00-3 in sockets S1, S3, S5;
    • Tibbit #39-2 (red) in S2;
    • Tibbit #39-3 (yellow) in S4;
    • Tibbit #39-1 (green) in S6;
    • You will also need some form of power, i.e. Tibbit #10 and #18, plus a suitable 12V power adaptor.

Test steps

  • Install a suitable MQTT server. We suggest HiveMQ (www.hivemq.com):
    • Download the software here: www.hivemq.com/downloads/ (you will be asked to register).
    • Unzip the downloaded file.
    • Go to the "windows-service" folder and execute "installService.bat".
    • Go to the "bin" folder and launch "run.bat".
    • You do not need to configure any user names or passwords.
  • Open mqtt_publisher and mqtt_subscriber projects in two separate instances of TIDE, then correct the following in the projects' global.tbh files:
    • OWN_IP - assign a suitable unoccupied IP to the publisher and to the subscriber (you know that they will use two different IPs, right?);
    • MQTT_SERVER_HOST - set this to the address of the PC on which your run HiveMQ.
  • Select your subscriber and publisher devices as debug targets, and run corresponding demo apps on them.
  • Press buttons on the publisher to see the LEDs light up on the subscriber.
  • If you are running in debug mode you will see a lot of useful debug info printed in the output panes of both TIDE instances.
  • You can switch into the release mode to see how fast this works without the debug printing.
Read more…

Open Source for IoT Software Stacks

Guest post by Ian Skerrett, Eclipse Foundation

In the previous article, Three Software Stacks Required to Implement IoT , we introduce the 3 software stacks that are required for any IoT solution: 1) Constrained Devices, 2) IoT Gateways and Smart Devices, and 3) IoT Cloud Platforms. In part 2 of this series, we discuss how open source software communities and in particular the Eclipse IoT open source community is becoming a key provider of the building blocks required to implement each of the three software stacks. Similar to how the LAMP (Linux/Apache HTTP Server/MySQL/PHP) stack has dominated the web infrastructures, it is believed a similar open source stack will dominate IoT deployments.

The Importance of open stacks for IoT

The separation of concerns brought by separating any IoT architecture into three stacks is a great step forward for building scalable and maintainable solutions. What’s more, building a software stack on top of open technologies helps achieve the following:

  1. Open standards ensure interoperability – The use of proprietary communication protocols create silos of IoT networks which cannot easily exchange information. Building IoT stacks on top of open standards (radio protocols, messaging protocols, etc.) helps with the overall interoperability in IoT.
  2. Software reuse reduces TCO – The total cost of ownership is an important consideration for any IoT solution provider. Open source technology is often being made available as building blocks that one can re-use across several solutions. An IoT solution based on open source software may for example leverage the same protocol implementations in the devices and in the gateways.
  3. No vendor lock-in – Building a solution on top of proprietary technologies and software exposes to the risk of having a third-party vendor change its roadmap, or stopping the support of its solution. An IoT stack based on open source technology enables solution providers to adapt the software to their needs if a feature is missing, without having to ask or wait for the feature to be implemented by a given vendor.
  4. Open stacks attract developers – Open source communities are vibrant ecosystems of companies and individuals who innovate and collaborate. A company using open source software will typically find it easier to attract or find developers who have the required skills to work with the stack.
  5. Reduce risk and time to market - Users of open source technology benefit from reusing technology that has been used and test by others, which reduce the overall development time and ensure a smoother transition from prototype to pilot to production.

Open Source Technology for IoT

The open source community has become an active producer of technology for IoT solutions. Like the LAMP stack for websites, there are a set of open source projects that can be used as the building blocks for an IoT solution architecture.

The Eclipse IoT community is very active in providing the technology that can be used in each stack of an IoT solution. Eclipse IoT has 26 different open source projects that address different features of the IoT stacks. In addition to the Eclipse IoT projects, there are other open source projects that are also relevant to an IoT stack. The next few pages provide a brief summary of how Eclipse IoT as well as other open source projects can be used to implement IoT stacks.

Open Source Stack for Constrained Devices

Eclipse IoT provides a set of libraries that can be deployed on a constrained embedded device to provide a complete IoT development stack.

  • IoT Operating Systems – RIOT, FreeRTOS, Zephyr, Apache Mynewt.
  • Hardware Abstraction – Eclipse Edjeprovides an high-level Java API for accessing hardware features provided by microcontrollers (e.g GPIO, ADC, MEMS, etc.). It can directly connect to native libraries, drivers and board support packages provided by silicon vendors.
  • Device Management – Eclipse Wakaama provides a C implementation of the OMA LWM2M standard.
  • Communication – Open source projects like Eclipse Paho or Eclipse Wakaamaprovide implementation of IoT communication protocols such as, respectively, MQTT or LWM2M. Eclipse Paho has a C implementation of MQTT that is less than 2,000 LOC.

Open Source Stack for Gateways: Connected and Smart Things

Within the Eclipse IoT community there are a variety of projects that work to provide the capabilities that an IoT gateway requires.

Eclipse Kura provides a general purpose middleware and application container for IoT gateway services. An IoT gateway stack based on Eclipse Kura would include the following:

  • Operating system – Linux (Ubuntu/Ubuntu Core, Yocto-based linux distribution), Windows.
  • Application container or runtime environment – Eclipse Equinox or Eclipse Concierge (OSGi Runtime).
  • Connectivity support for devices – Eclipse Kura  includes APIs to interface with the gateway I/Os (e.g. Serial, RS-485, BLE, GPIO, etc.) and support for many field protocols that can be used to connect to devices, e.g MODBUS, CAN bus, etc.
  • Networking support – Eclipse Kura provides advanced networking and routing capabilities over a wide-range of interfaces (cellular, Wi-Fi, Ethernet, etc.).
  • Data management & Messaging – Eclipse Kura implements a native MQTT-based messaging solution, that allows application running on the gateway to transparently communicate with a Cloud Platform, without having to deal with the availability of the network interfaces, or how to represent IoT data. Support for additional messaging protocols is available through the built-in Apache Camel message routing engine.
  • Remote management – Eclipse Kura provides a remote management solution based on the MQTT protocol, that allows to monitor the overall health of an IoT Gateway, in addition to control (install, update, modify settings) the software it’s running.

Eclipse SmartHome provides an IoT gateway platform that is specifically focused on the home automation domain. An Eclipse SmartHome stack would including the following:

  • Operating system – Linux (Ubuntu/Ubuntu Core, Yocto-based linux distribution), Windows or macOS.
  • Application container or runtime environment – Eclipse Equinox or Eclipse Concierge (OSGi Runtimes).
  • Communication and Connectivity – Eclipse SmartHome brings support for many off-the-shelf home automation devices such as Belkin WeMo, LIFX, Philips Hue, Sonos, etc. Eclipse SmartHome focuses on enabling home automation solutions to communicate within an “Intranet of Things”; therefore offline capabilities are a paramount design goal.
  • Data management & Messaging – Eclipse SmartHome has an internal event bus, which can be exposed to external systems through e.g. SSE or MQTT. It furthermore provides mechanisms for persisting values in databases and for running local business logic through a rule engine.
  • Remote management – Eclipse SmartHome supports device onboarding and configuration through its APIs. It furthermore provides an infrastructure to perform firmware update of connected devices.

Eclipse 4DIACprovides an industrial-grade open source infrastructure for distributed industrial process measurement and control systems based on the IEC 61499 standard. 4DIAC is ideally suited for Industrie 4.0 and Industrial IoT applications in a manufacturing setting. The IEC 61499 standard defines a domain specific modeling language for developing distributed industrial control solutions by providing a vendor independent format and for simplifying support for controller to controller communication.

Open Source Stack for IoT Cloud Platforms

The Eclipse IoT Community has a number of projects that are focused on providing the functionality required for IoT cloud platforms.

Eclipse Kapua is a modular platform providing the services required to manage IoT gateways and smart edge devices. Kapua provides a core integration framework and an initial set of core IoT services including a device registry, device management services, messaging services, data management, and application enablement.

The goal of Eclipse Kapua is to create a growing ecosystem of micro services through the extensions provided by other Eclipse IoT projects and organizations.

Eclipse OM2M is an IoT Platform specific for the telecommunication industry, based on the oneM2M specification. It provides a horizontal Common Service Entity (CSE) that can be deployed in an M2M server, a gateway, or a device. Each CSE provides Application Enablement, Security, Triggering, Notification, Persistency, Device Interworking, Device Management.

The Eclipse IoT community also has a number of standalone projects that provide functionality to address key features required for an IoT cloud platform. These projects can be used independently of Eclipse Kapua and over time some may be integrated into Kapua.

Connectivity and Protocol Support

  • Eclipse Hono provides a uniform API for interacting with devices using arbitrary protocols, as well as an extensible framework to add other protocols.
  • Eclipse Mosquitto provides an implementation of an MQTT broker.

Device Management and Device Registry

  • Eclipse Leshan provides an implementation of the OMA LWM2M device management protocol.  
  • Eclipse hawkBitprovides the management tools to roll out software updates to devices and gateways.

Event management and application enablement

  • Eclipse Hono helps to expose consistent APIs for consuming telemetry data or sending commands to devices, so as to rationalize IoT application development.

Analytics and Visualization – Outside of the Eclipse IoT community there are many open source options for data analytics and visualization, including Apache Hadoop, Apache Spark, and Apache Storm. Within the Eclipse community, Eclipse BIRT provides support for dashboards and reporting of data stored in a variety of data repositories.

Open Source for Cross-Stack Functionality

Security

  • Eclipse tinydtls provides an implementation of the DTLS security protocol providing transport layer security between the device and server.
  • Eclipse ACS provides an access control service that allows each stack in an IoT solution to protect their resources using a RESTful interface.

Ontologies

  • Eclipse Unide is a protocol for Production Performance Management (PPM) in the manufacturing industry. It establishes an ontology for sharing machine performance information.
  • Eclipse Whiskers implements the OGC SensorThings API that provides a standard way to share location based information for devices.

Development Tools and SDKs

  • Eclipse Vorto provides a set of tools and repository for creating device information models.
  • Eclipse JDT and CDT allow for integrated development of IoT solutions. For example, Eclipse Kura applications can be tested and debugged from within the Eclipse Java IDE (JDT).
  • Eclipse Che provides a browser-based IDE that can be used for building IoT solutions.

Conclusion

An IoT Solution requires substantial amount of technology in the form of software, hardware, and networking. In this series of articles we have defined the software requirements across three different stacks and the open source software that can be used to build the stacks

The last twenty years have proven that open source software and open source communities are key providers of technology for the software industry. The Internet of Things is following a similar trend, and it is expected that more and more IoT solutions will be built on open source software.

For the past five years, the Eclipse IoT community has been very active in building a portfolio of open source projects that companies and individuals use today to build their IoT solutions. If you are interested in participating, please join us and visit https://iot.eclipse.org.

Read more…

Tibbo Project System (TPS) is a highly configurable, affordable, and innovative automation platform. It is ideal for home, building, warehouse, and production floor automation projects, as well as data collection, distributed control, industrial computing, and device connectivity applications.

Suppliers of traditional “control boxes” (embedded computers, PLCs, remote automation and I/O products, etc.) typically offer a wide variety of models differing in their I/O capabilities. Four serial ports and six relays. Two serial ports and eight relays. One serial port, four relays, and two sensor inputs. These lists go on and on, yet never seem to contain just the right mix of I/O functions you are looking for.

Rather than offering a large number of models, Tibbo Technology takes a different approach: Our Tibbo Project System (TPS) utilizes Tibbits® – miniature electronic blocks that implement specific I/O functions. Need three RS232 ports? Plug in exactly three RS232 Tibbits! Need two relays? Use a relay Tibbit. This module-based approach saves you money by allowing you to precisely define the features you want in your automation controller.

Here is a closer look at the process of building a custom Tibbo Project System.

Start with a Tibbo Project PCB (TPP)

 

 

A Tibbo Project PCB is the foundation of TPS devices.

Available in two sizes – medium and large – each board carries a CPU, memory, an Ethernet port, power input for +5V regulated power, and a number of sockets for Tibbit Modules and Connectors.

Add Tibbit® Blocks

Tibbits (as in “Tibbo Bits”) are blocks of prepackaged I/O functionality housed in brightly colored rectangular shells. Tibbits are subdivided into Modules and Connectors.

Want an ADC? There is a Tibbit Module for this. 24V power supply? Got that! RS232/422/485 port? We have this, and many other Modules, too.

Same goes for Tibbit Connectors. DB9 Tibbit? Check. Terminal block? Check. Infrared receiver/transmitter? Got it. Temperature, humidity, and pressure sensors? On the list of available Tibbits, too.

Assemble into a Tibbo Project Box (TPB)

Most projects require an enclosure. Designing one is a tough job. Making it beautiful is even tougher, and may also be prohibitively expensive. Finding or making the right housing is a perennial obstacle to completing low-volume and hobbyist projects.

Strangely, suppliers of popular platforms such as Arduino, Raspberry Pi, and BeagleBone do not bother with providing any enclosures, and available third-party offerings are primitive and flimsy.

Tibbo understands enclosure struggles and here is our solution: Your Tibbo Project System can optionally be ordered with a Tibbo Project Box (TPB) kit.

The ingenious feature of the TPB is that its top and bottom walls are formed by Tibbit Connectors. This eliminates a huge problem of any low-volume production operation – the necessity to drill holes and openings in an off-the-shelf enclosure.

The result is a neat, professionally looking housing every time, even for projects with the production quantity of one.

Like boards, our enclosures are available in two sizes – medium and large. Medium-size project boxes can be ordered in the LCD/keypad version, thus allowing you to design solutions incorporating a user interface.

 

Unique Online Configurator

To simplify the process of planning your TPS we have created an Online Configurator.

Configurator allows you to select the Tibbo Project Board (TPP), “insert” Tibbit Modules and Connectors into the board’s sockets, and specify additional options. These include choosing whether or not you wish to add a Tibbo Project Box (TPB) enclosure, LCD and keypad, DIN rail mounting kit, and so on. You can choose to have your system shipped fully assembled or as a parts kit.

Configurator makes sure you specify a valid system by watching out for errors. For example, it verifies that the total power consumption of your future TPS device does not exceed available power budget. Configurator also checks the placement of Tibbits, ensuring that there are no mistakes in their arrangement.

Completed configurations can be immediately ordered from our online store. You can opt to keep each configuration private, share it with other registered users, or make it public for everyone to see.

Develop your application


Like all programmable Tibbo hardware, Tibbo Project System devices are powered by Tibbo OS (TiOS).

Use our free Tibbo IDE (TIDE) software to create and debug sophisticated automation applications in Tibbo BASIC, Tibbo C, or a combination of the two languages.

To learn more about the Tibbo Project System click here

Read more…

OPC Server from Tibbo Technology

OPC – «Open Platform Communications» – is a set of standards and specifications for manufacturing telecommunication. OPC specifies the transfer of real-time plant data between control devices from various producers. OPC was designed to process control hardware and support a common bridge for Windows-based software applications. OPC was aimed to reduce the number of duplicated effort performed by hardware manufacturers and their software partners.

 

The most typical OPC specification, OPC Data Access (OPC DA), is supported by Tibbo OPC Server. Any device compatible with the Tibbo AggreGate protocol can be a data source. AggreGate is a white-label IoT integration platform using up-to-date network technologies to control, configure, monitor and support electronic devices, along with distributed networks of such electronic devices. It also helps you collect device data in the cloud, where you can slice and dice it in alignment with your needs. In addition, the platform lets other enterprise applications transparently access this data via the AggreGate server.

Tibbo OPC server has embedded AggreGate network protocol. It can both interact with any Tibbo devices via AggreGate agent protocol and connect to AggreGate server. The AggreGate agent protocol open-source solution is published for Java, C#, and C++ programming languages, so your connection scheme is not restricted to AggreGate server  or Tibbo devices only.

 

Examples

A simple example: TPS reads Tibbit #29 (Ambient temperature meter) and forwards data to OPC server via AggreGate agent protocol.

A more complex example: we have a Windows-based PC controlling a wood processing machine by means of AggreGate server through the Modbus protocol. If Tibbo OPC server is linked with AggreGate server, the data from the machine is sent to Tibbo OPC server, and therefore, we can operate and monitor the machine via any OPC client.

Technical Specification

  • Compatibility with Windows XP/2003 or later (Microsoft Visual C++ 2013 redistributable is required - installed automatically)

  • Support of DA Asynchronous I/O 2.0 and Synchronous I/O with COM/DCOM technology

Tibbo OPC Server transmits the information on the Value, Quality and Timestamp of an item (tag) to the OPC Client applications. These fields are read from the AggreGate variables.

 

The process values are set to Bad [Configuration Error] quality if OPC Server loses communication with its data source (AggreGate Agent or AggreGate Server). The quality is set to Uncertain [Non-Specific] if the AggreGate variable value is empty.

In the following chart below you can see a concordance table of the AggreGate variables and the OPC data types:

AggreGate Data Type OPC Data Type
INTEGER VT_I4
STRING VT_BSTR
BOOLEAN VT_BOOL
LONG VT_I8
FLOAT VT_R4
DOUBLE VT_R8
DATE VT_DATE
DATATABLE OPC VT_BSTR (by default)
COLOR VT_I4
DATA VT_BSTR

To learn more about Tibbo OPC server, click here

Read more…

The IoT communication protocols

Guest post by James Stansberry

Messaging protocols for “lightweight” IoT nodes

A fascinating article from Philip N. Howard at George Washington University asserts that based on multiple sources, the number of connected devices surpassed the number of people on the planet in 2014. Further, it estimates that by 2020 we will be approaching 50 billion devices on the Internet of Things (IoT).

Philip N. Howard’s Study of Connected Devices

In other words, while humans will continue to connect their devices to the web in greater numbers, a bigger explosion will come from “things” connecting to the web that weren’t before, or which didn’t exist, or which now use their connection as more of a core feature.

The question is, how will these billions of things communicate between the end node, the cloud, and the service provider?

This article dives into that subject as it relates to a particular class of devices that are very low cost, battery-powered, and which must operate at least seven years without any manual intervention.

In particular, it looks at two emerging messaging protocols to address the needs of these “lightweight” IoT nodes. The first, MQTT, is very old by today’s standards from way back in 1999. And the second, CoAP, is relatively new but gaining traction.

IoT Communication Protocol Requirements

One definition of IoT is connecting devices to the internet that were not previously connected. A factory owner may connect high-powered lights. A triathlete may connect a battery-powered heart-rate monitor. A home or building automation provider may connect a wireless sensor with no line power source.

But the important thing here is that in all the above cases the “Thing” must communicate through the Internet to be considered an “IoT” node.

Since it must use the Internet, it must also adhere to the Internet Engineering Task Force’s (IETF) Internet Protocol Suite. However, the Internet has historically connected resource-rich devices with lots of power, memory and connection options. As such, its protocols have been considered too heavy to apply wholesale for applications in the emerging IoT.

Internet Protocol Suite Overview

There are other aspects of the IoT which also drive modifications to IETF’s work. In particular, networks of IoT end nodes will be lossy, and the devices attached to them will be very low power, saddled with constrained resources, and expected to live for years.

The requirements for both the network and its end devices might look like the table below. This new model needs new, lighter weight protocols that don’t require the large amount of resources.

MQTT and CoAP address these needs through small message sizes, message management, and lightweight message overhead. We look at each below.

Requirements for low-cost, power-constrained devices and associated networks

MQTT and CoAP: Lightweight IoT Communications Protocols

MQTT and CoAP allow for communication from Internet-based resource-rich devices to IoT-based resource-constrained devices. Both CoAP and MQTT implement a lightweight application layer, leaving much of the error correction to message retries, simple reliability strategies, or reliance on more resource rich devices for post-processing of raw end-node data.

Conceptual Diagram of MQTT and CoAP Communication to Cloud / Phone

MQTT Overview

IBM invented Message Queuing Telemetry Transport (MQTT) for satellite communications with oil field equipment. It had reliability and low power at its core and so made good sense to be applied to IoT networks.

The MQTT standard has since been adopted by the OASIS open standards society and released as version 3.1.1. It is also supported within the Eclipse community, as well as by many commercial companies who offer open source stacks and consulting.

MQTT uses a “publish/subscribe” model, and requires a central MQTT broker to manage and route messages among an MQTT network’s nodes. Eclipse describes MQTT as “a many-to-many communication protocol for passing messages between multiple clients through a central broker.”

MQTT uses TCP for its transport layer, which is characterized as “reliable, ordered and error-checked.”

MQTT Strengths

Publish / Subscribe Model

MQTT’s “pub/sub” model scales well and can be power efficient. Brokers and nodes publish information and others subscribe according to the message content, type, or subject. (These are MQTT standard terms.) Generally the broker subscribes to all messages and then manages information flow to its nodes.

There are several specific benefits to the Pub/Sub model.

Space decoupling

While the node and the broker need to have each other’s IP address, nodes can publish information and subscribe to other nodes’ published information without any knowledge of each other since everything goes through the central broker. This reduces overhead that can accompany TCP sessions and ports, and allows the end nodes to operate independently of one another.

Time decoupling

A node can publish its information regardless of other nodes’ states. Other nodes can then receive the published information from the broker when they are active. This allows nodes to remain in sleepy states even when other nodes are publishing messages directly relevant to them.

Synchronization decoupling

A node that in the midst of one operation is not interrupted to receive a published message to which it is subscribed. The message is queued by the broker until the receiving node is finished with its existing operation. This saves operating current and reduces repeated operations by avoiding interruptions of on-going operations or sleepy states.

Security

MQTT uses unencrypted TCP and is not “out-of-the-box” secure. But because it uses TCP it can – and should – use TLS/SSL internet security. TLS is a very secure method for encrypting traffic but is also resource intensive for lightweight clients due to its required handshake and increased packet overhead. For networks where energy is a very high priority and security much less so, encrypting just the packet payload may suffice.

MQTT Quality of Service (QoS) levels

The term “QoS” means other things outside of MQTT. In MQTT, “QoS” levels 0, 1 and 2 describe increasing levels of guaranteed message delivery.

MQTT QoS Level 0 (At most once)

This is commonly known as “Fire and forget” and is a single transmit burst with no guarantee of message arrival. This might be used for highly repetitive message types or non-mission critical messages.

MQTT QoS Level 1 (At least once)

This attempts to guarantee a message is received at least once by the intended recipient. Once a published messaged is received and understood by the intended recipient, it acknowledges the message with an acknowledgement message (PUBACK) addressed to the publishing node. Until the PUBACK is received by the publisher, it stores the message and retransmits it periodically. This type of message may be useful for a non-critical node shutdown.

MQTT QoS Level 2 (Exactly once)

This level attempts to guarantee the message is received and decoded by the intended recipient. This is the most secure and reliable MQTT level of QoS.  The publisher sends a message announcing it has a QoS level 2 message. Its intended recipient gathers the announcement, decodes it and indicates that it is ready to receive the message. The publisher relays its message. Once the recipient understands the message, it completes the transaction with an acknowledgement. This type of message may be useful for turning on or off lights or alarms in a home.

Last Will and Testament

MQTT provides a “last will and testament (LWT)” message that can be stored in the MQTT broker in case a node is unexpectedly disconnected from the network. This LWT retains the node’s state and purpose, including the types of commands it published and its subscriptions. If the node disappears, the broker notifies all subscribers of the node’s LWT. And if the node returns, the broker notifies it of its prior state. This feature accommodates lossy networks and scalability nicely.

Flexible topic subscriptions

An MQTT node may subscribe to all messages within a given functionality. For example a kitchen “oven node” may subscribe to all messages for “kitchen/oven/+”, with the “+” as a wildcard. This allows for a minimal amount of code (i.e., memory and cost). Another example is if a node in the kitchen is interested in all temperature information regardless of the end node’s functionality. In this case, “kitchen/+/temp” will collect any message in the kitchen from any node reporting “temp”. There are other equally useful MQTT wildcards for reducing code footprint and therefore memory size and cost.

Issues with MQTT

Central Broker

The use of a central broker can be a drawback for distributed IoT systems. For example, a system may start small with a remote control and window shade, thus requiring no central broker. Then as the system grows, for example adding security sensors, light bulbs, or other window shades, the network naturally grows and expands and may have need of a central broker. However, none of the individual nodes wants to take on the cost and responsibility as it requires resources, software and complexity not core to the end-node function.

In systems that already have a central broker, it can become a single point of failure for the complete network. For example, if the broker is a powered node without a battery back-up, then battery-powered nodes may continue operating during an electrical outage while the broker is off-line, thus rendering the network inoperable.

TCP

TCP was originally designed for devices with more memory and processing resources than may be available in a lightweight IoT-style network. For example, the TCP protocol requires that connections be established in a multi-step handshake process before any messages are exchanged. This drives up wake-up and communication times, and reduces battery life over the long run.

Also in TCP it is ideal for two communicating nodes to hold their TCP sockets open for each other continuously with a persistent session, which again may be difficult with energy- and resource-constrained devices.

Wake-up time

Again, using TCP without session persistence can require incremental transmit time for connection establishment. For nodes with periodic, repetitive traffic, this can lead to lower operating life.

CoAP Overview

With the growing importance of the IoT, the Internet Engineering Task Force (IETF)took on lightweight messaging and defined the Constrained Application Protocol (CoAP). As defined by the IETF, CoAP is for “use with constrained nodes and constrained (e.g., low-power, lossy) networks.” The Eclipse community also supports CoAP as an open standard, and like MQTT, CoAP is commercially supported and growing rapidly with IoT providers.

CoAP is a client/server protocol and provides a one-to-one “request/report” interaction model with accommodations for multi-cast, although multi-cast is still in early stages of IETF standardization. Unlike MQTT, which has been adapted to IoT needs from a decades-old protocol, the IETF specified CoAP from the outset to support IoT with lightweight messaging for constrained devices operating in a constrained environment. CoAP is designed to interoperate with HTTP and the RESTful web through simple proxies, making it natively compatible to the Internet.

Strengths of CoAP

Native UDP

CoAP runs over UDP which is inherently and intentionally less reliable than TCP, depending on repetitive messaging for reliability instead of consistent connections. For example, a temperature sensor may send an update every few seconds even though nothing has changed from one transmission to the next. If a receiving node misses one update, the next will arrive in a few seconds and is likely not much different than the first.

UDP’s connectionless datagrams also allow for faster wake-up and transmit cycles as well as smaller packets with less overhead. This allows devices to remain in a sleepy state for longer periods of time conserving battery power.

Multi-cast Support

A CoAP network is inherently one-to-one; however it allows for one-to-many or many-to-many multi-cast requirements. This is inherent in CoAP because it is built on top of IPv6 which allows for multicast addressing for devices in addition to their normal IPv6 addresses. Note that multicast message delivery to sleeping devices is unreliable or can impact the battery life of the device if it must wake regularly to receive these messages.

Security

CoAP uses DTLS on top of its UDP transport protocol. Like TCP, UDP is unencrypted but can be – and should be – augmented with DTLS.

Resource / Service Discovery

CoAP uses URI to provide a standard presentation and interaction expectations for network nodes. This allows a degree of autonomy in the message packets since the target node’s capabilities are partly understood by its URI details. In other words, a battery-powered sensor node may have one type of URI while a line-powered flow control actuator may have another. Nodes communicating to the battery-powered sensor node might be programmed to expect longer response times, more repetitive information, and limited message types. Nodes communicating to the line-powered flow control actuator might be programmed to expect rich, detailed messages, very rapidly.

Asynchronous Communication

Within the CoAP protocol, most messages are sent and received using the request/report model; however, there are other modes of operation that allow nodes to be somewhat decoupled. For example, CoAP has a simplified “observe” mechanism similar to MQTT’s pub/sub that allows nodes to observe others without actively engaging them.

As an example of the “observe” mode, node 1 can observe node 2 for specific transmission types, then any time node 2 publishes a relevant message, node 1 receives it when it awakens and queries another node. It’s important to note that one of the network nodes must hold messages for observers. This is similar to MQTT’s broker model except that there is no broker requirement in CoAP, and therefore no expectation of being able to hold or queue messages for observers.

There are currently draft additions to the standard which may provide a similar CoAP function to MQTT’s pub/sub model over the short-to-medium term. The leading candidate today is a draft proposal from Michael Koster, allowing CoAP networks to implement a pub/sub model like MQTT’s mentioned above.  

Issues with CoAP

Standard Maturity

MQTT is currently a more mature and stable standard than CoAP. It’s been Silicon Labs’ experience that it is easier to get an MQTT network up and running very quickly than a similar one using CoAP. That said, CoAP has tremendous market momentum and is rapidly evolving to provide a standardized foundation with important add-ons in the ratification pipeline now.

It is likely that CoAP will reach a similar level of stability and maturity as MQTT in the very near term. But the standard is evolving for now, which may present some troubles with interoperability.

Message Reliability (QoS level)

CoAP’s “reliability” is MQTT’s QoS and provides a very simple method of providing a “confirmable” message and a “non-confirmable” message. The confirmable message is acknowledged with an acknowledgement message (ACK) from the intended recipient. This confirms the message is received but stops short of confirming that its contents were decoded correctly or at all. A non-confirmable message is “fire and forget.”

Summary

The two messaging protocols MQTT and CoAP are emerging as leading lightweight messaging protocols for the booming IoT market. Each has benefits and each has issues. As leaders in mesh networking where lightweight nodes are a necessary aspect of almost every network, Silicon Labs has implemented both protocols, including gateway bridging logic to allow for inter-standard communication.

Further Reading

MQTT

Specification - http://docs.oasis-open.org/mqtt/mqtt/v3.1.1/os/mqtt-v3.1.1-os.html

Excellent source for MQTT information – http://www.hivemq.com/mqtt-essentials-wrap-up/

CoAP

Specification - https://tools.ietf.org/html/rfc7252

Excellent source for CoAP information - http://coap.technology/

MQTT-SN –

Specification – http://mqtt.org/2013/12/mqtt-for-sensor-networks-mqtt-sn

General coverage of IoT messaging protocols

Excellent white paper on using MQTT, CoAP, and other messaging protocols –  http://www.prismtech.com/sites/default/files/documents/MessagingComparsionNov2013USROW_vfinal.pdf

This article originally appeared here.

Read more…

New IoT App Makes Drivers Safer

Transportation has become one of the most frequently highlighted areas where the internet of things can improve our lives. Specifically, a lot of people are excited about the IoT's potential to further the progress toward entire networks of self-driving cars. We hear a lot about the tech companies that are involved in building self-driving cars, but it's the IoT that will actually allow these vehicles to operate. In fact, CNET quoted one IoT expert just last year as saying that because of the expanding IoT, self-driving cars will rule the roads by 2030.

On a much smaller scale, there are also some niche applications of the IoT that are designed to fix specific problems on the road. For instance, many companies have looked to combat distracted driving by teenagers through IoT-related tools. As noted by PC World, one device called the Smartwheel monitors teens' driving activity by sensing when they're keeping both hands on the wheel. The device sounds an alert when a hand comes off the wheel and communicates to a companion app that compiles reports on driver performance. This is a subtle way in which the IoT helps young drivers develop better habits.

In a way, these examples cover both extremes of the effect the IoT is having on drivers. One is a futuristic idea that's being slowly implemented to alter the very nature of road transportation. The other is an application for individuals meant to make drivers safer one by one. But there are also some IoT-related tools that fall somewhere in the middle of the spectrum. One is an exciting new app that seeks to make the roads safer for the thousands of shipping fleet drivers operating on a daily basis.

At first this might sound like a niche category. However, the reality is that the innumerable companies and agencies relying shipping and transportation fleets have a ton of drivers to take care of. That means supervising vehicle performance, safety, and more for each and every one of them. That process comprises a significant portion of road activity, particularly in cities and on highways. These operations are able to be simplified and streamlined through Networkfleet Driver, which Verizon describes as a tool to help employees manage routes, maintenance, communication, and driving habits all in one place.

The app can communicate up-to-date routing changes or required stops, inform drivers of necessary vehicle repairs or upkeep, and handle communication from management. It can also make note of dangerous habits (like a tendency to speed or make frequent sudden stops), helping the driver to identify bad habits and helping managers to recommend safer performance. All of this is accomplished through various IoT sensors on vehicles interacting automatically with the app, and with systems that can be monitored by management.

The positive effect, while difficult to quantify, is substantial. Fleet drivers make up a significant portion of road activity, and through the use of the IoT we can make sure that the roads are safer for everyone.

Read more…

Internet of Things has raised concerns over safety. Nowadays, it is possible to control your home using your Smartphone. In the coming years, mobile devices will work as a remote control to operate all the things in your house. 

Some devices display one or several vulnerabilities that can be exploited by the hackers to infiltrate them and the whole network of the connected home. For instance:

1.      During configuration, data – including the device ID and MAC address - is sometimes transmitted in plain text.

2.      The communication between the device and the app passes unencrypted through the manufacturer’s servers.

3.      The hotspot is poorly secured with a weak username and password and sometimes remains active after configuration.

4.      The device comes pre-installed with a Telnet client carrying default credentials.

With rising cases of identity theft and vishing, it has become absolutely necessary to install any of these 5 free tools in your smartphone in order to keep your data safe from hackers. 

1- LastPass - It lets you store passwords in a secure vault that is easy to use, searchable and organized the way you like. It is perhaps the safest vault available online today that lets you store password data for unlimited websites. 

2- Lookout - This tool offers security for the today's mobile generation. It is a free app that protects your iOS or Android device around the clock from mobile threats such as unsecure WiFi networks, malicious apps, fraudulent links, etc. It has a worldwide network of 100 million mobile sensors, world's largest mobile data set and a smarter machine intelligence to keep your smartphone secure from all kinds of threats. 

3- Authy - This app generates secure 2 step verification tokens on your device and protects your account from hackers and hijackers by adding an additional layer of security. Moreover, it offers secure cloud backup, multi device synchronization and multi factor authentication. 2 step authentication is the best kind of security available today that ensures your accounts don't get hacked. 

4- BullGuard - It protects your smartphone from all forms of viruses and malware. With an inbuilt, rigorous anti-theft functionality, BullGuard enables you to lock, locate and wipe device remotely in case it gets lost or stolen. It allows automatic scans so that the security remains updated. Moreover, it doesn't drains down your battery. 

5- Prey - It is a lightweight theft protection software that lets you keep an eye over your mobile devices in case you have more than one and you are leaving one in your home. Prey lets you recover the phone in case it gets stolen. After installing the software on your laptop, tablet or phone, Prey will sleep silently in the background awaiting your command. Once remotely triggered from your Prey account, your device will gather and deliver detailed evidence back to you, including a picture of who's using it – often the crucial piece of data that police officers need to take action. 

 

Read more…

As if the Internet of Things (IoT) was not complicated enough, the Marketing team at Cisco introduced its Fog Computing vision in January 2014, also known as Edge Computing  for other more purist vendors.

Given Cisco´s frantic activity in their Internet of Everything (IoE) marketing campaigns, it is not surprising that many bloggers have abused of shocking headlines around this subject taking advantage of the Hype of the IoT.

I hope this post help you better understand what is  the role of Fog Computing  in the IoT Reference Model and how companies are using IoT Intelligent gateways in the Fog to connect the "Things" to the Cloud through some applications areas and examples of Fog Computing.

The problem with the cloud

As the Internet of Things proliferates, businesses face a growing need to analyze data from sources at the edge of a network, whether mobile phones, gateways, or IoT sensors. Cloud computing has a disadvantage: It can’t process data quickly enough for modern business applications.

The IoT owes its explosive growth to the connection of physical things and operation technologies (OT) to analytics and machine learning applications, which can help glean insights from device-generated data and enable devices to make “smart” decisions without human intervention. Currently, such resources are mostly being provided by cloud service providers, where the computation and storage capacity exists.

However, despite its power, the cloud model is not applicable to environments where operations are time-critical or internet connectivity is poor. This is especially true in scenarios such as telemedicine and patient care, where milliseconds can have fatal consequences. The same can be said about vehicle to vehicle communications, where the prevention of collisions and accidents can’t afford the latency caused by the roundtrip to the cloud server.

“The cloud paradigm is like having your brain command your limbs from miles away — it won’t help you where you need quick reflexes.”

Moreover, having every device connected to the cloud and sending raw data over the internet can have privacy, security and legal implications, especially when dealing with sensitive data that is subject to separate regulations in different countries.

IoT nodes are closer to the action, but for the moment, they do not have the computing and storage resources to perform analytics and machine learning tasks. Cloud servers, on the other hand, have the horsepower, but are too far away to process data and respond in time.

The fog layer is the perfect junction where there are enough compute, storage and networking resources to mimic cloud capabilities at the edge and support the local ingestion of data and the quick turnaround of results.

The variety of IoT systems and the need for flexible solutions that respond to real-time events quickly make Fog Computing a compelling option.

The Fog Computing, Oh my good another layer in IoT!

A study by IDC estimates that by 2020, 10 percent of the world’s data will be produced by edge devices. This will further drive the need for more efficient fog computing solutions that provide low latency and holistic intelligence simultaneously.

“Computing at the edge of the network is, of course, not new -- we've been doing it for years to solve the same issue with other kinds of computing.”

The Fog Computing or Edge Computing  is a paradigm championed by some of the biggest IoT technology players, including Cisco, IBM, and Dell and represents a shift in architecture in which intelligence is pushed from the cloud to the edge, localizing certain kinds of analysis and decision-making.

Fog Computing enables quicker response times, unencumbered by network latency, as well as reduced traffic, selectively relaying the appropriate data to the cloud.

The concept of Fog Computing attempts to transcend some of these physical limitations. With Fog Computing processing happens on nodes physically closer to where the data is originally collected instead of sending vast amounts of IoT data to the cloud.

Photo Source: http://electronicdesign.com/site-files/electronicdesign.com/files/uploads/2014/06/113191_fig4sm-cisco-fog-computing.jpg

The OpenFog Consortium

The OpenFog Consortium, was founded on the premise based on open architectures and standards that are essential for the success of a ubiquitous Fog Computing ecosystem.

The collaboration among tech giants such as ARM, Cisco, Dell, GE, Intel, Microsoft and Schneider Electric defining an Open, Interoperable Fog Computing Architecture is without any doubt good news for a vibrant supplier ecosystem.

The OpenFog Reference Architecture is an architectural evolution from traditional closed systems and the burgeoning cloud-only models to an approach that emphasizes computation nearest the edge of the network when dictated by business concerns or critical application the functional requirements of the system.

The OpenFog Reference Architecture consists of putting micro data centers or even small, purpose-built high-performance data analytics machines in remote offices and locations in order to gain real-time insights from the data collected, or to promote data thinning at the edge, by dramatically reducing the amount of data that needs to be transmitted to a central data center. Without having to move unnecessary data to a central data center, analytics at the edge can simplify and drastically speed analysis while also cutting costs.

Benefits of Fog Computing

  • ·         Frees up network capacity - Fog computing uses much less bandwidth, which means it doesn't cause bottlenecks and other similar occupancies. Less data movement on the network frees up network capacity, which then can be used for other things.
  • ·         It is truly real-time - Fog computing has much higher expedience than any other cloud computing architecture we know today. Since all data analysis are being done at the spot it represents a true real time concept, which means it is a perfect match for the needs of Internet of Things concept.
  • ·         Boosts data security - Collected data is more secure when it doesn't travel. Also makes data storing much simpler, because it stays in its country of origin. Sending data abroad might violate certain laws.
  • ·         Analytics is done locally- Fog computing concept enables developers to access most important IoT data from other locations, but it still keeps piles of less important information in local storages;
  • ·         Some companies don't like their data being out of their premises- with Fog Computing lots of data is stored on the devices themselves (which are often located outside of company offices), this is perceived as a risk by part of developers' community.
  • ·         Whole system sounds a little bit confusing- Concept that includes huge number of devices that store, analyze and send their own data, located all around the world sounds utterly confusing.

Disadvantages of Fog Computing

Read more: http://bigdata.sys-con.com/node/3809885

Examples of Fog Computing

The applications of fog computing are many, and it is powering crucial parts of IoT ecosystems, especially in industrial environments. See below some use cases and examples.

  • Thanks to the power of fog computing, New York-based renewable energy company Envision has been able to obtain a 15 percent productivity improvement from the vast network of wind turbines it operates. The company is processing as much as 20 terabytes of data at a time, generated by 3 million sensors installed on the 20,000 turbines it manages. Moving computation to the edge has enabled Envision to cut down data analysis time from 10 minutes to mere seconds, providing them with actionable insights and significant business benefits.
  • Plat One is another firm using fog computing to improve data processing for the more than 1 million sensors it manages. The company uses the Cisco-ParStream platform to publish real-time sensor measurements for hundreds of thousands of devices, including smart lighting and parking, port and transportation management and a network of 50,000 coffee machines.
  • In Palo Alto, California, a $3 million project will enable traffic lights to integrate with connected vehicles, hopefully creating a future in which people won’t be waiting in their cars at empty intersections for no reason.
  • In transportation, it’s helping semi-autonomous cars assist drivers in avoiding distraction and veering off the road by providing real-time analytics and decisions on driving patterns.
  • It also can help reduce the transfer of gigantic volumes of audio and video recordings generated by police dashboard and video cameras. Cameras equipped with edge computing capabilities could analyze video feeds in real time and only send relevant data to the cloud when necessary.

See more at: Why Edge Computing Is Here to Stay: Five Use Cases By Patrick McGarry  

What is the future of fog computing?

The current trend shows that fog computing will continue to grow in usage and importance as the Internet of Things expands and conquers new grounds. With inexpensive, low-power processing and storage becoming more available, we can expect computation to move even closer to the edge and become ingrained in the same devices that are generating the data, creating even greater possibilities for inter-device intelligence and interactions. Sensors that only log data might one day become a thing of the past.

Janakiram MSV  wondered if Fog Computing  will be the Next Big Thing In Internet of Things? . It seems obvious that while cloud is a perfect match for the Internet of Things, we have other scenarios and IoT solutions that demand low-latency ingestion and immediate processing of data where Fog Computing is the answer.

Does the fog eliminate the cloud?

Fog computing improves efficiency and reduces the amount of data that needs to be sent to the cloud for processing. But it’s here to complement the cloud, not replace it.

The cloud will continue to have a pertinent role in the IoT cycle. In fact, with fog computing shouldering the burden of short-term analytics at the edge, cloud resources will be freed to take on the heavier tasks, especially where the analysis of historical data and large datasets is concerned. Insights obtained by the cloud can help update and tweak policies and functionality at the fog layer.

And there are still many cases where the centralized, highly efficient computing infrastructure of the cloud will outperform decentralized systems in performance, scalability and costs. This includes environments where data needs to be analyzed from largely dispersed sources.

“It is the combination of fog and cloud computing that will accelerate the adoption of IoT, especially for the enterprise.”

In essence, Fog Computing allows for big data to be processed locally, or at least in closer proximity to the systems that rely on it. Newer machines could incorporate more powerful microprocessors, and interact more fluidly with other machines on the edge of the network. While fog isn’t a replacement for cloud architecture, it is a necessary step forward that will facilitate the advancement of IoT, as more industries and businesses adopt emerging technologies.

'The Cloud' is not Over

Fog computing is far from a panacea. One of the immediate costs associated with this method pertains to equipping end devices with the necessary hardware to perform calculations remotely and independent of centralized data centers. Some vendors, however, are in the process of perfecting technologies for that purpose. The tradeoff is that by investing in such solutions immediately, organizations will avoid frequently updating their infrastructure and networks to deal with ever increasing data amounts as the IoT expands.

There are certain data types and use cases that actually benefit from centralized models. Data that carries the utmost security concerns, for example, will require the secure advantages of a centralized approach or one that continues to rely solely on physical infrastructure.

Though the benefits of Fog Computing are undeniable, the Cloud has a secure future in IoT for most companies with less time-sensitive computing needs and for analysing all the data gathered by IoT sensors.

 

Thanks in advance for your Likes and Shares

Thoughts ? Comments ?

Read more…

Originally posted on Data Science Central

 Printed electronics are being vouched as the next best thing in Internet of Things (IoT), the technology that is rightly regarded as a boon of advancing technology. Silicon-based sensors are the first that have been associated with IoT technology. These sensors have numerous applications, such as track data from airplane, wind turbines, engines, and medical devices, amongst other internet connected devices.

However, these silicon-based are not suitable for several other applications. Bendable packaging and premium items are some of the application where embedded sensors do not work. For such applications, printed electronics befit the need. Using sensor technology, information is transferred on smart labels that can be attached to packages to be tracked in real time.

Some Applications of Printed Sensor Technology

Grocery Industry: While bar code is the standard technology used in the grocery sector, the technology has limitations pertaining to the data it can store. Also, for some products, product packaging can run up to 30-40% of the cost, for which printed sensor are best-suited to save packaging costs. For such needs, a printed sensor is the most apt solution for real-time information about a product’s temperature, moisture, location, movement, and much more. Companies can check these parameters to validate the freshness and prevent substantial spoilage. Smart labels are also used to validate the authenticity of products.

Click here to get report.

Healthcare: The use of smart labels enables manufacturers and logistics firms to track the usage and disposal of pharmaceuticals and to control inventory. The use of smart labels on patients’ clothing enables to check their body temperature, dampness of adult diapers, or bandages for assisted living scenarios.

Logistics: Radio frequency identification (RFID) was the standard tag used by logistics companies until recently to identify shipping crates that carried perishable products. RFID is increasingly being replaced by smart labels that enable tracking of individual items. This facilitates companies to track products at the item level rather than at the container shipping level.

Biosensors Lead Printed and Flexible Sensors Market

As per the research study, the global market for printed and flexible sensors is estimated to grow at a fast pace, due to which several investors are interested in pouring funds into the market. This is expected to create potential opportunities for commercialization and product innovation. In addition, several new players are also projected to participate in order to gain a competitive advantage in the market. In 2013, the global printed and flexible sensors market stood at US$6.28 bn and is projected to be worth US$7.51 bn by the end of 2020. The market is expected to register a healthy 2.50% CAGR between 2012 and 2020, as per the study.

The rapid growth in individual application segments and several benefits over the conventional sensors are some of the key factors driving the global market for printed and flexible sensors. In addition, the developing global market for Internet of Things is further anticipated to fuel the growth of the market in the next few years. On the flip side, several challenges in conductive ink printing are estimated to hamper the growth of the market for printed and flexible sensors in the near future.

Biosensors are most extensively used with the largest market share in the global market for printed and flexible sensors. Glucose strips incorporated with a biosensor are one of the most sought after ways to track and monitor glucose levels among diabetics. Thus, it accounts as a multi-billion dollar segment in the global market for printed and flexible sensors. To evaluate and monitor working of the heart, kidney diseases, and cancer are the other emerging applications where printed biosensors technology is being utilized.

The expanding automobile industry holds promise for piezoelectric type printed flexible sensors for performance testing during production. Due to these varied applications of printed and flexible sensors, the global market for printed and flexible sensors will expand at a slow but steady 2.5% CAGR in the next six years starting from 2012.

Follow us @IoTCtrl | Join our Community

Read more…

Soft Pasture

By Ben Dickson. This article originally appeared here.

The Internet of Things (IoT) is one of the most exciting phenomena of the tech industry these days. But there seems to be a lot of confusion surrounding it as well. Some think about IoT merely as creating new internet-connected devices, while others are more focused on creating value through adding connectivity and smarts to what already exists out there.

I would argue that the former is an oversimplification of the IoT concept, though it accounts for the most common approach that startups take toward entering the industry. It’s what we call greenfield development, as opposed to the latter approach, which is called brownfield.

Here’s what you need to know about greenfield and brownfield development, their differences, the challenges, and where the right balance stands.

Greenfield IoT development

In software development, greenfield refers to software that is created from scratch in a totally new environment. No constraints are imposed by legacy code, no requirements to integrate with other systems. The development process is straightforward, but the risks are high as well because you’re moving into uncharted territory.

In IoT, greenfield development refers to all these shiny new gadgets and devices that come with internet connectivity. Connected washing machines, smart locks, TVs, thermostats, light bulbs, toasters, coffee machines and whatnot that you see in tech publications and consumer electronic expos are clear examples of greenfield IoT projects.

Greenfield IoT development is adopted by some well-established brands as well as a lineup of startups that are rushing to climb the IoT bandwagon and grab a foothold in one of the fastest growing industries. It is much easier for startups to enter greenfield development because they have a clean sheet and no strings attached to past development.

But it also causes some unwanted effects. First of all, when things are created independent of each other and their predecessors, they tend to pull the industry in separate ways. That is why we see the IoT landscape growing in many different directions at the same time, effectively becoming a fragmented hodgepodge of incompatible and non-interoperable standards and protocols. Meanwhile, the true future of IoT is an ecosystem of connected devices that can autonomously inter-communicate (M2M) without human intervention and create value for the community. And that’s not where these isolated efforts are leading us.

Also, many of these companies are blindly rushing into IoT development without regard to the many challenges they will eventually face. Many of the ideas we see are plain stupidand make the internet of things look like the internet of gadgets. Nice-to-haves start to screen out must-haves, and the IoT’s real potential for disruption and change will become obscured by the image of a luxury industry.

As is the case with most nascent industries, a lot of startups will sprout and many will wither and die before they can muster the strength to withstand the tidal waves that will wash over the landscape. And in their wake, they will leave thousands and millions of consumers with unsupported devices running buggy—and potentially vulnerable—software.

On the consumer side, greenfield products will impose the requirement to throw away appliances that should’ve worked for many more years. And who’s going to flush down hundreds and thousands of hard-earned dollars down the drain to buy something that won’t necessarily solve a critical problem?

On the industrial side, the strain is going to be even more amplified. The costs of replacing entire infrastructures are going to be stellar, and in some cases the feat will be impossible.

This all doesn’t mean that greenfield development is bad. It just means that it shouldn’t be regarded as the only path to developing IoT solutions.

Brownfield IoT development

Again, to take the cue from software development, brownfield development refers to any form of software that created on top of legacy systems or with the aim of coexisting with other software that are already in use. This will impose some constraints and requirements that will limit design and implementation decisions to the developers. The development process can become challenging and arduous and require meticulous analysis, design and testing, things that many upstart developers don’t have the patience for.

The same thing applies to IoT, but the challenges become even more accentuated. In brownfield IoT development, developers inherit hardware, embedded software and design decisions. They can’t deliberate on where they want to direct their efforts and will have to live and work within a constrained context. Throwing away all the legacy stuff will be costly. Some of it has decades of history, testing and implementation behind it, and manufacturers aren’t ready to repeat that cycle all over again for the sake of connectivity.

Brownfield is especially important in industrial IoT (IIoT), such as smart buildings, bridges, roads, railways and all infrastructure that have been around for decades and will continue to be around for decades more. Connecting these to the cloud (and the fog), collecting data and obtaining actionable insights might be even more pertinent than having a light bulb that can be turned on and off with your smartphone. IIoT is what will make our cities smarter, more efficient, and create the basis to support the technology of the future, shared economies, fully autonomous vehicles and things that we can’t imagine right now.

But as its software development counterpart, brownfield IoT development is very challenging, and that’s why manufacturers and developers are reluctant and loathe to engage in it. And thus, we’re missing out on a lot of the opportunities that IoT can provide.

So which is the better?

There’s no preference. There should be balance and coordination between greenfield and brownfield IoT development. We should see more efforts that bridge the gap between so many dispersed efforts in IoT development, a collective effort toward creating establishing standards that will ensure present and future IoT devices can seamlessly connect and combine their functionality and power. I’ve addressed some of these issues in a piece I wrote for TechCrunch a while back, and I think there’s a lot we can learn from the software industry. I’ll be writing about it again, because I think a lot needs to be done to have IoT development head in the right direction.

The point is, we don’t need to reinvent the wheel. We just have to use it correctly.

Read more…
RSS
Email me when there are new items in this category –

Upcoming IoT Events

6 things to avoid in transactional emails

transactional man typing

  You might think that once a sale has been made, or an email subscription confirmed, that your job is done. You’ve made the virtual handshake, you can have a well-earned coffee and sit down now right? Wrong! (You knew we were…

Continue

More IoT News

IDG Contributor Network: 20 awesome Apple websites

Apple is one of the richest, best known companies on the planet. Apple’s products generate reams of news, reviews, discussions and opinion columns everyday around the web.

But the company’s very prominence can make it hard to know which…

Continue

IoT Career Opportunities