Subscribe to our Newsletter | To Post On IoT Central, Click here


iot (44)

 

 

What is Going on with Residential IoT

Cyber Security?

For sure you have heard about the recent DDoS attacks that occurred last October 21st on Dyn’s DNS service. The news broke out reporting that many well-known Internet services were not available. According to Hacker News Twitter, Etsy, Spotify and other sites were affected. Up to this point, there’s nothing new, just another DDoS attack. Large company outage means big news, but there is still a point that is key in this equation and that has not been addressed. 

  • Was Residential or Consumer IoT affected?

According to Dyn’s report, “the attack come from 100,000 malicious endpoints”. 

On the second last paragraph they quote: “Not only has it highlighted vulnerabilities in the security of “Internet of Things” (IOT) devices that need to be addressed, but it has also sparked further dialogue in the internet infrastructure community about the future of the internet.

Put both quotes together: 100,000 IoT devices have been Hacked. This is astonishing and outstanding!

There has been no news about how the 100,000 IoT device customers have been affected or supported:

  • Do they still have the Bot inside their device? 
  • Do the devices work correctly? 
  • Do they know they have been hacked? 
  • Do they know they are at risk? 
  • Will the Bots change and do other things? 
  • Will the Bots leave backdoors in their home networks?
  • How long will it take for another Bot to hack their IoT device?
  • What are Consumer Protection Agencies doing about this?
  • What are Governments doing?

This is no joke, we are talking about 100,000 devices (IoT Customers), and therefore, has to be addressed very seriously.

Dyn and the Internet community will address the issue. That’s fine! But how and when will they solve the Residential IoT vulnerability problem. Residential IoT needs to be Secured, Monitored and its software Updated. Enterprise IoT already contemplates this, but Residential IoT does not. Individual devices are sold with no security, and in the best case, if they are well developed and secured they still need to be monitored because software always has vulnerabilities, no matter how well and secure it has been developed.

All the questions, above cannot be solved using secure policies inside IoT or in the Internet itself. More has to be done! This is a Game Changer; Home Networks have to be monitored and secured to prevent Malware and Attacks. If not, the Internet will soon be like Hell.

The Residential IoT Avalanche

Gartner estimates that by 2020 there will be 25 billion IoT devices, of these, 13 billion will be Residential Home Devices, more than 50% of the total. Imagine if only 1% of these devices are vulnerable, there will be 13 million devices to hack.

  • Are the Internet Home Users aware of the risk they are taking?
  • Are their Home Networks and GateWays (GW/Router) secure?
  • Will the Internet itself be reliable and secure?

How to Secure Home Networks

Twenty years ago, Home Networks only had PCs, with well-developed software, for examples Windows, but many vulnerabilities were used to Hack Residential and Enterprise PCs. This problem brought up many Anti Malware (AM) Software Companies to safeguard Windows PCs. The same is happening right now with Residential IoT.

IoT devices don’t have the possibility or suppliers are not interested in incorporating AM software to their IoT. They are generally too small and only have specific dedicated software, i.e.: they cannot be easily protected with AM Software embedded in their devices:

  • This is a big problem. How can it be solved?
  • Where and how can AM software safeguard Home Networks, GWs and IoT?

Every Home Network connects to the Internet through the GW, which is the main door into our Home. As with Houses, shouldn’t an armored door be used to prevent thieves from coming in? The GW is the door to the Internet and it is also another device with CPU and Memory, a processing unit that can do the job. Why not use it to block hackers before they even get in? Thanks to FTTH and IoT itself, Gateways have become more powerful. If a GW does not have the power to cope with AM Security, then a security appliance should be connected to it. Using a secure GW, the entire Home Network will be protected from Malware and Attacks.

Many Security Providers and new startups have already foreseen the Secure GW solution.

Current Residential IoT/GW Security Innovation Trends

As described before, the most effective scenario to protect your Home IoT is to Safeguard the Home Network using the GW, this is currently being done with two innovative solutions:

Solution #1.              Attach a physical AM Security Appliance to the Home GW.

Solution #2.              Embedding AM Security software directly into the Home GW.

Solution #1 Is an interesting and effective approach, another device with more CPU and Memory means more processing power, but it adds another gadget to the end-user and it has to be physically connect to the Home GW’s 1Gbit Port.

The Pros: The Appliance adds an extra device to manage security, leaving the GW as it is. The customers will manage alerts and/or security configurations through a simple app on their smartphones. 

The Cons: All the traffic will bypass the appliance through a 1Gbit port, which needs a cable connected to the GW. Customers want to reduce physical gadgets, they already have many, such as the GW itself, IPTV DVB Decoder, the ONT, Game Station, Printers, cables, etc. Another device is not a bad solution but the current trend is to reduce home devices and cables, this solution will work but in a few years Solution #2 will make Solution #1 obsolete.

Solution #2. The Security Software will come within the GW device or it will remotely be installed.

The Pros: The customer will only manage alerts and/or security configurations, with a simple mobile app, that’s all. Simple, no physical appliance, no wires. 

The Cons: Many of the current GW hardware devices don’t have sufficient physical CPU and/or Memory capacity to manage security software, but with the FTTH and the IoT boom, Gateways are becoming more and more powerful and in a few years, most of them, if not all, will have the power to manage AM software.

Make it Simple, Intelligent and Economically Viable for Retail

Both solutions have their pros and cons, and both should, at least, address basic security surveillance. There are many threats that can be addressed using Cloud Intelligent Processing, analyzing Home Network Metadata (GW CPU will be liberated from many security tasks). But, most important of all is the combined Residential Cloud Intelligence, for example; if a new threat is detected and blocked on a provider’s vulnerable IoT device, the solution will automatically be propagated to all of the security providers’ customers, avoiding mass propagation and hacking damage. 

Residential Device “Internet Use Patterns” will be supervised and any mismatch will be reported to the customer or automatically be blocked if a malicious attacker is detected.

Customers don’t or cannot give proper maintenance to their Home IoT. The solution should or will control possible problems like vulnerable firmware, recommend changing easy or default passwords, block dangerous port access, grant or deny access, etc. Most of these simple actions will be prompted on the users’ smartphone, and the problem will easily be solved using a simple one click menu.

And finally, and probably most important, customers don’t want and can’t pay for a highly sophisticated solution. A next generation firewall type solution is way out of scope and expensive, the solution has to be smart and economically viable or sales will draw back.

There is no need to drill down into what can be done and what cannot, both solutions are effective. Solution #1 is good but #2 is in the core of the Home Network, the GW, and simpler for the end user, but it may take some time before all the GWs have sufficient power and capacity. 

Conclusions

  • There are millions of Residential IoT Devices being hacked, but most users are unaware and the press doesn’t really talk about it.
  • Residential IoT is in general insecure and with the predicted IoT Avalanche, hackers will take advantage of the situation to make the Internet be like Hell.
  • Residential IoT must be Secured, Monitored and its software Updated using the Home GW Router.
  • Make it Simple, Intelligent and Economically Viable for Retail.
  • IoT Residential Customers must be 100% aware of the Security risks, this must be strongly driven by Consumer Agencies, Governments, The Press, IoT Suppliers and Security Vendors.

If the security actions described in this publication are not addressed correctly, the Internet and all of us will have to learn the hard way. 

Juan Mora Zamorano

Independent Security Contractor

https://es.linkedin.com/in/morajuan

Read more…

IoT Future – 34 Billion new Devices in 4 Years?

Many industry experts and consumers are pointing the Internet of Things (IoT) as an upcoming Industrial Revolution or an upcoming Internet.

Why this? Simple, because IoT will consist of the future form of interaction of businesses, governments and consumers with the physical world.

The most recent studies indicate that in 2020 more than 34 billion devices will be connected to the internet, in many sectors (Industrial, Agriculture, Transportation, Wearable Devices, Smart Cities, Smart Houses, etc).

Of these 34 billion, the IoT will be responsible for 23 billion devices, the others 11 billion will be represented by the regular devices, such as, smartphones, tablets, smartwatches, etc.

BI - IoT - Evolution Graph - IoT FutureSource: BI Intelligence

The business sector will be responsible for the biggest use part of this devices, since the IoT can reduce the Operational Costs, Increase the Production, expand the business for new market niches.

Government will take the second biggest part of the devices connected, in smart cities, fasting up the public process, increasing the quality life of the citizens.

At last but not less important, the home user, will have a lot of IoT Devices, Smart Houses, Wearable Devices.

So the future we can really specify in some words: "The future is Data".

Read more…

12 Steps to Stop the Next IoT Attack in its Tracks

The recent distributed denial-of-service (DDoS) IoT attack against DNS is a wake up call to how fragile the Internet can be.

The IoT attack against Domain Name Servers from a botnet of thousands of devices means it’s way past time to take IoT security seriously. The bad actors around the world who previously used PCs, servers and smartphones to carry out attacks have now set their sights on the growing tidal wave of IoT devices. It’s time for consumers and enterprises to protect themselves and others by locking down their devices, gateways and platforms. While staying secure is a never-ending journey, here’s a list of twelve actions you can take to get started:

  1. Change the default usernames and passwords on your IoT devices and edge gateways to something strong.
  2. Device telemetry connections must be outbound-only. Never listen for incoming commands or you’ll get hacked.
  3. Devices should support secure boot with cryptographically signed code by the manufacturer to ensure firmware is unaltered.
  4. Devices must have enough compute power and RAM to create a transport layer security (TLS) tunnel to secure data in transit.
  5. Use devices and edge gateways that include a Trusted Platform Module (TPM) chip to securely store keys, connection strings and passwords in hardware.
  6. IoT platforms must maintain a list of authorized devices, edge gateways, associated keys and expiration dates/times to authenticate each device.
  7. The telemetry ingestion component of IoT platforms must limit IP address ranges to just those used by managed devices and edge gateways.
  8. Since embedded IoT devices and edge gateways are only secure at a single point in time, IoT platforms must be able to remotely update their firmware to keep them secure.
  9. When telemetry arrives in an IoT platform, the queue, bus or storage where data comes to rest must be encrypted.
  10. Devices and edge gateways managed by an IoT platform must update/rotate their security access tokens prior to expiration.
  11. Field gateways in the fog layer must authenticate connected IoT devices, encrypt their data at rest and then authenticate with upstream IoT platforms.
  12. IoT platforms must authenticate each device sending telemetry and blacklist compromised devices to prevent attacks.

Keeping the various components that make up the IoT value chain secure requires constant vigilance. In addition to doing your part, it’s important to hold the vendors of the IoT devices, gateways and platforms accountable for delivering technology that’s secure today and in the future.

Read more…

What’s inside the Internet of Things?

The Internet evolution has achieved the level when it is simply here for us at all times. We don’t even think of how we connect to a network, nor analyze the connection technical details, as well as we don’t care about who our communications service provider is. All-round Wi-Fi penetration and gradual IPv6 extension enable thousands of simple devices and trackers to interoperate continuously and send data “to the cloud”. Fast infrastructure advancement resulted in substituting the older Machine-to-Machine (M2M) term for more up-to-date Internet of Things (IoT) one.

Building up sort of distributed intelligence, IoT devices yet need centralized management, a system or service able to fine-tune the devices, provide storage and interpret collected data. Being the “brain” of the device cloud infrastructure, the management system also enlarges machine knowledge bases and updates device software.

Operators study data aggregated by groups or time periods and visualize it. This data is then delivered to various Business Intelligence Systems for more detailed analysis. Curiously enough, even if we speak about personal devices (e.g. fitness trackers), almost every cloud service operator analyses the collected data usage statistics anonymously for further device/service development.

Development of IoT devices becomes simpler and cheaper enabling small companies to enter the market. Plenty of businesses realize the need of building a management system, but they underestimate its development complexity and ignore the need of using industrial server technologies (such as failover clustering and multi-server distributed architecture). Typically, such a development starts in house. IoT devices successfully introduced in the market lead to rapid growth of users, causing long-term problems with service scaling and performance.

Anticipating further problems and being unable to form a server-based software development team quickly, IoT operators usually outsource the central system development focusing on devices only. Yet, it doesn’t solve the problem as third-party developers start building the system from scratch with lack of time and resources to apply serious technologies.

AggreGate Platform was born in 2002. At that time we were producing serial-over-IP converters and needed a central server that would transmit data between converters hidden by firewalls or NAT and having no chance to communicate directly. The first product version called LinkServer was written in C++ and was available only as a service simply transmitting data flows without any processing.

Short while later our converters developed into freely programmable controllers. They “understood” data flowing through them, thus we wanted the central server to do the same thing. At about the same time we realized that 90% of time spent for developing a monitoring and device management system was reinventing the wheel with very little effort put into solving certain business problems.

Since 2004 the system ported on Java has evolved as a framework for device management. For quite a few years we worked without clear understanding of the result we want to achieve. Fortunately, we have avoided work with a single customer or in a single industry by keeping our system flexible.

Now AggreGate Platform is applied to a great variety of industries, including Remote Monitoring and Service, IT Infrastructure and Network Monitoring, SCADA/HMI and Process Automation, Access Control, Building Automation, Fleet Management, Vending Machine and Self-service Kiosk Management, Sensor Network Monitoring, People and Vehicle Counting, Centralized Event and Incident Management, Digital Signage and Mobile Device Management.

 

Major Platform Tasks

Figuratively speaking, AggreGate is a LEGO constructor for prompt device cloud interface development. Allowing IoT solution architects to focus mainly on hardware and business logic, it solves the following infrastructure tasks:

  • Maintaining communication between servers and devices connected via unreliable cellular and satellite links
  • Unified approach to device data regardless of its physical meaning
  • Storing large volumes of collected events and historical data in various databases (relational, round-robin, NoSQL)
  • Visual building of complex source data analysis and event correlation chains
  • Modeling multiple device data integration and all infrastructure KPIs calculation processes
  • Fast operator and system engineer interface building using out-of-the-box “bricks” without any coding
  • Implementing integration scenarios via ready-to-use universal connectors (SQL, HTTP/HTTPS, SOAP, CORBA, SNMP, etc.)

 

System Unification

Being universal, AggreGate Platform unites various monitoring and management systems. It helps avoid extra integration points and decreases the number of integration scenarios. For example, the integrated monitoring system has a single integration point with Service Desk/ITSM/Maintenance Management systems for incident (alert) delivery. It also integrates with Inventory/Asset Management systems for collecting information on available physical assets and their influence on business services.

In such cases, role-based access control provides various departments with customized system scenarios and unique operator interfaces.

Platform Architecture

The Platform includes the following essential components:

  • Server is a Java-based application providing communication with devices, data storage and its automated processing. Servers can group into clusters for high availability and keep peer-to-peer relations in distributed installations. AggreGate Server manages an embedded web server which in its turn supports web interfaces.
  • Unified Console is a crossplatform desktop client software ensuring simultaneous work with one or several servers in administrator, system engineer or operator mode.
  • Agent is a library that can be integrated into an IoT device firmware to ensure communication with servers, device setup unification, performing operations with a device and asynchronous event sending. There are a lot of libraries (Java, .NET, C/C++, Android Java, etc.). No need to deploy an agent if communications with the server are performed using standard or proprietary protocols. In the latter case a separate device driver is developed for the server. The agent can be also implemented as a separate hardware device (gateway).
  • Open-source API for extending functionality of all other components and implementing complex integration scenarios.

The Server supervises device data reading and writing changes. This process is called bidirectional synchronization. The server creates a device snapshot containing last values of device metrics and changes carried out by operators or system modules and not written to a device due to communication downtime. Configuration changes are delivered to devices on the “best effort” basis enabling to configure device groups, even if some devices are offline.

The Server also provides receiving and processing incoming device connections that have no white static IP addresses.

Device data and events merge into a unified data model. Within this model, each device is represented as a so-called context in a hierarchical context structure. Each context includes a set of formalized data elements of three types: variables (properties, settings, attributes), functions (methods, operations), and events (notifications). A context also contains metadata describing all available elements. Therefore, all context data and metadata are entirely stored in the current context. This technology is called device normalization. Device drivers and agents create a normalized presentation of various device types.

There are some parallels with object-oriented programming, where objects typically have properties, events and methods. Properties are internal device variables, methods are operations performed by a device, and events describe how a device notifies the server of internal data or environment changes.

Virtually any device can be described as a set of properties, methods and events. For example, a remotely controlled water tank can have a “water level” property to show the current amount of water in the tank and “turn valve on/off” methods to control the valve letting the water into/out of the tank. This smart water tank may also generate a number of notifications, such as “nearly empty”, “nearly full” and “overflow”. We have developed more than 100 Java-based drivers, and the normalization concept has also proved to be an advantage. Moreover, a lot of current “universal” protocols (such as OPC UA, JMX or WMI) use similar data models.

All Server contexts are a part of a hierarchical structure called context tree. Though the contexts match diverse objects (devices, users, reports, alerts, etc.), they have a unified interface and can interoperate within the server context tree, offering a high level of flexibility. The same principle enables various servers to interact in a distributed installation.


Every connected device allows operators to perform direct configuration (device configuration reading and modification), direct management (forcing device operation performance manually), and direct monitoring (viewing the device event log in near-real-time mode).

Events and changes of device metric values are stored in the server storage. Depending on the system task, the storage type can vary. For example, if it’s the Raspberry Pi microserver, the simplest file storage is used, while the central server of a distributed installation can use NoSQL-based Apache Cassandra cluster storing dozens of thousands events per second out of original stream with hundreds of thousands events per second.

However, in most cases a regular relational database is used as storage. Using ORM layer (Hibernate) provides compatibility with MySQL, Oracle, Microsoft SQL Server, PostgreSQL, and other DBMS.

Device data and events affect the life cycle of active server objects allowing the server to react to environmental condition changes. These active objects include:

  • Alerts converting a particular seamless object state or event chain to a new event type called incident
  • Models converting source events and values into user-defined events and value types by using business rules
  • Scheduler assuring task performance on schedule even when the server is shut off
  • Sensors and several other object types


Active objects are able to add new types of variables, functions and events in the unified data model, send custom variables and event changes to storage, and invoke device and other object operations in automated mode.

You can use widgets for building data entry forms, tables, dynamic maps, charts and HMIs. They can be combined in dashboards, both global (based on aggregated KPIs and showing the whole infrastructure state) and object-oriented (displaying a single device or infrastructure component state).


Widgets and report templates are built in specialized visual editors seamlessly integrated in the Aggregate Platform ecosystem. The GUI Builder helps design complex interfaces consisting of multiple nested containers with visual components. In addition to absolute layout typical for editors, you can use grid layout familiar to those who came across table editing in HTML page. The grid layout makes it possible to build scalable multi-size data entry forms and tables.

As a result, first-line or second-line operator interfaces developed by using data visualization tools include dashboards with widgets, forms, tables, diagrams, reports, HMIs, and navigation between them.

The GUI Builder supports dozens of out-of-the-box components, such as captions, text fields, buttons, checkboxes, sliders as well as spinners, lists, date/time selectors, scales, and pointers. Among more complex components are trees, video windows, dynamic vector SVG images, geographical maps based on Google Maps/Bing/Yandex/OpenStreetMap. The list of supported diagrams includes classic charts, statistics charts, Gantt charts, and polar charts.

All widgets designed in the GUI Builder operate via web interface, including non-Java browsers, i.e. on mobile devices. You only need HTML5 and JavaScript support.

Properties related to server objects (devices, models, alerts) and UI components are linked together using bindings. Such bindings define when and where data should be taken, how to process it and where to place the results. While processing data, the bindings use expression and query languages.

A binding using an expression resembles Microsoft Excel formula. Such a formula takes data from several cells, applies mathematical operations or data processing functions to it, and places the result into the current cell. An expression is also a formula describing where data should be taken from and what sort of changes to apply to it.

The query language is very similar to regular SQL. It also aggregates data from various tables into one by using filtering, sorting, grouping, etc. The difference between the classic SQL and the embedded query language is that the latter uses virtual tables built on-the-fly from diverse unified model data as a source. Every query checks operator/system object access permissions automatically. With this in mind, the query language has an obvious advantage over direct SQL queries to the server database.

To solve more challenging data processing tasks, you can easily write a script in Java or even a dedicated plugin. However, every script written for data processing by any of our partners is a warning for us: why does one need A platform if classic development out of familiar environment (such as Eclipse or Idea) is still required?

And finally, a few words about the distributed architecture technology. Our concept implies customization of peering relationships between servers so that a server (provider) links a part of its unified data model to the other server (consumer). This allows the consumer server objects to equally interact with the provider server objects. A single server can have unlimited links, moreover, such a server can be both a provider and a consumer towards neighboring servers.

 Distributed architecture ensures solving various large-scale system tasks:

  • Horizontal system scaling.
  • Distributed monitoring with local monitoring server installations and intermediate data storage at remote sites.

Vertical scaling, dividing functions between servers into several logic levels.

Read more…

Interactive Map of IoT Organizations -- TAKE 2

I am excited to launch the 2nd version of my Interactive Map of IoT Organizations  Thanks for all the support and encouragement from David Oro!

https://www.diku.ca/blog/2016/12/04/interactive-map-of-iot-organizations-take-2/

Here are the material changes from the first version:

  1. Each organization now has their specific address instead of being city-based
  2. Now includes the Founder(s) of the organization and a link to more information about them. This is in addition to the “Founded” year which was in the first version
  3. Cleanup of categories. Folks are still trying to determine what it means to be an IoT Platform. For me, it’s most important to focus on standards and integration of systems as there will be organizations that specialize in one aspect of an IoT platform whether it’s the analytics, rules engine, device management, workflow, or visualization functions.
  4. The initial launch of the map had 246 organizations, this new map has 759 organizations. Thanks to many people on LinkedIn and through blog comments for suggesting their companies which accounted for 180 additional organizations. The other 330+ organizations I have been finding on my own by trolling news, Twitter, IoT conference Web sites, “Partners” sections of each organization.

I set up a Twitter account @EyeOhTee and although I still need to tweet more, you may see some interesting news on there and feel free to tweet out this post, plug plug!

Besides the basic data shown on the map, I also track many more attributes of each product. I will publish additional findings and analysis on this blog and here on IoT Central.

I hope you find the map useful and I would love to hear if, and how, it has helped you. Whether you located a company in your area to collaborate or a supplier for a problem you are trying to solve or just learning like me it will have made it worth the time I spend on this.

BGJ

Read more…

TPS-based Office Aircon Controller Application

Figure:1 TPS-based aircon controller

TPS-based aircon controller

Air conditioning is a consumptive business while air conditioners are big-ticket to run. In general, AC systems, older ones, in particular, do not have any real temperature feedback. You set the temperature on your remote, but alas, it has absolutely nothing to do with the actual temperature in the room. Even when it gets colder outsides, many aircons keep blasting cold air into your space. As a result, you have to constantly readjust the temperature as needed for optimal comfort throughout the day.

No doubt, AC systems are improving day by day, but there are still old systems that cannot get updated. In some instances, it’s absolutely impossible to invest in a new system. Sometimes, it is just a catch 22 to rip the old aircon out and install a new one. A basic aircon has many parts that typically are split between an outside and inside configuration, hence you may have to undergo a drastic interior renovation. In Tibbo office in Taipei, we have got trapped in an identical situation. We just have to get by with the AC system we’ve got. Our aircon is controlled with a dozen of infrared remotes lying around.

Some time ago, we set out to create a management system for our dated HVAC system. We used Tibbo Project System (TPS) for this endeavor. Our spec for the aircon controller consisted of exactly two items:

  • The aircon must run or not run depending on whether the lights are on or off. The formula is simple: no lights = no people = no need to run the AC.
  • The temperature in the room must be monitored by the device that stops the aircon whenever the temperature is cooled off to the preset point.

To achieve our goal, we used a TPS2L system equipped with these Tibbits:

  • Ambient temperature probe
  • IR code processor Tibbit (#26)
  • IR front-end Tibbit (#27)
  • Ambient light sensor Tibbit (#28)

Let us tell you about the probe.The probe replaces the ambient temperature meter (Tibbit #29). It is nice to have the meter built right into the TPS. The problem is, the meter is affected by the internal heat of the TPSsystem itself. This influence is especially noticeable for the TPS2L device – it’s LCD really warms up the box! The new probe has the same circuit as the Tibbit #29, with the added benefit of being external to the TPSdevice. Now the measurements are accurate.

Here is a look at the items you need to set up in the menu:

IR commands. This is where you train your IR code processor to be able to transmit two commands: “On,” and “Off.” For the “On” command, use the lowest temperature that your aircon’s remote allows you to set (usually 16 degrees C). The logic here is that when you need to lower the temperature in the room you can use the coldest temperature setting, and when the room cools down to the preset temperature, the aircon is turned off. So really, you only need two commands.

Target temperature. You don’t need to set it here. There are dedicated buttons on the main screen.

Pre-cool start time. This is something we added along the way. Now it is possible to turn the aircon on, once a day, even before we all arrive at the office. Our day starts at 9 am. We set this time for 8:30 am, and by the time we get in, the office is nice and cool (while the scorching Taipei summer keeps on raging outside). The pre-cool timer is hardcoded for 45 minutes. If the lights are still off at 9:15 the aircon is turned off.

*Brightness threshold. *This is the brightness that the TPS will consider to correspond to “lights on.” The value is not expressed in any standard measurement units; it’s just the value the Tibbit #28 returns. So, how do you know what number to set here? Simple: the brightness is displayed on the main screen, like this: “Light level: 718”. Note the value with the lights off and on, then set the threshold to some value in the middle between the two.

Temp. meas. adjustment. This is useful for when you choose to use the Tibbit #29. As we’ve explained above, its measurements are affected by the internal heat of the TPS itself. You can use a regular thermometer and determine the measurement error. For example, if your thermometer reads 25C, and TPS shows 28C, then you must adjust the temperature by 3 degrees C. The data returned by the new external probe need no adjustment.

Further work

In phase 2 of this project we will connect our aircon controller to an AggreGate server. It will be possible to control the system via a smartphone app, which we going to design for this purpose. Now you know why our configuration menu has items like Network, AggreGate, etc. Stay tuned!

Figure:2 Aircon

Figure 2  Aircon

Read more…

Tibbo Project System (TPS) is a highly configurable, affordable, and innovative automation platform. It is ideal for home, building, warehouse, and production floor automation projects, as well as data collection, distributed control, industrial computing, and device connectivity applications.

Suppliers of traditional “control boxes” (embedded computers, PLCs, remote automation and I/O products, etc.) typically offer a wide variety of models differing in their I/O capabilities. Four serial ports and six relays. Two serial ports and eight relays. One serial port, four relays, and two sensor inputs. These lists go on and on, yet never seem to contain just the right mix of I/O functions you are looking for.

Rather than offering a large number of models, Tibbo Technology takes a different approach: Our Tibbo Project System (TPS) utilizes Tibbits® – miniature electronic blocks that implement specific I/O functions. Need three RS232 ports? Plug in exactly three RS232 Tibbits! Need two relays? Use a relay Tibbit. This module-based approach saves you money by allowing you to precisely define the features you want in your automation controller.

Here is a closer look at the process of building a custom Tibbo Project System.

Start with a Tibbo Project PCB (TPP)

 

 

A Tibbo Project PCB is the foundation of TPS devices.

Available in two sizes – medium and large – each board carries a CPU, memory, an Ethernet port, power input for +5V regulated power, and a number of sockets for Tibbit Modules and Connectors.

Add Tibbit® Blocks

Tibbits (as in “Tibbo Bits”) are blocks of prepackaged I/O functionality housed in brightly colored rectangular shells. Tibbits are subdivided into Modules and Connectors.

Want an ADC? There is a Tibbit Module for this. 24V power supply? Got that! RS232/422/485 port? We have this, and many other Modules, too.

Same goes for Tibbit Connectors. DB9 Tibbit? Check. Terminal block? Check. Infrared receiver/transmitter? Got it. Temperature, humidity, and pressure sensors? On the list of available Tibbits, too.

Assemble into a Tibbo Project Box (TPB)

Most projects require an enclosure. Designing one is a tough job. Making it beautiful is even tougher, and may also be prohibitively expensive. Finding or making the right housing is a perennial obstacle to completing low-volume and hobbyist projects.

Strangely, suppliers of popular platforms such as Arduino, Raspberry Pi, and BeagleBone do not bother with providing any enclosures, and available third-party offerings are primitive and flimsy.

Tibbo understands enclosure struggles and here is our solution: Your Tibbo Project System can optionally be ordered with a Tibbo Project Box (TPB) kit.

The ingenious feature of the TPB is that its top and bottom walls are formed by Tibbit Connectors. This eliminates a huge problem of any low-volume production operation – the necessity to drill holes and openings in an off-the-shelf enclosure.

The result is a neat, professionally looking housing every time, even for projects with the production quantity of one.

Like boards, our enclosures are available in two sizes – medium and large. Medium-size project boxes can be ordered in the LCD/keypad version, thus allowing you to design solutions incorporating a user interface.

 

Unique Online Configurator

To simplify the process of planning your TPS we have created an Online Configurator.

Configurator allows you to select the Tibbo Project Board (TPP), “insert” Tibbit Modules and Connectors into the board’s sockets, and specify additional options. These include choosing whether or not you wish to add a Tibbo Project Box (TPB) enclosure, LCD and keypad, DIN rail mounting kit, and so on. You can choose to have your system shipped fully assembled or as a parts kit.

Configurator makes sure you specify a valid system by watching out for errors. For example, it verifies that the total power consumption of your future TPS device does not exceed available power budget. Configurator also checks the placement of Tibbits, ensuring that there are no mistakes in their arrangement.

Completed configurations can be immediately ordered from our online store. You can opt to keep each configuration private, share it with other registered users, or make it public for everyone to see.

Develop your application


Like all programmable Tibbo hardware, Tibbo Project System devices are powered by Tibbo OS (TiOS).

Use our free Tibbo IDE (TIDE) software to create and debug sophisticated automation applications in Tibbo BASIC, Tibbo C, or a combination of the two languages.

To learn more about the Tibbo Project System click here

Read more…

OPC Server from Tibbo Technology

OPC – «Open Platform Communications» – is a set of standards and specifications for manufacturing telecommunication. OPC specifies the transfer of real-time plant data between control devices from various producers. OPC was designed to process control hardware and support a common bridge for Windows-based software applications. OPC was aimed to reduce the number of duplicated effort performed by hardware manufacturers and their software partners.

 

The most typical OPC specification, OPC Data Access (OPC DA), is supported by Tibbo OPC Server. Any device compatible with the Tibbo AggreGate protocol can be a data source. AggreGate is a white-label IoT integration platform using up-to-date network technologies to control, configure, monitor and support electronic devices, along with distributed networks of such electronic devices. It also helps you collect device data in the cloud, where you can slice and dice it in alignment with your needs. In addition, the platform lets other enterprise applications transparently access this data via the AggreGate server.

Tibbo OPC server has embedded AggreGate network protocol. It can both interact with any Tibbo devices via AggreGate agent protocol and connect to AggreGate server. The AggreGate agent protocol open-source solution is published for Java, C#, and C++ programming languages, so your connection scheme is not restricted to AggreGate server  or Tibbo devices only.

 

Examples

A simple example: TPS reads Tibbit #29 (Ambient temperature meter) and forwards data to OPC server via AggreGate agent protocol.

A more complex example: we have a Windows-based PC controlling a wood processing machine by means of AggreGate server through the Modbus protocol. If Tibbo OPC server is linked with AggreGate server, the data from the machine is sent to Tibbo OPC server, and therefore, we can operate and monitor the machine via any OPC client.

Technical Specification

  • Compatibility with Windows XP/2003 or later (Microsoft Visual C++ 2013 redistributable is required - installed automatically)

  • Support of DA Asynchronous I/O 2.0 and Synchronous I/O with COM/DCOM technology

Tibbo OPC Server transmits the information on the Value, Quality and Timestamp of an item (tag) to the OPC Client applications. These fields are read from the AggreGate variables.

 

The process values are set to Bad [Configuration Error] quality if OPC Server loses communication with its data source (AggreGate Agent or AggreGate Server). The quality is set to Uncertain [Non-Specific] if the AggreGate variable value is empty.

In the following chart below you can see a concordance table of the AggreGate variables and the OPC data types:

AggreGate Data Type OPC Data Type
INTEGER VT_I4
STRING VT_BSTR
BOOLEAN VT_BOOL
LONG VT_I8
FLOAT VT_R4
DOUBLE VT_R8
DATE VT_DATE
DATATABLE OPC VT_BSTR (by default)
COLOR VT_I4
DATA VT_BSTR

To learn more about Tibbo OPC server, click here

Read more…

Will There Be A Dominant IIoT Cloud Platform?

When you think about consumer cloud platforms, which ones come to mind? Amazon AWSMicrosoft Azure and Google’s Cloud Platform are likely to be at the top of your list. But what about industrial cloud platforms? Which ones rise to the top for you? Well, GE’s PredixSiemen's MindSphere, and the recently announced Honeywell Sentience are likely to be on any short list of industrial cloud platforms. But they aren’t the only ones in this space. Cisco's JasperIBM’s Watson IoTMeshifyUptake, and at least 20 others are competing to manage all those billions of sensors that are expected to encompass the Industrial Internet of Things (IIoT). Which one do you think will end up dominating the market?

A Brief Overview of Cloud Computing

To answer the above question, let's start with a very brief overview of cloud computing to put industrial cloud platforms in their proper context. Cloud platforms are one of several services that cloud computing providers offer, with the main ones being: Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS).

  • Infrastructure as a Service (IaaS) – does the provision processing, storage, networks, and other basic computing resources. The IaaS is used to deploy other services (i.e., PaaS and SaaS), as well as web applications. Customers don't manage or control the IaaS, but they are able to access to the servers and storage, and can operate a virtual data center in the cloud. There are a lot of IaaS providers. The biggest ones are Amazon Web Services (AWS), Windows Azure, Google Compute Engine, Rackspace Open Cloud, and IBM SmartCloud Enterprise. ( GE’s Predix platform for the Industrial Internet will be available to run on Microsoft Windows Azure Cloud.)
  • Platform as a Service (PaaS) – used for applications development, and providing cloud components to software. A PaaS makes it quick and easy and to develop, test, and deploy applications. Customers can't control or manage the underlying infrastructure, but they can control the applications and configuration of application-hosting environment. GE’s Predix, Honeywell's Sentience, and Siemens's MindSphere are PaaS's for industrial applications. Software development firms such as CumulocityBosch IoT, and Carriots are also provide PaaS’s for industry.
  • Software as a Service (SaaS) – delivers cloud-based applications to a user via a web browser or a program interface. The advantage of a SaaS is that you don't need to run or install specific applications on individual computers, which not only saves time and money, it also simplifies maintenance and support. Common SaaS examples include: Google Apps and Cisco WebExSiemens's Industrial Machinery Catalyst on the Cloud is an example of an Industrial SaaS (using AWS infrastructure).

Three Main Services of Cloud Computing

Software as a Service (SaaS)

Platform as a Service (PaaS)

Infrastructure as a Service (IaaS)

Industrial Cloud Platforms by Industrial Companies

Industrial cloud platforms have a much deeper focus on operational technology than consumer platforms. They are designed to allow data gathering throughout manufacturing production processes, in order to improve performance as well as predict failures before they happen. Here are three industrial cloud platforms by long time industrial companies:

  • GE Predix: is a platform-as-a-service (PaaS) specifically designed for industrial data and analytics. It can capture and analyze the unique volume, velocity and variety of machine data within a highly secure, industrial-strength cloud environment. GE Predix is designed to handle data types that consumer cloud services are not built to handle.
  • Siemens MindSphere is an open platform, based on the SAP HANA (PaaS) cloud, which allows developers to build, extend, and operate cloud-based applications. OEMs and application developers can access the platform via open interfaces and use it for services and analysis such as the online monitoring of globally distributed machine tools, industrial robots, or industrial equipment such as compressors and pumps. MindSphere also allows customers to create digital models of their factories with real data from the production process.
  • Honeywell Sentience is the recently announced cloud infrastructure by Honeywell Process Solutions. It is a secure, scalable, standards-based “One Honeywell” IoT platform, that will be able to accelerate time-to-market of connected solutions, lower the cost-to-market, and enable new innovative SaaS business models. It will have the ability to run global security standards embedded throughout the solution and make applications that are plug & play and scalable.

Industrial Cloud Platforms by Software Development Firms

Industrial companies aren’t the only ones developing industrial cloud platforms. There’s already a long list of industrial cloud platforms available from software development firms. Here are a few worth mentioning:

  • C3 IoT is a PaaS that enables organizations to leverage data – telemetry from sensors and devices, data from diverse enterprise information systems, and data from external sources (such as social media, weather, traffic, and commodity prices) – and employ advanced analytics and machine learning at scale, in real time, to capture business insights for improved operations, enhanced customer engagement, and differentiated products and services. C3 IoT is led by Silicon Valley entrepreneur Thomas Siebel. It has closed deals with the U.S. State Department and the French utility ENGIE SA, based on C3 IoT’s focus on machine-generated data.
  • Uptake: is a predictive analytics SaaS platform provider that offers industrial companies the ability to optimize performance, reduce asset failures, and enhance safety. Uptake integrates data science and workflow connectivity to provide high-value solutions using massive data sets. In 2015, it entered into a partnership with heavy construction equipment manufacturer Caterpillar to jointly develop an end-to-end platform for predictive diagnostics in order to help Caterpillar customers monitor and optimize their fleets more effectively.
  • Meshify is an Industrial IoT platform for tracking, monitoring, analyzing devices. The Meshify suite of tools provides all the features needed to deploy, monitor, control, and analyze the results of an IoT solution. Despite being a young technology business, it has a growing portfolio of clients with industrial-oriented companies, including Henry Pump, Sierra Resources, Stallion Oilfield Services, Gems Sensors & Controls and MistAway Systems.
Table 1: The List of Industrial Cloud Platform Providers

Amazon AWS

AT&T M2X

Bosch IoT

Carriots

Cumulocity

GE Predix

IBM Watson IoT

Intel IoT

Cisco Jasper

Losant IoT

MS Azure

ThingWorx

SAP Hana Cloud

Thethings.io

C3IoT

Uptake

Amplia IoT

XMPRO

Meshify

TempoIQ

Bitstew Systems

Siemens MindSphere

AirVantage

Honeywell Sentience

What Do You Think Will Be The Future of the Industrial Cloud?

Will there be a dominant Industrial Cloud Platform? It's hard to say at this point. GE Predix is hoping for 500,000 connected devices by the beginning of 2017, while C3 IoT is said to have 70 million devices connected to its platform already.

The Crowded Cloud of Industrial Platforms

Will this market consolidate around a few big-name platforms, or will a lesser known provider be the winner and take all? 

This post originally appeared here. 

Read more…

7 things that are getting smarter in IoT era

Internet of Things is surrounded with a lot of buzz, which is there for a reason. It is one of the most revolutionary technologies and it is the closest we’ve come to predicting our future. Of course, the IoT is not based on spells and witchcraft (it’s way scarier than that), but on machine-to-machine communication, cloud computing and networks of small sensors, which collect and analyze data. In this article we’ll share some of things and processes that will change in the IoT Era.

Home security systems

Today you can monitor home security cameras from your smartphone screen. More advanced home security systems go even further. They come with different types of sensors that control air quality, motion, sound, vibration and temperature. These systems use machine learning to determine the normal activity in your home and they send alerts to your smartphone, when something out of the ordinary occurs. Because of their smart machine learning approach, home security systems that are based on IoT concept drastically reduce the incidence of false alarms.

Bed

Even our beds will become smart. At the moment you can buy several types of sleep trackers from the ones that come in the form of bracelet and measure your heart rate and blood pressure to smart mattresses that can connect to home automation systems, prepare your bed temperature, track your heart and breathing rate and wake you up in the morning. These special mattresses also collect information about your sleep and give you recommendations for improving your bed rest.

Energy use

Recently several companies released Wi-Fi enabled sensors that can connect to the home electrical panel and control and track your energy use. These small sensors recognize all appliances and gadgets by their “power signatures” and can monitor the energy use and brake it down to every single device. They will allow you to have a deep look into your monthly energy use, to recognize and deal with critical points and to save money on utility bills. Same as many other home security and home automation systems, these sensors learn to interpret the activity of your home devices and send warnings when incidents happen.

All home appliances and systems

All-in-one smart home automation systems can control several home appliances at once. People can use them to turn their porch lights on and off when they are on vacation and to preheat their home or their oven before they arrive home from work. These systems also control various conditions in your home and use smart sensors and machine learning to create the perfect comfort. Some home automation systems also come with a Bluetooth speaker and a microphone and they can work as voice assistants.

Self-storage monitoring

Self-storage monitoring protects stored goods from climate changes, theft and other unforeseen incidents. New storage monitoring systems based on the IoT concept control storage lighting, air-conditioning and security. They also use sensors to track variables that are critical for perishable goods like temperature and humidity. You can find these smart storages in many different cities around the world. 

Construction sites

Construction site managers can use IoT solutions to monitor the work of heavy machinery and the movement of construction employees. This basically means that they don’t need to leave their trailer office. Sensors track the movement of supply and dumping trucks through geo-location technology and insure that everything works as scheduled. If there’re any irregularities in the work of heavy machinery, supply trucks or employees, the site manager will be instantly notified by smartphone push-notification.  

Emergency vehicles

In many cities the only connection between emergency vehicles and their headquarters is established through old-fashion radios. This offers a limited control in emergency situations. Advanced telematics already appeared in many emergency vehicles around the world. This technology allows lone drivers to receive updates in real time from the environment they are entering, including: over speeding, harsh events or the incidents of other team members. Employees at the headquarters also receive the information about emergency vehicle’s hours of service, speed, siren state and location. This way, they can easily schedule vehicle’s regular maintenance and minimize its downtime.

Internet of Things is the biggest tech trend that is happening at the moment. It will completely rock our world and bring a lot of positive disruption to every segment of our lives. Soon, we’ll be able to control all of our possessions through one smart app, which will leave us more time to focus on ourselves and our friends and family.

Read more…

New IoT App Makes Drivers Safer

Transportation has become one of the most frequently highlighted areas where the internet of things can improve our lives. Specifically, a lot of people are excited about the IoT's potential to further the progress toward entire networks of self-driving cars. We hear a lot about the tech companies that are involved in building self-driving cars, but it's the IoT that will actually allow these vehicles to operate. In fact, CNET quoted one IoT expert just last year as saying that because of the expanding IoT, self-driving cars will rule the roads by 2030.

On a much smaller scale, there are also some niche applications of the IoT that are designed to fix specific problems on the road. For instance, many companies have looked to combat distracted driving by teenagers through IoT-related tools. As noted by PC World, one device called the Smartwheel monitors teens' driving activity by sensing when they're keeping both hands on the wheel. The device sounds an alert when a hand comes off the wheel and communicates to a companion app that compiles reports on driver performance. This is a subtle way in which the IoT helps young drivers develop better habits.

In a way, these examples cover both extremes of the effect the IoT is having on drivers. One is a futuristic idea that's being slowly implemented to alter the very nature of road transportation. The other is an application for individuals meant to make drivers safer one by one. But there are also some IoT-related tools that fall somewhere in the middle of the spectrum. One is an exciting new app that seeks to make the roads safer for the thousands of shipping fleet drivers operating on a daily basis.

At first this might sound like a niche category. However, the reality is that the innumerable companies and agencies relying shipping and transportation fleets have a ton of drivers to take care of. That means supervising vehicle performance, safety, and more for each and every one of them. That process comprises a significant portion of road activity, particularly in cities and on highways. These operations are able to be simplified and streamlined through Networkfleet Driver, which Verizon describes as a tool to help employees manage routes, maintenance, communication, and driving habits all in one place.

The app can communicate up-to-date routing changes or required stops, inform drivers of necessary vehicle repairs or upkeep, and handle communication from management. It can also make note of dangerous habits (like a tendency to speed or make frequent sudden stops), helping the driver to identify bad habits and helping managers to recommend safer performance. All of this is accomplished through various IoT sensors on vehicles interacting automatically with the app, and with systems that can be monitored by management.

The positive effect, while difficult to quantify, is substantial. Fleet drivers make up a significant portion of road activity, and through the use of the IoT we can make sure that the roads are safer for everyone.

Read more…

Do not stop asking for security in IoT

Almost three years ago, I wrote in my IoT blog  the posts “Are you prepared to answer M2M/IoT security questions of your customers ?. and “There is no consensus how best to implement security in IoT” given the importance that Security has to fulfil the promise of the Internet of Things (IoT).

And during this time I have been sharing my opinion about the key role of IoT Security with other international experts in articles “What is the danger of taking M2M communications to the Internet of Things?, and events (Cycon , IoT Global Innovation Forum 2016).

The Security has been always a tradeoff between cost and benefit

I am honest when I say that I do not known how McKinsey gets calculate the total impact that IoT will have on the world economy in 2025, even on one of the specific sectors, and if they had taking into account the challenge of the Security, but it hardly matters: “The opportunities generated by IoT far outweigh the risks”.

With increased IoT opportunity comes increased security risks and a flourishing IoT Security Market (According with Zion Research the IoT Security Market will growth to USD 464 million in 2020).

A decade of breaches and the biggest attack target yet is looming

We all know the negative impact that news about cyber-attacks has in the society and enterprises. In less than a decade and according to Data Source: ICS- CERT (US) have gone from 39 incidents in 2010 to 295 incidents in 2015.

In a survey published by ATT, the company has logged a 458% increase in vulnerability scans of IoT devices in the last 2 years.

It is a temptation for hackers to test their skills in connected objects, whether connected cars or smart homes appliances. But I'm afraid they will go far beyond attacking smart factories, or smart transportation infrastructure or smart grids.

With the millions of unprotected devices out there, the multitude of IoT networks, IoT Platforms, and developers with lack of security I am one more that believes the biggest attack target yet is looming.

 New Threats

With the Internet of Things, we should be prepared for new attacks and we must design new essential defences.

The complex IoT Security Threat Map from Beecham Research provides an overlayed summary of the full set of threat and vulnerability analyses that is used to help clients shape their strategies. This Threat Map “summary” many of the top 5 features from each of those analyses.

1.       external threats and the top internal vulnerabilities of IoT applications

2.       the needs for robust authentication & authorisation & confidentiality

3.       the features and interactions between multiple networks used together in IoT;

4.       the complexities of combining Service Sector optimised capabilities of differing Service Enablement Platforms;

5.       the implementation and defences of edge device operating systems, chip integration and the associated Root of Trust.

 New Vulnerabilities

The OWASP Internet of Things Project is designed to help manufacturers, developers, and consumers better understand the security issues associated with the Internet of Things, and to enable users in any context to make better security decisions when building, deploying, or assessing IoT technologies.

The project looks to define a structure for various IoT sub-projects such as Attack Surface Areas, Testing Guides and Top Vulnerabilities. Bellow the top IoT Vulnerabilities.

 Subex white paper presenting their IoT solution add some real examples of  these vulnerabilities.

Insecure Web Interface: To exploit this vulnerability, attacker uses weak credentials or captures plain text credentials to access web interface. The impact results in data loss, denial of service and can lead to complete device take over. An insecure web interface was exploited by hackers to compromise Asus routers in 2014 that were shipped with default admin user name and password.

Insufficient Authentication/Authorization: Exploitation of this vulnerability involves attacker brute forcing weak passwords or poorly protected credentials to access a particular interface. The impact from this kind of attack is usually denial of service and can also lead to compromise of device. This vulnerability was exploited by ethical hackers to access head unit of Jeep Cherokee2 via WiFi-connectivity. The WiFi password for Jeep Cherokee unit is generated automatically based upon the time when car and head unit is started up. By guessing the time and using brute force techniques, the hackers were able to gain access to head unit.

Insecure Network Services: Attacker uses vulnerable network services to attack the device itself or bounce attacks off the device. Attackers can then use the compromised devices to facilitate attacks on other devices. This vulnerability was exploited by hackers that used 900 CCTV cameras3 globally to DoS attack a cloud platform service.

Lack of Transport Encryption: A lack of transport encryption allows 3rd parties to view data transmitted over the network. The impact of this kind of attack can lead to compromise of device or user accounts depending upon the data exposed. This weakness was exhibited by Toy Talk’s server domain which was susceptible to POODLE attack. Toy Talk helps Hello Barbie doll4 to talk to a child by uploading the words of a child to server and provide appropriate response after processing it. Though there was no reported hack on this, such a vulnerability could easily lead to one.

Privacy Concerns: Hackers use different vectors to view and/or collect personal data which is not properly protected. The impact of this attack is collection of personal user data. This vulnerability was exemplified by the VTech hack5 wherein in hackers were able to steal personal data of parents as well as children using VTech’s tablet.

Who owns the problem?

With the IoT we are creating a very complicated supply chain with lots of stakeholders so it's not always clear 'who owns the problem'. By way of an example with a simple home application and not Super Installers around; if you buy a central heating system and controller which requires you to push a button to increase the temperature then if it stops working you contact the company who supplied it. But if you buy a central heating boiler from one company, a wireless temperature controller from another, download a mobile App from another and have a weather station from another supplier then whose job is it to make sure it's secure and reliable? The simple cop-out is to say 'the homeowner bought the bits and connected them together therefore it's their responsibility' – well I'm sorry but that isn't good enough! 

Manufacturers can't simply divest themselves of responsibility simply because the home owner bought several component parts from different retailers. As a manufacturer you have a responsibility to ensure that your product is secure and reliable when used in any of the possible scenarios and use cases which means that manufacturers need to work together to ensure interoperability – we all own the problem!

This might come as a shock to some companies/industries but at some level even competitors have to work together to agree and implement architectures and connectivity that is secure and reliable. Standardization is a good example of this, if you look at the companies actively working together in ISO, ETSI, Bluetooth SIG etc. then they are often fierce competitors but they all recognize the need to work together to define common, secure and reliable platforms around which they can build interoperable products.  

If Cybersecurity is already top of mind for many organizations, is justified the alarm of lack of security in IoT?

In this three last years of evangelization of IoT, it has been no event or article not collect questions or comments on IoT Security and Privacy.

The good news is that according with the ATT State of IoT Security survey 2015, 85% of global organizations are considering exploring or implementing an IoT strategy but the bad news is that only 10% are fully confident that their connected devices are secure.

Source: ATT State of IoT Security survey 2015

And if we consider the report of Auth0, it scares me that only 10% of developers believe that most IoT devices on the market right now have the necessary security in place.

 

Source: Auth0

In a publication from EY titled “Cybersecurity and the IoT”, the company define three Stages to classify the current status of organizations in the implementation of IoT Security.

Stage 1: Activate

Organizations need to have a solid foundation of cybersecurity. This comprises a comprehensive set of information security measures, which will provide basic (but not good) defense against cyber-attacks. At this stage, organizations establish their fundamentals — i.e., they “activate” their cybersecurity.

Stage 2: Adapt

Organizations change — whether for survival or for growth. Threats also change. Therefore, the foundation of information security measures must adapt to keep pace and match the changing business requirements and dynamics otherwise they will become less and less effective over time. At this stage, organizations work to keep their cybersecurity up-to-date; i.e., they “adapt” to changing requirements.

Stage 3: Anticipate

Organizations need to develop tactics to detect and detract potential cyber-attacks. They must know exactly what they need to protect their most valuable assets, and rehearse appropriate responses to likely attack/incident scenarios: this requires a mature cyber threat intelligence capability, a robust risk assessment methodology, an experienced incident response mechanism and an informed organization. At this stage, organizations are more confident about their ability to handle more predictable threats and unexpected attacks; i.e., they anticipate cyber-attacks.

 

What enterprises needs to do

If you are thinking only in the benefits of IoT without consider the Security as a key component in your strategy you will probably regret very soon. Here below some recommendations either before start your IoT journey or if you are already started. Hope is not too late for wise advices.

Key Takeaways

With the proliferation and variety of IoT Devices, IoT Networks, IoT Platforms, Clouds, and applications, during the next few years we will see new vulnerabilities and a variety of new attacks. The progress in the security technologies and processes that prevent them will be key for the adoption of IoT in enterprises and consumers.

In the future Internet of Things world an end to end security approach to protect physical and digital assets. The ecosystems of this fragmented market must understand the need of Security by Design and avoid the temptation to reduce cost at the expense of the security.

Do not stop asking for security when you buy a connected product or use an IoT Service, the temptation of time to market, competitive prices and the lack of resources must not be an excuse to offer secure IoT solutions to enterprises, consumers and citizens.

 

Thanks in advance for your Likes and Shares

Thoughts ? Comments ?

Read more…

As if the Internet of Things (IoT) was not complicated enough, the Marketing team at Cisco introduced its Fog Computing vision in January 2014, also known as Edge Computing  for other more purist vendors.

Given Cisco´s frantic activity in their Internet of Everything (IoE) marketing campaigns, it is not surprising that many bloggers have abused of shocking headlines around this subject taking advantage of the Hype of the IoT.

I hope this post help you better understand what is  the role of Fog Computing  in the IoT Reference Model and how companies are using IoT Intelligent gateways in the Fog to connect the "Things" to the Cloud through some applications areas and examples of Fog Computing.

The problem with the cloud

As the Internet of Things proliferates, businesses face a growing need to analyze data from sources at the edge of a network, whether mobile phones, gateways, or IoT sensors. Cloud computing has a disadvantage: It can’t process data quickly enough for modern business applications.

The IoT owes its explosive growth to the connection of physical things and operation technologies (OT) to analytics and machine learning applications, which can help glean insights from device-generated data and enable devices to make “smart” decisions without human intervention. Currently, such resources are mostly being provided by cloud service providers, where the computation and storage capacity exists.

However, despite its power, the cloud model is not applicable to environments where operations are time-critical or internet connectivity is poor. This is especially true in scenarios such as telemedicine and patient care, where milliseconds can have fatal consequences. The same can be said about vehicle to vehicle communications, where the prevention of collisions and accidents can’t afford the latency caused by the roundtrip to the cloud server.

“The cloud paradigm is like having your brain command your limbs from miles away — it won’t help you where you need quick reflexes.”

Moreover, having every device connected to the cloud and sending raw data over the internet can have privacy, security and legal implications, especially when dealing with sensitive data that is subject to separate regulations in different countries.

IoT nodes are closer to the action, but for the moment, they do not have the computing and storage resources to perform analytics and machine learning tasks. Cloud servers, on the other hand, have the horsepower, but are too far away to process data and respond in time.

The fog layer is the perfect junction where there are enough compute, storage and networking resources to mimic cloud capabilities at the edge and support the local ingestion of data and the quick turnaround of results.

The variety of IoT systems and the need for flexible solutions that respond to real-time events quickly make Fog Computing a compelling option.

The Fog Computing, Oh my good another layer in IoT!

A study by IDC estimates that by 2020, 10 percent of the world’s data will be produced by edge devices. This will further drive the need for more efficient fog computing solutions that provide low latency and holistic intelligence simultaneously.

“Computing at the edge of the network is, of course, not new -- we've been doing it for years to solve the same issue with other kinds of computing.”

The Fog Computing or Edge Computing  is a paradigm championed by some of the biggest IoT technology players, including Cisco, IBM, and Dell and represents a shift in architecture in which intelligence is pushed from the cloud to the edge, localizing certain kinds of analysis and decision-making.

Fog Computing enables quicker response times, unencumbered by network latency, as well as reduced traffic, selectively relaying the appropriate data to the cloud.

The concept of Fog Computing attempts to transcend some of these physical limitations. With Fog Computing processing happens on nodes physically closer to where the data is originally collected instead of sending vast amounts of IoT data to the cloud.

Photo Source: http://electronicdesign.com/site-files/electronicdesign.com/files/uploads/2014/06/113191_fig4sm-cisco-fog-computing.jpg

The OpenFog Consortium

The OpenFog Consortium, was founded on the premise based on open architectures and standards that are essential for the success of a ubiquitous Fog Computing ecosystem.

The collaboration among tech giants such as ARM, Cisco, Dell, GE, Intel, Microsoft and Schneider Electric defining an Open, Interoperable Fog Computing Architecture is without any doubt good news for a vibrant supplier ecosystem.

The OpenFog Reference Architecture is an architectural evolution from traditional closed systems and the burgeoning cloud-only models to an approach that emphasizes computation nearest the edge of the network when dictated by business concerns or critical application the functional requirements of the system.

The OpenFog Reference Architecture consists of putting micro data centers or even small, purpose-built high-performance data analytics machines in remote offices and locations in order to gain real-time insights from the data collected, or to promote data thinning at the edge, by dramatically reducing the amount of data that needs to be transmitted to a central data center. Without having to move unnecessary data to a central data center, analytics at the edge can simplify and drastically speed analysis while also cutting costs.

Benefits of Fog Computing

  • ·         Frees up network capacity - Fog computing uses much less bandwidth, which means it doesn't cause bottlenecks and other similar occupancies. Less data movement on the network frees up network capacity, which then can be used for other things.
  • ·         It is truly real-time - Fog computing has much higher expedience than any other cloud computing architecture we know today. Since all data analysis are being done at the spot it represents a true real time concept, which means it is a perfect match for the needs of Internet of Things concept.
  • ·         Boosts data security - Collected data is more secure when it doesn't travel. Also makes data storing much simpler, because it stays in its country of origin. Sending data abroad might violate certain laws.
  • ·         Analytics is done locally- Fog computing concept enables developers to access most important IoT data from other locations, but it still keeps piles of less important information in local storages;
  • ·         Some companies don't like their data being out of their premises- with Fog Computing lots of data is stored on the devices themselves (which are often located outside of company offices), this is perceived as a risk by part of developers' community.
  • ·         Whole system sounds a little bit confusing- Concept that includes huge number of devices that store, analyze and send their own data, located all around the world sounds utterly confusing.

Disadvantages of Fog Computing

Read more: http://bigdata.sys-con.com/node/3809885

Examples of Fog Computing

The applications of fog computing are many, and it is powering crucial parts of IoT ecosystems, especially in industrial environments. See below some use cases and examples.

  • Thanks to the power of fog computing, New York-based renewable energy company Envision has been able to obtain a 15 percent productivity improvement from the vast network of wind turbines it operates. The company is processing as much as 20 terabytes of data at a time, generated by 3 million sensors installed on the 20,000 turbines it manages. Moving computation to the edge has enabled Envision to cut down data analysis time from 10 minutes to mere seconds, providing them with actionable insights and significant business benefits.
  • Plat One is another firm using fog computing to improve data processing for the more than 1 million sensors it manages. The company uses the Cisco-ParStream platform to publish real-time sensor measurements for hundreds of thousands of devices, including smart lighting and parking, port and transportation management and a network of 50,000 coffee machines.
  • In Palo Alto, California, a $3 million project will enable traffic lights to integrate with connected vehicles, hopefully creating a future in which people won’t be waiting in their cars at empty intersections for no reason.
  • In transportation, it’s helping semi-autonomous cars assist drivers in avoiding distraction and veering off the road by providing real-time analytics and decisions on driving patterns.
  • It also can help reduce the transfer of gigantic volumes of audio and video recordings generated by police dashboard and video cameras. Cameras equipped with edge computing capabilities could analyze video feeds in real time and only send relevant data to the cloud when necessary.

See more at: Why Edge Computing Is Here to Stay: Five Use Cases By Patrick McGarry  

What is the future of fog computing?

The current trend shows that fog computing will continue to grow in usage and importance as the Internet of Things expands and conquers new grounds. With inexpensive, low-power processing and storage becoming more available, we can expect computation to move even closer to the edge and become ingrained in the same devices that are generating the data, creating even greater possibilities for inter-device intelligence and interactions. Sensors that only log data might one day become a thing of the past.

Janakiram MSV  wondered if Fog Computing  will be the Next Big Thing In Internet of Things? . It seems obvious that while cloud is a perfect match for the Internet of Things, we have other scenarios and IoT solutions that demand low-latency ingestion and immediate processing of data where Fog Computing is the answer.

Does the fog eliminate the cloud?

Fog computing improves efficiency and reduces the amount of data that needs to be sent to the cloud for processing. But it’s here to complement the cloud, not replace it.

The cloud will continue to have a pertinent role in the IoT cycle. In fact, with fog computing shouldering the burden of short-term analytics at the edge, cloud resources will be freed to take on the heavier tasks, especially where the analysis of historical data and large datasets is concerned. Insights obtained by the cloud can help update and tweak policies and functionality at the fog layer.

And there are still many cases where the centralized, highly efficient computing infrastructure of the cloud will outperform decentralized systems in performance, scalability and costs. This includes environments where data needs to be analyzed from largely dispersed sources.

“It is the combination of fog and cloud computing that will accelerate the adoption of IoT, especially for the enterprise.”

In essence, Fog Computing allows for big data to be processed locally, or at least in closer proximity to the systems that rely on it. Newer machines could incorporate more powerful microprocessors, and interact more fluidly with other machines on the edge of the network. While fog isn’t a replacement for cloud architecture, it is a necessary step forward that will facilitate the advancement of IoT, as more industries and businesses adopt emerging technologies.

'The Cloud' is not Over

Fog computing is far from a panacea. One of the immediate costs associated with this method pertains to equipping end devices with the necessary hardware to perform calculations remotely and independent of centralized data centers. Some vendors, however, are in the process of perfecting technologies for that purpose. The tradeoff is that by investing in such solutions immediately, organizations will avoid frequently updating their infrastructure and networks to deal with ever increasing data amounts as the IoT expands.

There are certain data types and use cases that actually benefit from centralized models. Data that carries the utmost security concerns, for example, will require the secure advantages of a centralized approach or one that continues to rely solely on physical infrastructure.

Though the benefits of Fog Computing are undeniable, the Cloud has a secure future in IoT for most companies with less time-sensitive computing needs and for analysing all the data gathered by IoT sensors.

 

Thanks in advance for your Likes and Shares

Thoughts ? Comments ?

Read more…

Customer 360º view in Digital age

In today’s digital age of customer hyper-personalization, organizations identify opportunities for real time engagement based on data-driven understanding of customer behavior.
Customers have taken control of their purchase process. With websites, blogs, Facebook updates, online reviews and more, they use multiple sources of information to make decisions and often engage with a brand dozens of times between inspiration and purchase.
It’s important that organizations collect every customer interaction in order to identify sentiments of happy & unhappy customers.
Companies can get a complete 360º view of customers by aggregating data from the various touch points that a customer may use, to contact a company to purchase products and receive service/support.
This Customer 360º snapshot should include:
  • Identity: name, location, gender, age and other demographic data
  • Relationships: their influence, connections, associations with others
  • Current activity: orders, complaints, deliveries, returns
  • History: contacts, campaigns, processes, cases across all lines of business and channels
  • Value: which products or services they are associated with, including history
  • Flags: prompts to give context, e.g. churn propensity, up-sell options, fraud risk, mood of last interactions, complaint record, frequency of contact
  • Actions: expected, likely or essential steps based on who they are and the fact they are calling now

The 360º view of customers, also often requires a big data analytics strategy to marry structured data (data that can reside in the rows and columns of a database), with unstructured data (data like audio files, video files, social media data). 
Many companies like Nestle, Toyota are using social media listening tools to gather what customers are saying on sites like Facebook and Twitter, predictive analytics tools to determine what customers may research or purchase next.
What are the returns of Customer 360º:
  • All customer touch point data in a single repository for fast queries
  • Next best actions or recommendations for customers
  • All key metrics in a single location for business users to know and advise customers
  • Intuitive and customizable dashboards for quick insights
  • Real time hyper personalized customer interaction
  • Enhanced customer loyalty

Customer 360º helps achieve Single View of Customer across Channels – online, stores, marketplaces, Devices – wearables, mobile, tablets, laptops & Interactions – purchase, posts, likes, feedback, service.

This is further used for customer analytics – predict churn, retention, next best action, cross-sell & up-sell opportunities, profitability, life time value.
Global leaders in customer experience are Apple, Disney, Emirates.
A word of caution though - Focus & collect only that customer data, which can help to improve the customer journey.
Read more…

IoT Devices Common Thread in Colossal DDoS Attacks

EDITOR'S NOTE: This story originally appeared on the A10 Networks blog.

A pair of distributed denial-of-service (DDoS) attacks against high-profile targets last week rank among the largest DDoS attacks on record. And a common thread has emerged: these attacks are leveraging botnets comprising hundreds of thousands of unsecured Internet of Things (IoT) devices.

OVH attack reaches 1 Tbps

European Web hosting company OVH confirmed last week that it suffered a string of DDoS attacks that neared the 1 Tbps mark. On Twitter, OVH CTO Octave Klaba said the attacks OVH suffered were “close to 1 Tbps” and noted that the flood of traffic was fueled by a botnet made up of nearly 150,000 digital video recorders and IP cameras capable of sending 1.5 Tbps in DDoS traffic. Klaba said OVH servers were hit by multiple simultaneous attacks exceeding 100 Gbps each, totaling more than 1 Tbps. The most severe single attacks that was documented by OVH reached 93 million packets-per-second (mpps) and 799 Gbps.

SC Magazine UK quoted security researcher Mustafa Al-Bassam as saying the DDoS attack against OVH is “the largest DDoS attack ever recorded.”

Krebs gets slammed

The OVH attack came on the heels of another gargantuan DDoS incident, this one targeting respected cybersecurity blog Krebsonsecurity.com, which knocked the site offline for several hours.

“The outage came in the wake of a historically large distributed denial-of-service (DDoS) attack which hurled so much junk traffic at Krebsonsecurity.com that my DDoS protection provider Akamai chose to unmoor my site from its protective harbor,” Brian Krebs wrote, adding that he has since implemented DDoS protection from Google’s Project Shield.

The attack on Krebs clocked in at a massive 620 Gbps in size, which is several orders of magnitude more traffic than is typically necessary to knock most websites offline.

SecurityWeek reported that Krebs believes the botnet used to target his blog mostly consists of IoT devices — perhaps millions of them — such as webcams and routers that have default or weak credentials.

“There is every indication that this attack was launched with the help of a botnet that has enslaved a large number of hacked so-called ‘Internet of Things,’ (IoT) devices — mainly routers, IP cameras and digital video recorders (DVRs) that are exposed to the Internet and protected with weak or hard-coded passwords,” Krebs wrote.

Reports indicate that the attack was in response to Krebs reporting on and exposing vDOS, a service run by two Israelis who were offering a DDoS-as-a-Service play and were arrested after Krebs’ story was published.

IoT insecurity

Security researchers have warned that improperly secured IoT devices are more frequently being used to launch DDoS attacks. Symantec last week noted that hackers can easily hijack unsecured IoT devices due to lack of basic security controls and add them to a botnet, which they then use to launch a DDoS attack.

“Poor security on many IoT devices makes them soft targets and often victims may not even know they have been infected,” Symantec wrote. “Attackers are now highly aware of lax IoT security and many pre-program their malware with commonly used and default passwords.”

And while DDoS attacks remain the main purpose of IoT malware, Symantec warned that the proliferation of devices and their increased processing power may create new ways for threat actors to leverage IoT, such as cryptocurrency mining, information stealing and network reconnaissance.

 

Read more…

Upcoming IoT Events

More IoT News

iPhone 7 Plus accounts for bigger piece of U.S. pie

U.S. iPhone buyers significantly shifted purchase preference to the larger 7 Plus in 2016, boosting the 5.5-in. smartphone's share of all Apple handsets, a research analyst said Thursday.

"The U.S. market finally likes these bigger…

Continue

IoT Career Opportunities