Subscribe to our Newsletter | To Post On IoT Central, Click here


Apps and Tools (139)

For IoT and M2M device security assurance, it's critical to introduce automated software development tools into the development lifecycle. Although software tools' roles in quality assurance is important, it becomes even more so when security becomes part of a new or existing product's requirements.

Automated Software Development Tools

There are three broad categories of automated software development tools that are important for improving quality and security in embedded IoT products:

  • Application lifecycle management (ALM): Although not specific to security, these tools cover requirements analysis, design, coding, testing and integration, configuration management, and many other aspects of software development. However, with a security-first embedded development approach, these tools can help automate security engineering as well. For example, requirements analysis tools (in conjunction with vulnerability management tools) can ensure that security requirements and known vulnerabilities are tracked throughout the lifecycle.  Design automation tools can incorporate secure design patterns and then generate code that avoids known security flaws (e.g. avoiding buffer overflows or checking input data for errors). Configuration management tools can insist on code inspection or static analysis reports before checking in code. Test automation tools can be used to test for "abuse" cases against the system. In general, there is a role for ALM tools in the secure development just as there is for the entire project.
  • Dynamic Application Security Testing (DAST): Dynamic testing tools all require program execution in order to generate useful results. Examples include unit testing tools, test coverage, memory analyzers, and penetration test tools. Test automation tools are important for reducing the testing load on the development team and, more importantly, detecting vulnerabilities that manual testing may miss.
  • Static Application Security Testing (SAST): Static analysis tools work by analyzing source code, bytecode (e,g, compiled Java), and binary executable code. No code is executed in static analysis, but rather the analysis is done by reasoning about the potential behavior of the code. Static analysis is relatively efficient at analyzing a codebase compared to dynamic tools. Static analysis tools also analyze code paths that are untested by other methods and can trace execution and data paths through the code. Static analysis can be incorporated early during the development phase for analyzing existing, legacy, and third-party source and binaries before incorporating them into your product. As new source is added, incremental analysis can be used in conjunction with configuration management to ensure quality and security throughout. 

Figure 1: The application of various tool classes in the context of the software development lifecycle.

Although adopting any class of tools helps productivity, security, and quality, using a combination of these is recommended. No single class of tools is the silver bullet[1]. The best approach is one that automates the use of a combination of tools from all categories, and that is based on a risk-based rationale for achieving high security within budget.

The role of static analysis tools in a security-first approach

Static analysis tools provide critical support in the coding and integration phases of development. Ensuring continuous code quality, both in the development and maintenance phases, greatly reduces the costs and risks of security and quality issues in software. In particular, it provides some of the following benefits:

  • Continuous source code quality and security assurance: Static analysis is often applied initially to a large codebase as part of its initial integration as discussed below. However, where it really shines is after an initial code quality and security baseline is established. As each new code block is written (file or function), it can be scanned by the static analysis tools, and developers can deal with the errors and warnings quickly and efficiently before checking code into the build system. Detecting errors and vulnerabilities (and maintaining secure coding standards, discussed below) in the source at the source (developers themselves) yields the biggest impact from the tools.
  • Tainted data detection and analysis: Analysis of the data flows from sources (i.e. interfaces) to sinks (where data gets used in a program) is critical in detecting potential vulnerabilities from tainted data. Any input, whether from a user interface or network connection, if used unchecked, is a potential security vulnerability.  Many attacks are mounted by feeding specially-crafted data into inputs, designed to subvert the behavior of the target system. Unless data is verified to be acceptable both in length and content, it can be used to trigger error conditions or worse. Code injection and data leakage are possible outcomes of these attacks, which can have serious consequences.
  • Third-party code assessment: Most projects are not greenfield development and require the use of existing code within a company or from a third party. Performing testing and dynamic analysis on a large existing codebase is hugely time consuming and may exceed the limits on the budget and schedule. Static analysis is particularly suited to analyzing large code bases and providing meaningful errors and warnings that indicate both security and quality issues. GrammaTech CodeSonar binary analysis can analyze binary-only libraries and provide similar reports as source analysis when source is not available. In addition, CodeSonar binary analysis can work in a mixed source and binary mode to detect errors in the usage of external binary libraries from the source code. 
  • Secure coding standard enforcement: Static analysis tools analyze source syntax and can be used to enforce coding standards. Various code security guidelines are available such as SEI CERT C [2] and Microsoft's Secure Coding Guidelines [3]. Coding standards are good practice because they prevent risky code from becoming future vulnerabilities. As mentioned above, integrating these checks into the build and configuration management system improves the quality and security of code in the product.

As part of a complete tools suite, static analysis provides key capabilities that other tools cannot. The payback for adopting static analysis is the early detection of errors and vulnerabilities that traditional testing tools may miss. This helps ensure a high level of quality and security on an on-going basis.

Conclusion

Machine to machine and IoT device manufacturers incorporating a security-first design philosophy with formal threat assessments, leveraging automated tools, produce devices better secured against the accelerating threats on the Internet. Modifying an existing successful software development process that includes security at the early stages of product development is key. Smart use of automated tools to develop new code and analyze existing and third party code allows development teams to meet strict budget and schedule constraints. Static analysis of both source and binaries plays a key role in a security-first development toolset. 

References

  1. No Silver Bullet – Essence and Accident in Software Engineering, Fred Brooks, 1986
  2. SEI CERT C Coding Standard,
  3. Outsource Code Development Driving Automated Test Tool Market, VDC Research, IoT & Embedded Blog, October 22, 2013

 

Read more…

Originally Posted by:  

With the announcement of the Cisco Solution for LoRAWAN™, Service Providers have an integrated solution that enables them to extend their network reach to where they’ve never gone before – i.e., offering IoT services for devices and sensors that are battery powered, have low data rates and long distance communications requirements. The solution opens new markets and new revenue streams for Service Providers, and can be deployed in a wide range of use cases in Industrial IoT and Smart City applications such as:

  • Asset Tracking and Management
  • Logistics
  • Smart Cities (e.g., smart parking, street lighting, waste management, etc.)
  • Intelligent buildings
  • Utilities (e.g., water and gas metering)
  • Agriculture (e.g., soil, irrigation management)

AU43170

Our Cisco Mobile Visual Networking Index estimates that while LoRa is in its early stages now, these types of Low Power Wide Area connectivity means will quickly gain traction and that by 2020, there will be more than 860 million devices using it to connect.  One of the reasons for such forecasted aggressive adoption, especially in North America and Western Europe, is that LoRa® works over readily available unlicensed spectrum. Cisco is a founding Board member of the LoRa® Allianceformed in January, 2015, with a goal to standardize LPWA Networks in order to stimulate the growth of Internet of Things (IoT) applications.

Cisco has been working with a number of Mobile Operators who are trialing and deploying LoRa® networks to target new low-power consumption IoT services such as metering, location tracking and monitoring services. Many Mobile Operators are looking at LoRa® as complementary to NarrowBand IOT (NB-IOT), an upgrade to current mobile networks that drops the transmit power and data rates of the LTE standard to increase battery life. As NB-IOT networks, devices, and ecosystems will not be commercialized until 2017, LoRa® gives Operators (and all SPs, in fact) a way to gain a head-start on offering new IoT services based on various new low cost business models.

Cisco’s approach to IoT is to deliver integrated solutions that enable SPs to support different class of services aligned with specific pricing models across unlicensed (Wi-Fi, LoRa) and licensed (2G/3G/LTE, and soon, NB-IoT) radio spectrum as demanded by the IoT application. Our multi-access network strategy for IoT is complemented by the Cisco Ultra Services Platform (USP) – our comprehensive, virtualized services core, which includes mobile packet core, policy and services functions. Cisco USP delivers the scalability and flexibility that Operators focusing on IoT need as more and varied “things” get connected to their networks.

Cisco continues to integrate and evolve solutions such as LoraWAN™ to help Service Providers of all types capitalize on new IoT opportunities and transform into next-generation IoT Service Providers.

Read more…

Originally Posted and Written by: Michelle Canaan, John Lucker, & Bram Spector

Connectivity is changing the way people engage with their cars, homes, and bodies—and insurers are looking to keep pace. Even at an early stage, IoT technology may reshape the way insurance companies assess, price, and limit risks, with a wide range of potential implications for the industry.

Insurers’ path to growth: Embrace the future

In 1997, Progressive Insurance pioneered the use of the Internet to purchase auto insurance online, in real time.1 In a conservative industry, Progressive’s innovative approach broke several long-established trade-offs, shaking up traditional distribution channels and empowering consumers with price transparency.

This experiment in distribution ended up transforming the industry as a whole. Online sales quickly forced insurers to evolve their customer segmentation capabilities and, eventually, to refine pricing. These modifications propelled growth by allowing insurers to serve previously uninsurable market segments. And as segmentation became table stakes for carriers, a new cottage industry of tools, such as online rate comparison capabilities, emerged to capture customer attention. Insurers fought to maintain their competitive edge through innovation, but widespread transparency in product pricing over time created greater price competition and ultimately led to product commoditization. The tools and techniques that put the insurer in the driver’s seat slowly tipped the balance of power to the customer.

This case study of insurance innovation and its unintended consequences may be a precursor to the next generation of digital connectivity in the industry. Today, the availability of unlimited new sources of data that can be exploited in real time is radically altering how consumers and businesses interact. And the suite of technologies known as the Internet of Things (IoT) is accelerating the experimentation of Progressive and other financial services companies. With the IoT’s exponential growth, the ways in which citizens engage with their cars, homes, and bodies are getting smarter each day, and they expect the businesses they patronize to keep up with this evolution. Insurance, an industry generally recognized for its conservatism, is no exception.

IoT technology may still be in its infancy, but its potential to reshape the way insurers assess, price, and limit risks is already quite promising. Nevertheless, since innovation inevitably generates unintended possibilities and consequences, insurers will need to examine strategies from all angles in the earliest planning stages.

To better understand potential IoT applications in insurance, the Deloitte Center for Financial Services (DCFS), in conjunction with Wikistrat, performed a crowdsourcing simulation to explore the technology’s implications for the future of the financial services industry. Researchers probed participants (13 doctorate holders, 24 cyber and tech experts, 20 finance experts, and 6 entrepreneurs) from 20 countries and asked them to imagine how IoT technology might be applied in a financial services context. The results (figure 1) are not an exhaustive compilation of scenarios already in play or forthcoming but, rather, an illustration of several examples of how these analysts believe the IoT may reshape the industry.2

ER_2824_Fig.1

CONNECTIVITY AND OPPORTUNITY

Even this small sample of possible IoT applications shows how increased connectivity can generate tremendous new opportunities for insurers, beyond personalizing premium rates. Indeed, if harnessed effectively, IoT technology could potentially boost the industry’s traditionally low organic growth rates by creating new types of coverage opportunities. It offers carriers a chance to break free from the product commoditization trend that has left many personal and commercial lines to compete primarily on price rather than coverage differentiation or customer service.

For example, an insurer might use IoT technology to directly augment profitability by transforming the income statement’s loss component. IoT-based data, carefully gathered and analyzed, might help insurers evolve from a defensive posture—spreading risk among policyholders and compensating them for losses—to an offensive posture: helping policyholders prevent losses and insurers avoid claims in the first place. And by avoiding claims, insurers could not only reap the rewards of increased profitability, but also reduce premiums and aim to improve customer retention rates. Several examples, both speculative and real-life, include:

  • Sensors embedded in commercial infrastructure can monitor safety breaches such as smoke, mold, or toxic fumes, allowing for adjustments to the environment to head off or at least mitigate a potentially hazardous event.
  • Wearable sensors could monitor employee movements in high-risk areas and transmit data to employers in real time to warn the wearer of potential danger as well as decrease fraud related to workplace accidents.
  • Smart home sensors could detect moisture in a wall from pipe leakage and alert a homeowner to the issue prior to the pipe bursting. This might save the insurer from a large claim and the homeowner from both considerable inconvenience and losing irreplaceable valuables. The same can be said for placing IoT sensors in business properties and commercial machinery, mitigating property damage and injuries to workers and customers, as well as business interruption losses.
  • Socks and shoes that can alert diabetics early on to potential foot ulcers, odd joint angles, excessive pressure, and how well blood is pumping through capillaries are now entering the market, helping to avoid costly medical and disability claims as well as potentially life-altering amputations.3

Beyond minimizing losses, IoT applications could also potentially help insurers resolve the dilemma with which many have long wrestled: how to improve the customer experience, and therefore loyalty and retention, while still satisfying the unrelenting market demand for lower pricing. Until now, insurers have generally struggled to cultivate strong client relationships, both personal and commercial, given the infrequency of interactions throughout the insurance life cycle from policy sale to renewal—and the fact that most of those interactions entail unpleasant circumstances: either deductible payments or, worse, claims. This dynamic is even more pronounced in the independent agency model, in which the intermediary, not the carrier, usually dominates the relationship with the client.

The emerging technology intrinsic to the IoT that can potentially monitor and measure each insured’s behavioral and property footprint across an array of activities could turn out to be an insurer’s holy grail, as IoT applications can offer tangible benefits for value-conscious consumers while allowing carriers to remain connected to their policyholders’ everyday lives. While currently, people likely want as few associations with their insurers as possible, the IoT can potentially make insurers a desirable point of contact. The IoT’s true staying power will be manifested in the technology’s ability to create value for both the insurer and the policyholder, thereby strengthening their bond. And while the frequency of engagement shifts to the carrier, the independent agency channel will still likely remain relevant through the traditional client touchpoints.

By harnessing continuously streaming “quantified self” data, using advanced sensor connectivity devices, insurers could theoretically capture a vast variety of personal data and use it to analyze a policyholder’s movement, environment, location, health, and psychological and physical state. This could provide innovative opportunities for insurers to better understand, serve, and connect with policyholders—as well as insulate companies against client attrition to lower-priced competitors. Indeed, if an insurer can demonstrate how repurposing data collected for insurance considerations might help a carrier offer valuable ancillary non-insurance services, customers may be more likely to opt in to share further data, more closely binding insurer and customer.

Leveraging IoT technologies may also have the peripheral advantage of resuscitating the industry’s brand, making insurance more enticing to the relatively small pool of skilled professionals needed to put these strategies in play. And such a shift would be welcome, considering that Deloitte’s Talent in Insurance Survey revealed that the tech-savvy Millennial generation generally considers a career in the insurance industry “boring.”4 Such a reputational challenge clearly creates a daunting obstacle for insurance executives and HR professionals, particularly given the dearth of employees with necessary skill sets to successfully enable and systematize IoT strategies, set against a backdrop of intense competition from many other industries. Implementing cutting-edge IoT strategies could boost the “hip factor” that the industry currently lacks.

With change comes challenges

While most stakeholders might see attractive possibilities in the opportunity for behavior monitoring across the insurance ecosystem, inevitable hurdles stand in the way of wholesale adoption. How insurers surmount each potential barrier is central to successful evolution.

For instance, the industry’s historically conservative approach to innovation may impede the speed and flexibility required for carriers to implement enhanced consumer strategies based on IoT technology. Execution may require more nimble data management and data warehousing than currently in place, as engineers will need to design ways to quickly aggregate, analyze, and act upon disparate data streams. To achieve this speed, executives may need to spearhead adjustments to corporate culture grounded in more centralized location of data control. Capabilities to discern which data are truly predictive versus just noise in the system are also critical. Therefore, along with standardized formats for IoT technology,5 insurers may see an increasing need for data scientists to mine, organize, and make sense of mountains of raw information.

Perhaps most importantly, insurers would need to overcome the privacy concerns that could hinder consumers’ willingness to make available the data on which the IoT runs. Further, increased volume, velocity, and variety of data propagate a heightened need for appropriate security oversight and controls.

For insurers, efforts to capitalize on IoT technology may also require patience and long-term investments. Indeed, while bolstering market share, such efforts could put a short-term squeeze on revenues and profitability. To convince wary customers to opt in to monitoring programs, insurers may need to offer discounted pricing, at least at the start, on top of investments to finance infrastructure and staff supporting the new strategic initiative. This has essentially been the entry strategy for auto carriers in the usage-based insurance market, with discounts provided to convince drivers to allow their performance behind the wheel to be monitored, whether by a device installed in their vehicles or an application on their mobile device.

Results from the Wikistrat crowdsourcing simulation reveal several other IoT-related challenges that respondents put forward. (See figure 2.)6

ER_2824_Fig.2a

Each scenario implies some measure of material impact to the insurance industry. In fact, together they suggest that the same technology that could potentially help improve loss ratios and strengthen policyholder bonds over the long haul may also make some of the most traditionally lucrative insurance lines obsolete.

For example, if embedding sensors in cars and homes to prevent hazardous incidents increasingly becomes the norm, and these sensors are perfected to the point where accidents are drastically reduced, this development may minimize or eliminate the need for personal auto and home liability coverage, given the lower frequency and severity of losses that result from such monitoring. Insurers need to stay ahead of this, perhaps even eventually shifting books of business from personal to product liability as claims evolve from human error to product failure.

Examining the IoT through an insurance lens

Analyzing the intrinsic value of adopting an IoT strategy is fundamental in the development of a business plan, as executives must carefully consider each of the various dimensions to assess the potential value and imminent challenges associated with every stage of operationalization. Using Deloitte’s Information Value Loop can help capture the stages (create, communicate, aggregate, analyze, act) through which information passes in order to create value.7

The value loop framework is designed to evaluate the components of IoT implementation as well as potential bottlenecks in the process, by capturing the series and sequence of activities by which organizations create value from information (figure 3).

ER_2824_Fig.3

To complete the loop and create value, information passes through the value loop’s stages, each enabled by specific technologies. An act is monitored by a sensor that creates information. That information passes through a network so that it can be communicated, and standards—be they technical, legal, regulatory, or social—allow that information to be aggregated across time and space. Augmented intelligence is a generic term meant to capture all manner of analytical support, collectively used to analyze information. The loop is completed via augmented behavior technologies that either enable automated, autonomous action or shape human decisions in a manner leading to improved action.8

For a look at the value loop through an insurance lens, we will examine an IoT capability already at play in the industry: automobile telematics. By circumnavigating the stages of the framework, we can scrutinize the efficacy of how monitoring driving behavior is poised to eventually transform the auto insurance market with a vast infusion of value to both consumers and insurers.

Auto insurance and the value loop

Telematic sensors in the vehicle monitor an individual’s driving to create personalized data collection. The connected car, via in-vehicle telecommunication sensors, has been available in some form for over a decade.9 The key value for insurers is that sensors can closely monitor individual driving behavior, which directly corresponds to risk, for more accuracy in underwriting and pricing.

Originally, sensor manufacturers made devices available to install on vehicles; today, some carmakers are already integrating sensors into showroom models, available to drivers—and, potentially, their insurers—via smartphone apps. The sensors collect data (figure 4) which, if properly analyzed, might more accurately predict the unique level of risk associated with a specific individual’s driving and behavior. Once the data is created, an IoT-based system could quantify and transform it into “personalized” pricing.

ER_2824_Fig.4

Sensors’ increasing availability, affordability, and ease of use break what could potentially be a bottleneck at this stage of the Information Value Loop for other IoT capabilities in their early stages.

IoT technology aggregatesand communicatesinformation to the carrier to be evaluated. To identify potential correlations and create predictive models that produce reliable underwriting and pricing decisions, auto insurers need massive volumes of statistically and actuarially credible telematics data.

In the hierarchy of auto telematics monitoring, large insurers currently lead the pack when it comes to usage-based insurance market share, given the amount of data they have already accumulated or might potentially amass through their substantial client bases. In contrast, small and midsized insurers—with less comprehensive proprietary sources—will likely need more time to collect sufficient data on their own.

To break this bottleneck, smaller players could pool their telematics data with peers either independently or through a third-party vendor to create and share the broad insights necessary to allow a more level playing field throughout the industry.

Insurers analyze data and use it to encourage drivers to act by improving driver behavior/loss costs. By analyzing the collected data, insurers can now replace or augment proxy variables (age, car type, driving violations, education, gender, and credit score) correlated with the likelihood of having a loss with those factors directly contributing to the probability of loss for an individual driver (braking, acceleration, cornering, and average speed, as figure 4 shows). This is an inherently more equitable method to structure premiums: Rather than paying for something that might be true about a risk, a customer pays for what is true based on his own driving performance.

But even armed with all the data necessary to improve underwriting for “personalized” pricing, insurers need a way to convince millions of reluctant customers to opt in. To date, insurers have used the incentive of potential premium discounts to engage consumers in auto telematics monitoring.10 However, this model is not necessarily attractive enough to convince the majority of drivers to relinquish a measure of privacy and agree to usage-based insurance. It is also unsustainable for insurers that will eventually have to charge rates actually based on risk assessment rather than marketing initiatives.

Substantiating the point about consumer adoption is a recent survey by the Deloitte Center for Financial Services of 2,193 respondents representing a wide variety of demographic groups, aiming to understand consumer interest in mobile technology in financial services delivery, including the use of auto telematics monitoring. The survey identified three distinct groups among respondents when asked whether they would agree to allow an insurer to track their driving experience, if it meant they would be eligible for premium discounts based on their performance (figure 5).11 While one-quarter of respondents were amenable to being monitored, just as many said they would require a substantial discount to make it worth their while (figure 5), and nearly half would not consent.

ER_2824_Fig.5

While the Deloitte survey was prospective (asking how many respondents would be willing to have their driving monitored telematically), actual recruits have been proven to be difficult to bring on board. Indeed, a 2015 Lexis-Nexis study on the consumer market for telematics showed that usage-based insurance enrollment has remained at only 5 percent of households from 2014 to 2015 (figure 6).12

ER_2824_Fig.6

Both of these survey results suggest that premium discounts alone have not and likely will not induce many consumers to opt in to telematics monitoring going forward, and would likely be an unsustainable model for insurers to pursue. The good news: Research suggests that, while protective of their personal information, most consumers are willing to trade access to that data for valuable services from a reputable brand.13 Therefore, insurers will likely have to differentiate their telematics-based product offerings beyond any initial early-adopter premium savings by offering value-added services to encourage uptake, as well as to protect market share from other players moving into the telematics space.

In other words, insurers—by offering mutually beneficial, ongoing value-added services—can use IoT-based data to become an integral daily influence for connected policyholders. Companies can incentivize consumers to opt in by offering real-time, behavior-related services, such as individualized marketing and advertising, travel recommendations based on location, alerts about potentially hazardous road conditions or traffic, and even diagnostics and alerts about a vehicle’s potential issues (figure 7).14 More broadly, insurers could aim to serve as trusted advisers to help drivers realize the benefits of tomorrow’s connected car.15

Many IoT applications offer real value to both insurers and policyholders: Consider GPS-enabled geo-fencing, which can monitor and send alerts about driving behavior of teens or elderly parents. For example, Ford’s MyKey technology includes tools such as letting parents limit top speeds, mute the radio until seat belts are buckled, and keep the radio at a certain volume while the vehicle is moving.16 Other customers may be attracted to “green” monitoring, in which they receive feedback on how environmentally friendly their driving behavior is.

Insurers can also look to offer IoT-related services exclusive of risk transfer—for example, co-marketing location-based services with other providers, such as roadside assistance, auto repairs, and car washes may strengthen loyalty to a carrier. They can also include various nonvehicle-related service options such as alerts about nearby restaurants and shopping, perhaps in conjunction with points earned by good driving behavior in loyalty programs or through gamification, which could be redeemed at participating vendors. Indeed, consumers may be reluctant to switch carriers based solely on pricing, knowing they would be abandoning accumulated loyalty points as well as a host of personalized apps and settings.

For all types of insurance—not just auto—the objective is for insurers to identify the expectations that different types of policyholders may have, and then adapt those insights into practical applications through customized telematic monitoring to elevate the customer experience.

Telematics monitoring has demonstrated benefits even beyond better customer experience for policyholders. Insurers can use telematics tools to expose an individual’s risky driving behavior and encourage adjustments. Indeed, people being monitored by behavior sensors will likely improve their driving habits and reduce crash rates—a result to everyone’s benefit. This “nudge effect” indicates that the motivation to change driving behavior is likely linked to the actual surveillance facilitated by IoT technology.

The power of peer pressure is another galvanizing influence that can provoke beneficial consumer behavior. Take fitness wearables, which incentivize individuals to do as much or more exercise than the peers with whom they compete.17 In fact, research done in several industries points to an individual’s tendency to be influenced by peer behavior above most other factors. For example, researchers asked four separate groups of utility consumers to cut energy consumption: one for the good of the planet, a second for the well-being of future generations, a third for financial savings, and a fourth because their neighbors were doing it. The only group that elicited any drop in consumption (at 10 percent) was the fourth—the peer comparison group.18

Insurers equipped with not only specific policyholder information but aggregated data that puts a user’s experience in a community context have a real opportunity to influence customer behavior. Since people generally resist violating social norms, if a trusted adviser offers data that compares customer behavior to “the ideal driver”—or, better, to a group of friends, family, colleagues, or peers—they will, one hopes, adapt to safer habits.

ER_2824_Fig.7a

The future ain’t what it used to be—what should insurers do?

After decades of adherence to traditional business models, the insurance industry, pushed and guided by connected technology, is taking a road less traveled. Analysts expect some 38.5 billion IoT devices to be deployed globally by 2020, nearly three times as many as today,19 and insurers will no doubt install their fair share of sensors, data banks, and apps. In an otherwise static operating environment, IoT applications present insurers with an opportunity to benefit from technology that aims to improve profits, enable growth, strengthen the consumer experience, build new market relevance, and avoid disruption from more forward-looking traditional and nontraditional competitors.

Incorporating IoT technology into insurer business models will entail transformation to elicit the benefits offered by each strategy.

  • Carriers must confront the barriers associated with conflicting standards—data must be harvested and harnessed in a way that makes the information valid and able to generate valuable insights. This could include making in-house legacy systems more modernized and flexible, building or buying new systems, or collaborating with third-party sources to develop more standardized technology for harmonious connectivity.
  • Corporate culture will need a facelift—or, likely, something more dramatic—to overcome longstanding conventions on how information is managed and consumed across the organization. In line with industry practices around broader data management initiatives,20 successfully implementing IoT technology will require supportive “tone at the top,” change management initiatives, and enterprisewide training.
  • With premium savings already proving insufficient to entice most customers to allow insurers access to their personal usage data, companies will need to strategize how to convince or incentivize customers to opt in—after all, without that data, IoT applications are of limited use. To promote IoT-aided connectivity, insurers should look to market value-added services, loyalty points, and rewards for reducing risk. Insurers need to design these services in conjunction with their insurance offerings, to ensure that both make best use of the data being collected.
  • Insurers will need to carefully consider how an interconnected world might shift products from focusing on cleaning up after disruptions to forestalling those disruptions before they happen. IoT technology will likely upend certain lines of businesses, potentially even making some obsolete. Therefore, companies must consider how to heighten flexibility in their models, systems, and culture to counterbalance changing insurance needs related to greater connectivity.
  • IoT connectivity may also potentially level the playing field among insurers. Since a number of the broad capabilities that technology is introducing do not necessarily require large data sets to participate (such as measuring whether containers in a refrigerated truck are at optimal temperatures to prevent spoilage21 or whether soil has the right mix of nutrients for a particular crop22), small to midsized players or even new entrants may be able to seize competitive advantages from currently dominant players.
  • And finally, to test the efficacy of each IoT-related strategy prior to implementation, a framework such as the Information Value Loop may become an invaluable tool, helping forge a path forward and identify potential bottlenecks or barriers that may need to be resolved to get the greatest value out of investments in connectivity.

The bottom line: IoT is here to stay, and insurers need look beyond business as usual to remain competitive.

The IoT is here to stay, the rate of change is unlikely to slow anytime soon, and the conservative insurance industry is hardly impervious to connectivity-fueled disruption—both positive and negative. The bottom line: Insurers need to look beyond business as usual. In the long term, no company can afford to engage in premium price wars over commoditized products. A business model informed by IoT applications might emphasize differentiating offerings, strengthening customer bonds, energizing the industry brand, and curtailing risk either at or prior to its initiation.

IoT-related disruptors should also be considered through a long-term lens, and responses will likely need to be forward-looking and flexible to incorporate the increasingly connected, constantly evolving environment. With global connectivity reaching a fever pitch amid increasing rates of consumer uptake, embedding these neoteric schemes into the insurance industry’s DNA is no longer a matter of if but, rather, of when and how.

You can view the original post in its entirety Here

Read more…

Guest blog post by Bernard Marr

What does big data know about you?

Quite a lot.

Every time we use a computer, access our phones, or open an app on a tablet, we’re leaving a digital trail. Most people are vaguely aware that Google knows what they’ve searched for, or that Facebook knows who their friends are, but it goes much, much deeper than that.

I’ve compiled a list of 21 things Big Data knows about almost every one of us — right now:

  1. Of course, Google knows what you’ve searched for. So do Bing, Yahoo!, and every other search engine. And your ISP knows every website you’ve ever visited. Ever (even in private browsing).
  2. Google also knows your age and gender — even if you never told them. They make a pretty comprehensive ads profile of you, including a list of your interests (which you can edit) to decide what kinds of ads to show you.
  3. Facebook knows when your relationship is going south. Based on activities and status updates on Facebook, the company can predict (with scary accuracy) whether or not your relationship is going to last.
  4. Google knows where you’ve travelled, especially if you have an Android phone.
  5. And the police know where you’re driving right now — at least in the U.K., where closed circuit televisions (CCTV) are ubiquitous. Police have access to data from thousands of networked cameras across the country, which scan license plates and take photographs of each car and their driver. In the U.S., many cities have traffic cameras that can be used similarly.
  6. Your phone also knows how fast you were going when you were traveling. (Be glad they don’t share that information with the police!)
  7. Your phone has also probably deduced where you live and work.
  8. The Internet knows where your cat lives. Using the hidden meta-data about the geographic location of where the photo was taken which we share when we publish photos of our cats on sites like Instagram and other social media networks.
  9. Your credit card company knows what you buy. Of course your credit card company knows what you buy and where, but this has raised concerns that what you buy and where you shop might impact your credit score. They can use your purchasing data to decide if you’re a credit risk.
  10. Your grocery store knows what brands you like. For every point a grocery store or pharmacy doles out, they’re collecting mountains of data about your purchasing habits and preferences. The chains are using the data to serve up personalized experiences when you visit their websites, personalized coupon offers, and more.
  11. HR knows when you’re going to quit your job. An HR software company called Workday WDAY -1.00% is testing out an algorithm that analyzes text in documents and can predict from that information, which employees are likely to leave the company.
  12. Target knows if you’re pregnant. (Sometimes even before your family does.)
  13. YouTube knows what videos you’ve been watching. And even what you’ve searched for on YouTube.
  14. Amazon knows what you like to read, Netflix NFLX -0.85% knows what you like to watch. Even your public library knows what kinds of media you like to consume.
  15. Apple and Google know what you ask Siri and Cortana.
  16. Your child’s Barbie doll is also telling Mattel what she and your child talk about.
  17. Police departments in some major cities, including Chicago and Kansas City, know you’re going to commit a crime — before you do it.  
  18. Your auto insurance company knows when and where you drive — and they may penalize you for it, even if you’ve never filed a claim.
  19. Data brokers can help unscrupulous companies identify vulnerable consumers. For example, they may identify a population as a “credit-crunched city family” and then direct advertisements at you for payday loans.
  20. Facebook knows how intelligent you are, how satisfied you are with your life, and whether you are emotionally stable or not – simply based on a big data analysis of the ‘likes’ you have clicked.
  21. Your apps may have access to a lot of your personal data. Angry Birds gets access to your contact list in your phone and your physical location. Bejeweled wants to know your phone number. Some apps even access your microphone to record what’s going on around you while you use them.

This is actually just the tip of the iceberg. As we dive deeper into the benefits big data can provide to us, we’ll also be happily coughing up more and more data. The iPhone Health app, for instance, can collect data about all kinds of intimately personal things about your health.

It’s up to us, as consumers, to be aware of what we’re giving away, when, and to whom. I would love to hear your concerns and comments on this topic.

Bernard Marr is a best-selling author & keynote speaker. His new book: 'Big Data in Practice: How 45 Successful Companies Used Big Data Analytics to Deliver Extraordinary Results'

Follow us @IoTCtrl | Join our Community

Read more…

The technology sector is buzzing with predictions and hype about the Internet of Things (IoT), but many people are still confused about what it means, what the real world opportunities are and why businesses should be looking into IoT.

At a fundamental and simplistic level the Internet of Things refers to 'physical objects which linked via wired or wireless networks'

These physical objects could be anything (such as medical machines, vehicles, building systems, signage, toasters, smoke alarms, temperature sensors, weather monitors, intelligent tags or rubbish bins for example). Almost any object, in any sector, in any location could potentially join the Internet of Things, so its no wonder that Gartner predict there will be 50 billion devices connected by 2020 (and other analysts estimate several orders of magnitude more).  

Typically the Internet of Things is used to gather data and insight, find efficiency, automate tasks or improve an experience or service. At Smarter Technology Solutions (STS) we put this down to a simple formula, with greater insight, comes better decisions.

I know what you're thinking, why would you connect an object like a rubbish bin to the Internet?

Well its a simple example but it has tremendous flow on effects. Simply tracking the fill level of a rubbish bin using a smart sensor, councils and waste providers can find out a few important facts such as fill-level trends, how often the bin really needs emptying and when, to better plan waste collection services (eg timing of bin collection near food outlets to avoid lunchtimes) and to identify areas that may need more/less bins (to assist with city/service planning).
By collecting just the fill level data of a waste bin the following benefits could be attained:

  1. Reduction in cost as less bin collections = less waste trucks on the road, no unnecessary collections for a bin that's 20% full, less labour to complete waste collection. This also provides a level of operational efficiency and optimized processes.
  2. Environmental benefit - where waste is not overflowing and truck usage is reduced, flow on environmental impact, pollution and fuel consumption is minimized. By ensuring waste bins are placed in convenient locations, littering and scattered waste is also minimized.
  3. Service improvements - truck collection routes can be optimized, waste bins can be collected at convenient times and planning of future/additional services can be amended as the data to trend and verify assumptions is available. 

More complex examples of IoT include:

  • Intelligent transport systems which update digital signage on the highway and adjusts the traffic lights in real time to divert traffic, optimise traffic flow and reduce congestion;
  • A farm which uses sensors to measure soil moisture, chemical levels and weather patterns, adjusting the watering and treatment schedules accordingly;
  • The building which draws the blinds to block out the afternoon sun, reducing the need to consume more power cooling the building and to keep the environment comfortable;
  • Health-care devices which monitor patients and auto-alert medical practitioners once certain symptoms or attributes are detected; 
  • Trucks which automatically detect mechanical anomalies and auto schedule themselves in for preventative maintenance once they reach certain thresholds; 
  • Asset tracking of fleet vehicles within a services company which provides operations staff with fleet visibility to quickly dispatch the closest resource to a job based on proximity to the next task;
  • Water/gas/electric meters which sends in their own reading in on a monthly basis and trends analysis which can detect potential water/gas leaks; or
  • A retail store which analyses your in-store behavior or purchasing patterns and recommend products to you based on previous choices and your personal preferences.

At Smarter Technology Solutions we specialize in consulting with organizations  to understand the benefits of IoT, design best fit solutions, engineer and implement solutions as well as supporting the ongoing support needs of the organization. This results in 3 key outcomes:

  • Discovery of New Opportunities - With better visibility, trends, opportunities, correlations and inefficiencies can be understood. From this, products, services and business models can be adjusted or changed to achieve competitive advantage.
  • Improved Efficiency - By identifying inefficiencies in existing business practices, work-flows can be improved and more automated services can be provided.
  • Improved Services - With trends and real time data businesses are able make smarter decisions and alter the way you services are delivered.

www.smartertechnologysolutions.com.au

Read more…

The Internet of Things (IoT) concept promises to improve our lives by embedding billions of cheap purpose-built sensors into devices, objects and structures that surround us (appliances, homes, clothing, wearables, vehicles, buildings, healthcare tech, industrial equipment, manufacturing, etc.).

IoT Market Map -- Goldman Sachs

What this means is that billions of sensors, machines and smart devices will simultaneously collect volumes of big data, while processing real-time fast data from almost everything and... almost everyone!!!

IoT vision is not net reality

Simply stated, the Internet of Things is all about the power of connections.

Consumers, for the moment anyway, seem satisfied to have access to gadgets, trendy devices and apps which they believe will make them more efficient (efficient doesn't necessarily mean productive), improve their lives and promote general well-being.

Corporations on the other hand, have a grand vision that convergence of cloud computing, mobility, low-cost sensors, smart devices, ubiquitous networks and fast-data will help them achieve competitive advantages, market dominance, unyielding brand power and shareholder riches.

Global Enterprises (and big venture capital firms) will spend billions on the race for IoT supremacy. These titans of business are chomping at the bit to develop IoT platforms, machine learning algorithms, AI software applications & advanced predictive analytics. The end-game of these initiatives is to deploy IoT platforms on a large scale for;

  • real-time monitoring, control & tracking (retail, autonomous vehicles, digital health, industrial & manufacturing systems, etc.)
  • assessment of consumers, their emotions & buying sentiment,
  • managing smart systems and operational processes,
  • reducing operating costs & increasing efficiencies,
  • predicting outcomes, and equipment failures, and
  • monetization of consumer & commercial big data, etc.

 

IoT reality is still just a vision

No technology vendor (hardware or software), service provider, consulting firm or self-proclaimed expert can fulfill the IoT vision alone.

Recent history with tech hype-cycles has proven time and again that 'industry experts' are not very accurate predicting the future... in life or in business!

Having said this, it only makes sense that fulfilling the promise of IoT demands close collaboration & communication among many stake-holders.

A tech ecosystem is born

IoT & Industrial IoT comprise a rapidly developing tech ecosystem. Momentum is building quickly and will drive sustainable future demand for;

  • low-cost hardware platforms (sensors, smart devices, etc.),
  • a stable base of suppliers, developers, vendors & distribution,
  • interoperability & security (standards, encryption, API's, etc.),
  • local to global telecom & wireless services,
  • edge to cloud networks & data centers,
  • professional services firms (and self-proclaimed experts),
  • global strategic partnerships,
  • education and STEM initiatives, and
  • broad vertical market development.

I'll close with one final thought; "True IoT leaders and visionaries will first ask why, not how..!"

Read more…

Guest blog post by Peter Bruce

When Apple CEO Tim Cook finally unveiled his company’s new Apple Watch in a widely-publicized rollout, most of the press coverage centered on its cost ($349 to start) and whether it would be as popular among consumers as the iPod or iMac.

Nitin Indurkhya saw things differently.

“I think the most significant revelation was that of ResearchKit,” Indurkhya said. “It allows the iWatch to gather huge amounts of health-related data from its sensors that could then be used for medical research, an area that has traditionally been plagued by small samples and inconsistent and costly data collection, and for preventive care.”

Indurkhya is in a perfect position to know. He teaches text mining and other online courses for Statistics.com and the Institute for Statistics Education. And if you’ve ever wondered about the origins of a term we hear everywhere today – Big Data - the mystery is over. Indurkhya, along with Sholom Weiss, first coined "Big Data" in a predictive data mining book in 1998. (I never anticipated Big Data becoming a buzzword,” he said. “although we did expect the concept to take off.”)

The ResearchKit already has five apps that link users to studies on Parkinson's disease, diabetes, asthma, breast cancer and heart disease. Cook has touted other health benefits from Apple Watch, including its ability to tap users with a reminder to get up and move around if they have been sitting for a while. “We've taken (the mobile operating system) iOS and extended it into your car, into your home, into your health. All of these are really critical parts of your life,” Cook told a Goldman Sachs technology and Internet conference recently.

That helps explain the media fascination over another new Apple product. But it also tells us the importance of learning about Big Data. Having access to large amounts of raw numbers alone doesn’t necessarily change our lives. The transformation occurs when we master the skills needed to understand both the potential and the limitations of that information.

The Apple Watch exemplifies this because the ResearchKit essentially recruits test subjects for research studies through iPhone apps and taps into Apple Watch data. The implications for privacy, consent, sharing of data, and other ethical issues, are enormous. The Apple Watch likely won’t be the only device in the near future to prompt these kinds of concerns. It all leads to the realization that we need to be on a far more familiar basis with how data is collected and used than we’ve ever had to be in the past.

“We are increasingly relying on decisions, often from "smart" devices and apps that we accept and even demand,  that arise from data-based analyses,” Indurkhya said. “ So we do need to know when to, for example, manually override them in particular instances.

“Allowing our data to be pooled with others has benefits as well as risks. A person would need to understand these if they are to opt for a disclosure level that they are comfortable with. Otherwise the danger is that one would go to one or the other extreme, full or no participation, and have to deal with unexpected consequences.”

The Big Data questions raised by the Apple Watch are similar to the concerns over access to and disclosure of other reams of personal information. Edward Snowden’s leaks most famously brought these kinds of worries into play, publicizing the spying on ordinary Americans by the National Security Agency. There’s also commonly expressed fear that Big Data is dehumanizing, and that it’s used more for evil than for good.

These fears, Indurkhya noted, have seeped into the popular culture. Consider this list of Big Data movies: War Games, in which a super computer is given control of all United States defense assets.  Live Free or Die Hard, in which a data scientist hacker hopes to eventually bringing down the entire U.S. financial system. Even Batman gets into the act, hacking every cell phone in Gotham.

Little wonder people might shy away from studying big data. But that would be a mistake, said Indurkhya, who has a rebuttal for all the Hollywood hyped-fears.

First, he said, there are strong parallels between the Big Data revolution and the industrial revolution. Look at history. Despite all the dire predictions, machines aren't "taking over the world" and neither will Big Data.

Second, it’s also helpful to appreciate what Big Data gives us. It provides us with better estimates - they are more accurate and our confidence in them is higher. Perhaps more importantly, it provides estimates in situations where, in the absence of Big Data, answers were not obtainable at all, or not readily accessible. Think about searching the web for  "Little Red Riding Hood and Ricky Ricardo."  Even in the early days of the internet, you would have gotten lots of results individually for "Little Red Riding Hood" and "Ricky Ricardo," but it was not until Google had accumulated a massive enough data set, and perfected its Big Data search techniques, that you could reliably get directed to the "I Love Lucy" episode where Ricky dramatically reenacts the story for little Ricky. 

Data specialists can set policies and procedures that protect us from some of the risks of Big Data.  But we also need to become much more familiar with how our data is collected, analyzed, and distributed. If the Apple Watch rollout proves anything, it might be this: Going forward, we’ll all have to be as smart about data as our devices.

Follow us @IoTCtrl | Join our Community

Read more…

Machine learning is the science of getting computers to act without being explicitly programmed. In the past decade, machine learning has given us self-driving cars, practical speech recognition, effective web search, and a vastly improved understanding of the human genome. It will play a big part in the IoT. From our friends at R2D3 is a very interesting visual introduction to machine learning. Check it out here

Read more…

Keeping Up With Tech Trends

With the continuing evolution of technology, it's not surprising how trends are constantly changing as well. A big number of companies try to create new trends or keep up and ride with the current ones as they create new tech startups that will hook the public and keep them wanting for more.

Take Flappy Bird for example. Although the application was released May of 2013, it made huge waves in 2014 and even became the most downloaded free game in the Apple App Store. It even earned $50,000 a day! After feeling guilty about it's addictive nature, the creator of the game removed the game from application stores. But that gave opportunity for other developers to create their own application that was similar to it. So not long after that, hundreds of flappy bird-like applications were created and released in the application stores. But now it seems like the hype has died down and flappy bird will now just become another tech trend faded memory.

If you're one of the companies that are making plans now for a new tech trend or you're simply just making drafts for a new one, make sure you pay attention to what we think are the four rising tech trends this year. Bring out your pads and take note, people!

Think Smart

Nowadays, people are getting smart. Companies are creating better smart phones. Don't know what the time is? Check your smart watch. You can now even wear your computer with smart glasses and smart homes are getting popular too. Yes, developers are finding new ways to bring smarter things in this world. Create your own application or device that will keep the momentum going.

Some Privacy, Please

With plenty of emerging news about invasion of privacy, people are becoming more conscious about keeping things to themselves or just among a few people. Private applications like, Snapchat or ones that secure your pictures and other information have made waves because people favour them more for their privacy-promising quality. Look into creating or capitalizing more applications that will cater to this consumer need.

Drones!

With Amazon launching the video for their Prime Air, you know that they are coming. And with a great feedback of interest from everyone who watched the videos, we can predict that drones will no longer be something that we see on TV but something that we'll be experiencing very soon.

Data Forever

With growing popularity of social media or websites, unlimited data became and still is a huge trend. Look into ways to use that unlimited data as opportunity to use it for either better advertising, create better products or develop the consumer's experience. The possibilities are endless.

 

 

Read more…
The ‘connected’ car, not to be confused with the self-driving, autonomous car, is defined as any vehicle equipped with Internet access that allows data to be sent to and from the vehicle.

Since the automobiles were invented, car makers have been trying to add features which may reduce driver error. Today’s car has the computing power of 20 personal computers, features about 100 million lines of programming code, and processes up to 25 gigabytes of data an hour.

Digital technology is also changing how we use and interact with our cars, and in more ways than you probably realize.

The market for smart vehicles is certainly set for takeoff and many analysts predict they could revolutionize the world of automobiles in much the same way smartphones have changed the face of telecommunications.

Is your car connected to the Internet? Millions of vehicles around the world had embedded Internet access, offering their drivers a multitude of smart options and benefits. These include better engine controls, automatic crash notifications and safety alerts, to name just a few. Owners can also interact with their connected vehicles through apps from any distance.

Vehicle-to-vehicle communications, for example, could help automobiles detect one another's presence and location to avoid accidents. That could be especially useful when it comes to driver-less cars - another advance already very much in development. Similar technology could help ensure that cars and their drivers slow down for school zones or stop at red lights.

Connected vehicle technologies provide the tools to make transformational improvements in safety, to significantly reduce the number of lives lost each year through connected vehicle crash prevention applications.

The Connected Car will be optimized to track and report its own diagnostics, which is part of its appeal for safety conscious drivers.

Connected cars give superior Infotainment services like navigation, traffic, weather, mobile apps, emails and also entertainment.

Auto insurers also have much to gain from the connected car revolution, as personalized, behavior based premiums are already becoming new industry standard.

OEMS and dealers must embrace the  Big Data revolution now, so they’re ready to harness the plethora of data that will become available as more and more connected cars hit the roads.

Cloud computing powers much of the audio streaming capabilities and dashboard app functions that are becoming more commonplace in autos.

In the next 5 years it seems that non-connected cars will become a thing of the past.  Here are some good examples of connected cars:

  • Mercedes-Benz models introduced this year can link directly to Nest, the Internet of Things powered smart home system, to remotely activate a home’s temperature controls prior to arrival.
  • Audi has developed a 12.3 inch, 3d graphics fully digital dashboard in partnership with NVIDIA.
  • Telematics Company OnStar can shut down your stolen car remotely helping police solve the case.
  • ParkMe covers real time dynamic parking information and guide drivers to open parking lots and meters. It if further integrating with mobile payments.

The next wave is driver-less, fully equipped and connected car, where there will be no steering wheels, brakes, gas pedals and other major devices. You just have to sit back, relax and enjoy the ride!!

This article originally appeared here.
Read more…

Node.js and The Internet of Things

Last year, we interviewed Patrick Catanzariti and asked him if Javascript will be the language of IoT. It was one of our most shared Q&As. Charlie Key's talk at the Node Community Conference provides a nice overview of how Node is driving adoption of IoT. In software development, Node.js is an open-source, cross-platform runtime environment for developing server-side Web applications. Although Node.js is not a JavaScript framework, many of its basic modules are written in JavaScript, and developers can write new modules in JavaScript.

Here's his presentation and a look at where the market of the Internet of Things is and how technologies like Node.js (JavaScript) and the Intel Edison are making it easier to create connected solutions. 

The major topics include: 
* What is the Internet of Things 
* Where is IoT Today 
* 4 Parts of IoT (Collect, Communicate, Analyze, Act) 
* Why JavaScript is Good for IoT 
* How Node.js is Making a Dent in the Internet of Things 
* What npm Modules are used for Hardware (Johnny-Five, Cylon.js, MRAA) 
* What is the Intel Edison 
* How to Best Work with the Edison 
* Tips for Edison (MRAA, Grove Kit, UPM) 
* Where the World of JavaScript and IoT is Going Node.js 

Read more…

The IoT Database

Phillip Zito at the highly resourceful blog Building Automation Monthly has consolidated multiple IoT Frameworks and Offerings into the IoT Database. You will see links to the Frameworks and Offerings below. He says over time that he will be working on providing summary articles on each Framework and Offering. He could use your help. If you have an offering/framework you would like added to this list feel free to add it in the comments. You can find the IoT Database here

Read more…

The IoT User Experience Urgency

As we evolve toward a software-defined world, there’s a new user experience urgency emerging.  That’s because the definition of “user” is going to be vastly expanded.  In the Internet of Things (IoT) era, users include machines.

Companies today are generating, collecting and analyzing more data than ever before.  They want to get better insights into their customers and their business operations.  This is driving substantial Investments in new architectures that extend to cloud and mobility.

They’re also yielding to user demands for more and newer sources of big data.  They’re experimenting with data lakes to store this potential trove.  And they’re investing in data blending and visualization technologies to analyze it all.

In the IoT world of the near future, however, much of this analysis is going to be done by machines with deep learning capabilities.  With forecasts for as many as 50 billion connected devices by 2020, the experience of these “users” with the applications they engage with will be no less critical to achieving strategic objectives than customer experience is now – and will remain.

But how are companies going to get smarter if user experience sucks?  Where is this greater insight going to come from if whatever business intelligence software they’ve deployed is not performing to user expectations?

They’re not going to win customer satisfaction and loyalty by frustrating users.  And the risks involved with disappointing machine users could be catastrophic.

It's Time to Get Strategic

More companies have come to realize the strategic value of their data.  As such, they’re seeking ways to get a higher return on those data assets.  The databases – both transactional and analytic – they’ve invested in are critical to corporate strategy.

In order to maximize the performance of business-critical apps companies must get strategic about user experience and application performance.  Monitoring technologies can no longer be implemented as short-term tactical bandages.

They might put out a brush fire temporarily, but they create more complexity and management headaches in the long run.  They often don’t work well together and generate more false positives than a smoke detector with a failing battery.  Annoying right?

IT teams are going to have to get more efficient with their ops data.   They will need a standardized approach to integrating diverse data sets, including those from SaaS applications and IaaS or PaaS clouds.  This is critical to gaining physical and logical knowledge of the computing environment across the entire application delivery chain.

Next-generation data integration technologies can unify ops data from traditional monitoring solutions with real-time streams of machine data and other types of big data.  They automate much of the cleansing, matching, error handling and performance monitoring that IT Ops teams often struggle with manually.

As this ops data grows with IoT, it can be fed into a data lake for analysis.  In fact, IT teams can kill two birds with one stone.  First, IT Ops data is a natural fit as an early test case for a data lake.  And by starting now they can hone skills sets for big data analytics and the coming IoT data deluge.

IT Ops, which are increasingly becoming a part of DevOps teams, can learn from and share their experiences with data management and analytics teams – as well as business teams.  It makes sense to bring application governance and data governance together because they share a common goal: ensuring that users have access to the highest quality data at the point of decision to optimize business outcomes and mitigate risks. 

The Path to ROI and Risk Management Objectives

This environment necessitates communication and collaboration among IT and business teams to proactively anticipate, identify and resolve application performance and user experience problems.  It also facilitates orchestration and management of both internally and externally sourced services efficiently to improve decision-making and business outcomes.

Through a unified approach to performance analytics, IT can help their companies leverage technology investments to discover, interpret and respond to the myriad events that impact their operations, security, compliance and competitiveness.  Ops data efficiency becomes actionable to facilitate strategic initiatives and positively impact financial results.

Successful strategy implementation manifests in return on investment (ROI) and risk management.  Multiple studies, including ours and the annual Puppet Labs State of DevOps report confirm that companies taking a strategic approach to user experience and application performance outperform their respective peer groups in financial metrics and market performance.

Vendors in this space – usually referred to as application performance management ( APM) – need to advance their thinking and technology.  Machine learning and predictive analytics are going to be table stakes in the IoT future.

APM vendors have a choice: they can maintain a focus on human user experience, which will always be essential.  Or they can think more broadly about user experience in the IOT world.  Because some of today’s enterprise customers – that produce everything from home monitoring devices and appliances to turbine engines, agricultural machinery and healthcare equipment – could one day well become competitors.

By capturing data from embedded sensors and applying advanced analytics to provide customers using their equipment with deeper insights, they could close out what will become the lion’s share of the IoT user experience market.  Leading manufacturers are already there.

 Photo: Gorbash Varvara

Originally posted on Big Data News by Gabriel Lowy

Follow us @IoTCtrl | Join our Community

Read more…

IoT Dictionary and M2M Industry Terms

camera-dictionary

Here's a great resource  from Aeris - an IoT Dictionary.

Aeris Communications has been in the machine-to-machine market for some time and are both a technology provider and a cellular network operator delivering comprehensive IoT / M2M services.

This glossary includes key terms of the IoT (Internet of Things) & M2M (machine-to-machine) communications industry, including wireless and cellular technologies spanning many different markets. It is updated to present current terminology and usage. It's a crowd-sourced resource, so feel free to contact Aries with suggestions. 

Also, if you need an IT-related dictionary, I just love WhatIs.com.

Read more…

This is an interesting resource for data scientists, especially for those contemplating a career move to IoT (Internet of things). Many of these modern, sensor-based data sets collected via Internet protocols and various apps and devices, are related to energy, urban planning, healthcare, engineering, weather, and transportation sectors. 

Sensor data sets repositories

Originally posted on Data Science Central

Follow us @IoTCtrl | Join our Community

Read more…

Eight IOT Analytics Products

Vitria IoT Platform

Vitria’s IoT analytics platform enables you to transform your business operations and boost revenue growth through Faster Analytics, Smarter Actions, and Better Outcomes Faster.

Faster and unified analytics via Temporal Analytics Engine over all data types and cycles. Smarter Actions enable better outcomes by combining prescriptive analytics with intelligent actions. Self-service and automation capabilities empower teams to accelerate time-to-value. and create analytics solutions in minutes vs. months.

Tellient

Tellient's IoT Analytics gives you the whole story with beautiful graphs for humans, organized data for machines, designed for the Internet of Things. As the only analytics platform built specifically for the Internet of Things, Tellient's IoT Analytics helps manufacturers of smart connected devices know what those devices are doing so they can make them better.

ParStream

ParStream’s Analytics Platform was purpose-built for scale to handle the massive volumes and high velocity of IoT data. The Platform helps companies generate timely, actionable insights from IoT data by providing more innovative and efficient ways to analyze that data – faster, with greater flexibility and closer to the source. The Platform uniquely queries at the source of data for real-time analysis as data is being loaded. It also provides unified analytics of real-time data in every query and generates more accurate insights for decision-makers with the continuous import of new data.

IBM IoT Platform

IBM Internet of Things Foundation provides simple, but powerful application access to IoT devices and data to help you rapidly compose analytics applications, visualization dashboards and mobile IoT apps.

Dell Statistica IoT Platform

Dell has empowered its users with a powerful business data analytics tool named ‘Dell Statistica’, which is capable of delivering wide range of solutions to various sectors say process optimization in manufacturing sector to fraud detection in banking industry and it even allows analytics on the gateway providing faster local insights.

Spunk IoT Platform

It offers a platform for operational intelligence that assists you to search, monitor, analyze and visualize machine generated big data from various websites, networks and other IoT devices. In recent announcement, Splunk is to deliver Real time Analytics and Visualization for AWS IoT Service.

Intel® IoT Analytics Platform

This beta cloud-based analytics system for IoT includes resources for the collection and analysis of sensor data. Using this service, you can jump-start data acquisition and analysis without having to invest in large-scale storage and processing capacity.

Pentaho IoT Platform

Sensor, machine-to-machine, and network data are expected to play a larger role in analytics as the Internet of Things becomes a reality. However, these data types present significant challenges related to data volume and variety, as well as predictive modeling. Pentaho provides the ability to blend operational data with data from your IT systems of record and deliver intelligent analytics to those stakeholders who need them most.


Originally posted on Data Science Central


Follow us @IoTCtrl | Join our Community

Read more…

Will Javascript be the Language of IoT?

language.jpg

JavaScript has proven itself worthy for web applications, both client and server side, but does it have potential to be the de-facto language of IoT?  

This is a topic I posed to Patrick Catanzariti, founder of DevDiner.com, a site for developers looking to get involved in emerging tech. Patrick is a regular contributor and curator of developer news and opinion pieces on new technology such as the Internet of Things, virtual/augmented reality and wearables. He is a SitePoint contributing editor, an instructor at SitePoint Premium and O'Reilly, a Meta Pioneer and freelance web developer who loves every opportunity to tinker with something new in a tech demo.

Why does IoT require a de facto language any more than any other system? Wouldn't that stifle future language evolution?

Honestly, I think it's a bit too much to ask for every single IoT device out there to run on JavaScript or any one de facto language. That's unbelievably tough to manage. Getting the entire world of developers to agree on anything is pretty difficult. Whatever solution the world of competing tech giants and startups come to (which is likely to be a rather fragmented one if current trends are anything to go by), the most important thing is that these devices need to be able to communicate effectively with each other and with as little barriers as possible. They need to work together. It's the "Internet of Things". The entire benefit of connecting anything to the Internet is allowing it to speak to other devices at a massive scale. I think we'd be able to achieve this goal even with a variety of languages powering the IoT. So from that standpoint, I think it's totally okay for various devices to run on whichever programming language suits them best.

On the other hand, we need to honestly look at the future of this industry from a developer adoption and consistency perspective. The world of connected devices is going to skyrocket. We aren't talking about a computer in every home, we're talking dozens of interconnected devices in every home. If each one of those devices is from a different company who each decided on a different programming language to use, things are going to get very tough to maintain. Are we going to expect developers to understand all programming languages like C, C++, JavaScript, Java, Go, Python, Swift and more to be able to develop solutions for the IoT? Whilst I'm not saying that's impossible to do and I'm sure there'll be programmers up to the task of that - I worry that will impact the quality of our solutions. Every language comes with its quirks and best practices, it'll be tough to ensure every developer knows how to create best practice software for every language. Managing the IoT ecosystem might become a costly and difficult endeavour if it is that fragmented.

I've no issue with language evolution, however if every company decides to start its own language to better meet the needs of the IoT, we're going to be in a world of trouble too. The industry needs to work together on the difficulties of the IoT, not separately. The efforts of the Open Interconnect Consortium, AllSeen Alliance and IoT Trust Framework are all positive signs towards a better approach.

C, C++ and Java always seem to be foundational languages that are used by all platforms, why do you think JavaScript will be the programming language of IoT?

My position is actually a bit more open than having JavaScript as the sole programming language of the IoT. I don't think that's feasible. JavaScript isn't great as a lower level language for memory management and the complexities of managing a device to that extent. That's okay. We are likely to have a programming language more suited to that purpose, like C or C++, as the de facto standard operational language. That would make perfect sense and has worked for plenty of devices so far. The issues I see are in connecting these devices together nicely and easily.

My ideal world would involve having devices running on C or C++ with the ability to also run JavaScript on top for the areas in which JavaScript is strongest. The ability to send out messages in JSON to other devices and web applications. That ability alone is golden when it comes to parsing messages easily and quickly. The Internet can speak JavaScript already, so for all those times when you need to speak to it, why not speak JavaScript? If you've got overall functionality which you can share between a Node server, front end web application and a dozen connected IoT devices, why not use that ability?

JavaScript works well with the event driven side of things too. When it comes to responding to and emitting events to a range of devices and client web applications at once, JavaScript does this pretty well these days.

JavaScript is also simpler to use, so for a lot of basic functionality like triggering a response on a hardware pin or retrieving data from a sensor, why overcomplicate it? If it's possible to write code that is clear and easy for many developers to understand and use without needing to worry about the lower level side of things - why not? We have a tonne of JavaScript developers out there already building for the web and having them on board to work with joining these devices to their ecosystem of web applications just makes sense.

Basically, I think we're looking at a world where devices run programming languages like C at their core but also can speak JavaScript for the benefits it brings. Very similar to what it looks like IoT.js and JerryScript will bring. I really like the Pebble Smartwatch's approach to this. Their watches run C but their apps use JavaScript for the web connectivity.

When it comes to solutions like IoT.js and JerryScript, they're written initially in C++. However they're providing an entire interface to work with the IoT device via JavaScript. One thing I really like about the IoT.js and JerryScript idea is that I've read that it works with npm - the Node Package Manager. This is a great way of providing access to a range of modules and solutions that already exist for the JavaScript and Node ecosystems. If IoT.js and JerryScript manage memory effectively and can provide a strong foundation for all the low level side of things, then it could be a brilliant way to help make developing for the IoT easier and more consistent with developing for the web with all the benefits I mentioned earlier. It would be especially good if the same functionality was ported to other programming languages too, that would be a fantastic way of getting each IoT device to some level of compatibility and consistency.

I'm hoping to try IoT.js and JerryScript out on a Raspberry Pi 2 soon, I'm intrigued to see how well it runs everything.

What do developers need to consider when building apps for IoT?

Security - If you are building an IoT device which is going to ship out to thousands of people, think security first. Make sure you have a way of updating all of those devices remotely (yet securely) with a security fix if something goes wrong. There will be bugs in your code. Security vulnerabilities will be found in even the most core technologies you are using. You need to be able to issue patches for them!

Battery life - If everyone needs to change your brand of connected light bulbs every two months because they run out of juice - that affects the convenience of the IoT. IoT devices need to last a long time. They need to be out of the way. Battery life is crucial. Avoid coding things in a way which drains battery power unnecessarily.

Compatibility - Work towards matching a standard like the Open Interconnect Consortium or AllSeen Alliance. Have your communication to other devices be simple and open so that your users can benefit from the device working with other IoT devices in new and surprising ways. Don't close it off to your own ecosystem!

What tools do you recommend for developing apps in IoT?

I'm a fan of the simple things. I still use Sublime Text for my coding most of the time as it's simple and out of the way, yet supports code highlighting for a range of languages and situations. It works well!

Having a portable 4G Wi-Fi dongle is also very very valuable for working on the go with IoT devices. It serves as a portable home network and saves a lot of time as you can bring it around as a development Wi-Fi network you turn on whenever you need it.

Heroku is great as a quick free platform to host your own personal IoT prototypes on too while you're testing them out. I often set up Node servers in Heroku to manage my communication between devices and it is the smoothest process I've found out of all of the hosting platforms so far.

For working locally - I've found a service called ngrok is perfect. It creates a tunnel to the web from your localhost, so you can host a server locally but access it online via a publicly accessible URL while testing. I've got a guide on this and other options like it on SitePoint.

Are you seeing an uptick in demand for IoT developers?

I've seen a demand slowly rising for IoT developers but not much of a developer base that is taking the time to get involved. I think partially it is because developers don't know where to start or don't realise how much of their existing knowledge already applies to the IoT space. It's actually one of the reasons I write at SitePoint as a contributing editor - my goal is to try and get more developers thinking about this space. The more developers out there who are getting involved, the higher the chances we hit those breakthrough ideas that can change the world. I really hope that having devices enabled with JavaScript helps spur on a whole community of developers who've spent their lives focused on the value of interconnected devices and shared information get involved in the IoT.

My latest big website endeavour called Dev Diner (http://www.devdiner.com) aims to try and make it easier for developers to get involved with all of this emerging tech too by providing guides on where to look for information, interviews and opinion pieces to get people thinking. The more developers we get into this space, the stronger we will all be as a community! If you are reading this and you're a developer who has an Arduino buried in their drawer or a Raspberry Pi 2 still in their online shopping cart - just do it. Give it a go. Think outside the box and build something. Use JavaScript if that is your strength. If you're stronger at working with C or C++, work to your strength but know that JavaScript might be a good option to help with the communication side of things too.

For more on Patrick’s thoughts on Javascript, read his blog post “Why JavaScript and the Internet of Things?” and catch his O’Reilly seminar here.

Read more…

Guest blog post by ajit jaokar

By Ajit Jaokar @ajitjaokar Please connect with me if you want to stay in touch on linkedin and for future updates

Cross posted from my blog - I look forward to discussion/feedback here

Note: The paper below is best read as a pdf which you can download from the blog for free

Background and Abstract

This article is a part of an evolving theme. Here, I explain the basics of Deep Learning and how Deep learning algorithms could apply to IoT and Smart city domains. Specifically, as I discuss below, I am interested in complementing Deep learning algorithms using IoT datasets. I elaborate these ideas in the Data Science for Internet of Things program which enables you to work towards being a Data Scientist for the Internet of Things  (modelled on the course I teach at Oxford University and UPM – Madrid). I will also present these ideas at the International conference on City Sciences at Tongji University in Shanghai  and the Data Science for IoT workshop at the Iotworld event in San Francisco

Please connect with me if you want to stay in touch on linkedin and for future updates

Deep Learning

Deep learning is often thought of as a set of algorithms that ‘mimics the brain’. A more accurate description would be an algorithm that ‘learns in layers’. Deep learning involves learning through layers which allows a computer to build a hierarchy of complex concepts out of simpler concepts.

The obscure world of deep learning algorithms came into public limelight when Google researchers fed 10 million random, unlabeled images from YouTube into their experimental Deep Learning system. They then instructed the system to recognize the basic elements of a picture and how these elements fit together. The system comprising 16,000 CPUs was able to identify images that shared similar characteristics (such as images of Cats). This canonical experiment showed the potential of Deep learning algorithms. Deep learning algorithms apply to many areas including Computer Vision, Image recognition, pattern recognition, speech recognition, behaviour recognition etc

 

How does a Computer Learn?

To understand the significance of Deep Learning algorithms, it’s important to understand how Computers think and learn. Since the early days, researchers have attempted to create computers that think. Until recently, this effort has been rules based adopting a ‘top down’ approach. The Top-down approach involved writing enough rules for all possible circumstances.  But this approach is obviously limited by the number of rules and by its finite rules base.

To overcome these limitations, a bottom-up approach was proposed. The idea here is to learn from experience. The experience was provided by ‘labelled data’. Labelled data is fed to a system and the system is trained based on the responses. This approach works for applications like Spam filtering. However, most data (pictures, video feeds, sounds, etc.) is not labelled and if it is, it’s not labelled well.

The other issue is in handling problem domains which are not finite. For example, the problem domain in chess is complex but finite because there are a finite number of primitives (32 chess pieces)  and a finite set of allowable actions(on 64 squares).  But in real life, at any instant, we have potentially a large number or infinite alternatives. The problem domain is thus very large.

A problem like playing chess can be ‘described’ to a computer by a set of formal rules.  In contrast, many real world problems are easily understood by people (intuitive) but not easy to describe (represent) to a Computer (unlike Chess). Examples of such intuitive problems include recognizing words or faces in an image. Such problems are hard to describe to a Computer because the problem domain is not finite. Thus, the problem description suffers from the curse of dimensionality i.e. when the number of dimensions increase, the volume of the space increases so fast that the available data becomes sparse. Computers cannot be trained on sparse data. Such scenarios are not easy to describe because there is not enough data to adequately represent combinations represented by the dimensions. Nevertheless, such ‘infinite choice’ problems are common in daily life.

How do Deep learning algorithms learn?

Deep learning is involved with ‘hard/intuitive’ problem which have little/no rules and high dimensionality. Here, the system must learn to cope with unforeseen circumstances without knowing the Rules in advance. Many existing systems like Siri’s speech recognition and Facebook’s face recognition work on these principles.  Deep learning systems are possible to implement now because of three reasons: High CPU power, Better Algorithms and the availability of more data. Over the next few years, these factors will lead to more applications of Deep learning systems.

Deep Learning algorithms are modelled on the workings of the Brain. The Brain may be thought of as a massively parallel analog computer which contains about 10^10 simple processors (neurons) – each of which require a few milliseconds to respond to input. To model the workings of the brain, in theory, each neuron could be designed as a small electronic device which has a transfer function similar to a biological neuron. We could then connect each neuron to many other neurons to imitate the workings of the Brain. In practise,  it turns out that this model is not easy to implement and is difficult to train.

So, we make some simplifications in the model mimicking the brain. The resultant neural network is called “feed-forward back-propagation network”.  The simplifications/constraints are: We change the connectivity between the neurons so that they are in distinct layers. Each neuron in one layer is connected to every neuron in the next layer. Signals flow in only one direction. And finally, we simplify the neuron design to ‘fire’ based on simple, weight driven inputs from other neurons. Such a simplified network (feed-forward neural network model) is more practical to build and use.

Thus:

a)      Each neuron receives a signal from the neurons in the previous layer

b)      Each of those signals is multiplied by a weight value.

c)      The weighted inputs are summed, and passed through a limiting function which scales the output to a fixed range of values.

d)      The output of the limiter is then broadcast to all of the neurons in the next layer.

Image and parts of description in this section adapted from : Seattle robotics site

The most common learning algorithm for artificial neural networks is called Back Propagation (BP) which stands for “backward propagation of errors”. To use the neural network, we apply the input values to the first layer, allow the signals to propagate through the network and read the output. A BP network learns by example i.e. we must provide a learning set that consists of some input examples and the known correct output for each case. So, we use these input-output examples to show the network what type of behaviour is expected. The BP algorithm allows the network to adapt by adjusting the weights by propagating the error value backwards through the network. Each link between neurons has a unique weighting value. The ‘intelligence’ of the network lies in the values of the weights. With each iteration of the errors flowing backwards, the weights are adjusted. The whole process is repeated for each of the example cases. Thus, to detect an Object, Programmers would train a neural network by rapidly sending across many digitized versions of data (for example, images)  containing those objects. If the network did not accurately recognize a particular pattern,  the weights would be adjusted. The eventual goal of this training is to get the network to consistently recognize the patterns that we recognize (ex Cats).

How does Deep Learning help to solve the intuitive problem

The whole objective of Deep Learning is to solve ‘intuitive’ problems i.e. problems characterized by High dimensionality and no rules.  The above mechanism demonstrates a supervised learning algorithm based on a limited modelling of Neurons – but we need to understand more.

Deep learning allows computers to solve intuitive problems because:

  • With Deep learning, Computers can learn from experience but also can understand the world in terms of a hierarchy of concepts – where each concept is defined in terms of simpler concepts.
  • The hierarchy of concepts is built ‘bottom up’ without predefined rules by addressing the ‘representation problem’.

This is similar to the way a child learns ‘what a dog is’ i.e. by understanding the sub-components of a concept ex  the behavior(barking), shape of the head, the tail, the fur etc and then putting these concepts in one bigger idea i.e. the Dog itself.

The (knowledge) representation problem is a recurring theme in Computer Science.

Knowledge representation incorporates theories from psychology which look to understand how humans solve problems and represent knowledge.  The idea is that: if like humans, Computers were to gather knowledge from experience, it avoids the need for human operators to formally specify all of the knowledge that the computer needs to solve a problem.

For a computer, the choice of representation has an enormous effect on the performance of machine learning algorithms. For example, based on the sound pitch, it is possible to know if the speaker is a man, woman or child. However, for many applications, it is not easy to know what set of features represent the information accurately. For example, to detect pictures of cars in images, a wheel may be circular in shape – but actual pictures of wheels may have variants (spokes, metal parts etc). So, the idea of representation learning is to find both the mapping and the representation.

If we can find representations and their mappings automatically (i.e. without human intervention), we have a flexible design to solve intuitive problems.   We can adapt to new tasks and we can even infer new insights without observation. For example, based on the pitch of the sound – we can infer an accent and hence a nationality. The mechanism is self learning. Deep learning applications are best suited for situations which involve large amounts of data and complex relationships between different parameters. Training a Neural network involves repeatedly showing it that: “Given an input, this is the correct output”. If this is done enough times, a sufficiently trained network will mimic the function you are simulating. It will also ignore inputs that are irrelevant to the solution. Conversely, it will fail to converge on a solution if you leave out critical inputs. This model can be applied to many scenarios as we see below in a simplified example.

An example of learning through layers

Deep learning involves learning through layers which allows a computer to build a hierarchy of complex concepts out of simpler concepts. This approach works for subjective and intuitive problems which are difficult to articulate.

Consider image data. Computers cannot understand the meaning of a collection of pixels. Mappings from a collection of pixels to a complex Object are complicated.

With deep learning, the problem is broken down into a series of hierarchical mappings – with each mapping described by a specific layer.

The input (representing the variables we actually observe) is presented at the visible layer. Then a series of hidden layers extracts increasingly abstract features from the input with each layer concerned with a specific mapping. However, note that this process is not pre defined i.e. we do not specify what the layers select

For example: From the pixels, the first hidden layer identifies the edges

From the edges, the second hidden layer identifies the corners and contours

From the corners and contours, the third hidden layer identifies the parts of objects

Finally, from the parts of objects, the fourth hidden layer identifies whole objects

Image and example source: Yoshua Bengio book – Deep Learning

Implications for IoT

To recap:

  • Deep learning algorithms apply to many areas including Computer Vision, Image recognition, pattern recognition, speech recognition, behaviour recognition etc
  • Deep learning systems are possible to implement now because of three reasons: High CPU power, Better Algorithms and the availability of more data. Over the next few years, these factors will lead to more applications of Deep learning systems.
  • Deep learning applications are best suited for situations which involve large amounts of data and complex relationships between different parameters.
  • Solving intuitive problems: Training a Neural network involves repeatedly showing it that: “Given an input, this is the correct output”. If this is done enough times, a sufficiently trained network will mimic the function you are simulating. It will also ignore inputs that are irrelevant to the solution. Conversely, it will fail to converge on a solution if you leave out critical inputs. This model can be applied to many scenarios

In addition, we have limitations in the technology. For instance, we have a long way to go before a Deep learning system can figure out that you are sad because your cat died(although it seems Cognitoys based on IBM watson is heading in that direction). The current focus is more on identifying photos, guessing the age from photos(based on Microsoft’s project Oxford API)

And we have indeed a way to go as Andrew Ng reminds us to think of Artificial Intelligence as building a rocket ship

“I think AI is akin to building a rocket ship. You need a huge engine and a lot of fuel. If you have a large engine and a tiny amount of fuel, you won’t make it to orbit. If you have a tiny engine and a ton of fuel, you can’t even lift off. To build a rocket you need a huge engine and a lot of fuel. The analogy to deep learning [one of the key processes in creating artificial intelligence] is that the rocket engine is the deep learning models and the fuel is the huge amounts of data we can feed to these algorithms.”

Today, we are still limited by technology from achieving scale. Google’s neural network that identified cats had 16,000 nodes. In contrast, a human brain has an estimated 100 billion neurons!

There are some scenarios where Back propagation neural networks are suited

  • A large amount of input/output data is available, but you’re not sure how to relate it to the output. Thus, we have a larger number of “Given an input, this is the correct output” type scenarios which can be used to train the network because it is easy to create a number of examples of correct behaviour.
  • The problem appears to have overwhelming complexity. The complexity arises from Low rules base and a high dimensionality and from data which is not easy to represent.  However, there is clearly a solution.
  • The solution to the problem may change over time, within the bounds of the given input and output parameters (i.e., today 2+2=4, but in the future we may find that 2+2=3.8) and Outputs can be “fuzzy”, or non-numeric.
  • Domain expertise is not strictly needed because the output can be purely derived from inputs: This is controversial because it is not always possible to model an output based on the input alone. However, consider the example of stock market prediction. In theory, given enough cases of inputs and outputs for a stock value, you could create a model which would predict unknown scenarios if it was trained adequately using deep learning techniques.
  • Inference:  We need to infer new insights without observation. For example, based on the pitch of the sound – we can infer an accent and hence a nationality

Given an IoT domain, we could consider the top-level questions:

  • What existing applications can be complemented by Deep learning techniques by adding an intuitive component? (ex in smart cities)
  • What metrics are being measured and predicted? And how could we add an intuitive component to the metric?
  • What applications exist in Computer Vision, Image recognition, pattern recognition, speech recognition, behaviour recognition etc which also apply to IoT

Now, extending more deeply into the research domain, here are some areas of interest that I am following.

Complementing Deep Learning algorithms with IoT datasets

In essence, these techniques/strategies complement Deep learning algorithms with IoT datasets.

1)      Deep learning algorithms and Time series data : Time series data (coming from sensors) can be thought of as a 1D grid taking samples at regular time intervals, and image data can be thought of as a 2D grid of pixels. This allows us to model Time series data with Deep learning algorithms (most sensor / IoT data is time series).  It is relatively less common to explore Deep learning and Time series – but there are some instances of this approach already (Deep Learning for Time Series Modelling to predict energy loads using only time and temp data  )

2)      Multiple modalities: multimodality in deep learning. Multimodality in deep learning algorithms is being explored  In particular, cross modality feature learning, where better features for one modality (e.g., video) can be learned if multiple modalities (e.g., audio and video) are present at feature learning time

3)      Temporal patterns in Deep learning: In their recent paper, Ph.D. student Huan-Kai Peng and Professor Radu Marculescu, from Carnegie Mellon University’s Department of Electrical and Computer Engineering, propose a new way to identify the intrinsic dynamics of interaction patterns at multiple time scales. Their method involves building a deep-learning model that consists of multiple levels; each level captures the relevant patterns of a specific temporal scale. The newly proposed model can be also used to explain the possible ways in which short-term patterns relate to the long-term patterns. For example, it becomes possible to describe how a long-term pattern in Twitter can be sustained and enhanced by a sequence of short-term patterns, including characteristics like popularity, stickiness, contagiousness, and interactivity. The paper can be downloaded HERE

Implications for Smart cities

I see Smart cities as an application domain for Internet of Things. Many definitions exist for Smart cities/future cities. From our perspective, Smart cities refer to the use of digital technologies to enhance performance and wellbeing, to reduce costs and resource consumption, and to engage more effectively and actively with its citizens (adapted from Wikipedia). Key ‘smart’ sectors include transport, energy, health care, water and waste. A more comprehensive list of Smart City/IoT application areas are: Intelligent transport systems – Automatic vehicle , Medical and Healthcare, Environment , Waste management , Air quality , Water quality, Accident and  Emergency services, Energy including renewable, Intelligent transport systems  including autonomous vehicles. In all these areas we could find applications to which we could add an intuitive component based on the ideas above.

Typical domains will include Computer Vision, Image recognition, pattern recognition, speech recognition, behaviour recognition. Of special interest are new areas such as the Self driving cars – ex theLutz pod and even larger vehicles such as self driving trucks

Conclusions

Deep learning involves learning through layers which allows a computer to build a hierarchy of complex concepts out of simpler concepts. Deep learning is used to address intuitive applications with high dimensionality.  It is an emerging field and over the next few years, due to advances in technology, we are likely to see many more applications in the Deep learning space. I am specifically interested in how IoT datasets can be used to complement deep learning algorithms. This is an emerging area with some examples shown above. I believe that it will have widespread applications, many of which we have not fully explored(as in the Smart city examples)

I see this article as part of an evolving theme. Future updates will explore how Deep learning algorithms could apply to IoT and Smart city domains. Also, I am interested in complementing Deep learning algorithms using IoT datasets.

I elaborate these ideas in the Data Science for Internet of Things program  (modelled on the course I teach at Oxford University and UPM – Madrid). I will also present these ideas at the International conference on City Sciences at Tongji University in Shanghai  and the Data Science for IoT workshop at the Iotworld event in San Francisco

Please connect with me if you want to stay in touch on linkedin and for future updates

Follow us @IoTCtrl | Join our Community

Read more…
RSS
Email me when there are new items in this category –

Upcoming IoT Events

More IoT News

IoT Career Opportunities