All Posts (959)

Notable IoT Announcements at CES 2016

CES_Logo.jpg

170,000 attendees from across the globe and 3,600 vendors gathered amongst 2.4 million net square feet of exhibit space debuting the latest products and services across the entire consumer tech ecosystem just concluded CES 2016.

It’s come a long way since spinning out of the Chicago Music show in 1967. Products that have debuted at CES include the videocassette recorder, the compact disc player, HDTV, Microsoft Xbox and smart appliances.

Each year there seems to be a new category in consumer electronics added to the mix. In 2015 the big buzzword was the Internet of Things and it’s weight carried over to 2016 with more than 1000 exhibitors unveiling IoT technologies. For a community like ours focused on the industrial side of the IoT, what does a consumer electronics show have to do with our world?

A lot actually.  

Here are the notable announcements from CES 2016:

 

WiFi HaLow

For industrial IoT heads this is probably the most notable announcement to come out of the show. The Wi-Fi Alliance® introduced a low power, long range standard dubbed Wi-Fi HaLow™ .

In the IoT space with billions of sensors to be placed everywhere, the industry is in need of a low power Wi-Fi solution. Wi-Fi HaLow will be a designation for products incorporating IEEE 802.11ah technology. Wi-Fi HaLow operates in frequency bands below one gigahertz, offering longer range, lower power connectivity to Wi-Fi certified products.

Edgar Figueroa, President and CEO of Wi-Fi Alliance said, “Wi-Fi HaLow is well suited to meet the unique needs of the Smart Home, Smart City, and industrial markets because of its ability to operate using very low power, penetrate through walls, and operate at significantly longer ranges than Wi-Fi today. Wi-Fi HaLow expands the unmatched versatility of Wi-Fi to enable applications from small, battery-operated wearable devices to large-scale industrial facility deployments – and everything in between.”

Many devices that support Wi-Fi HaLow are expected to operate in 2.4 and 5 GHz as well as 900 MHz, allowing devices to connect with Wi-Fi’s ecosystem of more than 6.8 billion installed devices. Like all Wi-Fi devices, HaLow devices will support IP-based connectivity to natively connect to the cloud, which will become increasingly important in reaching the full potential of the Internet of Things. Dense device deployments will also benefit from Wi-Fi HaLow’s ability to connect thousands of devices to a single access point.

The bad news? The Wi-Fi Alliance isn't planning on rolling out HaLow certifications until sometime in 2018, and even if it gets here, it might not be the de-facto standard. There are others vying for the crown.

 

AT&T

AT&T held a developer summit at the Palms Resort which was all about emerging technologies, products and services. A year ago, AT&T launched the M2X Data Service, a cloud-based data storage service for enterprise IoT developers. At CES they announced the commercial launch of Flow Designer, a cloud-based tool developed at the AT&T Foundry that lets IoT developers quickly build new applications. They also said that they are on track to have 50% of their software built on open source. They are working with OpenDaylight, OPNFV, ON.Lab, the Linux Foundation, OpenStack and others. Rachel King of ZDNet has an interview with AT&T President and & CEO Ralph de la Vega here.

ces_2016_keynote_946x432.jpg.thumb.432.946.png

 

Ericsson

Ericsson and Verizon announced joint activities to further the development and deployment of cellular low-power wide-area (LPWA) networking for a diverse range of IoT applications. Ericsson introduced three IoT solutions for smart homes and cities:

  • Smart Metering as a Service puts consumers in control and enables utility companies to offer "smart" services to consumers in the future.

  • User & IoT Data Analytics enables controlled access and exposure of data from cellular and non-cellular devices and creates value through cross-industry offerings.

  • Networks Software 17A Diversifies Cellular for Massive IoT, supporting millions of IoT devices in one cell site, 90 percent reduced module cost, 10+ years battery life and 7-time cell coverage improvement.

 

IBM Watson

Last year, IBM announced a USD 3 Billion investment in Internet of Things, and in October, they announced plans to acquire The Weather Company, accelerating IBM's efforts in the IoT market that is expected to reach USD 1.7 trillion by 2020.

They furthered their commitment with five related IoT announcements at CES: Softbank, Whirpool, Under Armour, Pathway Genomics and Ford. What IBM does with Watson in the consumer space will carry over to the industrial space and vice versa. With the tremendous volumes of data from IoT, Watson’s advanced power of cognitive computing will be one way to exploit this new resource. Fortune’s Stacey Higginbotham has more here.

 

Intel

Lady GaGa aside, Intel made one announcement at CES which I think got through a lot clearer than Qualcomm’s 14 announcements! Rather than focus on technical aspects, Intel announced innovative technologies and collaborations aimed at delivering amazing experiences throughout daily life - which we often forget to do as we get enamored by the 1’s and 0’s. From unmanned aerial vehicles and wearables to new PCs and tablets, Intel made sure their chip was in it. On the industrial front was the DAQRI Smart Helmet, an augmented reality helmet for the industrial worker, powered by an Intel® Core™ M processor.

intelcesHH_BK_Photo.jpg

 

Qualcomm

Qualcomm made a mind-boggling 14 announcements in the CES time frame. Probably the most interesting was the Qualcomm® Snapdragon™ X5 LTE modem (9x07). Qualcomm said the chip has multimode capability and supports LTE Category 4 download speeds up to 150 Mbps. It’s designed to be used in a range of mobile broadband applications and in IoT use cases that demand higher data rates.

 

Samsung

The President and CEO of Samsung Electronics, BK Yoon, delivered the opening keynote speech CES, calling for greater openness and collaboration across industries to unlock the infinite possibilities of the Internet of Things. Mr. Yoon announced a timetable for making Samsung technology IoT-enabled. By 2017, all Samsung televisions will be IoT devices, and in five years all Samsung hardware will be IoT-ready. He also emphasized the importance of developers in building IoT and announced that Samsung will invest more than USD 100 million in its developer community in 2015.

 

ZigBee Alliance

The ZigBee Alliance, a non-profit association of companies creating open, global standards that define the Internet of Things for use in consumer, commercial and industrial applications, announced that it is working with the Thread Group on an end-to-end solution for IP-based IoT networks. The solution will become part of the ZigBee Alliance’s comprehensive set of product development specifications, technologies, and branding and certification programs.

 

I’m sure there were many more industrial Internet of Things announcements. Let me know what I missed in the comments section below.




Read more…

5 Ways SMS Messaging Will Play Out in IoT

Technophiles and dreamers unite in their joint vision of a future where our lives are connected via a network of devices that electronically talk both to each and to us. This intelligent design, often – and fondly – referred to as the 'Internet of Things', is a way to semi-automate everything from our homes to our workplaces to all kinds of fun and functional activities in between. While there are already glimpses of new ways to push the boundaries of cutting-edge communication, currently SMS messaging is slated to play a major role in how we live our day-to-day lives in a smarter way.

Housekeeping Reimagined

How many times have you forgotten to transfer wet clothes from the washer to the dryer and found yourself needing to rewash a stinky pile of long-sitting laundry? Perhaps you're notorious for not watering your plants, or maybe you're nagged by the idea that you've driven off and left the garage door up. As programming develops and device interconnectivity grows, you might get a text message from your smart flower pot or pre-programmed appliances alerting you to the error and giving you options as far as corrective action. Even better, the appliances' connection to the internet will allow them to guide you towards the proper amount of water for your plant or the temperature your fridge ideally should be at so you can make an educated decision about what steps to take next. If it sounds too good to be true, think again; by 2019, there will be 1.9 billion home devices connected across the IoT to a tune of some $490 billion in revenue.

Medical Milestones

The Internet of Things holds great promise for the medical industry. Geoff Zawolkow, CEO of Lab Sensor Solutions, says, "Sensors are changing the face of medicine. Mobile sensors are used to automatically diagnosis disease and suggest treatment, bringing us closer to having a Star Trek type Tricorder. Also mobile sensors will ensure the quality of our drugs, diagnostic samples and other biologically sensitive materials through remote monitoring, tracking and condition correction." SMS may connect medical professionals in their quest for quicker and more accurate diagnoses, but there are also real-world applications for everyday use as well; there are already pill boxes that will text you a reminder if you forget to take your daily dose, and a clever wearable gadget could send an alert to your phone if your heart rate or blood pressure read abnormally.

Security

We all seek to protect our home and loved ones, and the Internet of Things is making that easier and easier. Smart locks with electronic sensors can be activated – or deactivated, should your child arrive home to an empty house and find themselves unable to remember the entry code – by text, and should a break-in occur, emergency services and other chosen parties will get an SMS update as well.

Promoting Independent Living

The Internet of Things can help elderly relatives live alone longer by providing a constant connection between them and their caregivers. A network of wireless sensors placed around the home and even worn on the person can track, log, and transmit a person's daily activities and alert the proper authorities if there's a heat flare (fire), lack of regular activity (sudden illness or a fall), or even a fever or elevated heart rate. The alert threshold can be adjusted to maintain privacy and allow for discretion in all but certain circumstances deemed text-worthy by those involved. The result is greater independence for our parents and grandparents and peace of mind for those who love them the most.

Streamlining Inventory and Ordering

Running out of milk or eggs at home is inconvenient enough, but in the restaurant industry inventory mistakes can be practically catastrophic. Connected coolers, freezers, pantries, and storage containers send an automated SMS message when a product drops below a set level, with embedded data that can be plugged into an ordering system or forwarded right on to the distributor to maximize efficiency.

Experts say that there may be as many as 30 billion devices in the collective Internet of Things by the year 2020 – a huge web of connected devices that work in concert to make our lives bigger, better, and more efficient. Read more information about the impending electronic evolution and prepare yourself for a brave new world.

Read more…

2016 Predictions: IoT

Internet of Things (IoT) has garnered massive attention across the tech industry and portends major productivity advances for businesses of all types. The coming year holds significant promise as, potentially, the year in which actual (i.e., not simply existing products and services rechristened as IoT) business and industrial IoT deployments hit the mainstream.

Here are a few predictions from Bsquare.

1.  The fragmentation of IoT will gain speed

The IoT market is so broad that research data analyzing size and growth aspects of the market become almost meaningless. In fact, with such a broadly defined market, saying the IoT market will generate x trillions of dollars of economic impact is analogous to saying the same about the “technology” market. Interesting maybe, but hardly actionable for buyers, investors, suppliers or other market participants.

Going forward, the industry will begin to break the IoT market apart into subsets that do actually fit together. Many of these discrete markets may even dispense with the IoT acronym.

At the highest level this segmentation is already occurring. Consumer and business IoT, for example, are discrete markets having very little in common. Going further, and just looking at business IoT, segments are emerging for devices and device software, cloud services, machine learning, predictive analytics, and others that allow trends to be more accurately identified and tracked.

2.  Business IoT will see that fastest pace of innovation

While interesting developments are occurring in the consumer space, and these will continue to garner disproportionate attention, the most meaningful innovation will occur on the business side of the market. This is primarily due to the fact that while consumers may purchase IoT products for purely personal reasons, businesses embark on IoT initiatives with specific business objectives in mind, most of which translate directly or indirectly into improved financial outcomes. As a result, business IoT systems are considerably more complex and multifaceted, in many cases touching many core operational systems, yet afford more opportunities to innovate.

Advances in on-board device logic, machine learning, complex rule processing, predictive analytics, as well as IoT platforms in general, will raise the bar in terms of business outcome improvements that can be derived with IoT.

3.  A movement will be launched to capitalize the ‘o’ in IoT

Acronyms have long been smiled upon by technology people. They contribute to verbal economy and improve communications efficiency. For example, saying “TCP” requires five fewer syllables than “transmission control protocol.” And while there is no convention for the structure of acronyms, they are typically all caps in order to distinguish them from abbreviations.

However, starting with voice over IP (VoIP), technology acronyms started to get playful with capitalization. This was followed by fill-in-the-blank-as-a-service (SaaS, IaaS, PaaS, etc.) and, among others, IoT. The rationale for this was undoubtedly that words like “over,” “as,” “a,” and “of” are not important enough to warrant capitalization. This wouldn’t be a problem, and might even by slightly amusing, if it weren’t for the fact that the acronym most frequently appearing near IoT is ROI (return on investment). How do we account for the fact that “on” warrants a capital “O” while “of” has to get by with a small “o”?

4.  Analyst estimates will start to decline… but become more realistic

Stupendously ridiculous numbers have been bandied about regarding the potential size of the IoT market, the number of things participating, and total economic impact. The zeal to outdo one another quickly led to numbers in the trillions (one large company forecast economic impact at $82 trillion; by way of reference, nominal gross product for the planet earth is roughly $74 trillion (to be fair, the author didn’t specify a planet)). It seemed only a matter of time before someone would finally break out the “Q” word. E.g., “global economic value attributable to IoT is expected to eclipse two quadrillion dollars by the year… .”

As we get closer to reality many of these forecasts have been ratcheted down, in some cases by an order of magnitude. We expect these refinements will continue but at the same time become more realistic. In some ways, the progression of market forecasts follows the shape of Gartner’s well-known hype curve—progressively more outlandish estimates followed by a crashing back down to earth and finally settling into more realistic and sustainable ranges.

5.  2016 will be the year actual business IoT deployments accelerate

Not unlike any new technology, there is a propensity among suppliers to rechristen products and/or services they already offer using terminology associated with that new technology. Hence it might appear that the business-oriented IoT market is already going gangbusters when in fact it’s still in its infancy. This tendency is understandable and, in some cases, not completely without merit but what is truly interesting for businesses are complete systems where intelligent devices generate data that is captured by enterprise systems in order to automatically drive desired business outcomes. This, more than anything else, is why IoT is not even remotely the same as M2M.

For possibly the first time, 2016 will mark the beginning of complete, large scale IoT systems that directly and automatically link devices with business outcomes.

Read more…

Decision Scientist vs. Data Scientist

Someone asked a question in a LinkedIn forum on the distinction between Decision Science and Data Science.

My 2 cents. Others may disagree, given their particular orientation. Given a choice, we are in Decision Science domain; Given a firehose we are in Data Science domain. I would suggest that Decision Sciences ends where Data Science begins. Decision Scientists don't necessarily work with Big Data. Data Scientists are specialized to work with Big Data (or recognize when something isn't truly Big Data) and all the problems associated with that domain.

Decision Scientists build decision support tools to enable decision makers to make decisions, or take action, under uncertainty with a data-centric bias. Traditional analytics falls under this domain. Often decision makers like linear solutions that provide simple, explainable, socializable decision making frameworks. That is, they are looking for a rationale. Data Scientists build machines to make decisions about large-scale complex dynamical processes that are typically too fast (velocity, veracity, volume, etc.) for a human operator/manager. They typically don't concern themselves with whether the algorithm is explainable or socializable, but are more concerned with whether it is functional, reliable, accurate, and robust.

When the decision making has to be done at scale, iteratively, repetitively, in a non convex domain, in real-time, you need a Data Scientist and lots of compute power, bandwidth, storage capacity, scalability, etc. The dynamic nature of the generating process which leads to high volume, high velocity data, in my mind, is the differentiating factor between the two domains.

The transition from manually input data sources to sensor-driven real time data sources is the underlying theme of the "Internet of Things" and this is especially true with the "Industrial Internet of Things" with complex machines interconnected with hundreds of sensors relaying data continually. The underlying math may be somewhat sharable, but doing it at scale, at velocity, etc. requires an end-to-end approach, and a greater emphasis on high speed algorithms. This is particularly true when you are talking about IoT, and especially IIoT, where you do not want humans to be making decisions "on the fly".

A Decision Science problem might be something like a marketing analytics problem where you are segmenting a customer base, identifying a high margin target market, qualifying leads, etc. Here the cost of a bad 'decision' is relatively low. I view Decision Science from the perspective of "decision making under uncertainty" (see Decision theory) which suggests a statistical perspective to begin with. A Data Science problem might be more like "How do I dynamically tweak a Jet engine's performance in flight to ensure efficiency/uptime during flight, to achieve stable and reliable output, to optimize fuel consumption, to reduce maintenance costs, and to extend the useful service life of the engine?" The cost of a failure in flight can be pretty high. Here, you would want to have a reliable machine making those computations and decisions in real time, relying on both ground and in-flight streaming data for real time condition monitoring.

In reality, practitioners tend to be "Agnostic" Scientists that are transferring skills from one domain to another, with varying degrees of comfort with the different tools out there. However, a Data Scientist is more likely to have a diverse background that spans multiple disciplines including statistics, computing, engineering, etc. In these examples, the decision maker is a key differentiating factor in my view. A Decision Scientist typically provides a framework for decision making to a human being; a Data Scientist provides a framework for decision making to a machine.

When machines become more human, this distinction may be called into question, but for now, I think it works.

What do you think? Did I just give you a choice? Or, did I just deluge you with data?  Your call.

---

Note: In the above marketing analytics example, I am not referring to clickstream data; rather, I refer to historical records stored on an RDBMS somewhere.

Originally posted on Data Science Central

Follow us @IoTCtrl | Join our Community

Read more…

Data Security Trends for 2016

Data Security Professionals: What You Need to Know NOW: Trends for 2016

There are some scary things happening in data security. Along with the rise of the Internet of Things there has been a corresponding push by hackers to wrest the cloud from us law-abiding folks.

“Gartner is predicting that 6.4 billion connected “things" will be in use globally by the end of 2016 - up 30 percent from 2015 - and that number is expected to reach 20.8 billion by the year 2020. As more Internet connected devices hit the market, so too do the vulnerabilities that come with them, as evidenced by highly-publicized incidents of 2015 where researchers exploited vulnerabilities in planes, guns, medical devices and automobiles.

As the Internet of Things market expands and innovates, researchers will continue to find and uncover exploitable vulnerabilities in these newly connected “things,” which will in turn continue to fan the flames of responsible disclosure.”- Information Management

Companies are having a difficult time finding data security pros who know how to conquer this new frontier of data security in this “every business is an IT business age.”

Information Management Magazine had some cool ideas on this front:

Consolidation of IT Security

Big companies are buying out medium companies and then these really big companies are eating all of the “little fish” in sight. Dell buys EMC. Cisco buys Lancope. They all begin to buy companies like Adallom, Aorato and Secure Islands. It’s not going to stop next year, in fact, it will accelerate.

“It’s worth noting that offering up a “one stop shop” experience is completely different than being able to integrate technologies together to offer a seamless user experience.” Will that seamless user experience include seamless security?

Responsible Disclosure

You’ve got a Certified Hacker on staff who has uncovered some issues that overlap into the public domain. How much are you legally (never mind morally) required to divulge to regulators and/or competitors? According to IM, this issue will only get thornier as 2016 progresses: 

“White hat” hackers, hired to scope out flaws in systems, are already facilitating company / researcher relationships within the technology industry via bug bounty programs. However, it seems that many segments of the manufacturing industry would rather utilize lawyers to block research altogether than address the vulnerabilities that are uncovered. Another option for security researchers to consider is self-regulation, where they accept the risks and responsibilities associated with their findings.”

Smaller Businesses Up Security Spending

Remember the famous hacks of 2015? They were publicized more than ever before.  Companies like "LastPass, Securus Technologies, VTech and TalkTalk (are being targeted by) cybercriminals because they’re seen as less secure, while oftentimes owning valuable customer data.” These cyberattacks will grow in 2016.

People in the Cloud Share Responsibility

If you deploy in the cloud you share security responsibilities. Small to medium companies are hiring internally or taking advantage of Cloud Services’ security add-ons in contracts. To get a quick primer, check out Amazon’s shared responsibility model.

The other items in Information Management’s list include improved incident response protocols including communications and crisis management to calm investors and consumers; and enhanced collaboration among our communities as “security professionals are utilizing tools and platforms in order to better share and collaborate on security research and uncovering and responding to threats.” The folks at IM “expect this to increase and become more formalized amongst organizations, industry verticals and individual practitioners over the next year.”

What trends would you like us to keep an eye on for you as a cutting-edge data security specialist or leader? Let us know! We’d love to include your favorite topics right here.Email me. Until then, stay safe!

Read more…

Guest blog post by Bernard Marr

US agricultural manufacturer John Deere has always been a pioneering company. Its eponymous founder personally designed, built and sold some of the first commercial steel ploughs. These made the lives of settlers moving into the Midwest during the middle of the 19th century much easier and established the company as an American legend.

Often at the forefront of innovation, it is no surprise that it has embraced Big Data enthusiastically – assisting pioneers with the taming of the virtual wild frontier just as it did with the real one.

In recent years, it has focused efforts on providing Big Data and Internet of Things solutions to let farmers (and in the case of their industrial division with the black and yellow logo, builders) to make informed decisions based on real-time analysis of captured data.

So in this post I want to take a look at some of John Deere’s innovations in the virtual realm, and how they are leading to change which is said to be “revolutionizing” the world of farming.

Smart farms

The world’s population is growing rapidly, which means there is always going to be an increasing demand for more food. With the idea of genetically modified food still not appealing to public appetites, increasing the efficiency of production of standard crops is key to this. To this end, John Deere has launched several Big Data-enabled services which let farmers benefit from crowdsourced, real-time monitoring of data collected from its thousands of users.

They are designed by the company’s Intelligent Solutions Group, and the vision is that one day even large farms will be manageable by a small team of humans working alongside a fleet of robotic tools, all connected and communicating with each other.

To this end, they are working on a suite of services to allow everything from land preparation to seeding, fertilizing and harvesting to be controlled from a central hub.

The total land available can be split into sections and “Prescriptions” issued with precise instructions for seed density, depth and fertilization. These decisions are informed by Big Data – aggregated data from thousands of users feeding their own data back to the service for analysis.

Crowd sourced agriculture

Myjohndeere.com is an online portal which allows farmers to access data gathered from sensors attached to their own machinery as they work the fields, as well as aggregated data from other users around the world. It is also connected to external datasets including weather and financial data.

These services allow farmers to make better informed decisions about how to use their equipment, where they will get the best results from, and what return on their investment they are providing.

For example, fuel usage of different combines can be monitored and correlated with their productivity levels. By analyzing the data from thousands of farms, working with many different crops in many different conditions, it is possible to fine-tune operations for optimum levels of production.

The system also helps to minimize downtime by predicting, based on crowdsourced data, when and where equipment is likely to fail. This data can be shared with engineers who will stand ready to supply new parts and service machinery as and when it is needed – cutting down on waste caused by expensive machinery sitting idle.

Another service is Farmsight, launched in 2011. It allows farmers to make proactive decisions about what crops to plant where, based on information gathered in their own fields and those of other users. This is where the “prescriptions” can be assigned to individual fields, or sections of fields, and machinery remotely reprogrammed to alter their behavior according to the “best practice” suggested by the analytics.

As well as increasing farmers’ profits and hopefully creating cheaper, more abundant food for the world, there are potential environmental gains, too.

Pesticides and fertilizer can often cause pollution of air and waterways, so having more information on the precise levels needed for optimum production means that no more than is necessary will be used.

Who owns your agricultural data?

Of course, with all of this data being generated and shared – there is one question which needs answering – who owns it?

Deere offers what it calls its Deere Open Data Platform, which lets farmers share data with each other (or choose not to, if they wish) and also with third party application developers, who use can the APIs to connect equipment by other manufacturers, or to offer their own data analysis services.

But this has not stopped many farmers asking why they should effectively pay for their own data, and asking why John Deere and other companies providing similar services shouldn’t pay them – according to American Farm Bureau Federation director Mary Kay Thatcher.

Talks are currently ongoing between the AFBF and companies including John Deere, Monsanto and DuPont over how these concerns should be addressed. As well as privacy worries, there are concerns that having too much information could allow traders in financial markets to manipulate prices.

Farming is one of the fundamental activities which makes us human and distinguishes us from animals. Once we developed farms, we no longer needed to constantly be on the move in the pursuit of food and fertile foraging spots, leading to the development of towns, cities and civilization.

The future of farming?

With the development of automation and Big Data, we are starting to delegate those responsibilities to robots – not because farmers are lazy (they really aren’t, as anyone who lives in an area where agricultural activity goes on will tell you!) but because they can often do it better.

Sure, John Deere’s vision of vast areas of farmland managed by a man sitting at a computer terminal with a small team of helpers will lead to less employment opportunities for humans working the land, but that has been the trend for at least the last century, regardless.

And the potential for huge positive change– in a world facing overpopulation and insufficient production of food – particularly in the developing nations, is something that has the potential to benefit everyone on the planet.

I hope you found this post interesting. I am always keen to hear your views on the topic and invite you to comment with any thoughts you might have.

About : Bernard Marr is a globally recognized expert in analytics and big data. He helps companies manage, measure, analyze and improve performance using data.

His new book is: Big Data: Using Smart Big Data, Analytics and Metrics To Make Better Decisions and Improve Performance You can read a free sample chapter here

 

Follow us @IoTCtrl | Join our Community

Read more…

IT Ops Challenge

Each layer of technology in the data centre is becoming progressively more complex to control and manage. The average server environment now has thousands of configuration parameters (e.g. Windows OS contains – 1,500+, IBM WebSphere Application Server – 16,000+, and Oracle WebLogic –  60,000+). The growing interdependence and complexity of interaction between applications also makes it increasingly difficult to manage and control business services.

IT change is very much a fact of life and it takes place at every level of the application and infrastructure stack. It also impacts pretty much every part of the business! To meet these development challenges businesses have adopted agile development processes to accelerate application release schedules. By employing practices such as continuous integration and continuous build they are able to generate hundreds of production changes each day. For example, eBay is estimated as having around 35,000 changes per year!

Industry analyst firm Forrester have stated that, “If you can’t manage today’s complexity, you stand no chance managing tomorrow’s. With each passing day, the problem of complexity gets worse. More complex systems present more elements to manage and more data, so growing complexity exacerbates an already difficult problem. Time is now the enemy because complexity is growing exponentially and inexorably.

The tools we use to manage IT infrastructure have been around for many years but are only capable of measuring what has happened. They are also not designed to deal with the complexity and dynamics of our modern IT technologies. IT operation teams need to be able to automate the collection and analysis of vast quantities of data down to the finest resolution and be able to highlight any changes to unify the various operations silos. None of the traditional tools are up to this ‘big data’ problem!

Big data for operations is still a relatively new paradigm and Gartner has defined the sector as “IT Operations Analytics.” and one that can enable smarter and faster decision-making in a dynamic IT environment with the objective to deliver better services to your customers. Forrester Research defines IT analytics as “The use of mathematical algorithms and other innovations to extract meaningful information from the sea of raw data collected by management and monitoring technologies.”

Despite its relative age a lot has already moved on and here are a few interesting findings:

  • Customer analytics (48%), operational analytics (21%), and fraud & compliance (21%) are now the top three uses for Big Data.
  • 15% of enterprises will use IT operations analytics technologies to deliver intelligence for both business execution and IT operations.
  • The market to be mainstream in 2018, making up 10% of the $20+ billion IT Operations Management software category.
  • 89% of business leaders believe big data will revolutionize business operations in the same way the Internet did.
  • 79% agree that ‘companies that do not embrace Big Data will lose their competitive position and may even face extinction

Where to use ITOA?

IT Operations Analytics (ITOA) is also known as Advanced Operational Analytics, or IT Data Analytics) and encapsulate technologies that are primarily used to discover complex patterns in high volumes of ‘noisy’ IT system availability and performance data. Gartner has outlined five core applications for ITOA:

  • Root Cause Analysis: The models, structures and pattern descriptions of IT infrastructure or application stack being monitored can help users pinpoint fine-grained and previously unknown root causes of overall system behavior pathologies.
  • Proactive Control of Service Performance and Availability: Predicts future system states and the impact of those states on performance.
  • Problem Assignment: Determines how problems may be resolved or, at least, direct the results of inferences to the most appropriate individuals or communities in the enterprise for problem resolution.
  • Service Impact Analysis: When multiple root causes are known, the analytics system’s output is used to determine and rank the relative impact, so that resources can be devoted to correcting the fault in the most timely and cost-effective way possible.
  • Complement Best-of-breed Technology: The models, structures and pattern descriptions of IT infrastructure or application stack being monitored are used to correct or extend the outputs of other discovery-oriented tools to improve the fidelity of information used in operational tasks (e.g., service dependency maps, application runtime architecture topologies, network topologies).
  • Real time application behavior learning: Learns & correlates the behavior of application based on user pattern and underlying Infrastructure on various application patterns, create metrics of such correlated patterns and store it for further analysis.
  • Dynamically Baselines Threshold: Learns behavior of Infrastructure on various application user patterns and determines the optimal behavior of the Infra and technological components, bench marks and baselines the low and high water mark for the specific environments and dynamically changes the bench mark baselines with the changing infra and user patterns without any manual intervention

By employing advanced analytics to harness vast volumes of highly diverse data from various applications and endpoints across an organisation’s IT infrastructure, ITOA solutions provide IT service desks with instant awareness of issues as they occur – and often before the person at the other end is able to acknowledge. Along with awareness, they deliver an understanding of how these issues could in turn affect both the IT infrastructure and the wider business.

Conclusion

IT operations teams are being challenged to run larger, more complex, hybrid and geographically dispersed IT systems that are constantly in a state of change without growing the number of people or resources. Everything from system successes to system failures and all points in between are logged and saved as IT operations data. IT services, applications, and technology infrastructure generate data every second of every day. All that raw, unstructured, or polystructured data is critical in managing IT operations successfully. The problem is that doing more with less requires a level of efficiency that can only come from complete visibility and intelligent control based on the detailed information coming out of IT systems.

ITOA provides a set of powerful tools that can generate the necessary insight to help IT operations teams to proactively determine risks, impacts, or the potential for outages that may come out of various events that take place in the environment. Allowing a new way for operations to proactively manage IT system performance, availability, and security in complex and dynamic environments with less resources and greater speed. ITOA contributes both to the top and bottom line of any organization by cutting IT operations costs and increasing business value through greater user experience and reliability of business transactions.

ITOA technologies are still relatively immature and Gartner have stated that it will take another 2-5 years for them to reach maturity. However, the smart MSP’s are moving fast to incorporate these technologies in their portfolio’s and IT consumers to demand these technologies from their partners. In the next few years it is forecast that the vast majority of Global 2000 companies will have deployed IT Operations Analytics Platforms as a central component of their architecture for monitoring critical applications and IT services. The key message is If you have not already started to look at ITOA it is time to start planning…

Originally posted on Data Science Central
Follow us @IoTCtrl | Join our Community
Read more…

50 Predictions for the Internet of Things in 2016

Earlier this year I wrote a piece asking “Do you believe the hype?” It called out an unlikely source of hype: the McKinsey Global Institute. The predictions for IoT in the years to come are massive. Gartner believes IoT is a central tenet of top strategic technology trends in 2016. Major technology players are also taking Big Swings. Louis Columbus, writing for Forbes, gathered all the 2015 market forecasts and estimates here.

So what better way to end the year and look into the future than by asking the industry for their predictions for the IoT in 2016. We asked for predictions aimed at the industrial side of the IoT. What new technologies will appear? Which companies will succeed or fail? What platforms will take off? What security challenges will the industry face? Will enterprises finally realize the benefits of IoT? We heard from dozens of startups, big players and industry soothsayers. In no particular order, here are the Internet of Things Predictions for 2016.

Tweet this article here

Photo Credit: Sean Creamer via Flickr

Nathaniel Borenstein, inventor of the MIME email protocol and chief scientist Mimecast

“The maturation of the IoT will cause entirely new business models to emerge, just as the Internet did. We will see people turning to connected devices to sell things, including items that are currently "too small" to sell, thus creating a renewed interest in micropayments and alternate currencies. Street performers, for example, might find they are more successful if a passerby had the convenience of waving a key fob at their "donate here" sign. The IoT will complicate all aspects of security and privacy, causing even more organizations to outsource those functions to professional providers of security and privacy services.”

Adam Wray, CEO, Basho

"The deluge of Internet of Things data represents an opportunity, but also a burden for organizations that must find ways to generate actionable information from (mostly) unstructured data. Organizations will be seeking database solutions that are optimized for the different types of IoT data and multi-model approaches that make managing the mix of data types less operationally complex.”

Geoff Zawolkow, CEO, Lab Sensor Solutions

“Sensors are changing the face of medicine. Mobile sensors are used to automatically diagnosis disease and suggest treatment, bringing us closer to having a Star Trek type Tricorder. Also mobile sensors will ensure the quality of our drugs, diagnostic samples and other biologically sensitive materials through remote monitoring, tracking and condition correction.”

Zach Supalla, CEO, Particle

“2016 isn't the Year of IoT (yet)- It's A Bump in the Road. The industry has been claiming it’s the year of IoT for the last ​five years - let’s stop calling it the year of the IoT and let's start to call it the year of experimentation. 2016 will be the year that we recognize the need for investment, but we’re still deeply in the experimental phase. 2016 will be the bump in the road year - but at the end of it, we’ll have a much better idea of how experiments should be run, and how organizations can “play nicely” within their own walls to make IoT a reality for the business.”

Borys Pratsiuk, Ph.D, Head of R&D Engineering, Ciklum

"The IoT in medicine in 2016 will be reflected in deeper consumption of the biomedical features for non-invasive human body diagnostics. Key medical IoT words for next year are the following: image processing, ultrasound, blood analysis, gesture detection, integration with smart devices. Bluetooth and WiFi will be the most used protocols in the integration with mobile."

Brian T. Patterson, President, EMerge Alliance US Representative, International Electrotechnical Council

“IoT to Enable an Enernet 2016 will see the IoT starting to play a major role in the evolution of a new, more resilient, efficient, flexible and sustainable 21st Century electric energy platform. IoT connected sensors and microcontrollers will enable the effective and efficient management of a true mesh network of building and community level microgrids, which in turn will enable the greater use of distributed renewable energy sources like solar, wind, bio fuel micro-turbines and fuel cells. The convergence of data networks and physical energy grids will give rise to what will become the Enernet, a data driven transactional energy network.”

Chris Rommel, Executive VP, IoT & Embedded Technology, VDC Research

“PaaS Solution Evolution to Cannibalize IoT Platform Opportunity: The landscape of Platform-as-a-Service (PaaS) solutions is changing rapidly. In 2015, leading PaaS providers IBM, Oracle, and SAP threw their hats into the “IoT platform” ring. As quickly as the value of PaaS solutions had been placed on the consumerization and user experiences of development platform offerings, premiums have now been placed on the ease of back-end integrations. However, the value associated with time to market in the Internet of Things marketplace is too high. IoT solution development and engineering organizations still want the flexible benefits offered by PaaS development, but they also require a breadth of out-of-the-box integrations to mitigate the downstream engineering and deployment hassles caused by heterogeneous IoT systems and networks topologies. The desire and need for enterprise organizations to tightly integrate deployed systems' operations with enterprise business functions are reshaping PaaS selection. The need for tight, out-of-the-box integrations extends beyond the back-end, however. Bi-directional integration is critical. The heterogeneous nature of the IoT and wide range of device form factors, components and functions is too complex and costly to rely on bespoke integrations. As such, we expect the aforementioned PaaS leaders to accelerate their ecosystem development efforts in 2016. Although we likely won’t see any real winners yet emerge in the IoT PaaS space, I do expect that the investments made by the aforementioned large players to threaten the market opportunity available to smaller IoT-focused platform vendors like Arrayent and Carriots.”

Laurent Philonenko, CTO, Avaya

“Surge in connected devices will flood the network – the increasing volume of data and need for bandwidth for a growing number of IoT connected devices such as healthcare devices, security systems and appliances will drive traditional networks to the breaking point. Mesh topologies and Fabric-based technologies will quickly become adopted as cost-effective solutions that can accommodate the need for constant changes in network traffic.”


Lila Kee, Chief Product Officer and Vice President, Business Development, GlobalSign

“Prediction: PKI becomes ubiquitous security technology within the Internet of Things (IoT) market. It's hard to think of a consumer device that isn't connected to the Internet these days - from our baby monitors to our refrigerators to our fitness devices. With the increase of connected devices of course comes risk of exposing privacy and consumer data. But, what happens when industrial devices and critical infrastructure connect to the Internet and get hacked? The results can be catastrophic. Security and safety are real concerns for the Internet of Things (IoT) and especially in the Industrial Internet of Things (IIoT). Regarding security, the industrial world has been a bit of a laggard, but now equipment manufacturers are looking to build security in right at the design and development stages. Unless the security challenges of IIoT can be managed, the exciting progress that has been made in this area of connected devices will slow down dramatically. PKI has been identified as a key security technology in the IIoT space by the analyst community and organizations supporting the IIoT security standards. In 2016, we expect that PKI will become ubiquitous security technology within the IoT market. There will be an increased interest in PKI, how it plays in the IoT market and how it needs to advance and scale to meet the demands of billions of devices managed in the field.”

IoT Central members can see all the predictions here. Become a member today here

Read more…

Guest blog post by Humera Malik

Getting ready for the Industrial Internet of Things

 The IoT has been influencing the creation of a connected world where unbelievable amounts of data are generated from just about every imaginable thing. These connected, and previously dormant, things are now going to be able to communicate- from my dog’s collar, the milk carton, my car, my thermostat, my watch, my washer and dryer (not that I would particularly enjoy that conversation), you name it. This connectivity and mass data production is exciting because it is defining the future by creating an eco-system where a huge variety of technologies have to work together, thus breaking down silos. Coming from the telco world, that is refreshing.

So what are we supposed to do with this influx of data and connected devices? We’ve been sitting down with IoT decision makers in the industry, and we’ve realized a common theme –at the end of the day, they all want to generate value for their customers from this technology.

Transition before Transformation

The realization of data as an asset is a big leap. In certain industries, less than 1% of the data generated is leveraged to optimize and predict in the working environment. The number of connected devices and data that will be generated is a daily moving target (a recent forecast says 28 billion devices in the next 5 years).

Be it 20 billion or more, we must take measures to accommodate this growth. Both the challenges and opportunities lie in roughly two realms- the consumer world and the industrial world. In the consumer world it is easier to measure value with the advent of some of these wearables, although I am on the fence about the privacy logistics (another topic for another day).

But, in my opinion, the industrial world is where we will see the next wave of technological evolution, through revolutionary smart machinery- machines that can learn and adapt. Big giants like GE and others are already developing connected equipment and devices that will shape the future of the industrial world. In the meantime, before this industrial transformation, small steps need to be taken to help many industries adapt and prepare for the connected future. 

Learn and Adapt

We've seen manufacturing, energy, and healthcare take the lead by implementing the beginning steps- sensors and connected devices. The front-runners are working to decipher a flood of new data sets, as well as trying to leverage the wealth of historical data that is readily available and, in my opinion, something we cannot afford to lose track of. Combined, this old and new data will allow us to predict the future- and isn’t that the dream of modern business?

But before your machine learning systems can take over, and optimize efficiency for cost controls while you go and figure out new revenue streams, my advice is to be realistic. The most success has been gained by focusing and placing priority on building an IoT strategy and a roadmap – such as building efficient production environments, with the KPI’s that will optimize for you, rather than a fully connected factory floor. Build a roadmap, and focus on the end KPI’s rather than the technology and a platform. Take small steps towards adopting an IoT strategy as we have seen too much investment go into platforms too early and yield little ROI.

The worlds of IoT and IoE continue to change, adapt and surprise daily, making it very easy to get overwhelmed. The key is to understand how it can help your business and then to develop a path to adopt this technology. The value of IoT lies in business outcomes, not in underlying technology. 

Dat-uh helps create business value from your IoT investments. View original blog post here

Follow us @IoTCtrl | Join our Community

Read more…

Nanoscale devices and systems characterized with their size of 100 nanometer and below are predominantly used to study phenomena such as near field behavior in electromagnetics and optics, single-electron effects and quantum confinement in electronics and other effects in biological, fluidic and mechanical systems. Internet of nano things (IoNT) is the interconnectivity of such nanoscale devices over the internet and other communication networks. The Internet of Nano Things (IoNT) provides seamless communication and transmission of data for the given range of operations. Increasing interconnection of nanosensors and nanotechnology with consumer devices and other physical assets have led to an immense growth in the scope of internet. Internet of Nano things (IoNT) provides capabilities for inter-communications between nano devices for data collection, processing and sharing with the end-users. IoNT has found applications in various industries such as healthcare, manufacturing, transportation and logistics, energy and utilities, media and entertainment, retail and other services.

Internet of Nano Things (IoNT) infrastructure can be deployed as a combination of various nanotechnologies and nano devices. The IoNT infrastructure depends on the area of operation and required bandwidth for a particular application. Deployment of IoNT infrastructure provides high speed of communication and reduces the bandwidth pressure on existing communication infrastructure. The growth of the Internet of Nano Things (IoNT) primarily focuses on improving processing capabilities, providing larger storage capacity at lower costs and increasing the role of communication terminals. The IoNT infrastructure can be deployed in various eco-systems such as electro-magnetic waves, Wi-Fi, Li-Fi, radio frequency identification (RFID) and nano antenna.

One of the major factors driving the growth of IoNT market is the rise of ubiquitous connectivity. With the increasing number of computer devices and interconnection capabilities over the internet, various industrial applications of IoNT have been identified. The interconnection of nano devices has enabled efficient communication of data between different devices or components of a system. Thus, through IoNT, organizations are able to reduce the complexity in communication and increase the efficiency of processes using such connected devices. Moreover, government’s support for the development of IoNT technology for healthcare has further increased the demand and awareness of IoNT. However, the growth of the IoNT market faces a few challenges due to privacy and security issues. Since critical data is communicated between devices over the internet, concerns related to security of the data have risen. Another factor which hinders the growth of IoNT market is the huge capital investment required for the development of nanotechnology.

Immense growth opportunities of the IoNT market have been identified through its applications in various sectors such as healthcare, transportation, logistics, manufacturing, media, entertainment and retail. With technological advancements in the IoNT, the scope of IoNT and its applications is on the rise. Major companies are conducting research on nanotechnology and constantly developing nano systems with wider scope of applications. Some of the major players in the IoNT market are Intel Corporation, Cisco Systems Inc., Qualcomm Incorporated, Juniper Networks and IBM Corporation in U.S., Schneider Electric and Alcatel-Lucent S.A. in France, and SAP S.E. and Siemens AG in Germany among others.

This research report analyzes this market on the basis of its market segments, major geographies, and current market trends. Geographies analyzed under this research report include 
  • North America 
  • Asia Pacific 
  • Europe
  • Rest of the World  

This report provides comprehensive analysis of 
  • Market growth drivers 
  • Factors limiting market growth
  • Current market trends 
  • Market structure
  • Market projections for upcoming years 

This report is a complete study of current trends in the market, industry growth drivers, and restraints. It provides market projections for the coming years. It includes analysis of recent developments in technology, Porter’s five force model analysis and detailed profiles of top industry players. The report also includes a review of micro and macro factors essential for the existing market players and new entrants along with detailed value chain analysis. 

The report can be seen  here. 

Originally posted on Data Science Central



Follow us @IoTCtrl | Join our Community
Read more…

Guest blog post by Bernard Marr

I have to admit, I love the swell of Internet of Things devices and technology we’re seeing right now.  

Smart alarm clocks that can wake you without waking your bedmate, smart showers that help you save water, a smart kettle or coffee pot you can start with your phone, a smart frying pan to alert you before you burn the eggs — and that’s all before breakfast.

Some IoT devices seem silly while others seem poised to help change the world.  But the one thing they all have in common is that they aren’t really about the things themselves.

Source for picture: click here (2014 Gartner chart)

The Internet of Things is really all about data.

Big data is all about data, while the Internet of things involves data, devices, and connectivity, but in the end, they tend to come down to the same thing.

And while many companies are jumping on the IoT bandwagon — from big providers like Cisco and Dell all the way down to Kickstarter-funded startups — they’re all going to have to watch their data if they want to succeed.

I have three main suggestions for companies looking to benefit from the data their smart devices, sensors, and RFID tags gather:

  • Ensure your data is accurate. As IoT devices reach consumers, the must become less expensive to produce and maintain. This could result in a loss of accuracy.  For example, if you wear three different types of fitness monitors during a workout, you’re likely to get three different measurements of how many calories you’ve burned. At an individual level, the accuracy probably doesn’t matter that much — it won’t make a huge difference to me if I’ve burned 400 calories or 337 calories — but on a larger scale, it could make a huge difference, depending on what the company wants to learn from the data. Building a strategy on inaccurate data will not yield good results.
  • Protect your data chain. Because people are only just beginning to think of data as a business resource, they are also only just beginning to think of protecting that resource. Any erroneous or altered data streams could affect the accuracy of the dataset on the whole. So far, security at the sensor level is the most difficult to achieve and maintain, but without a secure data chain, you can’t rely on accurate data.
  • Collect the right data (and ask the right questions).  As with all big data projects, data itself is useless without the right questions driving it. IoT data must have the right kinds of analysis projects behind it to make it useful for any business.  You can know every movement of every shipment you ever send, but unless you can use that data to improve your processes, it is useless — expensive, and useless. IoT data collection has to be backed up by solid data analysis to be useful, and the business units need to be involved in dictating what data is collected and what analyses are performed.

So, while it is awfully cool to have a smart yoga mat to help you improve your poses, or a smart refrigerator so you don’t forget the milk, the IoT trend won’t go very far just for being clever or cool. But when the businesses behind those cool toys put the data they collect to work, they may find that the Internet of Things is real business.

How do you feel about the Internet of Things and smart devices?  Cool or creepy?  Business boom or bust?  I’d love to hear your opinions in the comments below.

Follow us @IoTCtrl | Join our Community

Read more…

Top 5 Trends in Big Data Analytics

While many of us recognize that companies are empowered by actionable information penetrations and help drive sales, devotion and superior customer experiences, the thought of making sense of enormous quantities of information and undertaking the task of unifying is daunting. But that is slowly changing. Experts forecast that this year, budgets will be allocated by most companies, and that 2015 will undoubtedly be the year of big data and discover the best tools and resources to really harness their data.

Information gathering has developed radically, and both C-level executives as well as their teams now recognize they have to join the data arms race that was big to keep and grow their customer base, also to stay competitive in today's data-driven marketplace. Terms like in-memory databases, sensor information, customer data platforms and predictive analytics will end up more widely understood.

With terabytes of information being gathered by companies at multiple touchpoints, platforms, devices and offline places, companies will start to focus more on possessing their info, to be able to access, visualize and control this data, and on monetizing their audience in real-time together with the right content. More emphasis will likely be placed on ethically info is accumulated, how clean and collect the big data is and to be an information hoarder that accumulates information you don't really want.

Here are the top 5 information trends that we predict will reign 2015:

1. Data agility will take center stage


It's not sufficient to just own quantities of customer information if this info is not agile. More companies are seeking approaches that are simple, quick and easy to offer unified and protected use of customer information, across departments and systems. CMOs, CTOs, information scientists, business analysts, programmers and sales teams possess precisely the same pressing need for tools and training to assist them navigate their customer data. With the growing popularity of wearables, sensors and IoT apparatus, there's additional real time information flooding in. Plus having customer information saved on multiple legacy platforms and third party vendor systems only makes information agility that much more challenging. Most firms only use about 12.5% of their available data to grow their company. Having access to the proper tools which make customer information more agile and easy to use is going to be a significant focus of businesses in 2015.

2. Information is the New Gold & Puts Businesses In Control
For several businesses, the most commonly-faced information need is ownership and unification: Volumes of information being generated every second, being saved on multiple legacy platforms that still use dated structure, as well as the inability to access all this customer data in a single area to get a "complete view" of their customers. But together with technology that makes information union easier and the introduction of new tools, businesses are beginning to appreciate the worth of controlling and possessing their customer data. The frustrations of working with multiple third party sellers to gain possession of information, along with a lack of information rights keys that permits you to automatically pull information from these vendors will be major pain points which is handled. Companies can now select from a variety of systems like Umbel to help gather first-party customer information from multiple online and offline sources, platforms and sellers, possess and unify the data, and make use of the information in real-time to power and optimize marketing and sales efforts.

3. The Rise of Customer Information Platforms

While DMPs and CRMs help fulfill many business needs, today's marketers want a centralized customer information platform like Umbel that examines and gives profound penetrations on their customer base to them. Very few businesses really have one genuinely complete, unified customer database alternative. They're largely still using multiple systems and platforms that collect information separately.

A CMO's top priority will probably be to possess a reliable Customer Info Platform that collects exact customer information from all online and offline touch points (enclosed web site visits and purchases, social interactions, beacon info, cellular and in store interactions etc.), removes duplicates and appends it with added data (demographic, geographic, behavioral, brand kinship) from other trusted sources.

4. Info Democratization Across Departments
The abundance of customer data offered to brands today is staggering and yet many companies are yet to fully use the information to supercharge marketing and sales efforts. Among the biggest hurdles that marketers face is the fact that accessibility for this information is quite limited at most firms. Primarily, only larger companies with IT resources had the capacity to gather, save, analyze, and monetize this information that is precious. Second even if data, the IT department was collecting and/or the business enterprise analytics teams have restricted access to this information and sales and marketing teams that actually use this data must undergo a convoluted, time-consuming procedure to get insights and the data they need.

With new tools like Umbel, teams don't desire an Information Scientist to make sense of their data.

For info to be genuinely valuable to an organization, it is critical that the info be democratized across teams and departments, empowering all employees, irrespective of their specialized expertise, to get information and make more informed decisions. In 2015 more companies will start to use automated platforms that enable anyone in the organization to see, assess and take actions according to customer data.

5. Mobile Data and Strategy Will End Up Vital to Advertising
Based on eMarketer, mobile search ad spend in the U.S. grew 120.8% in 2013 (overall gain of 122.0% for all mobile advertisements). Meanwhile, desktop advertisement spending went up by just 2.3% last year. The mobile program has become as useful as an essential component of any marketing plan, and sites for retailers. For companies to remain competitive, a seamless, secure, fast and instinctive experience on mobile devices, and also the ability to capture this information that is mobile and add it to a unified customer data base is critical. Having this unified information of customers from every touchpoint (including cellular and offline) will enable firms to identify trends and shape a better customer experience. More companies are getting to be conscious of how significant it is to be able to unify their information and compare analytics across all platforms to help them create personalised marketing campaigns centered on a "complete customer view."

Originally posted on Data Science Central

Follow us @IoTCtrl | Join our Community

Read more…

Guest blog and great infographic from Matt Zajechowski

It’s no secret that analytics are everywhere. We can now measure everything, from exabytes of organizational “big data”  to smaller, personal information like your heart rate during a run. And when this data is collected, deciphered, and used to create actionable items, the possibilities, both for businesses and individuals, are virtually endless.

One area tailor-made for analytics is the sports industry. In a world where phrases like “America’s pastime” are thrown around and “the will to win” is revered as an intangible you can’t put a number on, stats lovers with PhDs in analytics are becoming more and more essential to sports franchises. Since the sabermetric revolution, sports franchises have begun investing time and money in using sports analytics from wearable technology to help their athletes train and even make more money from their stadiums.

Today, Sports Fans Prefer the Couch Over the Stadium

For decades, television networks have tried to create an at-home experience that’s on par with the stadium experience — and they’ve succeeded emphatically. In a 1998 ESPN poll, 54% of sports fans reported that they would rather be at the game than watch it at home; however, when that same poll was readministered in 2011 found that only 29% preferred being at the game.

While this varies by sport to some degree, the conclusion is clear: people would rather watch a game in the comfort of their own climate-controlled homes, with easy access to the fridge and a clean bathroom, than experience the atmosphere of the stadium in person. Plus, sports fans today want the ability to watch multiple games at once; it’s not unusual for diehard fans to have two televisions set up with different games on, plus another game streaming on a tablet.

However, fans could be persuaded to make their way back to the stadiums; 45% of “premium fans” (who always or often buy season tickets) would pay more money for a better in-person experience. That’s where wearable technology comes into play.

Wearable Data — for Fans Too

At first glance, the sole application of wearable technology and data science should seemingly be to monitor and improve athlete performance. These tasks might include measuring heart rate and yards run, timing reactions and hand speed, gauging shot arch, and more, while also monitoring the body for signs of concussion or fatigue

And that’s largely true. For example, every NBA arena now uses SportVU, a series of indoor GPS technology-enabled cameras, to track the movements of the ball and all players on the court at a rate of 25 times per second. With that data, they can use myriad statistics concerning speed, distance, player separation, and ball possession to decide when to rest players.

Similarly, Adidas’ Micoach is used by the German national soccer team during training to monitor speed, running distances, and heart rates of each player. In fact, this system is credited with the decision to sub in German soccer player Mario Gotze in the 88th minute of the 2014 World Cup final; in the 113th minute, the midfielder scored the World Cup-winning goal.

However, some sports franchises are using that wearable technology to benefit the fan sitting in the stadium. For example, the Cleveland Cavaliers’ Quicken Loans Arena (an older stadium) was retrofitted with SportsVU; however, they don’t use them just for determining when LeBron James needs a break. Instead, the Cavs use the data tracked by SportsVU to populate their Humungotron with unique statistics tracked in real-time during the game. The Cavs then took this data to the next level by using the stats in their social media marketing and to partner with various advertisers.

How Analytics Are Improving the Stadium Experience

Besides sharing interesting statistics on the JumboTron during the game, stadiums are using data from athletes and fans to enhance the spectators’ experience. In fact, stadiums are actually mirroring the in-home experience, through various apps and amenities that reach the spectator right in their seat.

And at times, they’re going above and beyond simply imitating the in-home experience. Take the Sacramento Kings, for example. In 2014, the team partnered with Google to equip many of its courtside personnel (mascots, reporters, and even dancers) with Google Glass. Fans were able to stream close-up, first-person views of the action through their mobile devices, allowing them to feel closer than their upper-level seats would suggest.

Levi’s Stadium in Santa Clara (home of the San Francisco 49ers) boasts a fiber optic network that essentially powers every activity in their thoroughly modern stadium. The stadium contains 680 Wi-Fi access ports (one for every 100 seats in the stadium) and around 12,000 ethernet ports, allowing everything from video cameras and phones to connect to a 40 gigabit-per-second network that’s 10,000 times faster than the federal classification for broadband. 1700 wireless beacons use a version of Bluetooth to triangulate a fan’s position within the stadium and give them directions. And for fans who don’t want to leave their seats, a specially developed app can be used for tickets, food delivery to your seat, and watching replays of on-field action.

The Miami Dolphins, meanwhile, have partnered with IBM and use technology from their “Smart Cities” initiative to monitor and react to weather forecasts, parking delays, and even shortages of concessions at specific stands in Sun Life Stadium. The Dallas Cowboys’ AT&T Stadium features 2,800 video monitors throughout the stadium as well as more than five million feet of fiber optic cable, used for everything from gathering data to ordering food in-suite.

NFL teams aren’t the only franchises making use of sports analytics. The Barclays Center, home of the Brooklyn Nets, uses Vixi to display properly hashtagged tweets on multiple big screens throughout the arena. They also use AmpThink, a series of networking tools that require the user to submit some personal information before logging onto the arena’s Wi-Fi; that way, they’re able to collect data on how and where people are logging in, as well as what services they’re using while in the arena. Fans can already order food and drink from their seats and replay sequences from various camera angles, and in the future, they’ll be able to use an app that gives information about restroom waits and directions to the restrooms with the shortest lines.

To some, the increase of connectivity might seem to take away from the experience of watching a game live; after all, how can you enjoy live action if you’re constantly staring down at your phone? On the contrary: by employing these apps to judge the shortest bathroom lines or order food directly to their seats, fans are able to stay in their seat longer and watch more of the games.

While this technology certainly isn’t cheap (and will be reflected in increased ticket prices), those extra minutes of action may be worth the higher cost to some fans. Ultimately, it’s up to the fans to decide if paying more for tickets is worth the premium experience — and the time saved waiting in line.

Bringing Fans Back, One Byte at a Time

Sports teams aren’t going to lose their fans to television without a fight. And with the majority of sports franchises embracing wearable and mobile data in some form or another, it’s a natural transition for marketing departments to apply that data to the fan experience. With easy access to Wi-Fi, snacks, replays, and shorter restroom lines, sports fans can combine the atmosphere of game day with the comfort of being in their own homes.

Originally posted on Data Science Central

Follow us @IoTCtrl | Join our Community

Read more…

The Importance of Smart City Contests

Earlier this week Microsoft billionaire Paul Allen announced that he was teaming up with the U.S. Department of Transportation (DOT) to offer a $50 million prize to the winner of a “Smart City” competition aimed at promoting high-tech solutions to traffic snarls.

The aim is to show what is possible when communities use technology to connect transportation assets into an interactive network. The Smart City Challenge will concentrate federal resources into one medium-sized city, selected through a nationwide competition. Funding of up to $40 million in funding will go to one mid-sized city that puts forward bold, data-driven ideas to improve lives by making transportation safer, easier and more reliable. DOT will partner with Vulcan, Paul Allen’s venture arm, to offer an additional $10 million to the winning city to support infrastructure for Electric Vehicles.

Photo: Courtesy of Cisco via Flickr

February is the deadline to submit proposals for transit innovations and DOT’s experts will select five proposals as finalists. Each of the finalists will receive $100,000 in federal funding for further development, and the winner would be announced by next June. The competition is open to U.S. mid-sized cities, which is defined as cities with a 2010 census population between 200,000 and 850,000. You can see the guidelines here.

Fifty million dollars may not sound like much compared to overall spending on transportation, but for cities of this size it’s a great start for creating a smarter city.

This week’s announcement is one of many smart city competitions announced over the years, and surely there will be more to come. Cities are where the majority of people will live and by 2050 some estimates predict that as many as seven out of 10 people on Earth will live in an urban area. The continued population increases will exceed the capacity of human administrators.

Cities will have to get a whole lot smarter.

This is why you are seeing more and more contests for cities to get smarter, and for them to be more open.  Witness cities like Glasgow who won the UK’s Future Cities competition, Barcelona’s Smart City ecosystem, India’s Smart City Challenge, the Obama Administration's Smart City challenge, and New York’s efforts to build a smart city.

What this means is that sensors will be woven into every aspect of daily life. By 2020, the number of thermostats, pressure gauges, accelerometers, acoustic microphones, cameras, meters and other micro-electromechanical measuring devices linked to the Internet is predicted to reach 50 billion worldwide, a number predicted by Cisco.

Think solar powered WiFi connected trash cans to let rubbish collectors know when they are full, sensors to alert public works directors of clogged sewers, traffic cameras connected to an IP network to notify engineers in a central location of mounting traffic jams, air quality sensors to monitor pollution and rooftop acoustics sensors triangulating sounds of gunshots.  

These contests are a way to drive industry towards a new era of more efficient and responsive government, driven by real-time data. The role of the IOT will also drive new economic opportunity and business development, centered around the creation, analysis and intelligent use of these data feeds. The benefits are many: increased cost-savings, bolstered civic engagement, and strengthened public health and safety.

Cheers to more contests and to the winners, which will be all of us.

Further reading: Wall Street Journal Special Report: As World Crowds In, Cities Become Digital Laboratories

 

Read more…

Guest blog post by Bill Vorhies

We spend so much time thinking about consumers that’s it’s refreshing to find applications of Big Data and advanced analytics that are not linked to selling something.  Last week I wrote about Predictive Analytics role in student success in colleges.  This week a friend pointed me to this very interesting article on The Role of Big Data in Medicine.  This article was published by McKinsey&Company and features interviews with Dr. Eric Schadt, the founding director of the Icahn Institute for Genomics and Multiscale Biology at New York’s Mount Sinai Health System.  Here are a few of the highlights.

Evolution or revolution?

The role of big data in medicine is one where we can build better health profiles and better predictive models around individual patients so that we can better diagnose and treat disease.

One of the main limitations with medicine today and in the pharmaceutical industry is our understanding of the biology of disease. Big data comes into play around aggregating more and more information around multiple scales for what constitutes a disease—from the DNA, proteins, and metabolites to cells, tissues, organs, organisms, and ecosystems. Those are the scales of the biology that we need to be modeling by integrating big data. If we do that, the models will evolve, the models will build, and they will be more predictive for given individuals.

How wearables are poised to transform medicine

Wearable devices and engagement through mobile health apps represent the future—not just of the research of diseases, but of medicine. I can be confident in saying that, because today in medicine, a normal individual who is generally healthy spends maybe ten minutes in front of a physician every year. What that physician can possibly score you on to assess the state of your health is very minimal.

What the wearable-device revolution provides is a way to longitudinally monitor your state—with respect to many different dimensions of your health—to provide a much better, much more accurate profile of who you are, what your baseline is, and how deviations from that baseline may predict a disease state or sliding into a disease state. That means we’ll be able to intervene sooner to prevent you from that kind of slide.

What big data means for patients

What I see for the future for patients is engaging them as a partner in this new mode of understanding their health and wellness better and understanding how to make better decisions around those elements.

Most of their data collection will be passive, so individuals won’t have to be active every day—logging things, for example—but they’ll stay engaged because they’ll get a benefit from it. They’ll agree to have their data used in this way because they get some perceived benefit.

A better understanding of Alzheimer’s disease

For a long time, the plaque and tangles were the driving force for how people were seeking to understand Alzheimer’s and to come up with preventative or more effective treatments. What we were able to do was engage modern technology—the genomics technologies—and go to some of the established brain banks and carry out a much deeper profiling in a completely data-driven way.

We didn’t have to constrain ourselves by the plaques-and-tangles hypothesis. We could say, “We’re going to sequence all the DNA in different brain regions. We’re going to sequence the RNA,” which is a more active sort of sensor of what’s going on at the deep molecular level in different parts of the brain. And then, “We’re going to try to reconstruct predictive or network models to understand how the millions of variables we’re measuring are connected to one another in a cause–effect sort of way,” and, “We’re going to see how those models change between the disease state and the normal, nondemented state.”

Think of these networks as a graphical model where the nodes in the network are different genes and clinical features and DNA variance, and the edges indicate relationships between those variables that we observe over the population of brains we profiled. What we were very surprised to find is that the most important network for Alzheimer’s had nothing directly to do with tangles or plaques, but the immune system. We directly implicated microglial cells—which are sort of the macrophage-type cells of the brain that keep the brain healthy—as a key driver of Alzheimer’s disease.

The Future

One of the biggest problems around big data, and the predictive models that could build on that data, really centers on how you engage others to benefit from that information. Beyond the tools that we need to engage noncomputational individuals in this type of information and decision making, training is another element. They’ve grown up in a system that is very counter to this information revolution. So we’ve started placing much more emphasis on the generation of coming physicians and on how we can transform the curriculum of the medical schools.

Read the entire article here.

Follow us @IoTCtrl | Join our Community

Read more…

Unraveling Real-Time Predictive Analytics

Guest blog post by Ram Sangireddy

As a product manager in the domain of predictive analytics, I own the responsibility to build predictive analytics capabilities for consumer facing and/or enterprise platforms; the business applications vary among item recommendations for consumers, prediction of event outcomes based on classification models, demand forecasting for supply optimization, and so on. We usually see the applications where the predictive model built using machine learning technique(s) is leveraged to score the new set of data, and that new set of data is most often fed to the model on-demand as a batch.

However, the more exciting aspect of my recent work has been in the realm of real-time predictive analytics, where each single observation (raw data point) has to be used to compute the predicted outcome; note that this is a continuous process as the stream of new observations continuously arrive and the business decisions based on the predicted outcomes have to be made in real-time. A classic use case for such a scenario is the credit card fraud detection: when a credit card swipe occurs, all the data relevant to the nature of the transaction is fed to a pre-built predictive model in order to classify if the transaction is fraudulent, and if so deny it; all this has to happen in a split second at scale (millions of transactions each second) in real-time. Another exciting use case is the preventive maintenance in Internet of Things (IoT), where continuous streaming data from thousands/millions of smart devices have to be leveraged to predict any possible failure in advance to prevent/reduce downtime.

Let me address some of the common questions that I often receive in the context of real-time predictive analytics.

What exactly is real-time predictive analytics – does that mean we can build the predictive model in real-time? A data scientist requires an aggregated mass of data which forms the historical basis over which the predictive model can be built. The model building exercise is a deep subject by itself and we can have a separate discussion about that; however, the main point to note is that model building for better predictive performance involves rigorous experimentation, requires sufficient historical data, and is a time consuming process. So, a predictive model cannot be built in “real-time” in its true sense.

Can the predictive model be updated in real-time? Again, model building is an iterative process with rigorous experimentation. So, if the premise is to update the model on each new observation arriving in real-time, it is not practical to do so from multiple perspectives. One, the retraining of the model involves feeding the base data set including the new observation data point (choosing either to drop older data points in order to keep the data set size the same or not drop and keep growing the data set size) and so requires rebuilding of the model. There is no practical way of “incrementally updating the model” with each new observation; unless, the model is a simple rule based; for example: predict as “fail” if the observation falls outside the two standard deviations from the sample mean; in such a simple model, it is possible to recompute and update the mean and standard deviation values of the sample data by including the new observation even while the outcome for the current observation is being predicted. But for our discussion on predictive analytics here, we are considering more complex machine learning or statistical techniques.

Second, even if technologies make it possible to feed large volume of data including the new observation each time to rebuild the model in a split second, there is no tangible benefit in doing so. The model does not much with just one more data point. Drawing an analogy, if one wants to measure by how much the weight has reduced from an intensive running program, it is common sense that the needle does not move much if measured after every mile run. One has to accumulate a considerable number of miles before experiencing any tangible change in the weight! Same is true in Data Science. Rebuild the model only after aggregating a considerable volume of data to experience a tangible difference in the model.

(Even the recent developments, such as Cloudera Oryx, that are making efforts to move forward from Apache Mahout and similar tools (limited to only batch processing for both model building and prediction) are focused on real-time prediction and yet rightly so on batch-based model building. For example, Oryx has a computational layer and a serving layer, where the former performs a model building/update periodically on an aggregated data at a batch level in the back-end, and the latter serves queries to the model in real-time via an HTTP REST API)

Then, what is real-time predictive analytics? It is when a predictive model (built/fitted on a set of aggregated data) is deployed to perform run-time prediction on a continuous stream of event data to enable decision making in real-time. In order to achieve this, there are two aspects involved. One, the predictive model built by a Data Scientist via a stand-alone tool (R, SAS, SPSS, etc.) has to be exported in a consumable format (PMML is a preferred method across machine learning environments these days; we have done this and also via other formats). Second, a streaming operational analytics platform has to consume the model (PMML or other format) and translate it into the necessary predictive function (via open-source jPMML or Cascading Pattern or Zementis’ commercial licensed UPPI or other interfaces), and also feed the processed streaming event data (via a stream processing component in CEP or similar) to compute the predicted outcome.

This deployment of a complex predictive model, from its parent machine learning environment to an operational analytics environment, is one possible route in order to successfully achieve a continuous run-time prediction on streaming event data in real-time.

Follow us @IoTCtrl | Join our Community

Read more…

IoT Without the Internet?

This is a guest post from Dana Blouin who writes about IoT at www.danablouin.com. You can follow him on Twitter @DanaBlouin

I recently took a trip to Northern Thailand to support a volunteer effort at a school that serves a rural hill tribe. The project was backed by three Thai startups Knit by JibDrvr andBangkok Bike Finder. All outstanding startups that value the idea of doing social good and believe that education is a fundamental necessity.                                                                                                                            

Village 1

This school itself isn’t to easy to get to. After taking an overnight train from Bangkok to Chaingmai we still had two days of travel ahead, including the last leg which was a three hour drive in a 4×4 truck through mountain roads and paths.

While I was visiting this village I was wondering how could I help this school through my specific set of knowledge and capabilities with the Internet of Things, there was just one issue. The village the school serves is completely off the grid, no power, no Internet and just to get a mobile signal you have to travel about 15km. How do you get the benefit of the IoT without actually having Internet, or power for that matter?

Well the power issues is fairly easy to address, this is Thailand after all and sun is one thing we have in abundance. There are a number of low power devices that can run via solar power and can be contained in a weatherproof case of some kind.

Of course there are solution in place for such lacks of connectivity, a device can cache data and then send it back up to the cloud later for analysis. Or a mobile device can be used to query sensors once they are in range to get real time data.

So, the tech solutions are there, now just finding out how IoT can help this school was the key. I spent a lot of time exploring the area around the school and talking with the teacher to assess what the school needs to help provide a better learning environment for the community they serve.

Water is really the primary concern for the school at this point. They can only get fresh water about five months out of the year. Water, being sort of essential for life and all, clearly moves to the top of the list. Knit by Jib is working on a project now that will help extend how high up on the stream the schools water is sourced from; which should allow them to get clean, fresh water year round.

Just because the IoT can’t physically bring the water to the school doesn’t mean it won’t have a role to play. I can envision sensors used for the water tanks at the school to measure level, letting the teacher know when they need to turn the valves on to fill the tanks, or even possibly some sensors to check water quality, I still have some more research to do on this front.

Another issues that the teachers face is nutrition, as the diet of the locals is very limited. It is often the case that the only balanced meal the students get each day is prepared by the teachers at school. To this end the teachers are currently running a school garden where they grow the food that is used in some of the students meals. An automated watering system linked to a soil moisture sensor seems to be a simple project that can be put together to help out in this regard. Of course because there is no electricity in this village the system would have to be solar powered so it could operate consistently, and then that whole system would need to be maintained. All are interesting challenges.

Ultimately it comes down to how much benefit can this school get from technology projects like this. Just installing them could help a little bit, but I am unsure how much benefit it would really be. I have been thinking that the biggest help these projects could accomplish would be as a learning experience for the children and maybe provide some inspiration along the way. The point of these projects and outreach is to benefits the school in its mission to provide quality education to this remote, rural area. I have a lot more thought to put into this before I can decide what, if any benefit can be offered through technology. I will be sure to post more here as I work through ideas.

Read more…

Big Data from Small Devices?

Predictions are in our DNA.  Millions of us live with them daily, from checking the weather to reading daily horoscopes.   When it comes to Big Data, the industry has shown no shortage of predictions for 2014.  In fact, you might have read about insights on women in data science, ambitions for Machine Learning or a vision for the consumerization of Advanced Analytics.

It is quite difficult to accurately assess when these predictions will materialize.  Some of them will see the light of the day in 2014 but many might take until 2020 to fully mature. 

Wearable Devices and Big Data

Take the case of wearable devices.  There is no question that mobile phones, tablets and smart watches will become pervasive over the next 5 years.  According to Business Insider, the market for wearables could reach $12B in 2018 and theses devices have a strong potential for changing our habits all together. 

The only issue is how quickly we will adopt them and in turn get clear value from them.  Pioneers like Robert Scoble have made a great case for the opportunity but also have provided a down to earth perspective for the rest of us (his recent article on “Why Google Glass is doomed ” is a gem).

So, I predict that, while the tipping point for such technologies might be 2014, but the true disruption might not happen before 2020.  Why?  Definitions and Center of Design. 

For starters, the definition of a “wearable device” is still very loose.  I’m a big fan of devices like the Jawbone UP, the Fitbit and the Basis watch.  In fact, I’ve built an analytical system that allows me to visualize my goals, measure and predict my progress already. My “smart devices” collect information I couldn’t easily understand before and offer the opportunity to know more about myself.  Big Data growth will primarily come from these types of smart devices. 

The wearables that are still confusing are the so-called “smart-watches”.  Theses watches, in my opinion, suffer from a “Center of Design” Dilemna.

Let me explain: the technology industry is famous for wanting new technologies to sunset old ones.  When Marc Benioff introduced Chatter, he said it would obliterate email.  When PC shipments went down, the industry rushed to talk about the “Post-PC” era.  Have any of these two trends fully materialized yet?! 

The answer is unfortunately not simple.  Smart watches, phones, tablets and PC all have a distinct use cases, just like email and social apps.  Expecting that one technology would completely overlap the other one would be disregarding what I call a product’s “center of design”.  The expression refers to the idea that a particular technology can be stretched for many uses but that it is particularly relevant for a set of defined use cases.  Let’s take the example of the phone, tablet and PC:

  • A phone is best used for quickly checking texts, browsing emails, calendar invites…and of course making phone calls (duh!)
  • A tablet is best used for reading and browsing websites, documents, books and emails.  Typing for 12 hours and creating content is possible but it’s not a tablet’s center of design…
  • A PC or a Macbook are best for creating content for many hours.  They might be best for typing, correcting and working on projects that require lots of editing.

When I see an ad like this on the freeway, I really question the value of an additional device.  What can a watch in this case add, if the wrist that wears it, is also connected to a hand that holds a much more appropriate device?

Big Data from Wearables is a Predictive Insight for 2020 in my opinion, because I think that, by then, the broad public will have embraced them into use cases that truly add value to their lives.

--

Bruno Aziza is a Big Data entrepreneur and author.  He’s lead Marketing at multiple start-ups and has worked at Microsoft, Apple and BusinessObjects/SAP.  One of his startups sold to Symantec in 2008 and two of them have raised tens of millions and experienced triple digit growth.   Bruno is currently Chief Marketing Officer at Alpine Data Labs, loves soccer and has lived in France, Germany and the U.K.  

Originally posted on Data Science Central

Follow us @IoTCtrl | Join our Community

Read more…

The Internet of Things refers to the network of physical objects which are embedded with software, electronics, network connectivity and sensors which enable the objects to exchange and collect data. Internet of Things is defined as an invisible and intelligent network of things that communicate indirectly or directly with each other. The internet is used to facilitate experience and efficiency. Internet of Things enable the communication between the physical objects and other Internet-enabled systems and devices. In addition, technological advancement in the field of healthcare, government initiatives for the expansion of Internet of Things and need to increase efficiency and cost reduction are the major factors that are driving the Internet of Things market globally. However, security risk and over-reliance on technology are posing challenges to the growth of the Internet of Things market. 

 

The global Internet of Things market is segmented on the basis of technology into: ZigBee, Bluetooth low energy (BLE), near field communication (NFC), Wi-Fi and radio frequency identification (RFID). Wi-Fi technology led the global Internet of Things market in 2014. Key growth factors driving the Wi-Fi technology market owing to its wide usage across several industries such as campuses, schools, office buildings, lodging and residential homes among others. Wi-Fi can provide secure connection with software-defined networking by scanning and securing the devices at network entry point.

 

By application, global Internet of Things market is divided into: industrial, automotive, consumer electronics, retail, healthcare and others (including energy and utilities, and entertainment). As of 2014, industrial sector was the largest contributor in the global Internet of Things market. The potential for cyber physical systems to improve productivity in the supply chain and production process are increasing the demand for Internet of Things. Healthcare and consumer electronics sectors are expected to be the fastest growing applications in the Internet of Things market globally.

 

By geography, as of 2014, North America led the global Internet of Things market, accounting for 38.6% of the overall market. The rapid growth of industrial, automotive and healthcare industries is the major factor driving the growth of Internet of Things in North America. Europe held the second largest market share and the demand for Internet of Things market is expected to increase during the forecast period in the region. The market in Europe is primarily driven by government regulation supporting the growth of Internet of Things. The strong regulations in place will ensure the effective operability of the Internet of Things concept in various application areas. The automotive industry is expected to contribute a large share in Europe. Asia Pacific is expected to be the fastest growing region throughout the forecast period.

Originally posted on Data Science Central

Follow us @IoTCtrl | Join our Community

Read more…

Upcoming IoT Events

More IoT News

Arcadia makes supporting clean energy easier

Nowadays, it’s easier than ever to power your home with clean energy, and yet, many Americans don’t know how to make the switch. Luckily, you don’t have to install expensive solar panels or switch utility companies…

Continue

Answering your Huawei ban questions

A lot has happened since we uploaded our most recent video about the Huawei ban last month. Another reprieve has been issued, licenses have been granted and the FCC has officially barred Huawei equipment from U.S. networks. Our viewers had some… Continue

IoT Career Opportunities