Subscribe to our Newsletter | To Post On IoT Central, Click here


big data (29)

Can the Public Internet Secure Our Digital Assets?

There is a lot of talk, and, indeed, hype, these days about the internet of things. But what is often overlooked is that the internet of things is also an internet of shared services and shared data. What’s more, we are becoming too heavily reliant on public internet connectivity to underpin innovative new services.

Take this as an example. Back in April, Ford Motor Company, Starbucks and Amazon announced and demonstrated an alliance that would allow a consumer to use Alexa to order and pay for their usual coffee selection from their car. Simply saying, “Alexa: ask Starbucks to start my order,” would trigger the sequence of events required to enable you to drive to the pickup point and collect your already-paid-for coffee with no waiting in line.

Making that transaction happen behind the scenes involves a complex integration of the business processes of all the companies involved. Let’s be clear: this is about data protection. For this series of transactions to be successfully handled, they must be able to share customer payment data, manage identity and authentication, and match personal accounts to customer profiles.

Because all of that critical data can be manipulated, changed or stolen, cyberattacks pose significant data protection risks for nearly any entity anywhere. The ambition of some of these consumer innovations makes an assumption that the “secure” network underpinning this ecosystem for the transfer of all that valuable personal data is the public internet. And that’s the point – it’s not secure.

As we’ve talked about previously on Syniverse's blog Synergy, the public internet poses a systemic risk to businesses and to confidential data. In short, when we are dealing on a large scale with highly sensitive data, the level of protection available today for data that, at any point, touches the public internet is substantially inadequate.

And this alliance between Ford and Starbucks is just one example of the type of innovation, across many different industry and consumer sectors, that we can expect to see a lot of in the very near future. These services will connect organizations that are sharing data and information about businesses and about consumers – about their purchase history, their preferences and requirements, and also about their likely future needs. This is potentially a very convenient and desired service from a consumer’s point of view, but at what cost?

We need security of connectivity, security from outside interference and the security of encrypted transfer and protection for our personal and financial data. And we need to be able to verify the protection of that data at all times by ensuring attribution and identity – both concepts we’ll explore more deeply in an upcoming blog post. And that’s a level of security that the public internet simply cannot provide.

Last month, an internet-based global ransomware attack took down systems and services all over the world – affecting sensitive personal healthcare data in the U.K. in particular.

Whether it is personal health records, financial records, data about the movement of freight in a supply chain, or variations in energy production and consumption, these are digital assets. Businesses, institutions and government bodies all over the world have billions of digital assets that must be constantly sent to and from different parties. And those assets require the type of high-level data protection that is not currently possible because of the systemic risk posed by the insecure public internet.

As mentioned in my last blog post on Synergy, there is an alternative. Some companies using private IP networks were able to carry on regardless throughout the high-profile cyberattacks that have been capturing headlines in the last year. That’s because those companies were not reliant on the public internet. Instead, they were all using what we are beginning to term “Triple-A” networks on which you can specify the speed and capacity of your Access to the network while guaranteeing the Availability of your connection. What’s more, on a Triple-A network, Attribution is securely controlled, so you know who and what is accessing your network and the level of authority granted both to the device accessing the network and to its user.

The public internet cannot provide or compete with a Triple-A level of security, and nor should we expect it to. It cannot live up to the stringent data protection requirements necessary for today’s critical digital assets. We cannot remain content that so much infrastructure, from banking, to transport and to power supplies, relies on a network with so many known vulnerabilities. And we must consider whether we want to carry on developing an industrial internet of things and consumer services on a public network.

We will continue to explore these issues on this blog, to highlight different approaches, and examine the requirements of the secure networks of the future. And in the process, we’ll take a look at the work being done to build more networks with a Triple-A approach.

Read more…

How Can You Cope With The Rise Of Dark Data

At this point, everyone has heard about what big data analytics can do for marketing, research, and internal productivity. However, the data only about 20% of all data created is collected and analyzed. The other 80% is known as dark data, or data that collected but not analyzed or made to be searchable. So, what is the purpose of this data, and why is it taking up terabytes worth of storage space on servers around the world?

Examples of Dark Data

  • Media: Audio, video and image files oftentimes will not be indexed, making them difficult to gain insights from. Contents of these media files, such as the people in the recording or dialogue within a video, will remain locked within the file itself.

  • Social Data: Social media analytics have improved drastically over the last few years. However, data can only be gathered from a user’s point of entry to their exit point. If a potential customer follows a link on Facebook, then send the visited website to five friends in a group chat, the firm will not realize their advertisement had 6 touchpoints, not just the one.

  • Search Histories: For many companies, especially in the financial service, healthcare, and energy industries, regulations are a constant concern. As legal compliance standards change, firms worry that they will end up deleting something valuable.

As analytics and automation improve, more dark data is beginning to be dragged out into the light. AI, for example, is getting far better at speech recognition. This allows media files to be automatically tagged with metadata and audio files to be transcribed in real time. Social data is also starting to be tracked with far better accuracy. In doing so, companies will be able to better understand their customers, their interests, and their buying habits. This will allow marketers to create limited, targeted ads based on a customers location that bring in more revenue while reducing cost.

The explosion of data we are currently seeing is only the tip of the big data iceberg. As IoT and wearable devices continue their integration into our daily lives, the amount of data we produce will only grow. Companies are looking to get ahead of the curve and ensure they can gain as much insight from this data as possible. If these firms do not have a plan to create actionable insights from this currently dark data, they ultimately could fall behind and lose out to competitors with a bigger focus on analytics.

The original story was published on ELEKS Trends Blog, visit to get more insights. 

Read more…

An Open and Dangerous Place

Let’s just say it: The public internet is great, but it’s an unfit, wide-open place to try to conduct confidential business.

More and more, the public nature of the internet is causing business and government leaders to lose sleep. The global ransomware attacks this year that crippled infrastructure and businesses across Europe clearly shows the concern is not only justified but also growing.

As a result, internet and privacy regulations, like GDPR and PSD2, are front and center as governments around the world increasingly look at the web and how it’s being used. This is creating competing and contradictory objectives.

On the one hand, governments want to protect consumer privacy and data; on the other, they want to be able to monitor what certain folks are up to on the internet. And in both cases, they can at least claim to be looking to protect people.

Regardless of the difficulty of the task, there is no doubt the big governments are circling and considering their options.

Speaking in Mexico in June, Germany Chancellor Angela Merkel touted the need for global digital rules, like those that exist for financial markets, and that those rules need to be enforceable through bodies like the World Trade Organization.

From a business perspective, I can applaud the ambition, but it does seem a little like trying to control the uncontrollable. The truth is that the public internet has come to resemble the old Wild West. It is an increasingly dangerous place to do business, with more than its fair share of rustlers, hustlers, and bandits to keep at bay.

The public internet connects the world and nearly all its citizens. When it comes to connecting businesses, national infrastructures, and governments themselves, trying to regulate the Wild West of the public internet simply isn’t an option. Instead, it’s time to take a step back and look for something different.

We believe organizations that want to conduct business, transfer data, monitor equipment and control operations globally – with certainty, security and privacy – should not be relying on the public internet. The sheer number of access points and endpoints creates an attack surface that is simply too wide to protect, especially with the increased trending of fog and edge networks that we’ve discussed on previous Syniverse blog posts.

Just last week, the online gaming store CEX was hacked. In an instant, around two million customers found their personal information and financial data had been exposed. Consumers in America, the U.K. and Australia are among those affected. As I said, the public internet presents an ever-widening attack surface.

Recently on the Syniverse blog, we’ve been talking about the need to develop private, closed networks where businesses, national utilities and governments can truly control not just access, but activity. Networks that are always on and ones where the owners always know who is on them and what they are doing. Networks that are private and built for an exact purpose, not public and adaptable.

Trying to apply or bolt on rules, regulations and security processes after the fact is never the best approach.  Especially if you are trying to apply them to a service that is omnipresent and open to anybody 24/7.

When we look at the public internet, we see fake actors, state actors, hackers and fraudsters roaming relatively freely. We see an environment where the efforts to police that state might raise as many issues as they solve.

Instead, it’s time for global businesses to build a new world. It’s time to leave the old Wild West and settle somewhere safer. It’s time to circle the wagons around a network built for purpose. That is the future.

Read more…

Why Edge Computing Is an IIoT Requirement

How edge computing is poised to jump-start the next industrial revolution.

From travel to fitness to entertainment, we now have killer apps for many things we never knew we needed. Over the past decade, we’ve witnessed tremendous improvements in terms of democratizing data and productivity across the consumer world.

Building on that, we’re entering a new era of software-defined machines that will transform productivity, products and services in the industrial world. This is the critical link which will drive new scenarios at even faster rates of innovation. By 2020, the Industrial Internet of Things (IIoT) is expected to be a $225 billion market.

To jump-start the productivity engine of IIoT, real-time response is needed at the machine-level at scale and that requires an edge-plus-cloud architecture designed specifically for the Industrial Internet. From Google maps to weather apps, we’ve been experiencing the benefits of cloud and edge computing working together in our daily lives for quite some time.

But, what is edge? Edge is the physical location that allows computing closer to the source of data. Edge computing enables data analytics to occur and resulting insights to be gleaned closer to the machines. While edge computing isn’t new, it’s beginning to take hold in the industrial sector – and the opportunity is far greater than anything we’ve seen in the consumer sector, and here’s why:

Real-time data in a real-time world: The edge is not merely a way to collect data for transmission to the cloud. We are now able to process, analyze and act upon the collected data at the edge within milliseconds. It is the gateway for optimizing industrial data. And when millions of dollars and human lives are on the line, edge computing is essential for optimizing industrial data at every aspect of an operation.

Take windfarms for example. If wind direction changes, the edge software onsite would collect and analyze this data in real-time and then communicate to the wind turbine to adjust appropriately using an edge device, such as a field agent and connected control system, and successfully capture more kinetic energy. Because the data is not sent to the cloud, the processing time is significantly faster. This increases wind turbines’ production, and ultimately distributes more clean energy to our cities, increasing the value of the renewable energy space.

Big data, big trade-offs: The harsh and remote conditions of many industrial sites make it challenging to connect and cost-effectively transmit large quantities of data in real-time. We are now able to add intelligence to machines at the edge of the network, in the plant or field. Through edge computing on the device, we’re bringing analytics capabilities closer to the machine and providing a less expensive option for optimizing asset performance.

Consider the thousands of terabytes of data from a gas turbine. Sending this data to the cloud to run advanced analytics maybe technologically possible, but certainly too cost prohibitive to do a daily basis. Through edge computing, we can capture streaming data from a turbine and use this data in real-time to prevent unplanned downtime and optimize production to extend the life of the machine.

What’s Next

Today, only 3% of data from industrial assets is useable. Connecting machines from the cloud to the edge will dramatically increase useable data by providing greater access to high powered, cost effective computing and analytics tools at the machine and plant level.

Consider the fact that for years traditional control systems were designed to keep a machine running the same way day in and day out for the lifecycle of the machine. At GE Energy Connections, we recently debuted the Industrial Internet Control System (IICS), which successfully allows machines to see, think and do and will enable machine learning at scale. To take IICS to the next level, we’re creating an ecosystem of edge offerings to accelerate widespread adoption across the industrial sector. We’re advancing this ecosystem and empowering app developers who want to play a role in driving the new industrial era. 

Currently, to add value to a software system, a developer writes the code, ports it into the legacy software stack, shuts down the devices and finally, updates it. That’s all going to change. We are working on creating an opportunity for any developer to create value-added edge applications. Customers will be able port the necessary apps to their machine without having to shut it down, just like we do on our phones today. Companies will be able to download apps for their needs and update frequently to ensure their business is running smoothly. While no one likes to run out of battery on their smart phone, an outage for a powerplant is far more costly, so the ability to port apps without shutting down devices and being able to detect issues before it occurs will be a game changer.

From wind turbines to autonomous cars, edge computing is poised to completely revolutionize our world. It’s forcing change in the way information is sent, stored and analyzed.  And there’s no sign of slowing down.

Read more…

How Customer Analytics has evolved...

Customer analytics has been one of hottest buzzwords for years. Few years back it was only marketing department’s monopoly carried out with limited volumes of customer data, which was stored in relational databases like Oracle or appliances like Teradata and Netezza.
SAS & SPSS were the leaders in providing customer analytics but it was restricted to conducting segmentation of customers who are likely to buy your products or services.
In the 90’s came web analytics, it was more popular for page hits, time on sessions, use of cookies for visitors and then using that for customer analytics.
By the late 2000s, Facebook, Twitter and all the other socialchannels changed the way people interacted with brands and each other. Businesses needed to have a presence on the major social sites to stay relevant.
With the digital age things have changed drastically. Customer issuperman now. Their mobile interactions have increased substantially and they leave digital footprint everywhere they go. They are more informed, more connected, always on and looking for exceptionally simple and easy experience.
This tsunami of data has changed the customer analytics forever.
Today customer analytics is not only restricted to marketing forchurn and retention but more focus is going on how to improve thecustomer experience and is done by every department of the organization.
A lot of companies had problems integrating large bulk of customer data between various databases and warehouse systems. They are not completely sure of which key metrics to use for profiling customers. Hence creating customer 360 degree view became the foundation for customer analytics. It can capture all customer interactions which can be used for further analytics.
From the technology perspective, the biggest change is the introduction of big data platforms which can do the analytics very fast on all the data organization has, instead of sampling and segmentation.
Then came Cloud based platforms, which can scale up and down as per the need of analysis, so companies didn’t have to invest upfront on infrastructure.
Predictive models of customer churn, Retention, Cross-Sell do exist today as well, but they run against more data than ever before.
Even analytics has further evolved from descriptive to predictive toprescriptive. Only showing what will happen next is not helping anymore but what actions you need to take is becoming more critical.
There are various ways customer analytics is carried out:
·       Acquiring all the customer data
·       Understanding the customer journey
·       Applying big data concepts to customer relationships
·       Finding high propensity prospects
·       Upselling by identifying related products and interests
·       Generating customer loyalty by discovering response patterns
·       Predicting customer lifetime value (CLV)
·       Identifying dissatisfied customers & churn patterns
·       Applying predictive analytics
·       Implementing continuous improvement
Hyper-personalization is the center stage now which gives your customer the right message, on the right platform, using the right channel, at the right time
Now via Cognitive computing and Artificial Intelligence using IBM Watson, Microsoft and Google cognitive services, customer analytics will become sharper as their deep learning neural network algorithms provide a game changing aspect.
Tomorrow there may not be just plain simple customer sentiment analyticsbased on feedbacks or surveys or social media, but with help of cognitive it may be what customer’s facial expressions show in real time.
There’s no doubt that customer analytics is absolutely essential for brand survival.
Read more…
The digital revolution has created significant opportunities and threats for every industry. Companies that cannot or do not make significant changes faster to their business model in response to a disruption are unlikely to survive
It is extremely important to do digital maturity assessment before embarking on digital transformation.
Digital leaders must respond to the clear and present threat of digital disruption by transforming their businesses. They must embed digital capabilities into the very heart of their business, making digital a core competency, not a bolt-on. Creating lasting transformative digital capabilities requires you to build a customer-centric culture within your organization.
This requires new capabilities that organizations need to acquire and develop which include disruptive technologies like Big Data,AnalyticsInternet of Things, newer business models.
Digital maturity model measures readiness of the organization to attain higher value in digital customer engagement, digital operations or digital services. It helps in incremental adoption of digital technologies and processes to drive competitive strategies, greater operationally agility and respond to rapidly changing market conditions.
Business can use the maturity model to define the roadmap, measuring progress on the milestones.
The levels of maturity can be defined as per multiple reports available and

adopt the ones which makes more sense to your business.

·     Level 1 : Project based solutions are developed for a particular problem, no integration to home grown systems, unaware of risks and opportunities
·     Level 2 : Departmentalized projects but still not known to organization, little integration
·     Level 3 : Solutions are shared between the departments for a common business problem, better integration
·     Level 4 : Organization wide efforts of digital, highly integrated, adaptive culture for fail fast  and improve
·     Level 5 : Driven by CXOs, customer centric and complete transformation changes happen to organization
Here are the 7 categories on which business should ask questions to all the stakeholders to gauge the maturity of Digital Transformation and identify the improvement and priorities.
1.   Strategy & Roadmap - how the business operates or transforms to increase its competitive advantage through digital initiatives which are embedded within the overall business strategy
2.   Customer – Are you providing experience to customers on theirpreferred channels, online, offline, anytime on any device
3.   Technology – Relevant tools and technologies to make data available across all the systems
4.   Culture – Do you have the organization structure and culture to drive the digital top down
5.   Operations – Digitizing & automating the processes to enhance business efficiency and effectiveness.
6.   Partners – Are you utilizing right partners to augment your expertise
7.   Innovation – How employees are encouraged to bring the continuous innovation to how they serve the customers
Finally you know when you are digital transformed?
·             When there is nobody having “Digital” in their title
·             There is no marketing focused on digital within the organization
·             There is no separate digital strategy than company’s business strategy
Read more…
As businesses are trying to leverage every opportunity regarding IoT by trying to find ways to partner with top universities and research centers, here is a list of the Top 20 co-occurring topics of the Top 500 Internet of Things Authors in the academic field. This gives an idea of research frontiers of the leaders.
Read more…

18 Big Data tools you need to know!!

In today’s digital transformation, big data has given organization an edge to analyze the customer behavior & hyper-personalize every interaction which results into cross-sell, improved customer experience and obviously more revenues.
The market for Big Data has grown up steadily as more and more enterprises have implemented a data-driven strategy. While Apache Hadoop is the most well-established tool for analyzing big data, there are thousands of big data tools out there. All of them promising to save you time, money and help you uncover never-before-seen business insights.
I have selected few to get you going….
Avro: It was developed by Doug Cutting & used for data serialization for encoding the schema of Hadoop files.
 
Cassandra: is a distributed and Open Source database. Designed to handle large amounts of distributed data across commodity servers while providing a highly available service. It is a NoSQL solution that was initially developed by Facebook. It is used by many organizations like Netflix, Cisco, Twitter.
 
Drill: An open source distributed system for performing interactive analysis on large-scale datasets. It is similar to Google’s Dremel, and is managed by Apache.
 
Elasticsearch: An open source search engine built on Apache Lucene. It is developed on Java, can power extremely fast searches that support your data discovery applications.
 
Flume: is a framework for populating Hadoop with data from web servers, application servers and mobile devices. It is the plumbing between sources and Hadoop.
 
HCatalog: is a centralized metadata management and sharing service for Apache Hadoop. It allows for a unified view of all data in Hadoop clusters and allows diverse tools, including Pig and Hive, to process any data elements without needing to know physically where in the cluster the data is stored.
 
Impala: provides fast, interactive SQL queries directly on your Apache Hadoop data stored in HDFS or HBase using the same metadata, SQL syntax (Hive SQL), ODBC driver and user interface (Hue Beeswax) as Apache Hive. This provides a familiar and unified platform for batch-oriented or real-time queries.
 
JSON: Many of today’s NoSQL databases store data in the JSON (JavaScript Object Notation) format that’s become popular with Web developers
 
Kafka: is a distributed publish-subscribe messaging system that offers a solution capable of handling all data flow activity and processing these data on a consumer website. This type of data (page views, searches, and other user actions) are a key ingredient in the current social web.
 
MongoDB: is a NoSQL database oriented to documents, developed under the open source concept. This comes with full index support and the flexibility to index any attribute and scale horizontally without affecting functionality.
 
Neo4j: is a graph database & boasts performance improvements of up to 1000x or more when in comparison with relational databases.
Oozie: is a workflow processing system that lets users define a series of jobs written in multiple languages – such as Map Reduce, Pig and Hive. It further intelligently links them to one another. Oozie allows users to specify dependancies.
 
Pig: is a Hadoop-based language developed by Yahoo. It is relatively easy to learn and is adept at very deep, very long data pipelines.
 
Storm: is a system of real-time distributed computing, open source and free.  Storm makes it easy to reliably process unstructured data flows in the field of real-time processing. Storm is fault-tolerant and works with nearly all programming languages, though typically Java is used. Descending from the Apache family, Storm is now owned by Twitter.
 
Tableau: is a data visualization tool with a primary focus on business intelligence. You can create maps, bar charts, scatter plots and more without the need for programming. They recently released a web connector that allows you to connect to a database or API thus giving you the ability to get live data in a visualization.
 
ZooKeeper: is a service that provides centralized configuration and open code name registration for large distributed systems. 
 
Everyday many more tools are getting added the big data technology stack and its extremely difficult to cope up with each and every tool. Select few which you can master and continue upgrading your knowledge.
Read more…
The IoT needs to be distinguished from the Internet. The Internet, of course, represents a globally connected number of network, irrespective of a wired or wireless interconnection. IoT, on the other hand, specifically draws your attention to the ability of a ‘device’ to be tracked or identified within an IP structure according to the original supposition.
Read more…
Today we are into digital age, every business is using big data and machine learning to effectively target users with messaging in a language they really understand and push offers, deals and ads that appeal to them across a range of channels.
With exponential growth in data from people and & internet of things, a key to survival is to use machine learning & make that data more meaningful, more relevant to enrich customer experience.
Machine Learning can also wreak havoc on a business if improperly implemented. Before embracing this technology, enterprises should be aware of the ways machine learning can fall flat.Data scientists have to take extreme care while developing these machine learning models so that it generate right insights to be consumed by business.
Here are 5 ways to improve the accuracy & predictive ability of machine learning model and ensure it produces better results.
·       Ensure that you have variety of data that covers almost all the scenarios and not biased to any situation. There was a news in early pokemon go days that it was showing only white neighborhoods. It’s because the creators of the algorithms failed to provide a diverse training set, and didn't spend time in these neighborhoods. Instead of working on a limited data, ask for more data. That will improve the accuracy of the model.
·       Several times the data received has missing values. Data scientists have to treat outliers and missing values properly to increase the accuracy. There are multiple methods to do that – impute mean, median or mode values in case of continuous variables and for categorical variables use a class. For outliers either delete them or perform some transformations.
·       Finding the right variables or features which will have maximum impact on the outcome is one of the key aspect. This will come from better domain knowledge, visualizations. It’s imperative to consider as many relevant variables and potential outcomes as possible prior to deploying a machine learning algorithm.
·       Ensemble models is combining multiple models to improve the accuracy using bagging, boosting. This ensembling can improve the predictive performance more than any single model. Random forests are used many times for ensembling.
·       Re-validate the model at proper time frequency. It is necessary to score the model with new data every day, every week or month based on changes in the data. If required rebuild the models periodically with different techniques to challenge the model present in the production.
There are some more ways but the ones mentioned above are foundational steps to ensure model accuracy.
Machine learning gives the super power in the hands of organization but as mentioned in the Spider Man movie – “With great power comes the great responsibility” so use it properly.
Read more…

Smart IoT - Generate Greatest Value

Digital Transformation

We have now entered an era with a new virtual revolution, particularly, the Internet of things (IoT). The virtual revolution marks the starting of information age. We use the Internet almost every day. The net has turned out to be one of established ways for us to work together, to share our lives with others, to shop, to teach, to research, and to learn. However  the next wave of the Internet isn't about people. it's far about things, honestly?

All about IoT

IoT is defined as the network of physical objects that can be accessed through the Internet. These objects contain embedded various technology to interact with internal states or the external environment.

IoT is characterized as "the figuring frameworks of sensors and actuators associated by systems, where the processing frameworks can screen or deal with the status and actions of connected objects and machines, and the connected sensors can likewise screen the characteristic world, individuals, and creatures." The center of IoT is not just about interfacing things to the Internet. It is about how to generate and use the big data from the things to make new values for individuals, and about how we empower new trades of significant worth between them. In other words, when objects can sense and communicate, IoT has its knowledge to change how and where choices are made, and who makes them, and to pick up a superior esteem, solution or service.

Smart IoT

Fundamental to the estimation of IoT is in actuality the Internet of smart things (smart IoT). Supported by intelligent optimization, smart IoT can increase productivity of work and enhance quality of lives for people. Let us take “cities” — the engines of global economic growth — as an example. Smart cities have the potential to dramatically improve the lives of everyone. In intelligent transportation systems (ITS), smart IoT can not only monitor the status of the transportation, but also optimize traffic signal controls to solve traffic congestion and provide the travelers with better routes and appropriate transportation information, etc. Combining IoT and machine learning (ML) can also make our roads safer. Profits by smart IoT have been shown also in health-care, logistics, environment, smart home, in the aspects of better quality, energy conservation, efficiency increase, and so on.

Smart IoT remains in its infancy now in terms of the technology  development and the effect on our global economy system and our daily lives. Maximum IoT statistics aren't used presently within the era of big data. Maximum IoT has no intelligence inside the generation of artificial intelligence (AI).  IoT which might be used these days are on the whole for anomaly detection and control, as opposed to optimization and prediction. Given the brilliant anticipated increase of the Internet over the following 10 years, it is considered one of vital challenges and possibilities for us to invent and practice in real-global programs on a way to make the IoT smarter to generate the greatest value.

 

 

Read more…
Internet of Things (IoT) began as an emerging trend and has now become one of the key element ofDigital Transformationthat is driving the world in many respects.
If your thermostat or refrigerator is connected to the Internet, then it is part of the consumer IoT.  If your factory equipment have sensors connected to internet, then it is part of Industrial IoT(IIoT).
IoT has an impact on end consumers, while IIoT has an impact on industries like Manufacturing, Aviation, Utility, Agriculture, Oil & Gas, Transportation, Energy and Healthcare.
IoT refers to the use of "smart" objects, which are everyday things from cars and home appliances to athletic shoes and light switches that can connect to the Internet, transmitting and receiving data and connecting the physical world to the digital world.
IoT is mostly about human interaction with objects. Devices can alert users when certain events or situations occur or monitor activities:
·       Google Nest sends an alert when temperature in the house dropped below 68 degrees
·       Garage door sensors alert when open
·       Turn up the heat and turn on the driveway lights a half hour before you arrive at your home
·       Meeting room that turns off lights when no one is using it
·       A/C switch off when windows are open
IIoT on the other hand, focus more workers safety, productivity & monitors activities and conditions with remote control functions ability:
·       Drones to monitor oil pipelines
·       Sensors to monitor Chemical factories, drilling equipment, excavators, earth movers
·       Tractors and sprayers in agriculture
·       Smart cities might be a mix of commercial and IIoT.
IoT is important but not critical while IIoT failure often results in life-threatening or other emergency situations.
IIoT provides an unprecedented level of visibility throughout the supply chain. Individual items, cases, pallets, containers and vehicles can be equipped with auto identification tags and tied to GPS-enabled connections to continuously update location and movement.
IoT generates medium or high volume of data while IIoT generates very huge amounts of data (A single turbine compressor blade can generate more than 500GB of data per day) so includes Big Data,Cloud computingmachine learning as necessary computing requirements.
In future, IoT will continue to enhance our lives as consumers while IIoT will enable efficient management of entire supply chain.
Read more…
How many times you have listened to the advice of your friend/colleague or someone you know, to invest in stock market? Many people have gained and lost their fortune with this guess work and now younger generation is more scared to hand over their hard earned money to someone for investing.
Until recently, you had 2 options for investments - either hire a human financial advisor or do it yourself. Human advisors charge substantial fees starting minimum 1% of value of assets to manage your portfolios. Do it yourself option requires lot of time and energy and you may lose your money due to result of overtrading, panic-selling during downturns, and trying to time the market as the issue for many individuals is they aren’t cut out to go it alone
This is where robo-advisors have scored more over humans.
A robo-advisor is an online, automated wealth management service based on data science algorithms with no or minimal human interventions that allocate, deploy and rebalance(spreading your money in stocks, mutual funds, bonds to balance risks) your investments.
The robo-advisor industry is in its infancy. Online life is migrating from persona desktop computing to laptops to tablets and finally to mobile.
Here are some of the advantages of using a robo-advisor:
·       Cheaper fees or free compared to traditional financial advisors
·       Automatic diversification into various options
·       Easy online access as we all are accustomed to shiny apps on mobile
·       Safer than picking your own stocks
·       You don’t need a degree in finance to understand the recommendations.
Big data and advanced analytics can help broaden the scope of robo-advice dramatically, incorporating financial planning into broader retirement planning, tax planning, vacation savings, higher education planning.
Robo-Advisors have typically targeted millennials segment because these young investors want to save & multiple money faster and often don't have enough patience & wealth to warrant the attention and interest of a human advisor.
High Net worth Individuals also think, online and automated investment tools can positively affect their wealth manager's advice and decision-making.
Overall, robo-advisors provide a good user experience with latest digital technologies such as slick apps and fancy interfaces. These platforms make sure that they fit right in with your daily online browsing,  and are great options for novice investors who are just starting out and want to dip their toes in the world of investments, or for people with a simple financial plan who just need an affordable, straightforward place to start their retirement plans
Wealthfront & Betterment are two popular commercial fee based robo-advisors available today. In the Free category WiseBanyan & CharlesSchwab are making the ground.
But it won’t be long before Amazon, Google, Facebook and Apple get in on the robo-advisor industry.
Robo advice is certainly here to stay, and it has its place in the wealth management landscape of tomorrow. But what's missing most, with robo-advisers is the personal touch.  In this age of hyper-personalization, the lack of a human element is one area where robo-advisors may fall short.
The robo-advisor can't replace a trusted age old adviser, your elders have worked with, who lives nearby and can rush right over in case of need, who knows you and your family.

With the pace of improvement that Artificial Intelligence and machine learning bringing up, robo-advice has the potential to become highly personalized and specific over time.
Read more…
Today, with Digitization of everything, 80 percent the data being created is unstructured. 
Audio, Video, our social footprints, the data generated from conversations between customer service reps, tons of legal document’s texts processed in financial sectors are examples of unstructured data stored in Big Data.
Organizations are turning to natural language processing (NLP) technology to derive understanding from the myriad of these unstructured data available online and in call-logs.
Natural language processing (NLP) is the ability of computers to understand human speech as it is spoken. NLP is a branch of artificial intelligence that has many important implications on the ways that computers and humans interact. Machine Learning has helped computers parse the ambiguity of human language.
Apache OpenNLP, Natural Language Toolkit(NLTK), Stanford NLP are various open source NLP libraries used in real world application below.
Here are multiple ways NLP is used today:
The most basic and well known application of NLP is Microsoft Word spell checking.
Text analysis, also known as sentiment analytics is a key use of NLP. Businesses are most concerned with comprehending how their customers feel emotionally adn use that data for betterment of their service.
Email filters are another important application of NLP. By analyzing the emails that flow through the servers, email providers can calculate the likelihood that an email is spam based its content by using Bayesian or Naive based spam filtering.
Call centers representatives engage with customers to hear list of specific complaints and problems. Mining this data for sentiment can lead to incredibly actionable intelligence that can be applied to product placement, messaging, design, or a range of other use cases.
Google and Bing and other search systems use NLP to extract terms from text to populate their indexes and to parse search queries.
Google Translate applies machine translation technologies in not only translating words, but in understanding the meaning of sentences to provide a true translation.
Many important decisions in financial markets use NLP by taking plain text announcements, and extracting the relevant info in a format that can be factored into algorithmic trading decisions. E.g. news of a merger between companies can have a big impact on trading decisions, and the speed at which the particulars of the merger, players, prices, who acquires who, can be incorporated into a trading algorithm can have profit implications in the millions of dollars.
Since the invention of the typewriter, the keyboard has been the king of human-computer interface. But today with voice recognition via virtual assistants, like Amazon’s Alexa, Google’s Now, Apple’s Siri and Microsoft’s Cortana respond to vocal prompts and do everything from finding a coffee shop to getting directions to our office and also tasks like turning on the lights in home, switching the heat on etc. depending on how digitized and wired-up our life is.
Question Answering - IBM Watson is the most prominent example of question answering via information retrieval that helps guide in various areas like healthcare, weather, insurance etc.
Therefore it is clear that Natural Language Processing takes a very important role in new machine human interfaces. It’s an essential tool for leading-edge analytics & is the near future.
Read more…
Analytics and Big Data have disrupted many industries, and now they are on the edge of scoring major points in sports. Over the past few years, the world of sports has experienced an explosion in the use of analytics
Till few years back experience, gut feelings, and superstition have traditionally shaped the decision making process in sports.
It is first started with Oakland Athletics' General Manager, Billy Beane, who applied analytics for selecting right players. This was the first known use of statistics and data to make decisions in professional sports.
Today, every major professional sports team either has an analytics department or an analytics expert on staff.  From coaches and players to front offices and businesses, analytics can make a difference in scoring touchdowns, signing contracts or preventing injuries.
Big name organizations such as the Chicago Cubs, and Golden State Warriors are realizing that this is the future of sports and it is in their best interest to ride the wave while everyone else is trying to learn how to surf.
Golden State Warriors, have similarly used big data sets to help owners and coaches recruit players and execute game plans.
SportVu has six cameras installed in the NBA arenas to track the movements of every player on the court and the basketball 25 times per second. The data collected provides a plethora of innovative statistics based on speed, distance, player separation and ball possession to improve next games.
Adidas miCoach app works by having players attach a wearable device to their jerseys. Data from the device shows the coach who the top performers are and who needs rest. It also provides real-time stats on each player, such as speed, heart rate and acceleration.
Patriots developed a mobile app called Patriots Game Day Live, available to anyone attending a game at Gillette Stadium. With this app, they are trying to predict the wants and needs of fans, special content to be delivered, in-seat concession ordering and bathroom wait times.
FiveThirtyEight.com, provides details into more than just baseball coverage. It has over 20 journalists crunching numbers for fans to gain a better understanding of an upcoming game, series or season.
Motus’ new sleeves for tracking a pitcher’s throwing motion, measuring arm stress, speed and shoulder rotation. The advanced data generated from this increases a player’s health, performance and career. Experts can now predict with greater confidence if and when a pitcher with a certain throwing style will get injured.

In the recent Cricket world cup, every team had its own team of Data Analysts. They used various technologies like Cloud Platform and visualizations to predict scores, player performance, player profiles and more. Around 40 years’ worth of Cricket World Cup data is being mined to produce insights that enhances the viewer's experience. 
Analytics can advance the sports fans' experience as teams and ticket vendors compete with the at-home experience -- the better they know their fans, the better they can cater to them.
This collection of data is also used for internet ads, which can help with the expansion and growth of your organization through social media platforms or websites. 
  • What would be the most profitable food served at the concession stand?
  • What would be the best prices to sell game day tickets?
  • Determine which player on the team is the most productive?
  • Which players in the draft will become all-stars, and which ones will be considered role players?
  • Understand the fans behavior at the stadium via their app and push relevant information accordingly.
In this Digital age, Analytics are the present and future of professional sports. Any team that does not apply them to the fullest is at a competitive disadvantage.
Read more…

The Untapped Potential of Data Analytics

The potential of big data just keeps growing. For taking full advantage, companies need to incorporate analytics into their strategic objectives.

A research report from McKinsey Global Institute (MGI), suggests that the opportunity and applications continue to expand in the data-driven world.

With rapid technological transformation, the question for businesses arises on how to position themselves uniquely in the world leveraging analytics. Over 2.5 quintillion bytes of data is generated every day. As information pours in via various digital platforms, VR application, and mobile phones the need for data storage capacity has increased.

The transformational potential

The recent progress shows the potential of big data and analytics in more than five distinct domains. However, transforming to a data-driven decision-making organisation is not always simple.

The first challenge is to incorporate data and analytics along with business objectives into a core strategic vision. Secondly, the lack of talent in the adoption of analytics. New reports denote that despite training programs, the talent is not enough to match the demand. The next step is to develop the right business process and framework which includes data infrastructure.

Simply combining technology systems along with the existing business operations isn't enough. For ensuring a successful transformation, all aspects of business activity need to be evaluated and combined to realize the full potential of data analytics.

Incorporating data analytics

The next generation of analytic tools will unleash even bigger opportunities. With new machine-learning, deep-learning and artificial-intelligence capabilities, an enormous variety of applications can be enabled which provide customer service, manage logistics and analyze data.

Technology and productivity gains seem an advantage, but also carry the risk of people losing jobs. A case of automation is the AI software developed by Bridgewater Associates, the world's largest hedge fund to improve efficiency.

With Data and analytics shaking up every industry, the effects will only become more noticeable as adoption reaches the masses.

As machines gain unprecedented capabilities to solve complex problems, organizations can harness these capabilities to create their unique value proposition and solve problems.

 

Read more…

Do you know what is powerful real-time analytics?

In the Digital age today, world has become smaller and faster. 
Global audio & video calls which were available only in corporate offices, are now available to common man on the smartphone.
Consumers have more information of the products and comparison than the manufactures at any time, any place, and any device.
Gone are the days, when organizations used to load data in their data warehouse overnight and take decision based on BI, next day. Today organizations need actionable insights faster than ever before to stay competitive, reduce risks, meet customer expectations, and capitalize on time-sensitive opportunities – Real-time, near real-time.
Real-time is often defined in microseconds, milliseconds, or seconds, while near real-time in seconds, minutes.
With real-time analytics, the main goal is to solve problems quickly as they happen, or even better, before they happen. Real-time recommendations create a hyper-personal shopping experience for each and every customer.
The Internet of Things (IoT) is revolutionizing real-time analytics. Now, with sensor devices and the data streams they generate, companies have more insight into their assets than ever before.
Several industries are using this streaming data & putting real-time analytics. 
·        Churn prediction in Telecom
·        Intelligent traffic management in smart cities
·        Real-time surveillance analytics to reduce crime
·        Impact of weather and other external factors on stock markets to take trading decisions
·        Real-time staff optimization in Hospitals based on patients 
·        Energy generation and distribution based on smart grids
·        Credit scoring and fraud detection in financial & medical sector
Here are some real world examples of real-time analytics:
·        City of Chicago collects data from 911 calls, bus & train locations, 311 complaint calls & tweets to create a real-time geospatial map to cut crimes and respond to emergencies
·        The New York Times pays attention to their reader behavior using real-time analytics so they know what’s being read at any time. This helps them decide which position a story is placed and for how long it’s placed there
·        Telefonica the largest telecommunications company in Spain can now make split-second recommendations to television viewers and can create audience segments for new campaigns in real-time
·        Invoca, the call intelligence company, is embedding IBM Watson cognitive computing technology into its Voice Marketing Cloud to help marketers analyze and act on voice data in real-time.
·        Verizon now enables artificial intelligence and machine learning, predicting the customer intent by mining unstructured data and correlations
·        Ferrari, Honda & Red Bull use data generated by over 100 sensors in their Formula 
One cars and apply real-time analytics, giving drivers and their crews the information they need to make better decisions about pit stops, tire pressures, speed adjustments and fuel efficiency.
Real-Time analytics helps getting the right products in front of the people looking for them, or offering the right promotions to the people most likely to buy. For gaming companies, it helps in understanding which types of individuals are playing which game, and crafting an individualized approach to reach them.
As the pace of data generation and the value of analytics accelerate, real-time analytics is the top most choice to ride on this tsunami of information.
More and more tools such as Cloudera Impala, AWS, Spark, Storm, offer the possibility of real-time processing of Big Data and provide analytics,

Now is the time to move beyond just collecting, storing & managing the data to take rapid actions on the continuous streaming data – Real-Time!! 

Read more…

Upcoming IoT Events

More IoT News

IoT Career Opportunities