According to Cisco, currently there are 10 billion things – phones, PCs, things – connected to the Internet. That is merely 600ths of one percent of the actual devices and things that exist right now. There are over one trillion devices out there right this very minute that are not talking to the Internet – but soon enough they will be.
Kevin Ashton, cofounder and executive director of the Auto-ID Center at MIT, first men-tioned the Internet of Things in a presentation he made to Procter & Gamble in 1999. Here’s how Ashton explains the potential of the Internet of Things:
“Today computers -- and, therefore, the Internet -- are almost wholly dependent on hu-man beings for information. Nearly all of the roughly 50 petabytes (a petabyte is 1,024 terabytes) of data available on the Internet were first captured and created by hu-man beings by typing, pressing a record button, taking a digital picture or scanning a bar code.
The problem is, people have limited time, attention and accuracy -- all of which means they are not very good at capturing data about things in the real world. If we had com-puters that knew everything there was to know about things -- using data they gathered without any help from us -- we would be able to track and count everything and greatly reduce waste, loss and cost. We would know when things needed replacing, repairing or recalling and whether they were fresh or past their best.”
The broadband divide could prove to be a real hampering force to the Internet of Things movement that is gaining speed today. Cloud, mobility, big data are all con-verging and making a seamless network, but the success of this convergence de-pends heavily on the ability to actually move and access the data. And considering that millions of additional devices (some of which are just sensors) will enter the equation means its time for further investment and quick. According to the CIO Sur-vey, organizations are in a prime position to innovate and make significant changes.
CONNECT ANY THING OVER ANY NETWORK
The Internet of Things (IoT) is a computing concept that describes a future where everyday physical objects will be connected to the Internet and be able to identify themselves to other devices. It is significant because an object that can represent itself digitally becomes something greater than the object by itself. No longer does the object relate just to you, but is now connected to surrounding objects and database data. When many objects act in unison, they are known as having "ambient intelligence."
Business Model focusing more on Data
In other words, as the physical and digital worlds integrate more closely with each other, and the number of connected devices is predicted to reach 25 billion by 2018, the IoT will enhance and evolve our ability to manage and process information . It’s a more context-oriented world, because there is better data. First thing in a new technology, people do all the obvious things that look like the old market, but more efficiently. In the Internet, GNN had web ads like old newspaper ads. Later there was Google search, which was a different way of doing advertising, by focusing more on data. Now we’ve got social search, social networks. The business model moves to something that is more native to the technology. Uber is an early IoT company. Other businesses will pop up that do more native things. Much of what is available are components that require highly specialized knowledge and skills to make use of. The Internet of Things and its partner in crime, big data, can also impact society at a much higher level. By effecting better decision making through a better understanding of data, we can tackle socioeconomic issues like poverty and disease, education, and quality of life around the world. You know that soccer ball that generates electricity (an awesome invention, btw)? The IoT is the next exponent up.
IoT focus on what matters most to you
The Internet of Things is not a futuristic, aspirational technology trend. It’s here today in the devices, sensors, cloud infrastructure, and data and business intelligence tools you are already using. Rather than thinking about the Internet of Things in terms of everything–such as billions of devices and sensors–focus on what matters most to you. Instead of thinking about the massive amount of data being produced, think about how one piece of data can provide value to your business. The DIY Marker community has its Arduino and Rasberry Pi boards to create toy educational experiments but even those require a bit of study to make sense of. The only project that I know of that seems to be pointing in a direction of making IoT available as a platform for anyone to create with is the TOI, thingsoninternet.biz and their VIPER platform. It is a set of components that are open so available from many sources and they have made Python available as the programming language. Python was create to be an easy programming language to learn but until VIPER it was not suitable for embedded devices. Look for this interesting product on kickstarter and use it to point to a direction for the rest of the industry.
That said, the notion of “The Internet of things” is something unstoppable. More and more devices will become Internet enabled, not less. What needs to be addressed is rock-solid security (logical and physical) combined with privacy laws and policies. At the same time, a comprehensive set of government acts, laws, and regulatory frameworks and technical standards needs to be developed to harness the potential of new models of interactions among the machines and people.
By capturing real time inventory data from vending machines, smart shelves and other instrumented sources of retail data, you learn customer preferences which let you quickly manipulate product mix to increase sales.
Every week, thousands of new apps are seen hitting the mobile market. Unfortunately, the number of hackers working assiduously to tap into these apps to implant malware or phish for user information has also been on the increase. By implication, there is every need to take the security of mobile users very seriously particularly when it comes to app development.
Apart from being highly vigilant about security, app developers need to be able to identify these security issues and know how to avoid them, so as to be able to provide users with the security they need to keep their information and other data safe. Security issues can be experienced in various forms during any mobile application development process; some of which are explained below.
Failure to implement secure communications to servers
Most apps are designed to connect back to a server particularly those applications that control sensitive user information. Therefore, as a critical area of concern, mobile app developers must ensure safe transit between the app and the server. Nothing has to be interrupted on an insecure WiFi connection. Basically, this type of security is achievable through SSL certificates and encryption. User information can be compromised particularly if developers fail to employ the right SSL libraries.
Inability to plan for physical security breaches
Nothing much can be done to prevent theft or loss of mobile devices. In fact, mobile app developers have a very little role to play in this. However, they can greatly help to minimize the problem by executing a local session timeout code. Usually, users are obligated to enter a password from time to time to access an app. Rather than making this a daily occurrence, password requirement from devices can be observed once a week or at the fifth time the app is used. Local session timeout can also prevent the use of software that helps users remember passwords.
The use of weak encryption or an entire lack of encryption
Obviously, improves constantly which helps to make algorithms become obsolete and very easy to crack. Failing to use encryption or using weak encryption in an app can put sensitive user information at risk of getting exposed. In the course of using certain apps, users are obligated to input sensitive data like personal identification information or credit card numbers. It is sad to know that this information can be hacked particularly with the absence of good encryption. An app is more likely to be hacked when it becomes more popular. So, if you are looking to push your app to the top, there is every need to invest in good encryption.
Bypassing systematic security testing
Most importantly, Indian app developers need to consider themselves as the last line of defense. You stand to put your app users at risk when you fail to ensure a secure app. In every development process, testing is very important and as such, there is no need to rush in releasing an app. Ensure to test every common inlet for security issues, such as sensors, GPS, camera, and even the development platform. Viruses and malware are no respecters of apps – every app is vulnerable to an attack from them.
Developers should try as much as possible to avoid the eruption of a crash and debug logs during testing. These are often common places hackers often take advantage of for app vulnerabilities. Apart from increasing the speed of an app, NSLog statements on iOS can be effectively disabled during iPhone app development to avoid vulnerabilities. Also, an Android app remains vulnerable until the Android debug log is typically cleared.
Lack of proper planning for data caching vulnerabilities
Unlike standard laptops and desktops, mobile devices are well-known for their ability to store short-term information for longer periods. This caching method generally helps to increase speed. However, since hackers can easily access cached information, there is every possible for mobile devices to be susceptible to security breaches. A major way of avoiding the problem is by demanding for a password to use an app. However, this can affect the popularity of your app, as most app users often find the use of passwords to be quite inconvenient. Alternatively, you can program the cache to be automatically erased every time users reboot their mobile device. This is another meaningful solution to data caching vulnerabilities.
Adopting other developers’ code
Developing an app from the start can be very time-consuming but with the availability of numerous free codes, this process has been extremely simplified. Interestingly, some hackers create codes for unsuspecting developers. In the hopes that application developers would pick up their codes, some hackers have ventured into creating anonymous codes. Through this, they tend to gain easy and free access to any information of their choice after the app has been designed and released.
Although it is never a bad thing to build upon people’s ideas, however, it is highly essential to carry out relevant research before doing so. In order to avoid experiencing security issues, it is well advisable that you make use of code from reliable sources. So, if you’re looking to build upon the ideas of a third-party, ensure to use sources you can trust. As a matter of fact, always use verified and trusted sources for code and ensure to be on the lookout for phishing scams by reading the code line by line.
Slow patching of app
Just because your app has been launched does not mean that you are done with the development process. Hackers are always on the move, they do not relent in their efforts to break through an app and so, they always work very fast. Most times, they search for apps with irregular security updates. Then they exploit these security breaches to bring down the app. Just to let you know, it is good to perform regular security updates by revisiting the app often.
However, users on their own part may be unable to get these patches on time. This is because they have to accept and download them. Additionally, the approval process of a patch on an iOS platform can typically take up to a week. Obviously, patches can take a while to reach users. To this end, you can put user information at risk if you fail to stay right on top of new security updates.
When it comes to creating apps that deal with confidential matters such as personal information and customer credit cards, there is always no room for error. To any app developer, the repercussions of the smallest security breach can be highly catastrophic. As a matter of fact, it is your duty to protect both your app and its users. So, ensure to take all necessary precautions so as not to get caught unawares.
Antarctica inhabits a unique place in the human exploration mythos. The vast expanse of uninhabitable land twice the size of Australia has birthed legendary stories of human perseverance and cautionary tales about the indomitable force of nature. However, since those early years, Antarctica has become a rich research center for all different kinds of data collection – from climate change, to biology, to seismic and more. And although today there are many organizations with field stations running this data collection, the nature of its, well, nature still presents daily challenges that technology has had a hand in helping address.
Can You Send Data Through Snow?
British Antarctic Survey (BAS) – of recent Boaty McBoatface fame – has been entrenched in this brutal region for over 60 years, the BAS endeavors to gather data on the polar environment and search for indicators of global change. Its studies of sediments, ice cores, meteorites, the polar atmosphere and ever-changing ice shelves are vitally important and help predict the global climate of the future. Indeed, the BAS is one of the most essential research institutions in the world.
In addition to two research ships, five aircraft and five research stations, the BAS relies on state of the art data gathering equipment to complete its mission. From GPS equipment to motion and atmospheric sensors, the BAS deploys only the most precise and reliable equipment available to generate data. Reliable equipment is vital because of the exceedingly high cost of shipping and repair in such a remote place.
To collect this data, BAS required a network that could reliably transmit it in what could be considered one of the harshest environments on the planet. This means deploying GPS equipment, motion and atmospheric sensors, radios and more that could stand up to the daily tests.
In order to collect and transport the data in this harsh environment, BAS needed a ruggedized solution that could handle both the freezing temperatures (-58 degrees F in the winer), strong winds and snow accumulation. Additionally, the solution needed to be low power due to the region’s lack of power infrastructure.
Halley VI Research Station is a highly advanced platform for global earth, atmospheric and space weather observation. Built on a floating ice shelf in the Weddell Sea, Halley VI is the world’s first re-locatable research facility. It provides scientists with state-of-the-art laboratories and living accommodation, enabling them to study pressing global problems from climate change and sea-level rise to space weather and the ozone hole (Source: BAS website).
The BAS monitors the movement of Brunt Ice Shelf around Halley VI using highly accurate remote field site GPS installations. It employs FreeWave radios at these locations to transmit data from the field sites back to a collection point on the base.
Once there, the data undergoes postprocessing and is sent back to Cambridge, England for analysis. Below are Google Maps representation of the location of the Halley VI Research Station and a satellite image (from 2011) shows the first 9 of the remote GPS systems in relation to Halley VI.
Data transport and collection at Halley VI requires highly ruggedized, yet precise and reliable wireless communication systems to be successful. Antarctica is the highest, driest, windiest and coldest region on earth and environmental condition are extremely harsh year round. Temperatures can drop below -50°C (-58 °F) during the winter months.
Winds are predominantly from the east. Strong winds usually pick up the dusty surface snow, reducing visibility to a few meters. Approximately 1.2 meters of snow accumulates each year on the Brunt Ice Shelf and buildings on the surface become covered and eventually crushed by snow.
This part of the ice shelf is also moving westward by approximately 700 meters per year. There is 24-hour darkness for 105 days per year when Halley VI is completely isolated from the outside world by the surrounding sea ice (Source: BAS Website).
Additionally, the components of the wireless ecosystem need to be low power due to the region’s obvious lack of power infrastructure. These field site systems have been designed from ‘off the shelf’ available parts that have been integrated and ‘winterized’ by BAS for Antarctic deployment.
The BAS turned to wireless data radios from FreeWave that ensure uptime and that can transport data over ice – typically a hindrance to RF communications. Currently, the network consists of 19 FreeWave 900 MHz radios, each connected to a remote GPS station containing sensors that track the movement of the Brunt Ice Shelf near the Halley VI Research Station.
The highly advanced GPS sensors accurately determine the Shelf’s position and dynamics, before reporting this back to a base station at Halley VI. Throughput consists of a 200 kilobit file over 12 minutes, and the longest range between a field site and the research station is approximately 30 kilometers.
Deployment of the GPS field site is done by teams of 3-4 staff using a combination of sledges and skidoo, or Twin Otter aircraft, depending on the distance and the abundance of ice features such as crevassing. As such, wireless equipment needed to be lightweight and easy to install and configure because of obvious human and material resource constraints.
In addition, the solution has to revolve around low power consumption. FreeWave radios have more than two decades of military application and many of the technical advancements made in collaboration with its military partners have led to innovations around low power consumption and improved field performance. The below image shows an example of a BAS remote GPS site, powered by a combination of batteries, a solar panel and a wind turbine (penguin not included).
FreeWave Technologies has been a supplier to the BAS for nearly a decade and has provided a reliable wireless IoT network in spite of nearly year-round brutal weather conditions. To learn more, visit: http://www.freewave.com/technology/.
One of the main attractions of automated analytics appears to be the perception that it represents an automated process that is able to learn automatically from data without the need to do any programming of rules. Furthermore, it is perceived that the IOT will allow organisations to apply analytics to data being generated by any physical asset or business process and thereafter being able to use automated analytics to monitor asset performance, detect anomalies and generate problem resolution / trouble-shooting advice; all without any programming of rules!
In reality, automated analytics is a powerful technology for turning data into actionable insight / knowledge and thereby represents a key enabling technology for automation in Industrial IOT. However, automated analytics alone cannot deliver complete solutions for the following reasons:
i- In order for analytics to learn effectively it needs data that spans the spectrum of normal, sub normal and anomalous asset/process behaviour. Such data can become available relatively quickly in a scenario where there are tens or hundreds of thousands of similar assets (central heating boilers, mobile phones etc.). However, this is not the case for more complex equipment / plants / processes where the volume of available faults or anomalous behaviour data is simply not large enough to facilitate effective analytics learning/modelling. As a result any generated automated analytics will be very restricted in its scope and will generate a large number of anomalies representing operating conditions that do not exist in the data.
ii- By focussing on data analytics alone we are ignoring the most important asset of any organisation; namely the expertise of its people in how to operate plants / processes. This expertise covers condition / risk assessment, planning, configuration, diagnostics, trouble-shooting and other skills that can involve decision making tasks. Automating ‘Decision making’ and applying it to streaming real-time IOT data offers huge business benefits and is very complementary to automated analytics in that it addresses the very areas in point 1 above where data coverage is incomplete, but human expertise exists.
Capturing expertise into an automated decision making system does require the programming of rules and decisions but that need not be a lengthy or cumbersome in a modern rules/decision automation technology such as Xpertrule. Decision making tasks can be represented in a graphical way that a subject matter expert can easily author and maintain without the involvement of a programmer. This can be done using graphical and easy to edit decision flows, decision trees, decision tables and rules. From my experience in using this approach, a substantial decision making task of tens of decision trees can be captured and deployed within a few weeks.
Given the complementary nature of automated analytics and automated decisions, I would recommend the use of symbolic learning data analytics techniques. Symbolic analytics generate rules/tree structures from data which are interpretable and understandable to the domain experts. Whilst rules/tree analytics models are marginally less accurate than deep learning or other ‘blackbox models’, the transparency of symbolic data models offer a number of advantages:
i- The analytics models can be validated by the domain experts
ii- The domain experts can add additional decision knowledge to the analytics models
iii- The transparency of the data models gives the experts insights into the root causes of problems and highlights opportunities for performance improvement.
Combining automated knowledge from data analytics with automated decisions from domain experts can deliver a paradigm shift in the way organisations use IOT to manage their assets / processes. It allows organisations to deploy their best practice expertise 24/7 real time throughout the organisation and rapidly turn newly acquired data into new and improved knowledge.
Below are example decision and analytics knowledge from an industrial IOT solution that we developed for a major manufacturer of powder processing mills. The solution monitors the performance of the mills to diagnose problems and to detect anomalous behaviour:
The Fault diagnosis tree below is part of the knowledge captured from the subject matter experts within the company
The tree below is generated by automated data analytics and relates the output particle size to other process parameters and environmental variables. The tree is one of many analytics models used to monitor anomalous behaviour of the process.
The above example demonstrates both the complementary nature of rules and analytics automation and the interpretability of symbolic analytics. In my next posting I will cover the subject of the rapid capture of decision making expertise using decision structuring and the induction of decision trees from decision examples provided by subject matter experts.
Note: this page contains paid content.
Please, subscribe to get an access.