There is a lot of talk, and, indeed, hype, these days about the internet of things. But what is often overlooked is that the internet of things is also an internet of shared services and shared data. What’s more, we are becoming too heavily reliant on public internet connectivity to underpin innovative new services.
Take this as an example. Back in April, Ford Motor Company, Starbucks and Amazon announced and demonstrated an alliance that would allow a consumer to use Alexa to order and pay for their usual coffee selection from their car. Simply saying, “Alexa: ask Starbucks to start my order,” would trigger the sequence of events required to enable you to drive to the pickup point and collect your already-paid-for coffee with no waiting in line.
Making that transaction happen behind the scenes involves a complex integration of the business processes of all the companies involved. Let’s be clear: this is about data protection. For this series of transactions to be successfully handled, they must be able to share customer payment data, manage identity and authentication, and match personal accounts to customer profiles.
Because all of that critical data can be manipulated, changed or stolen, cyberattacks pose significant data protection risks for nearly any entity anywhere. The ambition of some of these consumer innovations makes an assumption that the “secure” network underpinning this ecosystem for the transfer of all that valuable personal data is the public internet. And that’s the point – it’s not secure.
As we’ve talked about previously on Syniverse's blog Synergy, the public internet poses a systemic risk to businesses and to confidential data. In short, when we are dealing on a large scale with highly sensitive data, the level of protection available today for data that, at any point, touches the public internet is substantially inadequate.
And this alliance between Ford and Starbucks is just one example of the type of innovation, across many different industry and consumer sectors, that we can expect to see a lot of in the very near future. These services will connect organizations that are sharing data and information about businesses and about consumers – about their purchase history, their preferences and requirements, and also about their likely future needs. This is potentially a very convenient and desired service from a consumer’s point of view, but at what cost?
We need security of connectivity, security from outside interference and the security of encrypted transfer and protection for our personal and financial data. And we need to be able to verify the protection of that data at all times by ensuring attribution and identity – both concepts we’ll explore more deeply in an upcoming blog post. And that’s a level of security that the public internet simply cannot provide.
Last month, an internet-based global ransomware attack took down systems and services all over the world – affecting sensitive personal healthcare data in the U.K. in particular.
Whether it is personal health records, financial records, data about the movement of freight in a supply chain, or variations in energy production and consumption, these are digital assets. Businesses, institutions and government bodies all over the world have billions of digital assets that must be constantly sent to and from different parties. And those assets require the type of high-level data protection that is not currently possible because of the systemic risk posed by the insecure public internet.
As mentioned in my last blog post on Synergy, there is an alternative. Some companies using private IP networks were able to carry on regardless throughout the high-profile cyberattacks that have been capturing headlines in the last year. That’s because those companies were not reliant on the public internet. Instead, they were all using what we are beginning to term “Triple-A” networks on which you can specify the speed and capacity of your Access to the network while guaranteeing the Availability of your connection. What’s more, on a Triple-A network, Attribution is securely controlled, so you know who and what is accessing your network and the level of authority granted both to the device accessing the network and to its user.
The public internet cannot provide or compete with a Triple-A level of security, and nor should we expect it to. It cannot live up to the stringent data protection requirements necessary for today’s critical digital assets. We cannot remain content that so much infrastructure, from banking, to transport and to power supplies, relies on a network with so many known vulnerabilities. And we must consider whether we want to carry on developing an industrial internet of things and consumer services on a public network.
We will continue to explore these issues on this blog, to highlight different approaches, and examine the requirements of the secure networks of the future. And in the process, we’ll take a look at the work being done to build more networks with a Triple-A approach.
At this point, everyone has heard about what big data analytics can do for marketing, research, and internal productivity. However, the data only about 20% of all data created is collected and analyzed. The other 80% is known as dark data, or data that collected but not analyzed or made to be searchable. So, what is the purpose of this data, and why is it taking up terabytes worth of storage space on servers around the world?
Examples of Dark Data
Media: Audio, video and image files oftentimes will not be indexed, making them difficult to gain insights from. Contents of these media files, such as the people in the recording or dialogue within a video, will remain locked within the file itself.
Social Data: Social media analytics have improved drastically over the last few years. However, data can only be gathered from a user’s point of entry to their exit point. If a potential customer follows a link on Facebook, then send the visited website to five friends in a group chat, the firm will not realize their advertisement had 6 touchpoints, not just the one.
Search Histories: For many companies, especially in the financial service, healthcare, and energy industries, regulations are a constant concern. As legal compliance standards change, firms worry that they will end up deleting something valuable.
As analytics and automation improve, more dark data is beginning to be dragged out into the light. AI, for example, is getting far better at speech recognition. This allows media files to be automatically tagged with metadata and audio files to be transcribed in real time. Social data is also starting to be tracked with far better accuracy. In doing so, companies will be able to better understand their customers, their interests, and their buying habits. This will allow marketers to create limited, targeted ads based on a customers location that bring in more revenue while reducing cost.
The explosion of data we are currently seeing is only the tip of the big data iceberg. As IoT and wearable devices continue their integration into our daily lives, the amount of data we produce will only grow. Companies are looking to get ahead of the curve and ensure they can gain as much insight from this data as possible. If these firms do not have a plan to create actionable insights from this currently dark data, they ultimately could fall behind and lose out to competitors with a bigger focus on analytics.
The original story was published on ELEKS Trends Blog, visit to get more insights.
Let’s just say it: The public internet is great, but it’s an unfit, wide-open place to try to conduct confidential business.
More and more, the public nature of the internet is causing business and government leaders to lose sleep. The global ransomware attacks this year that crippled infrastructure and businesses across Europe clearly shows the concern is not only justified but also growing.
As a result, internet and privacy regulations, like GDPR and PSD2, are front and center as governments around the world increasingly look at the web and how it’s being used. This is creating competing and contradictory objectives.
On the one hand, governments want to protect consumer privacy and data; on the other, they want to be able to monitor what certain folks are up to on the internet. And in both cases, they can at least claim to be looking to protect people.
Regardless of the difficulty of the task, there is no doubt the big governments are circling and considering their options.
Speaking in Mexico in June, Germany Chancellor Angela Merkel touted the need for global digital rules, like those that exist for financial markets, and that those rules need to be enforceable through bodies like the World Trade Organization.
From a business perspective, I can applaud the ambition, but it does seem a little like trying to control the uncontrollable. The truth is that the public internet has come to resemble the old Wild West. It is an increasingly dangerous place to do business, with more than its fair share of rustlers, hustlers, and bandits to keep at bay.
The public internet connects the world and nearly all its citizens. When it comes to connecting businesses, national infrastructures, and governments themselves, trying to regulate the Wild West of the public internet simply isn’t an option. Instead, it’s time to take a step back and look for something different.
We believe organizations that want to conduct business, transfer data, monitor equipment and control operations globally – with certainty, security and privacy – should not be relying on the public internet. The sheer number of access points and endpoints creates an attack surface that is simply too wide to protect, especially with the increased trending of fog and edge networks that we’ve discussed on previous Syniverse blog posts.
Just last week, the online gaming store CEX was hacked. In an instant, around two million customers found their personal information and financial data had been exposed. Consumers in America, the U.K. and Australia are among those affected. As I said, the public internet presents an ever-widening attack surface.
Recently on the Syniverse blog, we’ve been talking about the need to develop private, closed networks where businesses, national utilities and governments can truly control not just access, but activity. Networks that are always on and ones where the owners always know who is on them and what they are doing. Networks that are private and built for an exact purpose, not public and adaptable.
Trying to apply or bolt on rules, regulations and security processes after the fact is never the best approach. Especially if you are trying to apply them to a service that is omnipresent and open to anybody 24/7.
When we look at the public internet, we see fake actors, state actors, hackers and fraudsters roaming relatively freely. We see an environment where the efforts to police that state might raise as many issues as they solve.
Instead, it’s time for global businesses to build a new world. It’s time to leave the old Wild West and settle somewhere safer. It’s time to circle the wagons around a network built for purpose. That is the future.
The best results will occur when technology and humans collaborate to create an entire ecosystem, which technology alone cannot achieve.
The other day we were discussing and debating on a solution to be designed to meet the sensing needs for access, temperature and humidity for some devices with form part of a networking infrastructure ecosystem. The idea was to build a IoT based system for monitoring and control.
The design discussions veered around the ability to collect data from the sensors and the types of short range communication protocols which could be deployed .Questions and clarification were raised if we were compliant to use short range communication protocols in sensitive areas as customer Data Centres which are like owned and that they may be custodians of data of their end customers .
The hidden perils of data acquisition and data ownership reared its head which needed to be addressed as we moved forward .
The data which is acquired by sensors is essentially Machine Generated Data (MGD) .This post will dwell on the subject of data ownership of MGD as follows :
- Sensors ( Data Acquisition and Communication )
- Machine Generated Data
- The Lifecycle of the MGD and the Ownership Paradigm
- Who should be the owner of the MGD?
- Sensors (Data Acquisition and Communication):
In the IoT ecosystem, the physical computing frontier is managed by the Sensors .Sensors essentially include three fundamental functions:
- The act of sensing and acquiring the data
- Communication of the data through appropriate protocols to communicate their readings to internet cloud services for further aggregation and trend analysis
- The activity is energised by power supply,
The additional functions would include processing/system management and user interface
The Digital Computing part comprises the IoT application. This is determined by the types of sensors, cloud connectivity, power sources, and (optionally) user interface used in an IoT sensor device. The following diagram showcases the primacy of sensors in a typical IoT Ecosystem.
When making physical measurements such as temperature, strain, or pressure, we need a sensor to convert the physical properties into an electrical signal, usually voltage. Then, the signal must be converted to the proper amplitude and filtered for noise before being digitized, displayed, stored, or used to make a decision. Data-acquisition systems use ADCs (analog-to-digital converters) to digitize the signals with adequate signal conditioning.
Sensor data communication to the cloud can be done in multiple ways from wireline to wireless communication of various complexities. While wire line communication has some important benefits (such as reliability, privacy, and power delivery over the same wires), wireless communication is the technology that is the key catalyst in the majority of IoT applications that were not previously practical with wired systems. Reliability, channel security, long range, low power consumption, ease of use, and low cost are now reaching new levels, previously thought infeasible
Some examples of recently popular IoT wireless communication types: Wi-Fi, Bluetooth Low Energy (aka Smart), Zigbee (and other mesh 802.15.4 variants), cellular, LPWA (Low-Power, Wide-Area network variants: Ingenu, LoRaWAN, Sigfox, NB-LTE, Weightless), and Iridium satellite.
- Machine Generated Data (MGD) :
Sensor data is the integral component of the increasing reality of the Internet of Things (IoT) environment. With IpV6 , anything can be outfitted with a unique ip address with the capacity to transfer data over a network. Sensor data is essentially Machine Generated Data . MGD is that is produced entirely by devices / machines though an event or observation.
Here we would define human-generated data, what is recorded is the direct result of human choices. Examples are buying on the web, making an inquiry, filling in a form , making payments with corresponding updates on database. We would not consider the ownership of this data in the post and would be limiting our post to MGD.
- The journey of the MCD and the Ownership Paradigm:
The different phases exist in the typical journey of Machine Generated Data .
Capture and Acquisition of Data– This is a machine or a device based function through signal reception.
Processing and Synthesis of the Data – This is a function which ensures enrichment and integration of Data
Publication of the Data – This is done by expert systems and analysts who work on exception management , triggers and trends .
Usage of Data – The action which need to be taken on the processed and reported information is used by the end user .
Archival and Purging of Data – This function is essentially done by the data maintenance team with supervision.
Now let us dwell on the Ownership Paradigms .They range from the origination of data , adding value to the data through make over , monetising of data through insights generated. Interestingly, let us explore if there is any conclusive method for determining how ownership should be assigned. A number of players may be involved in the journey of the data (e.g. the user, hardware manufacturer, application developer, provider of database architecture and the purchaser of data, each having an equal lay of the claim in different stages of this journey )
- Who should be the owner of MGD :
Let me share the multiple and conflicting views :
- The owner of the device which records Data .In essence, the owner of machine-generated data(MGD), is the entity who holds title to the device that recordw the data. In other words, the entity that owns the IoT device also owns the data produced by that device.
But there could be a lack of clarity if the device is leased rather than owned.. When real-world constructs such as lease holdings of (say servers) come into play, it indeed gets complex and even murky.
- Who should be the owner of MGD :
Let me share the multiple and conflicting views :
The owner of the device which records Data .In essence, the owner of machine-generated data(MGD), is the entity who holds title to the device that recordw the data. In other words, the entity that owns the IoT device also owns the data produced by that device.
But there could be a lack of clarity if the device is leased rather than owned.. When real-world constructs such as lease holdings of (say servers) come into play, it indeed gets complex and even murky.
The owner is the user of the Data :The other dimension is data may be owned by one party and controlled by another. Possession of data does not necessarily equate to title. Through possession there is control. Title is ownership. Referred to as usage rights, each time data sets are copied, recopied and transmitted, control of the data follows it. There could be cases where the owner of the device could be the user of the data.
The maker of the Database who essentially invests in aggregating, processing and making the data usable is the owner of the Data :This has a number of buyers of this paradigm . The owner of a smart thermostat does not, for example, own the data about how he uses it. The only thing that is ‘ownable’ is an aggregation or collection of such data provided there has been a relevant investment in carrying out that aggregation or collection (the individual user is very unlikely to have made that investment). The owner here could be the Home automation company . The value which could be generated though this investment could be producing market intelligence , exploiting the insights form data to build market presence and differentiation ,
The purchaser of Data could be the owner of the Data: An auto insurance company could buy the vehicle generated data ( from the makers of automobiles ) and could design a product for targeted offerings to specific market segments based on say driving behaviour patterns and demographics .This may not be as easy as this seems – refer the url : http://joebarkai.com/who-owns-car-data/ which states that the owner of the vehicle and not the maker of the car owns the data collected from the electronic data recorder .
The value chain of who owns the data can be a complex one with multiple claimants . As one aggregates more sources it just gets more complicated. A good example is in the making of smart cities. The sources of data can be from multiple layers and operational areas . City authorities would be making the effort to make use of the data in areas of waste management , traffic congestion , air pollution etc . So does the city authority own the data?
My personal take is , if someone in the MGD value chain is making the data usable for a larger good , and in the process may monetize the data to cover the investments , that entity deserves to be the owner of the data as that is where value is generated .
Posted on August 14, 2017
It is Digital or Die. You are an easy prey if you don’t change.
adopt the ones which makes more sense to your business.
With its growing prevalence, the Internet of Things is ushering in a new form of ecommerce – the Commerce of Things, where everyday objects are internet connected and capable of initiating a series of purchases on their own. This new way of buying and selling online is radically changing traditional ecommerce rules and creating a new set of challenges for companies. In this new world of commerce, the product sale is no longer just a transaction; it’s the beginning of an ongoing relationship between brands and customers. Successful online brands are focused on nurturing this relationship – and taking deliberate steps to turn transactional customers into loyal members.
There is a subtle but critical difference between a repeat customer and a member. Understanding this difference is the key to succeeding in an environment that is swiftly becoming a hyper-connected network of consumers who value the access and amenities that come with membership.
How do you build these relationships?
1.) Create lasting relationships to make members out of customers. Members share the experience and the story of the brand, rather than just execute a basic business transaction or product purchase. For years, Disney, where everything is a show and employees are cast members, has stood by the adage “Be Our Guest,” calling to their customers in a more intimate, personable way. Cable companies refer to their customers as “subscribers;” LinkedIn has always called users “members.”
To move customers from “transaction to membership” on a relationship continuum, companies must provide extra, incremental value that replaces pure monetary benefits with more intangible rewards of being, in Disney’s case, a guest.
2.) Use data and metrics to strengthen relationships. Once a company starts to grow its base of members, a whole new set of metrics becomes the benchmark for evaluating the customer relationship.
Asking one simple question, “What is a subscriber’s actual usage?” can yield revelations regarding whether someone is a transactional customer or an invested member. For example, January is the peak season for signing new members at fitness centers around the country. Are those who sign up then really members? If they are not actually getting personal value out of their membership, then the relationship remains transactional and fleeting at best.
Good data is powerful. If the data shows customers are not acting like members, then a company can follow up to discern the true nature of the relationship and figure out how it can become more valuable to the customer. This creates a win for both the customer and the company.
Delta Airlines’ SkyMiles program, for example, makes great use of data to cut through barriers that could otherwise prevent strong relationships from developing. When members call in, the automated phone system quickly recognizes callers based on their phone numbers, addresses them by name and asks about recent or upcoming trips.
Personalizing interactions, continually making improvements and utilizing customer insights are key in this new, Commerce of Things world. Taking these steps can help transform transactional customers into loyal members – and take an online business to the next level.
Manufacturers seek quantifiable ROI before making leap to IIoT implementation
By now, most manufacturers have heard of the promise of the Industrial Internet of Things (IIoT).
In this bold new future of manufacturing, newly installed sensors will collect previously unavailable data on equipment, parts, inventory and even personnel that will then be shared with existing systems in an interconnected “smart” system where machines learn from other machines and executives can analyze reports based on the accumulated data.
By doing so, manufacturers can stamp out inefficiencies, eliminate bottlenecks and ultimately streamline operations to become more competitive and profitable.
However, despite the tremendous potential, there is a palpable hesitation by some in the industry to jump into the deep end of the IIoT pool.
When asked, this hesitation stems from one primary concern: If we invest in IIoT, what specific ROI can we expect and when? How will it streamline my process such that it translates into greater efficiencies and actual revenue in the short and long term?
Although it may come as a surprise, the potential return can actually be identified and quantified prior to any implementation. Furthermore, implementations can be scalable for those that want to start with “baby steps.”
In many cases, this is being facilitated by a new breed of managed service providers dedicated to IIoT that have the expertise to conduct in-plant evaluations that pinpoint a specific, achievable ROI.
These managed service providers can then implement and manage all aspects from end-to-end so manufacturers can focus on core competencies and not becoming IIoT experts. Like their IT counterparts, this can often be done on a monthly fee schedule that minimizes, or eliminates, up-front capital investment costs.
Despite all the fanfare for the Internet of Things, the truth is many manufacturers still have a less-than-complete understanding of what it is and how it applies to industry.
While it might appear complicated from the outside looking in, IIoT is merely a logical extension of the increasing automation and connectivity that has been a part of the plant environment for decades.
In fact, in some ways many of the component parts and pieces required already exist in a plant or are collected by more manual methods.
However, a core principle of the Industrial “Internet of Things” is to vastly supplement and improve upon the data collected through the integration of sensors in items such as products, equipment, and containers that are integral parts of the process.
In many cases, these sensors provide a tremendous wealth of critical information required to increase efficiency and streamline operations.
Armed with this new information, IIoT then seeks to facilitate machine-to-machine intelligence and interaction so that the system can learn to become more efficient based on the available data points and traffic patterns. In this way, the proverbial “left hand” now knows what the “right hand” is doing.
In addition, the mass of data collected can then be turned into reports that can be analyzed by top executives and operations personnel to provide further insights on ways to increase operational savings and revenue opportunities.
In manufacturing, the net result can impact quality control, predictive maintenance, supply chain traceability and efficiency, sustainable and green practices and even customer service.
BRINGING IT ALL TOGETHER
The difficulty, however, comes from bridging the gap between “here” and “there.”
Organizations need to do more than just collect data; it must be turned into actionable insights that increase productivity, generate savings, or uncover new income streams.
For Pacesetter, a national processor and distributor of flat rolled steel that operates processing facilities in Atlanta, Chicago and Houston, IIoT holds great promise.
“At Pacesetter, there are so many ways we can use sensors to streamline our operation, says CEO Aviva Leebow Wolmer. “I believe we need to be constantly investigating new technologies and figuring out how to integrate them into our business.”
Pacesetter has always been a trendsetter in the industry. Despite offering a commodity product, the company often takes an active role in helping its customers identify ways to streamline operations as well.
The company is currently working with Industrial Intelligence, a managed service provider that offers full, turnkey end-to-end installed IIoT solutions, to install sensors in each of its facilities to increase efficiency by using dashboards that allow management to view information in real time.
“Having access to real-time data from the sensors and being able to log in and see it to figure out the answer to a problem or question so you can make a better decision – that type of access is incredible,” says Leebow Wolmer.
She also appreciates the perspective that an outsider can bring to the table.
“Industrial Intelligence is in so many different manufacturing plants in a given year and they see different things,” explains Leebow Wolmer. “They see what works, what doesn’t, and can provide a better overall solution not just from the IIoT perspective but even best practices.”
For Pacesetter, the move to IIoT has already yielded significant returns.
In a recently completed project, Industrial Intelligence installed sensors designed to track production schedules throughout the plant. The information revealed two bottlenecks: one in which coils were not immediately ready for processing – slowing production – and another where the skids on which they are placed for shipping were often not ready.
By making the status of both coil and skids available for real time monitoring and alerting key personnel when production slowed, Pacesetter was able to push the production schedule through the existing ERP system.
This increased productivity at the Atlanta plant by 30%. Similar implementations in the other two facilities yielded similar increases in productivity.
TAKING THE FIRST STEP
According to Darren Tessitore, COO of Industrial Intelligence, the process of examining the possible ROI begins with a factory walk-through with trained expertise in manufacturing process improvement and IoT engineers that understand the back-end technologies.
A detailed analysis is then prepared, outlining the scope of the recommended IIoT implementation, exact areas and opportunities for improvement and the location of new sensors.
“The analysis gives us the ability to build the ROI,” says Tessitore. “We’re going to know exactly how much money this will make by making the changes. This takes much of the risk out of it so executives are not guessing how it might help.”
Once completed, a company like Industrial Intelligence can then provide a turnkey, end-to-end-solution.
According to Tessitore, this covers the entire gamut: all hardware and software, station monitors, etc.; the building of real-time alerts, reports & analytics; training management on how to use data points to increase profits; and even continuously monitoring and improving the system as needed.
“Unless you’re a huge company, you really don’t have somebody who can come in and guide you and create a cost effective solution to help you compete with the larger players in the space,” says Pacesetter’s Leebow Wolmer. “I think that’s what Industrial Intelligence offers that can’t be created on your own.”
“It’s not a one-size-fits-all approach,” she adds. “They have some things that can give you a little bit of IIoT or they can take an entire factory to a whole new level. By doing this they can be cost effective for a variety of sizes of organizations.”
For quite some time, the term “machine learning” and “deep learning” seeped its way to the business language, especially when it is related to Artificial Intelligence (AI), analytics and Big Data. Frankly, the approach directed to AI which provides a great promise with regard to creating self-teaching and autonomous systems that can revolutionize various industries.
What is Machine Learning (ML)?
One of the subfield of AL is machine learning. Here the basic principle is that machine, collect data and they learn it for themselves. No doubt, this is the most awesome tool of the business’s Artificial Intelligence kit. One of the interesting advantages of the ML is that you can easily apply the training and knowledge received from analyzing huge data set to perform various functions and excelling at them like speech recognition, facial recognition, translation, object recognition, and various other tasks.
Compared to the hand-coding a given software tool filled with specific instructions which can be used for completing the task, the ML provides a suitable system to understand the pattern by itself and make the required predictions.
What is Deep Learning?
Frankly, a subset of the ML is called as deep learning. Here one utilizes ML techniques for solving various real-life issues, and this is possible by accessing the neural networks which easily help in stimulating the decision-making of human beings. In addition, deep learning is kind of expensive and one will need extensive data sets to train. This is because there are various number of parameters that one might need to have an understanding, possible by learning about the algorithm. Thus, this can be present at the initial stages and create various kinds of false-positives.
To have a fair understanding, let’s check how deep learning algorithm can be used for understanding how a cat looks. So, a huge amount of data set of pictures is used for underlying the basic details which separates the cat from other like panther, cheetah, fox etc.
How Machine Learning And Deep Learning Affects Job
There is a kind of hysteria of doom-and gloom surrounding the machine learning AI. The majority of it is all about how people will be out of work, as there are quite successful stories where machines were able to carry out specific job-related works and bought about extensive results in it.
Indeed it has become a huge paranoia, but it turns out that machine learning only performs tasks, and not the job. Of course, many tasks constitute a job but ML programs are not much flexible.
However, it doesn’t mean that both machine learning and deep learning will not affect your job, as they have already done and will simply continue to do so. Most importantly, whether it will be a benefit or threat will depend on how you are going to react when you identify it.
No doubt, there are quite a lot of reasons on how white-collar jobs can be a great invitation for deep learning and other related technologies. There are various experts who feel that the professional impact which AI and deep learning along with other automated technologies can drastically affect the work force count.
In short, there have been certain reactions or changes with regard to how machine learning and deep learning brings. It has drastically reduced the role of various professionals who are considered as knowledge gatekeepers. Plus, there has been a positive trend towards proactive and reactive services.
Antarctica inhabits a unique place in the human exploration mythos. The vast expanse of uninhabitable land twice the size of Australia has birthed legendary stories of human perseverance and cautionary tales about the indomitable force of nature. However, since those early years, Antarctica has become a rich research center for all different kinds of data collection – from climate change, to biology, to seismic and more. And although today there are many organizations with field stations running this data collection, the nature of its, well, nature still presents daily challenges that technology has had a hand in helping address.
Can You Send Data Through Snow?
British Antarctic Survey (BAS) – of recent Boaty McBoatface fame – has been entrenched in this brutal region for over 60 years, the BAS endeavors to gather data on the polar environment and search for indicators of global change. Its studies of sediments, ice cores, meteorites, the polar atmosphere and ever-changing ice shelves are vitally important and help predict the global climate of the future. Indeed, the BAS is one of the most essential research institutions in the world.
In addition to two research ships, five aircraft and five research stations, the BAS relies on state of the art data gathering equipment to complete its mission. From GPS equipment to motion and atmospheric sensors, the BAS deploys only the most precise and reliable equipment available to generate data. Reliable equipment is vital because of the exceedingly high cost of shipping and repair in such a remote place.
To collect this data, BAS required a network that could reliably transmit it in what could be considered one of the harshest environments on the planet. This means deploying GPS equipment, motion and atmospheric sensors, radios and more that could stand up to the daily tests.
In order to collect and transport the data in this harsh environment, BAS needed a ruggedized solution that could handle both the freezing temperatures (-58 degrees F in the winer), strong winds and snow accumulation. Additionally, the solution needed to be low power due to the region’s lack of power infrastructure.
Halley VI Research Station is a highly advanced platform for global earth, atmospheric and space weather observation. Built on a floating ice shelf in the Weddell Sea, Halley VI is the world’s first re-locatable research facility. It provides scientists with state-of-the-art laboratories and living accommodation, enabling them to study pressing global problems from climate change and sea-level rise to space weather and the ozone hole (Source: BAS website).
The BAS monitors the movement of Brunt Ice Shelf around Halley VI using highly accurate remote field site GPS installations. It employs FreeWave radios at these locations to transmit data from the field sites back to a collection point on the base.
Once there, the data undergoes postprocessing and is sent back to Cambridge, England for analysis. Below are Google Maps representation of the location of the Halley VI Research Station and a satellite image (from 2011) shows the first 9 of the remote GPS systems in relation to Halley VI.
Data transport and collection at Halley VI requires highly ruggedized, yet precise and reliable wireless communication systems to be successful. Antarctica is the highest, driest, windiest and coldest region on earth and environmental condition are extremely harsh year round. Temperatures can drop below -50°C (-58 °F) during the winter months.
Winds are predominantly from the east. Strong winds usually pick up the dusty surface snow, reducing visibility to a few meters. Approximately 1.2 meters of snow accumulates each year on the Brunt Ice Shelf and buildings on the surface become covered and eventually crushed by snow.
This part of the ice shelf is also moving westward by approximately 700 meters per year. There is 24-hour darkness for 105 days per year when Halley VI is completely isolated from the outside world by the surrounding sea ice (Source: BAS Website).
Additionally, the components of the wireless ecosystem need to be low power due to the region’s obvious lack of power infrastructure. These field site systems have been designed from ‘off the shelf’ available parts that have been integrated and ‘winterized’ by BAS for Antarctic deployment.
The BAS turned to wireless data radios from FreeWave that ensure uptime and that can transport data over ice – typically a hindrance to RF communications. Currently, the network consists of 19 FreeWave 900 MHz radios, each connected to a remote GPS station containing sensors that track the movement of the Brunt Ice Shelf near the Halley VI Research Station.
The highly advanced GPS sensors accurately determine the Shelf’s position and dynamics, before reporting this back to a base station at Halley VI. Throughput consists of a 200 kilobit file over 12 minutes, and the longest range between a field site and the research station is approximately 30 kilometers.
Deployment of the GPS field site is done by teams of 3-4 staff using a combination of sledges and skidoo, or Twin Otter aircraft, depending on the distance and the abundance of ice features such as crevassing. As such, wireless equipment needed to be lightweight and easy to install and configure because of obvious human and material resource constraints.
In addition, the solution has to revolve around low power consumption. FreeWave radios have more than two decades of military application and many of the technical advancements made in collaboration with its military partners have led to innovations around low power consumption and improved field performance. The below image shows an example of a BAS remote GPS site, powered by a combination of batteries, a solar panel and a wind turbine (penguin not included).
FreeWave Technologies has been a supplier to the BAS for nearly a decade and has provided a reliable wireless IoT network in spite of nearly year-round brutal weather conditions. To learn more, visit: http://www.freewave.com/technology/.
Note: this page contains paid content.
Please, subscribe to get an access.
Note: this page contains paid content.
Please, subscribe to get an access.