The Internet of Things is slated to be one of the most disruptive technologies we’ve ever seen. It’s going to change a great deal - including how we look at and use the cloud.
Software-defined cars. Internet-connected ‘smart’ fridges, coffee machines, and televisions. Wearable technology like smartwatches and smartglasses. The Internet of Things is going to change everything from how we work to how we drive to how we live our lives. And as it does so, it’s also going to change the cloud.
It already is, actually.
Enter fog computing. It’s an extension of the cloud, born out of the fact that there are more Internet-connected devices in the world than ever before (by 2020, Gartner predicts that there will be 6.4 billion.) Given this influx, the traditional, centralized model of the cloud is no longer viable.
“Today, there might be hundreds of connected devices in an office or data center,” writes Ahmed Banafa of Thoughts On Cloud. “In just a few years, that number could explode to thousands or tens of thousands, all connected and communicating. Most of the buzz around fog has a direct correlation with IoT. The fact that everything from cars to thermostats are gaining web intelligence means that direct user-end computing and communication may soon be more important than ever.”
It makes a lot more sense to move the real computing and processing closer to client devices. To carry out analysis at the network’s edge. See, the thing about the Internet of Things is that it depends on managing data over very short timeframes. Even a slight delay introduced as a result of bandwidth is unacceptable.
Consider the following examples:
A self-driving car is communicating with the vehicles and traffic infrastructure around it, and analyzing traffic and weather conditions. While it may communicate with a central server, it needs to be able to analyze and aggregate data immediately, lest it cause an accident.
Autonomous tunneling and boring machines at a mining site ensure workers don’t have to subject themselves to hazardous underground conditions. These machines must be capable of analyzing and storing terabytes of data, as network connectivity hundreds of feet underground is near-impossible. They also must be able to communicate with other mining infrastructure, as well as a central server, uploading processed data to the cloud when mining is finished.
Sensors at an oil well must connect to a cloud server to provide headquarters with a real-time vision of the facility. These sensors, however, must be capable of analyzing data on-site before it is uploaded.
In each of the examples above, distributed computing works together with a more traditional cloud model to better-enable connected equipment and sensors. And that’s where the cloud slots in with IoT. It’s still the cloud - but it’s changed in order to adapt to new workflows, business processes, and an entirely new world.
“With the increase in data and cloud services utilization, fog computing will play a key role in helping reduce latency and improve the user experience” writes Data Center Knowledge’s Bill Kleyman. “We are now truly distributing the data plane and pushing advanced services to the edge. By doing so, administrators are able to bring rich content to the user faster, more efficiently, and - very importantly - more economically.”
Photo credit: Mr. & Mrs. Gray
About the Author:
Tim Mullahy is the General Manager at Liberty Center One. Liberty Center One is a new breed of data center located in Royal Oak, MI. Liberty can host any customer solution regardless of space, power, or networking/bandwidth requirements.