Over the years, IoT has made its way into the complex consumer markets and made millions of lives easier and smarter. Without a doubt, the industry holds enormous potential for upcoming entrepreneurs to introduce innovative solutions. In fact, the number of IoT start-ups have grown by 27% from 2019 till mid-2020.
While many of these IoT projects have made the cut, others are struggling to realize the intended RoI. Although tempting but it still a highly challenging space to be in and create sustainable companies.
While a new organization is a collaborative effort of many people, it is the leaders who hold the vision strong and spearhead the transformation. For those setting on their journeys of tech start-ups, here’s what you can learn from the best.
Targeting the right KPIs
The rule to achieve your KPIs is simple – never ignore them. Start-ups who planned around the KPIs were able to meet them quickly and seamlessly. Starting from product ideation to distributing the budgets across marketing, development and acquiring customers and retaining them, the complete lifecycle should be evaluated periodically through metrics such as Customer Acquisition Cost (CAC), Customer Retention Rate (CRR) and the Life-Time Value (LTV).
Customer Retention Rate (CRR) is the total number of customers a business is able to retain over a given period of time. High retention rates are a clear hint of a successful product and a fully satisfied customer while high attrition rates mean the opposite. Life Time Value (LTV) is the net value of a customer to the business. When these metrics are evaluated in relation to each other such as the LTV/CAC ratio, the total capital efficiency of a company can be predicted.
IoT enables you to take a step further in tracking KPIs:
- Track end usage using IoT and deduce usage analytics.
- Take user feedback at the point of usage using IoT and deduce user experience in real-time.
- Use remote device management to monitor the health of your IoT solution and run diagnostics to find and fix issues. This will help in keeping MTTr rate (Mean Time to Repair) as controlled as possible.
These advanced indicators can directly help you in reducing expenses and increase revenues by improving customer experience.
Why are these important? Entrepreneurs who stayed intact to meet these KPIs have seen a 10x increase in business efficiency. This is an important takeaway for budding entrepreneurs who have to justify their investments periodically. Since CAC has increased by 50% over the past few years, not ignoring performance KPIs is the foremost lesson for every new leader.
Rapid adoption to change
IoT is not the same as it was 5 years ago. In fact, it may not be a ‘new technology on the block’ anymore. It is continuously evolving and start-ups have no choice but to keep experimenting with newer builds and processes. For example, embracing new technologies such as Edge computing or bringing anonymity to the data transfers, IoT products must upgrade. Likewise, project owners can improvise their development process by officially collaborating with other companies. It is a mesh and more hands will help to simplify. So be it outsourcing the resourcing requirements to a partner or outsourcing end-to-end product development, start-ups must weigh their choices and utilize the available expertise optimally.
Few entrepreneurs have been able to resolve this complexity by trekking the midline. They sensed that the risk of not embracing change is greater than the risk of failing. Therefore, budding entrepreneurs must understand that experimentation doesn’t have to replace your existing processes. It can be an additional vertical which is committed to embracing contemporary product offerings or technologies.
Despite the world being restricted indoors due to Covid, the following tech entrepreneurs have brilliantly led their workforce and achieved impressive results.
CEO Armis Security
Armis Security ventured, attempted and mastered a market that most companies are scared of trying – IoT cybersecurity. Led by the hugely ambitious Yevgeny Dibrov, Amris is a security platform that discovers devices across the network, analyses their behaviors and identifies risks. For an industry plagued with cybersecurity threats, Amris is a huge reassurance. The company has a line-up of customers across sectors such as healthcare, automobile, finance and manufacturing.
While the start-up completes 5 years shortly, CEO Yevgeny quotes – “ "As companies accelerate their digital transformation initiatives, securely enable employees to work from home long-term, and adopt 5G, we are seeing an explosion of connected devices. At the same time, this uptick has increased the risk profile for businesses, especially around ransomware attacks, which is driving even more demand for our industry-leading agentless device security platform”.
CEO - Ioterra Inc.
When most start-ups were swaying in the hype of IoT, Ioterra foresaw the complications and immediately plunged at the opportunity to resolve a huge gap in the IoT ecosystem – the challenge of quickly sourcing reliable IoT service partners and other resources needed for a successful IoT initiative. Unlike other technology markets, IoT is a rare space that involves sourcing complications regarding IoT services as well as solutions from all walks of technology – hardware, software and wireless communications. Besides delaying projects, sourcing difficulties lead to cost overheads. As an IoT consultant himself, Daniel along with his team created a digital marketplace that enables project owners to seek sourcing assistance based on their business model, type and sector.
Daniel says, “Startups are advised to ensure a minimum of 12-18 months of runway. The most important reasoning behind this thinking is that you would invariably pivot 2-3 times before you get it right and you need to survive until then. Unless you watch the KPIs regularly and quickly pivot adapting to what you see on the ground, you cannot build a growing startup”.
CEO - Helium
Technologies from all sectors and markets have started to embrace Web 3.0 and Helium is IoT’s big bet. It is a platform that empowers businesses to develop connectivity for devices and sensors over a peer-to-peer wireless network. CEO Amir Haleem who was always ambitious about wireless coverage for low power IoT devices aims at bringing more projects on the stage.
He quotes - We’ve worked hard to bring native geo-location to everything that connects to the network. This opens up all sorts of interesting use cases that haven’t been seen yet, which have otherwise been impossible to build.
What’s common in all of them?- The Ethos to grow
Ultimately, no start-up can grow without the mindset to win. Although most tech leaders ensure a learning culture within the organization, the motivation is mostly missing at the employee level. This largely happens when leaders don’t communicate their vision to the workforce and keep them restricted to the task assignments. The ethos to grow has to reflect at the individual level and that’s the hack to organizational success that many don’t get right.
Moreover, the missing KPIs and not retrospecting upon those failures along with your teams is a big flaw. In a startup environment wherein the team structure is mostly lean, the entrepreneurs must share quarterly progress with everyone. Besides keeping everyone in unison about the expected outcomes, such sessions float innovative ideas to achieve the results more efficiently. Therefore, upcoming entrepreneurs should ensure a work culture that acknowledges creative inputs.
Motivated employees with a growth mindset, diligent tracking of KPIs and quick adaptability to change lay a solid foundation for success.
When I think about the things that held the planet together in 2020, it was digital experiences delivered over wireless connectivity that made remote things local.
While heroes like doctors, nurses, first responders, teachers, and other essential personnel bore the brunt of the COVID-19 response, billions of people around the world found themselves cut off from society. In order to keep people safe, we were physically isolated from each other. Far beyond the six feet of social distancing, most of humanity weathered the storm from their homes.
And then little by little, old things we took for granted, combined with new things many had never heard of, pulled the world together. Let’s take a look at the technologies and trends that made the biggest impact in 2020 and where they’re headed in 2021:
The global Internet infrastructure from which everything else is built is an undeniable hero of the pandemic. This highly-distributed network designed to withstand a nuclear attack performed admirably as usage by people, machines, critical infrastructure, hospitals, and businesses skyrocketed. Like the air we breathe, this primary facilitator of connected, digital experiences is indispensable to our modern society. Unfortunately, the Internet is also home to a growing cyberwar and security will be the biggest concern as we move into 2021 and beyond. It goes without saying that the Internet is one of the world’s most critical utilities along with water, electricity, and the farm-to-table supply chain of food.
People are mobile and they stay connected through their smartphones, tablets, in cars and airplanes, on laptops, and other devices. Just like the Internet, the cellular infrastructure has remained exceptionally resilient to enable communications and digital experiences delivered via native apps and the web. Indoor wireless connectivity continues to be dominated by WiFi at home and all those empty offices. Moving into 2021, the continued rollout of 5G around the world will give cellular endpoints dramatic increases in data capacity and WiFi-like speeds. Additionally, private 5G networks will challenge WiFi as a formidable indoor option, but WiFi 6E with increased capacity and speed won’t give up without a fight. All of these developments are good for consumers who need to stay connected from anywhere like never before.
With many people stuck at home in 2020, web conferencing technology took the place of traveling to other locations to meet people or receive education. This technology isn’t new and includes familiar players like GoToMeeting, Skype, WebEx, Google Hangouts/Meet, BlueJeans, FaceTime, and others. Before COVID, these platforms enjoyed success, but most people preferred to fly on airplanes to meet customers and attend conferences while students hopped on the bus to go to school. In 2020, “necessity is the mother of invention” took hold and the use of Zoom and Teams skyrocketed as airplanes sat on the ground while business offices and schools remained empty. These two platforms further increased their stickiness by increasing the number of visible people and adding features like breakout rooms to meet the demands of businesses, virtual conference organizers, and school teachers. Despite the rollout of the vaccine, COVID won’t be extinguished overnight and these platforms will remain strong through the first half of 2021 as organizations rethink where and when people work and learn. There’s way too many players in this space so look for some consolidation.
“Stay at home” orders and closed businesses gave e-commerce platforms a dramatic boost in 2020 as they took the place of shopping at stores or going to malls. Amazon soared to even higher heights, Walmart upped their game, Etsy brought the artsy, and thousands of Shopify sites delivered the goods. Speaking of delivery, the empty city streets became home to fleets FedEx, Amazon, UPS, and DHL trucks bringing packages to your front doorstep. Many retail employees traded-in working at customer-facing stores for working in a distribution centers as long as they could outperform robots. Even though people are looking forward to hanging out at malls in 2021, the e-commerce, distribution center, delivery truck trinity is here to stay. This ball was already in motion and got a rocket boost from COVID. This market will stay hot in the first half of 2021 and then cool a bit in the second half.
The COVID pandemic really took a toll on restaurants in the 2020, with many of them going out of business permanently. Those that survived had to pivot to digital and other ways of doing business. High-end steakhouses started making burgers on grills in the parking lot, while takeout pizzerias discovered they finally had the best business model. Having a drive-thru lane was definitely one of the keys to success in a world without waiters, busboys, and hosts. “Front of house” was shut down, but the “back of house” still had a pulse. Adding mobile web and native apps that allowed customers to easily order from operating “ghost kitchens” and pay with credit cards or Apple/Google/Samsung Pay enabled many restaurants to survive. A combination of curbside pickup and delivery from the likes of DoorDash, Uber Eats, Postmates, Instacart and Grubhub made this business model work. A surge in digital marketing also took place where many restaurants learned the importance of maintaining a relationship with their loyal customers via connected mobile devices. For the most part, 2021 has restauranteurs hoping for 100% in-person dining, but a new business model that looks a lot like catering + digital + physical delivery is something that has legs.
The Internet of Things
At its very essence, IoT is all about remotely knowing the state of a device or environmental system along with being able to remotely control some of those machines. COVID forced people to work, learn, and meet remotely and this same trend applied to the industrial world. The need to remotely operate industrial equipment or an entire “lights out” factory became an urgent imperative in order to keep workers safe. This is yet another case where the pandemic dramatically accelerated digital transformation. Connecting everything via APIs, modeling entities as digital twins, and having software bots bring everything to life with analytics has become an ROI game-changer for companies trying to survive in a free-falling economy. Despite massive employee layoffs and furloughs, jobs and tasks still have to be accomplished, and business leaders will look to IoT-fueled automation to keep their companies running and drive economic gains in 2021.
Closed movie theaters, football stadiums, bowling alleys, and other sources of entertainment left most people sitting at home watching TV in 2020. This turned into a dream come true for streaming entertainment companies like Netflix, Apple TV+, Disney+, HBO Max, Hulu, Amazon Prime Video, Youtube TV, and others. That said, Quibi and Facebook Watch didn’t make it. The idea of binge-watching shows during the weekend turned into binge-watching every season of every show almost every day. Delivering all these streams over the Internet via apps has made it easy to get hooked. Multiplayer video games fall in this category as well and represent an even larger market than the film industry. Gamers socially distanced as they played each other from their locked-down homes. The rise of cloud gaming combined with the rollout of low-latency 5G and Edge computing will give gamers true mobility in 2021. On the other hand, the video streaming market has too many players and looks ripe for consolidation in 2021 as people escape the living room once the vaccine is broadly deployed.
With doctors and nurses working around the clock as hospitals and clinics were stretched to the limit, it became increasingly difficult for non-COVID patients to receive the healthcare they needed. This unfortunate situation gave tele-medicine the shot in the arm (no pun intended) it needed. The combination of healthcare professionals delivering healthcare digitally over widespread connectivity helped those in need. This was especially important in rural areas that lacked the healthcare capacity of cities. Concurrently, the Internet of Things is making deeper inroads into delivering the health of a person to healthcare professionals via wearable technology. Connected healthcare has a bright future that will accelerate in 2021 as high-bandwidth 5G provides coverage to more of the population to facilitate virtual visits to the doctor from anywhere.
Working and Living
As companies and governments told their employees to work from home, it gave people time to rethink their living and working situation. Lots of people living in previously hip, urban, high-rise buildings found themselves residing in not-so-cool, hollowed-out ghost towns comprised of boarded-up windows and closed bars and cafés. Others began to question why they were living in areas with expensive real estate and high taxes when they not longer had to be close to the office. This led to a 2020 COVID exodus out of pricey apartments/condos downtown to cheaper homes in distant suburbs as well as the move from pricey areas like Silicon Valley to cheaper destinations like Texas. Since you were stuck in your home, having a larger house with a home office, fast broadband, and a back yard became the most important thing. Looking ahead to 2021, a hybrid model of work-from-home plus occasionally going into the office is here to stay as employees will no longer tolerate sitting in traffic two hours a day just to sit in a cubicle in a skyscraper. The digital transformation of how and where we work has truly accelerated.
Data and Advanced Analytics
Data has shown itself to be one of the world’s most important assets during the time of COVID. Petabytes of data has continuously streamed-in from all over the world letting us know the number of cases, the growth or decline of infections, hospitalizations, contact-tracing, free ICU beds, temperature checks, deaths, and hotspots of infection. Some of this data has been reported manually while lots of other sources are fully automated from machines. Capturing, storing, organizing, modeling and analyzing this big data has elevated the importance of cloud and edge computing, global-scale databases, advanced analytics software, and the growing importance of machine learning. This is a trend that was already taking place in business and now has a giant spotlight on it due to its global importance. There’s no stopping the data + advanced analytics juggernaut in 2021 and beyond.
2020 was one of the worst years in human history and the loss of life was just heartbreaking. People, businesses, and our education system had to become resourceful to survive. This resourcefulness amplified the importance of delivering connected, digital experiences to make previously remote things into local ones. Cheers to 2021 and the hope for a brighter day for all of humanity.
If in previous years many of the Internet of Things (IoT) predictions have failed, the year 2020 has been no exception. This time justified by the virus outbreak.
In my article " 2020 IoT Trends and Predictions: Be prepared for the IoT Tsunami" I wrote that we should be prepared for the Internet of Things (IoT) tsunami, but it won't be in 2020". I didn't imagine the special circumstances of the year "MMXX". Today, I do see clearly that Covid-19’s impact is difficult to ignore looking forward into 2021 and beyond. This pandemic is going to accelerate adoption in many industries that have been affected and will have to make some changes to how they operate.
The year 2020 has been a significant year in terms of the emergence of technologies leading to a much better space of IoT to flourish and grow.
I'm not going to make my own predictions this year. Although I have taken a responsibility for my followers to collect and publish the predictions of other recognized or enthusiastic voices of the IoT.
Here I summarize some of them. My advice is to keep relying on optimistic predictions as there are many Reasons to Believe in Internet of Things.
- Forrester - Predictions 2021: Technology Diverity Drives IoT Growth
- Network connectivity chaos will reign. We expect interest in satellite and other lower-power networking technologies to increase by 20% in the coming year.
- Connected device makers will double down on healthcare use cases. In 2021, proactive engagement using wearables and sensors to detect patients’ health at home will surge.
- Smart office initiatives will drive employee-experience transformation. We expect at least 80% of firms to develop comprehensive on-premises return-to-work office strategies that include IoT applications to enhance employee safety and improve resource efficiency.
- The near ubiquity of connected machines will finally disrupt traditional business. In 2021, field service firms and industrial OEMs will rush to keep up with customer demand for more connected assets and machines.
- Consumer and employee location data will be core to convenience. In 2021, brands must utilize location to generate convenience for consumers or employees with virtual queues, curbside pickup, and checking in for reservations.
- CRN - 5 Hot IoT Trends To Watch in 2021 And Beyond
- Changes In Real Estate Trends Will Push Smart Office Initiatives
- The Internet Of Behavior Is Coming To Your Workplace
- Location Data Will Become More Prominent
- This Year’s Pivot To Remote Operations Will Expand Connected Assets
- Connected Health Care Will Ramp Up In 2021
- The future of IoT: 5 major predictoins for 2021, based on Forrester
- TechoPedia - 6 IoT Predictions for 2021: What's Next? –
- An Increase in IoT Remote Workforce Management Products
- More IoT-Enabled Options for Smart Cities
- Improving Driving and Autonomous Vehicles
- The IoT Will Boost Predictive Maintenance
- The Connected Home over Internet Protocol (CHIP) Standard Will Become a Reality
- Market Enticements With Multipurpose Products
- Forbes - 5 IoT Trends To Watch In 2021
- Can You Turn Off Your Alexa? We'll likely see an increase in the security surrounding smart devices, including AI-driven, automated ability to scan networks for IoT devices.
- More Use Cases in More Industries - the IoT has the ability to mean big money for almost any industry.
- IoT Helping to Build Digital Twins - the IoT may be the perfect partner for the development of digital twins, for almost any application. Especially for things like construction, engineering, and architecture, that could mean huge cost and time savings.
- IoT and Data Analytics - the IoT is no longer just about monitoring behavior and spitting out data. It's about processing data quickly and making recommendations (or taking actions) based on those findings.
- Improving Data Processing at the Edge - With the confluence of 5G networks, an increase in IoT and IIoT devices, and a dramatic increase in the amount of data we are collecting, I don't see this trend going anywhere but up
- Security Today - By 2021, 36 bilion IoT devices will be installed around the world.
- IoT Agenda - Mitch Maiman - Intelligent Product Solutions (IPS)- IoT predictions for 2021–
- Medical IoT
- Radio frequency services
- AI and augmented reality
- Electric vehicles
- Remote work
- IoT Agenda - Carmen Fontana, Institute of Electrical and Electronics Engineer -Top 5 IoT predictions for gworing use cases in 2021
- Wearables will blur the line between consumer gadgets and medical devices
- Consumers will be more concerned about data privacy
- AI IoT products will be more accessible
- Digital twin adoption will explode due to increased remote work
- Edge computing will benefit from green energy investment
- IoT World Today - IoT Trends 2021: A Focus on Gundamentals, Not Nice-to-Haves. IoT trends in 2021 will focus on core needs such as health-and-safety efforts and equipment monitoring, but IoT in customer experience will also develop.
- Rockwell Automation Predictions for 2021
- IT/OT Integration is critical for answering the $77 billion need for IIoT
- Edge is the new cloud
- Digital twins save $1 trillion in manufacturing costs
- Pandemic promotes AR training as the new standard for a distributed workforce
- Automation accelerates employee advancement through human-machine interface
- Top 5 IoT Predictions For 2021; What Future Holds?
- Private IoT networks
- Digital health
- Cities would turn smarter
- Remote offices
- Improved location services
- online - 10 IoT Trends for 2020/2021: Latest Predictions According to Experts
- J2 Innovations - Smart Building, Equpement and IoT Trends for 2021 –
- Remote work and management
- Changing the way we work
- Flexible spaces
- Digital processes- 2021 will see ever more processes becoming digital.
- The convergence of IT and OT - The industry will continue to see a concerted push to integrate and leverage the vast amounts of valuable data derived from Operational Technologies (OT) into the Information Technology (IT) side of the enterprise
- A new kind of interoperability - A good example of that is The Web of Things (WoT), which is an open source standard being pioneered by Siemens
- Krakul - IoT trends to expect in 2021 - Cloud service providers are the most prominent vendors within the IoT space. 2021 will also see the rise in IoT development partnerships. Brands, who not only require cloud transformation will need a hardware partner to ensure IoT devices perform to both consumer and business needs. Whether those IoT device applications will see use by consumers, businesses or the industry – the common concerns shaping IoT solutions for 2021 include:
- user safety
- return on investment (ROI) for the business case
- Analysis Mason - predictions for business connectivity, communications, IoT and security in 2021 –
- A major mobile operator will buy one of the IoT market disruptors.
- A new deployment model for private LTE/5G networks will emerge – the public industrial network
- Private networks will become a topic for financial sponsors.
- TBR (Ezra Gottheil)- 2021 DEVICES & COMMERCIAL IOT PREDICTIONS
- AI in IoT will increasingly be encapsulated in specific functions like recognition and detection
- Conversational user interfaces, based on voice or typed communication, will play an increasing role in business Solutions
- THE EMERGENCE OF THE CHIEF DATA OFFICER ROLE WILL INCREASE ORGANIZATIONAL CLARITY, ACCELERATING IOT ADOPTION
- Predictions for Embedded Machine Learning for IoT in 2021
- From increasingly capable hardware to TinyML, embedded machine learning will make strides in 2021.
- More capable microcontrollers combined with on-device machine learning at the edge is poised to develop further in 2021. These developments with further advances in video surveillance, manufacturing and more.
- The impact of COVID-19 on the global supply chain, however, may stunt innovation and growth of embedded machine learning.
- Frost & Sullivan -Top 4 Growth Opportunities in the Internet of Things Industry for 2021
- Exponential growth of edge computing in public and private networks
- Convergence between IT and OT to drive end-user concerns on IIoT security, privacy, and data protection
- Emerging techs: convergence of IoT, AI, and blockchain
- The future of retail post COVID-19
In spite the global pandemic has influenced product introduction timelines, causing some things to be fast-tracked, while others lose priority, enterprises, consumers, and different stakeholders will continue to drive demand for new and improved internet of things applications, technologies, and solutions in 2021 across verticals and geographies.
IoT will continue to gain footholds, as people and enterprises become comfortable and familiar with the technology and it is incorporated into daily life in seamless ways.
You must not forget that by the year 2025 IoT devices installed worldwide would be 75.44 Billion. That is a whopping number, which will relentlessly soar further, will give a positive impact on our lives and businesses alike.
I expect an exciting year for IoT advancements in 2021. And you?
Written by: Mirko Grabel
Edge computing brings a number of benefits to the Internet of Things. Reduced latency, improved resiliency and availability, lower costs, and local data storage (to assist with regulatory compliance) to name a few. In my last blog post I examined some of these benefits as a means of defining exactly where is the edge. Now let’s take a closer look at how edge computing benefits play out in real-world IoT use cases.
Benefit No. 1: Reduced latency
Many applications have strict latency requirements, but when it comes to safety and security applications, latency can be a matter of life or death. Consider, for example, an autonomous vehicle applying brakes or roadside signs warning drivers of upcoming hazards. By the time data is sent to the cloud and analyzed, and a response is returned to the car or sign, lives can be endangered. But let’s crunch some numbers just for fun.
Say a Department of Transportation in Florida is considering a cloud service to host the apps for its roadside signs. One of the vendors on the DoT’s shortlist is a cloud in California. The DoT’s latency requirement is less than 15ms. The light speed in fiber is about 5 μs/km. The distance from the U.S. east coast to the west coast is about 5,000 km. Do the math and the resulting round-trip latency is 50ms. It’s pure physics. If the DoT requires a real-time response, it must move the compute closer to the devices.
Benefit No. 2: Improved resiliency/availability
Critical infrastructure requires the highest level of availability and resiliency to ensure safety and continuity of services. Consider a refinery gas leakage detection system. It must be able to operate without Internet access. If the system goes offline and there’s a leakage, that’s an issue. Compute must be done at the edge. In this case, the edge may be on the system itself.
While it’s not a life-threatening use case, retail operations can also benefit from the availability provided by edge compute. Retailers want their Point of Sale (PoS) systems to be available 100% of the time to service customers. But some retail stores are in remote locations with unreliable WAN connections. Moving the PoS systems onto their edge compute enables retailers to maintain high availability.
Benefit No. 3: Reduced costs
Bandwidth is almost infinite, but it comes at a cost. Edge computing allows organizations to reduce bandwidth costs by processing data before it crosses the WAN. This benefit applies to any use case, but here are two example use-cases where this is very evident: video surveillance and preventive maintenance. For example, a single city-deployed HD video camera may generate 1,296GB a month. Streaming that data over LTE easily becomes cost prohibitive. Adding edge compute to pre-aggregate the data significantly reduces those costs.
Manufacturers use edge computing for preventive maintenance of remote machinery. Sensors are used to monitor temperatures and vibrations. The currency of this data is critical, as the slightest variation can indicate a problem. To ensure that issues are caught as early as possible, the application requires high-resolution data (for example, 1000 per second). Rather than sending all of this data over the Internet to be analyzed, edge compute is used to filter the data and only averages, anomalies and threshold violations are sent to the cloud.
Benefit No. 4: Comply with government regulations
Countries are increasingly instituting privacy and data retention laws. The European Union’s General Data Protection Regulation (GDPR) is a prime example. Any organization that has data belonging to an EU citizen is required to meet the GDPR’s requirements, which includes an obligation to report leaks of personal data. Edge computing can help these organizations comply with GDPR. For example, instead of storing and backhauling surveillance video, a smart city can evaluate the footage at the edge and only backhaul the meta data.
Canada’s Water Act: National Hydrometric Program is another edge computing use case that delivers regulatory compliance benefits. As part of the program, about 3,000 measurement stations have been implemented nationwide. Any missing data requires justification. However, storing data at the edge ensures data retention.
Bonus Benefit: “Because I want to…”
Finally, some users simply prefer to have full control. By implementing compute at the edge rather than the cloud, users have greater flexibility. We have seen this in manufacturing. Technicians want to have full control over the machinery. Edge computing gives them this control as well as independence from IT. The technicians know the machinery best and security and availability remain top of mind.
By reducing latency and costs, improving resiliency and availability, and keeping data local, edge computing opens up a new world of IoT use cases. Those described here are just the beginning. It will be exciting to see where we see edge computing turn up next.
Originaly posted: here
Arm DevSummit 2020 debuted this week (October 6 – 8) as an online virtual conference focused on engineers and providing them with insights into the Arm ecosystem. The summit lasted three days over which Arm painted an interesting technology story about the current and future state of computing and where developers fit within that story. I’ve been attending Arm Techcon for more than half a decade now (which has become Arm DevSummit) and as I perused content, there were several take-a-ways I noticed for developers working on microcontroller based embedded systems. In this post, we will examine these key take-a-ways and I’ll point you to some of the sessions that I also think may pique your interest.
(For those of you that aren’t yet aware, you can register up until October 21st (for free) and still watch the conferences materials up until November 28th . Click here to register)
Take-A-Way #1 – Expect Big Things from NVIDIAs Acquisition of Arm
As many readers probably already know, NVIDIA is in the process of acquiring Arm. This acquisition has the potential to be one of the focal points that I think will lead to a technological revolution in computing technologies, particularly around artificial intelligence but that will also impact nearly every embedded system at the edge and beyond. While many of us have probably wondered what plans NVIDIA CEO Jensen Huang may have for Arm, the Keynotes for October 6th include a fireside chat between Jensen Huang and Arm CEO Simon Segars. Listening to this conversation is well worth the time and will help give developers some insights into the future but also assurances that the Arm business model will not be dramatically upended.
Take-A-Way #2 – Machine Learning for MCU’s is Accelerating
It is sometimes difficult at a conference to get a feel for what is real and what is a little more smoke and mirrors. Sometimes, announcements are real, but they just take several years to filter their way into the market and affect how developers build systems. Machine learning is one of those technologies that I find there is a lot of interest around but that developers also aren’t quite sure what to do with yet, at least in the microcontroller space. When we hear machine learning, we think artificial intelligence, big datasets and more processing power than will fit on an MCU.
There were several interesting talks at DevSummit around machine learning such as:
- Beyond ML – A Neuromorphic Approach to AI by Paul Isaacs
- tinyML Development with TensorFlow Lite for Microcontrollers and CMSIS-NN by Peter Warden and Fredrik Knutsson
- uTVM, an AI Compiler for Arm Microcontrollers by Thomas Gall
- Machine Learning Made Possible for Embedded Developers with Zero AI Skills by Francois de Rochebouet
Some of these were foundational, providing embedded developers with the fundamentals to get started while others provided hands-on explorations of machine learning with development boards. The take-a-way that I gather here is that the effort to bring machine learning capabilities to microcontrollers so that they can be leveraged in industry use cases is accelerating. Lots of effort is being placed in ML algorithms, tools, frameworks and even the hardware. There were several talks that mentioned Arm’s Cortex-M55 architecture that will include Helium technology to help accelerate machine learning and DSP processing capabilities.
Take-A-Way #3 – The Constant Need for Reinvention
In my last take-a-way, I eluded to the fact that things are accelerating. Acceleration is not just happening though in the technologies that we use to build systems. The very application domain that we can apply these technology domains to is dramatically expanding. Not only can we start to deploy security and ML technologies at the edge but in domains such as space and medical systems. There were several interesting talks about how technologies are being used around the world to solve interesting and unique problems such as protecting vulnerable ecosystems, mapping the sea floor, fighting against diseases and so much more.
By carefully watching and listening, you’ll notice that many speakers have been involved in many different types of products over their careers and that they are constantly having to reinvent their skill sets, capabilities and even their interests! This is what makes working in embedded systems so interesting! It is constantly changing and evolving and as engineers we don’t get to sit idly behind a desk. Just as Arm, NVIDIA and many of the other ecosystem partners and speakers show us, technology is rapidly changing but so are the problem domains that we can apply these technologies to.
Take-A-Way #4 – Mbed and Keil are Evolving
There are also interesting changes coming to the Arm toolchains and tools like Mbed and Keil MDK. In Reinhard Keil’s talk, “Introduction to an Open Approach for Low-Power IoT Development“, developers got an insight into the changes that are coming to Mbed and Keil with the core focus being on IoT development. The talk focused on the endpoint and discussed how Mbed and Keil MDK are being moved to an online platform designed to help developers move through the product development faster from prototyping to production. The Keil Studio Online is currently in early access and will be released early next year.
(If you are interested in endpoints and AI, you might also want to check-out this article on “How Do We Accelerate Endpoint AI Innovation? Put Developers First“)
Arm DevSummit had a lot to offer developers this year and without the need to travel to California to participate. (Although I greatly missed catching up with friends and colleagues in person). If you haven’t already, I would recommend checking out the DevSummit and watching a few of the talks I mentioned. There certainly were a lot more talks and I’m still in the process of sifting through everything. Hopefully there will be a few sessions that will inspire you and give you a feel for where the industry is headed and how you will need to pivot your own skills in the coming years.
Originaly posted here
An edge device is the network component that is responsible for connecting a local area network to an external or wide area network, which can be accessed from anywhere. Edge devices offer several new services and improved outcomes for IoT deployments across all markets. Smart services that rely on high volumes of data and local analysis can be deployed in a wide range of environments.
Edge device provides the local data to an external network. If protocols are different in local and external networks, it also translates this information, and make the connection between both network boundaries. Edge devices analyze diagnostics and automatic data populating; however, it is necessary to make a secure connection between the field network and cloud computing. In the event of loss of internet connection or cloud crash edge device will store data until the connection is established, so it won’t lose any process information. The local data storage is optional and not all edge devices offer local storage, it depends on the application and service required to implement on the plant.
How does an edge device work?
An edge device has a very straightforward working principle, it communicates between two different networks and translates one protocol into another. Furthermore, it creates a secure connection with the cloud.
An edge device can be configured via local access and internet or cloud. In general, we can say an edge device is a plug-and-play, its setup is simple and does not require much time to configure.
Why should I use an edge device?
Depending on the service required in the plant, the edge devices will be a crucial point to collect the information and create an automatic digital twin of your device in the cloud.
Edge devices are an essential part of IoT solutions since they connect the information from a network to a cloud solution. They do not affect the network but only collect the data from it, and never cause a problem with the communication between the control system and the field devices. by using an edge device to collect information, the user won’t need to touch the control system. Edge is one-way communication, nothing is written into the network, and data are acquired with the highest possible security.
Edge device requirements
Edge devices are required to meet certain requirements that are to meet at all conditions to perform in different secretions. This may include storage, network, and latency, etc.
Sensor data is collected in near real-time by an edge server. For services like image recognition and visual monitoring, edge servers are located in very close proximity to the device, meeting low latency requirements. Edge deployment needs to ensure that these services are not lost through poor development practice or inadequate processing resources at the edge. Maintaining data quality and security at the edge whilst enabling low latency is a challenge that need to address.
IoT services do not care for data communication topology. The user requires the data through the most effective means possible which in many cases will be mobile networks, but in some scenarios, Wi-Fi or local mesh networking may be the most effective mechanism of collecting data to ensure latency requirements can be met.
Users require data at the edge to be kept secure as when it is stored and used elsewhere. These challenges need to meet due to the larger vector and scope for attacks at the edge. Data authentication and user access are as important at the edge as it is on the device or at the core. Additionally, the physical security of edge infrastructure needs to be considered, as it is likely to hold in less secure environments than dedicated data centers.
Data quality at the edge is a key requirement to guarantee to operate in demanding environments. To maintain data quality at the edge, applications must ensure that data is authenticated, replicated as and assigned into the correct classes and types of data category.
Flexibility in future enhancements
Additional sensors can be added and managed at the edge as requirements change. Sensors such as accelerometers, cameras, and GPS, can be added to equipment, with seamless integration and control at the edge.
Local storage is essential in the event of loss of internet connection or cloud crash edge device will store data until the connection is established, so it won’t lose any process information. The local data storage is optional and not all edge devices offer local storage, it depends on the application and service required to implement on the plant
Originaly Posted here
After so many years evangelizing the Internet of Things (IoT) or developing IoT products or selling IoT services or using IoT technologies, it is hard to believe that today there are as many defenders as detractors of these technologies. Why does the doubt still assail us: "Believe or Not Believe in the IoT"? What's the reason we keep saying every year that the time for IoT is finally now?
It does not seem strange to you that if we have already experienced the power of change that involves having connected devices in ourselves (wearables), in our homes, in cities, in transportation, in business, we continue with so many non-believers. Maybe, because the expectations in 2013 were so great that now in 2020 we need more tangible and realistic data and facts to continue believing.
In recent months I have had more time to review my articles and some white papers and I think I have found some reasons to continue believing, but also reasons not to believe.
Here below there are some of these reasons for you to decide where to position yourself.
Top reasons to believe
- Mackinsey continue presenting us new opportunities with IoT
- If in 2015 “Internet of Things: Mapping the value beyond the hype” the company estimated a potential economic impact as much as 11,1 US trillions per year in 2025 for IoT applications in 9 settings.
- In 2019 “Growing opportunities in the Internet of Things” they said that “The number of businesses that use the IoT technologies has increased from 13 percent in 2014 to about 25 percent today. And the worldwide number of IoT connected devices is projected to increase to 43 billion by 2023, an almost threefold increase from 2018.”
- Gartner in 2019 predicted that by 2021, there will be over 25 Billion live IoT endpoints that will allow unlimited number of IoT use cases.
- Harbor Research considers that the market opportunity for industrial internet of things (IIoT) and industry 4.0 is still emergent.
- Solutions are not completely new but are evolving from the convergence of existing technologies; creative combinations of these technologies will drive many new growth opportunities;
- As integration and interoperability across the industrial technology “stack” relies on classic IT principles like open architectures, many leading IT players are entering the industrial arena;
- IoT regulation is coming - The lack of regulation is one of the biggest issues associated with IoT devices, but things are starting to change in that regard as well. The U.S. government was among the first to take the threat posed by unsecured IoT devices seriously, introducing several IoT-related bills in Congress over the last couple of years. It all began with the IoT Cybersecurity Improvement Act of 2017, which set minimum security standards for connected devices obtained by the government. This legislation was followed by the SMART IoT Act, which tasked the Department of Commerce with conducting a study of the current IoT industry in the United States.
- Synergy of IoT and AI - IoT supported by artificial intelligence enhances considerably the success in a large repertory of every-day applications with dominant one’s enterprise, transportation, robotics, industrial, and automation systems applications.
- Believe in superpowers again, thanks to IoT - Today, IoT sensors are everywhere – in your car, in electronic appliances, in traffic lights, even probably on the pigeon outside your window (it’s true, it happened in London!). IoT sensors will help cities map air quality, identify high-pollution pockets, trigger alerts if pollution levels rise dangerously, while tracking changes over time and taking preventive measures to correct the situation. thanks to IoT, connected cars will now communicate seamlessly with IoT sensors and find empty parking spots easily. Sensors in your car will also communicate with your GPS and the manufacturer’s system, making maintenance and driving a breeze!. City sensors will identify high-traffic areas and regulate traffic flows by updating your GPS with alternate routes. These IoT sensors can also identify and repair broken street lamps. IoT will be our knight in shining, super-strong metallic armor and prevent accidents like floods, fires and even road accidents, by simply monitoring fatigue levels of truck drivers!. Washing machines, refrigerators, air-conditioners will now self-monitor their usage, performance, servicing requirements, while triggering alerts before potential breakdowns and optimizing performance with automatic software updates. IoT sensors will now help medical professional monitor pulse rates, blood pressure and other vitals more efficiently, while triggering alerts in case of emergencies. Soon, Nano sensors in smart pills will make healthcare super-personalized and 10x more efficient!
Top reasons not to believe
- Three fourths of IoT projects failing globally. Government and enterprises across the globe are rolling out Internet of Things (IoT) projects but almost three-fourths of them fail, impacted by factors like culture and leadership, according to US tech giant Cisco (2017). Businesses are spending $745 billion worldwide on IoT hardware and software in 2019 alone. Yet, three out of every four IoT implementations are failing.
- Few IoT projects survive proof-of-concept stage - About 60% of IoT initiatives get stalled at the Proof of Concept (PoC) stage. If the right steps aren’t taken in the beginning, say you don’t think far enough beyond the IT infrastructure, you end up in limbo: caught between the dream of what IoT could do for your business and the reality of today’s ROI. That spot is called proof-of-concept (POC) purgatory.
- IoT Security still a big concern - The 2019 annual report of SonicWall Caoture Labs threat researchers analyzing data from over 200,000 malicious events indicated that 217.5 percent increase in IoT attacks in 2018.
- There are several obstacles companies face both in calculating and realizing ROI from IoT. Very few companies can quantify the current, pre-IoT costs. The instinct is often to stop after calculating the cost impact on the layer of operations immediately adjacent to the potential IoT project. For example, when quantifying the baseline cost of reactive (versus predictive or prescriptive) maintenance, too many companies would only include down time for unexpected outages, but may not consider reduced life of the machine, maintenance overtime, lost sales due to long lead times, supply chain volatility risk for spare parts, and the list goes on.
- Privacy, And No, That’s Not the Same as Security. The Big Corporations don’t expect to make a big profit on the devices themselves. the Big Money in IoT is in Big Data. And enterprises and consumers do not want to expose everything sensors are learning about your company or you.
- No Killer Application – I suggest to read my article “Worth it waste your time searching the Killer IoT Application?"
- No Interoperable Technology ecosystems - We have a plethora of IoT vendors, both large and small, jumping into the fray and trying to establish a foothold, in hopes of either creating their own ecosystem (for the startups) or extending their existing one (for the behemoths).
- Digital Fatigue – It is not enough for us to try to explain IoT, that now more technologies such as Artificial Intelligence, Blockchain, 5G, AR / VR are joining the party and of course companies say enough.
You have the last word
We can go on forever looking for reasons to believe or not believe in IoT but we cannot continue to deny the evidence that the millions of connected devices already out there and the millions that will soon be waiting for us to exploit their full potential.
I still believe. But you have the last word.
Thanks in advance for your Likes and Shares
Scott Rosenthal and I go back about a thousand years; we've worked together, helped midwife the embedded field into being, had some amazing sailing adventures, and recently took a jaunt to the Azores just for the heck of it. Our sons are both big data people; their physics PhDs were perfect entrees into that field, and both now work in the field of artificial intelligence.
At lunch recently we were talking about embedded systems and AI, and Scott posed a thought that has been rattling around in my head since. Could AI replace firmware?
Firmware is a huge problem for our industry. It's hideously expensive. Only highly-skilled people can create it, and there are too few of us.
What if an AI engine of some sort could be dumped into a microcontroller and the "software" then created by training that AI? If that were possible - and that's a big "if" - then it might be possible to achieve what was hoped for when COBOL was invented: programmers would no longer be needed as domain experts could do the work. That didn't pan out for COBOL; the industry learned that accountants couldn't code. Though the language was much more friendly than the assembly it replaced, it still required serious development skills.
But with AI, could a domain expert train an inference engine?
Consider a robot: a "home economics" major could create scenarios of stacking dishes from a dishwasher. Maybe these would be in the form of videos, which were then fed to the AI engine as it tuned the weighting coefficients to achieve what the home ec expert deems worthy goals.
My first objection to this idea was that these sorts of systems have physical constraints. With firmware I'd write code to sample limit switches so the motors would turn off if at an end-of-motion extreme. During training an AI-based system would try and drive the motors into all kinds of crazy positions, banging destructively into stops. But think how a child learns: a parent encourages experimentation but prevents the youngster from self-harm. Maybe that's the role of the future developer training an AI. Or perhaps the training will be done on a simulator of some sort where nothing can go horribly wrong.
Taking this further, a domain expert could define the desired inputs and outputs, and then a poorly-paid person do the actual training. CEOs will love that. With that model a strange parallel emerges to computation a century ago: before the computer age "computers" were people doing simple math to create tables of logs, trig, ballistics, etc. A room full all labored at a problem. They weren't particularly skilled, didn't make much, but did the rote work under the direction of one master. Maybe AI trainers will be somewhat like that.
Like we outsource clothing manufacturing to Bangladesh, I could see training, basically grunt work, being sent overseas as well.
I'm not wild about this idea as it means we'd have an IoT of idiots: billions of AI-powered machines where no one really knows how they work. They've been well-trained but what happens when there's a corner case?
And most of the AI literature I read suggests that inference successes of 97% or so are the norm. That might be fine for classifying faces, but a 3% failure rate of a safety-critical system is a disaster. And the same rate for less-critical systems like factory controllers would also be completely unacceptable.
But the idea is intriguing.
Original post can be viewed here
Feel free to email me with comments.
Back to Jack's blog index page.
A System on Chip (SoC), is essentially an integrated circuit that takes a single platform and integrates an entire computer system onto it. It combines the power of the CPU with other components that it needs to perform and execute its functions. It is in charge of using the other hardware and running your software. The main advantage of SoC includes lower latency and power saving.
It is made of various building blocks:
- Core + Caches + MMU – An SoC has a processor at its core which will define its functions. Normally, an SoC has multiple processor cores. For a “real” processor, e.g. ARM Cortex-A9. It’s the main thing kept in mind while choosing an SoC. Maybe co-adjuvanted by e.g. a SIMD co-processor like NEON.
- Internal RAM – IRAM is composed of very high-speed SRAM located alongside the CPU. It acts similar to a CPU cache, and generally very small. It is used in the first phase of the boot sequence.
- Peripherals – These can be a simple ADC, DSP, or a Graphical Processing Unit which is connected via some bus to the Core. A low power/real-time co-processor helps the main Core with real-time tasks or handle low power states. Examples of such IP cores are USB, PCI-E, SGX, etc.
An SoC uses RAM to store temporary data during and after bootstrap. It is the memory an embedded system uses during regular operation.
In an Embedded system or single-board computer, it is the SD card. In other cases, it can be a NAND, NOR, or SPI Data flash memory. It is the source of data the SoC reads and stores all the software components needed for the system to work.
An SoC must have external interfaces for standard communication protocols such as USB, Ethernet, and HDMI. It also includes wireless technology protocols of Wi-Fi and Bluetooth.
Softwarehttps://www.tirichlabs.com/storage/2020/09/Second-Article-01-300x169.jpg 300w, https://www.tirichlabs.com/storage/2020/09/Second-Article-01-768x432.jpg 768w, https://www.tirichlabs.com/storage/2020/09/Second-Article-01-1200x675.jpg 1200w" alt="" />
First of all, we introduce the boot chain which is the series of actions that happens when an SoC is powered up.
Boot ROM: It is a piece of code stored in the ROM which is executed by the booting core when it is powered-on. This code contains instructions for the configuration of SoC to allow it to execute applications. The configurations performed by Boot ROM include initialization of the core’s register and stack pointer, enablement of caches and line buffers, programming of interrupt service routine, clock configuration.
Boot ROM also implements a Boot Assist Module (BAM) for downloading an application image from external memories using interfaces like Ethernet, SD/MMC, USB, CAN, UART, etc.
1st stage bootloader
In the first-stage bootloader performs the following
- Setup the memory segments and stack used by the bootloader code
- Reset the disk system
- Display a string “Loading OS…”
- Find the 2nd stage boot loader in the FAT directory
- Read the 2nd stage boot loader image into memory at 1000:0000
- Transfer control to the second-stage bootloader
It copies the Boot ROM into the SoC’s internal RAM. Must be tiny enough to fit that memory usually well under 100kB. It initializes the External RAM and the SoC’s external memory interface, as well as other peripherals that may be of interest (e.g. disable watchdog timers). Once done, it executes the next stage, depending on the context, which could be called MLO, SPL, or else.
2nd stage bootloader
This is the main bootloader and can be 10 times bigger than the 1st stage, it completes the initialization of the relevant peripherals.
- Copy the boot sector to a local memory area
- Find kernel image in the FAT directory
- Read kernel image in memory at 2000:0000
- Reset the disk system
- Enable the A20 line
- Setup interrupt descriptor table at 0000:0000
- Setup the global descriptor table at 0000:0800
- Load the descriptor tables into the CPU
- Switch to protected mode
- Clear the prefetch queue
- Setup protected mode memory segments and stack for use by the kernel code
- Transfer control to the kernel code using a long jump
The Linux kernel is the main component of a Linux OS and is the core interface between hardware and processes. It communicates between the hardware and processes, managing resources as efficiently as possible. The kernel performs following jobs
- Memory management: Keep track of memory, how much is used to store what, and where
- Process management: Determine which processes can use the processor, when, and for how long
- Device drivers: Act as an interpreter between the hardware and the processes
- System calls and security: Receive requests for the service from processes
To put the kernel in context, they can be interpreted as a Linux machine as having 3 layers:
- The hardware: The physical machine—the base of the system, made up of memory (RAM) and the processor (CPU), as well as input/output (I/O) devices such as storage, networking, and graphics.
- The Linux kernel: The core of the OS. It is a software residing in memory that tells the CPU what to do.
- User processes: These are the running programs that the kernel manages. User processes are what collectively makeup user space. The kernel allows processes and servers to communicate with each other.
Init and rootfs – init is the first non-Kernel task to be run, and has PID 1. It initializes everything needed to use the system. In production embedded systems, it also starts the main application. In such systems, it is either BusyBox or a custom-crafted application.
View original post here
Now more than ever, there are billions of edge products in the world. But without proper cloud computing, making the most of electronic devices that run on Linux or any other OS would not be possible.
And so, a question most people keep asking is which is the best Software-as-a-service platform that can effectively manage edge devices through cloud computing. Well, while edge device management may not be something, the fact that cloud computing space is not fully exploited means there is a lot to do in the cloud space.
Product remote management is especially necessary for the 21st century and beyond. Because of the increasing number of devices connected to the internet of things (IoT), a reliable SaaS platform should, therefore, help with maintaining software glitches from anywhere in the world. From smart homes, stereo speakers, cars, to personal computers, any product that is connected to the internet needs real-time protection from hacking threats such as unlawful access to business or personal data.
Data being the most vital asset is constantly at risk, especially if individuals using edge products do not connect to trusted, reliable, and secure edge device management platforms.
Bridges the Gap Between Complicated Software And End Users
Cloud computing is the new frontier through which SaaS platforms help manage edge devices in real-time. But something even more noteworthy is the increasing number of complicated software that now run edge devices at homes and in workplaces.
Edge device management, therefore, ensures everything runs smoothly. From fixing bugs, running debugging commands to real-time software patch deployment, cloud management of edge products bridges a gap between end-users and complicated software that is becoming the norm these days.
Even more importantly, going beyond physical firewall barriers is a major necessity in remote management of edge devices. A reliable Software-as-a-Service, therefore, ensures data encryption for edge devices is not only hackproof by also accessed by the right people. Moreover, deployment of secure routers and access tools are especially critical in cloud computing when managing edge devices. And so, developers behind successful SaaS platforms do conduct regular security checks over the cloud, design and implement solutions for edge products.
Reliable IT Infrastructure Is Necessary
Software-as-a-service platforms that manage edge devices focus on having a reliable IT infrastructure and centralized systems through which they can conduct cloud computing. It is all about remotely managing edge devices with the help of an IT infrastructure that eliminates challenges such as connectivity latency.
Originally posted here
Introducing Profiler, by Auptimizer: Select the best AI model for your target device — no deployment required.
Profiler is a simulator for profiling the performance of Machine Learning (ML) model scripts. Profiler can be used during both the training and inference stages of the development pipeline. It is particularly useful for evaluating script performance and resource requirements for models and scripts being deployed to edge devices. Profiler is part of Auptimizer. You can get Profiler from the Auptimizer GitHub page or via pip install auptimizer.
Why we built Profiler
The cost of training machine learning models in the cloud has dropped dramatically over the past few years. While this drop has pushed model development to the cloud, there are still important reasons for training, adapting, and deploying models to devices. Performance and security are the big two but cost-savings is also an important consideration as the cost of transferring and storing data, and building models for millions of devices tends to add up. Unsurprisingly, machine learning for edge devices or Edge AI as it is more commonly known continues to become mainstream even as cloud compute becomes cheaper.
Developing models for the edge opens up interesting problems for practitioners.
- Model selection now involves taking into consideration the resource requirements of these models.
- The training-testing cycle becomes longer due to having a device in the loop because the model now needs to be deployed on the device to test its performance. This problem is only magnified when there are multiple target devices.
Currently, there are three ways to shorten the model selection/deployment cycle:
- The use of device-specific simulators that run on the development machine and preclude the need for deployment to the device. Caveat: Simulators are usually not generalizable across devices.
- The use of profilers that are native to the target device. Caveat: They need the model to be deployed to the target device for measurement.
- The use of measures like FLOPS or Multiply-Add (MAC) operations to give approximate measures of resource usage. Caveat: The model itself is only one (sometimes insignificant) part of the entire pipeline (which also includes data loading, augmentation, feature engineering, etc.)
In practice, if you want to pick a model that will run efficiently on your target devices but do not have access to a dedicated simulator, you have to test each model by deploying on all of the target devices.
Profiler helps alleviate these issues. Profiler allows you to simulate, on your development machine, how your training or inference script will perform on a target device. With Profiler, you can understand CPU- and memory-usage as well as run-time for your model script on the target device.
How Profiler works
Profiler encapsulates the model script, its requirements, and corresponding data into a Docker container. It uses user-inputs on compute-, memory-, and framework-constraints to build a corresponding Docker image so the script can run independently and without external dependencies. This image can then easily be scaled and ported to ease future development and deployment. As the model script is executed within the container, Profiler tracks and records various resource utilization statistics including Average CPU Utilization, Memory Usage, Network I/O, and Block I/O. The logger also supports setting the Sample Time to control how frequently Profiler samples utilization statistics from the Docker container.
Get Profiler: Click here
How Profiler helps
Our results show that Profiler can help users build a good estimate of model runtime and memory usage for many popular image/video recognition models. We conducted over 300 experiments across a variety of models (InceptionV3, SqueezeNet, Resnet18, MobileNetV2–0.25x, -0.5x, -0.75x, -1.0x, 3D-SqueezeNet, 3D-ShuffleNetV2–0.25x, -0.5x, -1.0x, -1.5x, -2.0x, 3D-MobileNetV2–0.25x, -0.5x, -0.75x, -1.0x, -2.0x) on three different devices — LG G6 and Samsung S8 phones, and NVIDIA Jetson Nano. You can find the full set of experimental results and more information on how to conduct similar experiments on your devices here.
The addition of Profiler brings Auptimizer closer to the vision of a tool that helps machine learning scientists and engineers build models for edge devices. The hyperparameter optimization (HPO) capabilities of Auptimizer help speed up model discovery. Profiler helps with choosing the right model for deployment. It is particularly useful in the following two scenarios:
- Deciding between models — The ranking of the run-times and memory usages of the model scripts measured using Profiler on the development machine is indicative of their ranking on the target device. For instance, if Model1 is faster than Model2 when measured using Profiler on the development machine, Model1 will be faster than Model2 on the device. This ranking is valid only when the CPU’s are running at full utilization.
- Predicting model script performance on the device — A simple linear relationship relates the run-times and memory usage measured using Profiler on the development machine with the usage measured using a native profiling tool on the target device. In other words, if a model runs in time x when measured using Profiler, it will run approximately in time (a*x+b) on the target device (where a and b can be discovered by profiling a few models on the device with a native profiling tool). The strength of this relationship depends on the architectural similarity between the models but, in general, the models designed for the same task are architecturally similar as they are composed of the same set of layers. This makes Profiler a useful tool for selecting the best suited model.
Profiler continues to evolve. So far, we have tested its efficacy on select mobile- and edge-platforms for running popular image and video recognition models for inference, but there is much more to explore. Profiler might have limitations for certain models or devices and can potentially result in inconsistencies between Profiler outputs and on-device measurements. Our experiment page provides more information on how to best set up your experiment using Profiler and how to interpret potential inconsistencies in results. The exact use case varies from user to user but we believe that Profiler is relevant to anyone deploying models on devices. We hope that Profiler’s estimation capability can enable leaner and faster model development for resource-constrained devices. We’d love to hear (via github) if you use Profiler during deployment.
Originaly posted here
Authors: Samarth Tripathi, Junyao Guo, Vera Serdiukova, Unmesh Kurup, and Mohak Shah — Advanced AI, LG Electronics USA
Summary: Know How Businesses Are Leveraging Their Business Power with the Help of the Internet of Things (IoT). They Are Paying Attention to It to Enhance Their Business Process and Ensuring Gain Long Term Success for Their Business in This Fiercely Competitive Market.
In this IT era, the latest technology is making its way to our day to day life. It has influenced our life to a great extent and has also affected the way we work. Now we use different gadgets and modern equipment that ease our work and helps us to complete it more smoothly and accurately than ever before. The latest technology like Machine Learning, Big Data Analytics, and Artificial Intelligence has slowly established its command across different industries. Apart from all these technologies one technology that gained significant importance is the internet of things (IoT), it has affected the different areas of various sectors to a great extent.
The use of IoT enabled devices has enhanced the way people live their lives. According to Gartner's prediction, more than 25 billion IoT devices will be present in the market by 2021. The use of IoT will introduce new innovation for businesses, customers, and society.
The potential growth in usage of IoT has resulted in improvement in various sectors like healthcare, education, entertainment, and many more. Now it has become possible to track assert in real-time, monitoring the ups and downs in the human body, home automation, environmental monitoring, etc have become easy and all thanks go to the internet of things (IoT).
Internet of Things: Know Why Businesses Need It for Their Business?
As per the report by Cisco, more than 500 billion devices will be connected with the Internet by 2030. Each device that will be connected by the internet will include sensors that collect data by interacting with the environment and will communicate over a network very accurately.
And all this will become possible through the Internet of Things (IoT) as it's the network of all these connected devices. These smart devices which are developed using this latest technology will generate data that IoT applications use to accomplish various tasks like deliver insight, analyze, aggregate which helps to respond much accurately as per user's actions.
The internet of things is one such latest technology that is continuously improving with each passing second. As this technology connects multiple things with each other, it becomes possible for businesses to get real-time access to all the information on the network and thus it has been proved to be beneficial for them to improve their business processes. It provided multiple benefits to the businesses who adopt it, go through the list of benefits that IoT offers for your business. There are various advantages to explore when it comes to implementing the internet of things for your business.
1. Offers a Large Amount of Data
Almost all businesses these days have realized the power of the internet of things and have started opting for the same for their business. As more and more businesses are stepping ahead to opt for this technology it is predicted that the total market value of IoT will grow rapidly and will reach $3 trillion by 2026.
IoT enabled devices are able to collect huge data from the network with the help of added sensors. This information can be beneficial for businesses as they can easily know what their customers really want from them, how can they fulfill their demands in the best possible way, and much more.
2. Better Customer Service
Every business these days boil down to satisfy their customers and offer the best to them on their demand. The combo of IoT based devices with an app like spoitify can provide quick access to customers' behaviors. It helps businesses to analyze all the data which includes customers' preference, the time they spent on making a particular purchase, the language they prefer, and much more.
All this information can help businesses to enhance their customer support and come up with an advanced solution that satisfies all their needs. Using this information you can diversify your business according to new market trends and grab all the opportunities that come your way.
3. Ability to Monitor and Track Things
IoT enabled devices will allow all businesses to track and monitor each and every activity of their employees. They can easily know what their employees are working, how many tasks they have completed, what progress that has made, and much more. They can even share information with their employees in real-time about the current project on which they are working and can also get information from them whenever they want.
4. Save Money and Resource
There is no doubt that machine to machine communication is growing dramatically in recent years. It is estimated that the total number of M2M connections will grow speedily from 5 billion to 27 billion from 2014 to 2024.
Machines have taken the place of the human in most of the business sector which save a huge amount of money and resources of businesses which they used to spend on human labor. Nowadays work like answering customers' queries, managing accounts, keeping other business records, and much more work in the business environment is performed by the latest application and software that has been developed using the latest technology like the internet of things or any other.
IoT helps businesses to find the best way to make their business process faster and better. They can let them know which areas to be automated so that they can reduce the task of the employees and can save a huge amount of time and resources of their business. If as a business entrepreneur if you feel that your business needs to be automated then IoT will analyze each and every area of your business and will let you know which can be automated and don't need human interaction.
6. Helps to offer Personalized Experiences
As stated above, businesses can get all the information related to their ideal customers with the help of IoT enabled devices. They can know their purchase preferences, likes, dislikes, and much more and can try to provide a personalized experience.
As per New Epsilon research, 80% of consumers like to make a purchase from a particular brand if and only if they offer personalized experiences to them. For example, businesses can develop accurate bills keeping in mind the analyzed IoT data and can provide various discounts and offers to the customers as more than 74% of customers expect that they will get automatic crediting for coupons and loyalty points.
Wonders of the Internet of Things Have a Long Way to Go!
There are certain areas that are still untapped by businesses as they are unable to implement IoT technology in every aspect of their business environment. And even some of the businesses have yet not opted for this modern technology, due to which they are missing various opportunities that are in their success. There are various ways in which IoT works wonders for every business sector. As technology is evolving continually due to research and efforts of brilliant minds, there are certain changes that IoT will have much to offer to the businesses in the nearby future.
When businesses implement the internet of things in their business they will experience enhancement in their employee's productivity, speed, and efficiency which will directly affect the business profit. Hence work on your business niche and find out whether you can implement IoT in your business environment or not. It’s the demand of time to stand out from others and you can do it using IoT, implement this technology in a basic way for your business if possible.
Today, the IoT devices are largely used by industries and households, smart bulbs can adjust the intensity of light by themselves, doctors can check the patient data remotely, IoT sensors can help in warehousing, and more, the potential is seemingly endless. There are billions of IoT devices on the field and billions more are expected in the next few years. The data that IoT devices produce are stored on the cloud, for example, a health monitor collects all the information about our health and stores it on the cloud. This information is further analyzed to provide us better services, but on the other hand if someone manages to get the data they can violate our privacy. Thus it is important to ensure the confidentiality and integrity of IoT solutions while mitigating the cybersecurity risks. There are many ways attackers can make their way into your system.
Most common IoT cyber attacks are:
A botnet is a network of systems combined to remotely take control of distributing malware, controlled by botnet operators via Command-and-Control-Servers (C&C servers). They are used by attackers on a large scale for many things such as stealing private information, exploiting online banking data or spam, and phishing emails.
The man-in-the-middle concept is where an attacker is looking to interrupt and breach communication between two separate systems. It can be a dangerous attack because it is one where the attacker secretly intercepts and transmits messages between two parties when they are under the belief that they are communicating directly with each other.
The main strategy of identity theft is to amass data, and with a little bit of patience, a lot of information can be fetched out. Generally, data is available on the internet, combined with social media information and data from smartwatches, fitness trackers, smart meters, smart fridges, and more. These data give a great all-around idea of your identity.
Recent research indicates that 85% of customers lack confidence in IoT device security, it is important to ensure the security of IoT devices by eliminating the IoT cybersecurity risk.
Here are some best practices to ensure IoT cybersecurity:
The secure boot helps a system to stop attacks and infections from malware, it is a feature embedded with IoT devices to detect tampering with the system. It works like a security gate as it restricts unauthorized access by validating the digital signature, detections are blocked from running before they attack the system. Deploying secure boot in the IoT ecosystem is important to ensure cybersecurity.
Secured passwords with two-factor authentication
You can activate two-factor authentication on almost any IoT device, it is important because it ensures authorized access to devices and automates trust into the system. Having two-factor authentication enabled with unusual passwords keeps IoT devices secure from being vulnerable to cyber attacks, it restricts attackers from making their way into the system.
Disabling the UPnP feature
UPnP feature allows an IoT device to get connected with other IoT devices, for example, smart bulbs can be paired with Google Home to turn it off or on via voice command. It is a feature that is convenient for users but poses cybersecurity risks at the same time. If hackers manage to make their way in one device they will easily be able to find another device that is connected. We can easily disable the UPnP feature as most of the IoT devices allow you to disable the UPnP feature from their settings.
Secure data storage
Keeping data in a large enterprise system is secured but the flash storage of a particular embedded device holds some important data from time to time that is not immediately secured or encrypted which can open you up to cybersecurity risk. Thus it is important to have system-level encryption of data for storage of sensitive information. If we do not encrypt the flash storage on the embedded device, someone can easily have their peak at your data.
Securing IoT devices from cyberattacks is important for households and it is equally important for industries to ensure the confidentiality and integrity of their IoT devices and data produced by IoT devices. Researchers find that data breaches linked to IoT devices have increased rapidly in the past few years, according to a study by Ponemon, the number of cyberattacks due to unsecured connected devices have increased from 15% to 25% in the last two years. Thus securing the IoT devices can never be downplayed.
Piyush Jain is the founder and CEO of Simpalm, an app development company in Virginia. Piyush founded Simpalm in 2009 and has grown it to be a leading mobile and web development company in the DMV area. With a Ph.D. from Johns Hopkins and a strong background in technology and entrepreneurship, he understands how to solve problems using technology. Under his leadership, Simpalm has delivered 300+ mobile apps and web solutions to clients in startups, enterprises and the federal sector.
Note: this page contains paid content.
Please, subscribe to get an access.
Note: this page contains paid content.
Please, subscribe to get an access.
Nowadays, it’s easier than ever to power your home with clean energy, and yet, many Americans don’t know how to make the switch. Luckily, you don’t have to install expensive solar panels or switch utility companies…Continue
Consumer-centric applications for artificial intelligence (AI) and automation are helping to stamp out the public perception that these technologies will only benefit businesses and negatively impact jobs and hiring. The conversation from human…Continue