Subscribe to our Newsletter | To Post On IoT Central, Click here


Data AI (40)

by Evelyn Münster

IoT systems are complex data products: they consist of digital and physical components, networks, communications, processes, data, and artificial intelligence (AI). User interfaces (UIs) are meant to make this level of complexity understandable for the user. However, building a data product that can explain data and models to users in a way that they can understand is an unexpectedly difficult challenge. That is because data products are not your run-of-the-mill software product.

In fact, 85% of all big data and AI projects fail. Why? I can say from experience that it is not the technology but rather the design that is to blame.

So how do you create a valuable data product? The answer lies in a new type of user experience (UX) design. With data products, UX designers are confronted with several additional layers that are not usually found in conventional software products: it’s a relatively complex system, unfamiliar to most users, and comprises data and data visualization as well as AI in some cases. Last but not least, it presents an entirely different set of user problems and tasks than customary software products.

Let’s take things one step at a time. My many years in data product design have taught me that it is possible to create great data products, as long as you keep a few things in mind before you begin.

As a prelude to the UX design process, make sure you and your team answer the following nine questions:

1. Which problem does my product solve for the user?

The user must be able to understand the purpose of your data product in a matter of minutes. The assignment to the five categories of the specific tasks of data products can be helpful: actionable insights, performance feedback loop, root cause analysis, knowledge creation, and trust building.

2. What does the system look like?

Do not expect users to already know how to interpret the data properly. They need to be able to construct a fairly accurate mental model of the system behind the data.

3. What is the level of data quality?

The UI must reflect the quality of the data. A good UI leads the user to trust the product.

4. What is the user’s proficiency level in graphicacy and numeracy?

Conduct user testing to make sure that your audience will be able to read and interpret the data and visuals correctly.

5. What level of detail do I need?

Aggregated data is often too abstract to explain, or to build user trust. A good way to counter this challenge is to use details that explain things. Then again, too much detail can also be overwhelming.

6. Are we dealing with probabilities?

Probabilities are tricky and require explanations. The common practice of cutting out all uncertainties makes the UI deceptively simple – and dangerous.

7. Do we have a data visualization expert on the design team?

UX design applied to data visualization requires a special skillset that covers the entire process, from data analysis to data storytelling. It is always a good idea to have an expert on the team or, alternatively, have someone to reach out to when required.

8. How do we get user feedback?

As soon as the first prototype is ready, you should collect feedback through user testing. The prototype should present content in the most realistic and consistent way possible, especially when it comes to data and figures.

9. Can the user interface boost our marketing and sales?

If the user interface clearly communicates what the data product does and what the process is like, then it could take on a new function: sell your products.

To sum up: we must acknowledge that data products are an unexplored territory. They are not just another software product or dashboard, which is why, in order to create a valuable data product, we will need a specific strategy, new workflows, and a particular set of skills: Data UX Design.

Originally posted HERE 

Read more…

by Stephanie Overby

What's next for edge computing, and how should it shape your strategy? Experts weigh in on edge trends and talk workloads, cloud partnerships, security, and related issues


All year, industry analysts have been predicting that that edge computing – and complimentary 5G network offerings ­­– will see significant growth, as major cloud vendors are deploying more edge servers in local markets and telecom providers pushing ahead with 5G deployments.

The global pandemic has not significantly altered these predictions. In fact, according to IDC’s worldwide IT predictions for 2021, COVID-19’s impact on workforce and operational practices will be the dominant accelerator for 80 percent of edge-driven investments and business model change across most industries over the next few years.

First, what exactly do we mean by edge? Here’s how Rosa Guntrip, senior principal marketing manager, cloud platforms at Red Hat, defines it: “Edge computing refers to the concept of bringing computing services closer to service consumers or data sources. Fueled by emerging use cases like IoT, AR/VR, robotics, machine learning, and telco network functions that require service provisioning closer to users, edge computing helps solve the key challenges of bandwidth, latency, resiliency, and data sovereignty. It complements the hybrid computing model where centralized computing can be used for compute-intensive workloads while edge computing helps address the requirements of workloads that require processing in near real time.”

Moving data infrastructure, applications, and data resources to the edge can enable faster response to business needs, increased flexibility, greater business scaling, and more effective long-term resilience.

“Edge computing is more important than ever and is becoming a primary consideration for organizations defining new cloud-based products or services that exploit local processing, storage, and security capabilities at the edge of the network through the billions of smart objects known as edge devices,” says Craig Wright, managing director with business transformation and outsourcing advisory firm Pace Harmon.

“In 2021 this will be an increasing consideration as autonomous vehicles become more common, as new post-COVID-19 ways of working require more distributed compute and data processing power without incurring debilitating latency, and as 5G adoption stimulates a whole new generation of augmented reality, real-time application solutions, and gaming experiences on mobile devices,” Wright adds.

8 key edge computing trends in 2021


Noting the steady maturation of edge computing capabilities, Forrester analysts said, “It’s time to step up investment in edge computing,” in their recent Predictions 2020: Edge Computing report. As edge computing emerges as ever more important to business strategy and operations, here are eight trends IT leaders will want to keep an eye on in the year ahead.

1. Edge meets more AI/ML


Until recently, pre-processing of data via near-edge technologies or gateways had its share of challenges due to the increased complexity of data solutions, especially in use cases with a high volume of events or limited connectivity, explains David Williams, managing principal of advisory at digital business consultancy AHEAD. “Now, AI/ML-optimized hardware, container-packaged analytics applications, frameworks such as TensorFlow Lite and tinyML, and open standards such as the Open Neural Network Exchange (ONNX) are encouraging machine learning interoperability and making on-device machine learning and data analytics at the edge a reality.” 

Machine learning at the edge will enable faster decision-making. “Moreover, the amalgamation of edge and AI will further drive real-time personalization,” predicts Mukesh Ranjan, practice director with management consultancy and research firm Everest Group.

“But without proper thresholds in place, anomalies can slowly become standards,” notes Greg Jones, CTO of IoT solutions provider Kajeet. “Advanced policy controls will enable greater confidence in the actions made as a result of the data collected and interpreted from the edge.” 

 

2. Cloud and edge providers explore partnerships


IDC predicts a quarter of organizations will improve business agility by integrating edge data with applications built on cloud platforms by 2024. That will require partnerships across cloud and communications service providers, with some pairing up already beginning between wireless carriers and the major public cloud providers.

According to IDC research, the systems that organizations can leverage to enable real-time analytics are already starting to expand beyond traditional data centers and deployment locations. Devices and computing platforms closer to end customers and/or co-located with real-world assets will become an increasingly critical component of this IT portfolio. This edge computing strategy will be part of a larger computing fabric that also includes public cloud services and on-premises locations.

In this scenario, edge provides immediacy and cloud supports big data computing.

 

3. Edge management takes center stage


“As edge computing becomes as ubiquitous as cloud computing, there will be increased demand for scalability and centralized management,” says Wright of Pace Harmon. IT leaders deploying applications at scale will need to invest in tools to “harness step change in their capabilities so that edge computing solutions and data can be custom-developed right from the processor level and deployed consistently and easily just like any other mainstream compute or storage platform,” Wright says.

The traditional approach to data center or cloud monitoring won’t work at the edge, notes Williams of AHEAD. “Because of the rather volatile nature of edge technologies, organizations should shift from monitoring the health of devices or the applications they run to instead monitor the digital experience of their users,” Williams says. “This user-centric approach to monitoring takes into consideration all of the components that can impact user or customer experience while avoiding the blind spots that often lie between infrastructure and the user.”

As Stu Miniman, director of market insights on the Red Hat cloud platforms team, recently noted, “If there is any remaining argument that hybrid or multi-cloud is a reality, the growth of edge solidifies this truth: When we think about where data and applications live, they will be in many places.”

“The discussion of edge is very different if you are talking to a telco company, one of the public cloud providers, or a typical enterprise,” Miniman adds. “When it comes to Kubernetes and the cloud-native ecosystem, there are many technology-driven solutions competing for mindshare and customer interest. While telecom giants are already extending their NFV solutions into the edge discussion, there are many options for enterprises. Edge becomes part of the overall distributed nature of hybrid environments, so users should work closely with their vendors to make sure the edge does not become an island of technology with a specialized skill set.“

 

4. IT and operational technology begin to converge


Resiliency is perhaps the business term of the year, thanks to a pandemic that revealed most organizations’ weaknesses in this area. IoT-enabled devices (and other connected equipment) drive the adoption of edge solutions where infrastructure and applications are being placed within operations facilities. This approach will be “critical for real-time inference using AI models and digital twins, which can detect changes in operating conditions and automate remediation,” IDC’s research says.

IDC predicts that the number of new operational processes deployed on edge infrastructure will grow from less than 20 percent today to more than 90 percent in 2024 as IT and operational technology converge. Organizations will begin to prioritize not just extracting insight from their new sources of data, but integrating that intelligence into processes and workflows using edge capabilities.

Mobile edge computing (MEC) will be a key enabler of supply chain resilience in 2021, according to Pace Harmon’s Wright. “Through MEC, the ecosystem of supply chain enablers has the ability to deploy artificial intelligence and machine learning to access near real-time insights into consumption data and predictive analytics as well as visibility into the most granular elements of highly complex demand and supply chains,” Wright says. “For organizations to compete and prosper, IT leaders will need to deliver MEC-based solutions that enable an end-to-end view across the supply chain available 24/7 – from the point of manufacture or service  throughout its distribution.”

 

5. Edge eases connected ecosystem adoption


Edge not only enables and enhances the use of IoT, but it also makes it easier for organizations to participate in the connected ecosystem with minimized network latency and bandwidth issues, says Manali Bhaumik, lead analyst at technology research and advisory firm ISG. “Enterprises can leverage edge computing’s scalability to quickly expand to other profitable businesses without incurring huge infrastructure costs,” Bhaumik says. “Enterprises can now move into profitable and fast-streaming markets with the power of edge and easy data processing.”

 

6. COVID-19 drives innovation at the edge


“There’s nothing like a pandemic to take the hype out of technology effectiveness,” says Jason Mann, vice president of IoT at SAS. Take IoT technologies such as computer vision enabled by edge computing: “From social distancing to thermal imaging, safety device assurance and operational changes such as daily cleaning and sanitation activities, computer vision is an essential technology to accelerate solutions that turn raw IoT data (from video/cameras) into actionable insights,” Mann says. Retailers, for example, can use computer vision solutions to identify when people are violating the store’s social distance policy.

 

7. Private 5G adoption increases


“Use cases such as factory floor automation, augmented and virtual reality within field service management, and autonomous vehicles will drive the adoption of private 5G networks,” says Ranjan of Everest Group. Expect more maturity in this area in the year ahead, Ranjan says.

 

8. Edge improves data security


“Data efficiency is improved at the edge compared with the cloud, reducing internet and data costs,” says ISG’s Bhaumik. “The additional layer of security at the edge enhances the user experience.” Edge computing is also not dependent on a single point of application or storage, Bhaumik says. “Rather, it distributes processes across a vast range of devices.”

As organizations adopt DevSecOps and take a “design for security” approach, edge is becoming a major consideration for the CSO to enable secure cloud-based solutions, says Pace Harmon’s Wright. “This is particularly important where cloud architectures alone may not deliver enough resiliency or inherent security to assure the continuity of services required by autonomous solutions, by virtual or augmented reality experiences, or big data transaction processing,” Wright says. “However, IT leaders should be aware of the rate of change and relative lack of maturity of edge management and monitoring systems; consequently, an edge-based security component or solution for today will likely need to be revisited in 18 to 24 months’ time.”

Originally posted here.

Read more…

IoT Sustainability, Data At The Edge.

Recently I've written quite a bit about IOT, and one thing you may have picked up on is that the Internet of Things is made up of a lot of very large numbers.

For starters, the number of connected things is measured in the tens of billions, nearly 100's of billions. Then, behind that very large number is an even bigger number, the amount of data these billions of devices is predicted to generate.

As FutureIoT pointed out, IDC forecasted that the amount of data generated by IoT devices by 2025 is expected to be in excess of 79.4 zettabytes (ZB).

How much is Zettabyte!?

A zettabyte is a very large number indeed, but how big? How can you get your head around it? Does this help...?

A zettabyte is 1,000,000,000,000,000,000,000 bytes. Hmm, that's still not very easy to visualise.

So let's think of it in terms of London busses. Let's image a byte is represented as a human on a bus, a London bus can take 80 people, so you'd need 993 quintillion busses to accommodate 79.4 zettahumans.

I tried to calculate how long 993 quintillion busses would be. Relating it to the distance to the moon, Mars or the Sun wasn't doing it justice, the only comparable scale is the size of the Milky Way. Even with that, our 79.4 zettahumans lined up in London busses, would stretch across the entire Milky Way ... and a fair bit further!

Sustainability Of Cloud Storage For 993 Quintillion Busses Of Data

Everything we do has an impact on the planet. Just by reading this article, you're generating 0.2 grams of Carbon Dioxide (CO2) emissions per second ... so I'll try to keep this short.

Using data from the Stanford Magazine that suggests every 100 gigabytes of data stored in the Cloud could generate 0.2 tons of CO2 per year. Storing 79.4 zettabytes of data in the Cloud could be responsible for the production of 158.8 billion tons of greenhouse gases.

8598308493?profile=RESIZE_710x

 

Putting that number into context, using USA Today numbers, the total emissions for China, USA, India, Russia, Japan and Germany accounted for a little over 21 billion tons in 2019.

So if we just go ahead and let all the IoT devices stream data to the Cloud, those billions of little gadgets would indirectly generate more than seven times the air pollution than the six most industrial countries, combined.

Save The Planet, Store Data At The Edge

As mentioned in a previous article, not all data generated by IoT devices needs to be stored in the Cloud.

Speaking with an expert in data storage, ObjectBox, they say their users on average cut their Cloud data storage by 60%. So how does that work, then? 

First, what does The Edge mean?

The term "Edge" refers to the edge of the network, in other words the last piece of equipment or thing connected to the network closest to the point of usage.

Let me illustrate in rather over-simplified diagram.

8598328665?profile=RESIZE_710x

 

How Can Edge Data Storage Improve Sustainability?

In an article about computer vision and AI on the edge, I talked about how vast amounts of network data could be saved if the cameras themselves could detect what an important event was, and to just send that event over the network, not the entire video stream.

In that example, only the key events and meta data, like the identification marks of a vehicle crossing a stop light, needed to be transmitted across the network. However, it is important to keep the raw content at the edge, so it can be used for post processing, for further learning of the AI or even to be retrieved at a later date, e.g. by law-enforcement.

Another example could be sensors used to detect gas leaks, seismic activity, fires or broken glass. These sensors are capturing volumes of data each second, but they only want to alert someone when something happens - detection of abnormal gas levels, a tremor, fire or smashed window.

Those alerts are the primary purpose of those devices, but the data in between those events can also hold significant value. In this instance, keeping it locally at the edge, but having it as and when needed is an ideal way to reduce network traffic, reduce Cloud storage and save the planet (well, at least a little bit).

Accessible Data At The Edge

Keeping your data at the edge is a great way to save costs and increase performance, but you still want to be able to get access to it, when you need it.

ObjectBox have created not just one of the most efficient ways to store data at the edge, but they've also built a sophisticated and powerful method to synchronise data between edge devices, the Cloud and other edge devices.

Synchronise Data At The Edge - Fog Computing.

Fog Computing (which is computing that happens between the Cloud and the Edge) requires data to be exchanged with devices connected to the edge, but without going all the way to/from the servers in the Cloud. 

In the article on making smarter, safer cities, I talked about how by having AI-equipped cameras share data between themselves they could become smarter, more efficient. 

A solution like that could be using ObjectBox's synchronisation capabilities to efficiently discover and collect relevant video footage from various cameras to help either identify objects or even train the artificial intelligence algorithms running on the AI-equipped cameras at the edge.

Storing Data At The Edge Can Save A Bus Load CO2

Edge computing has a lot of benefits to offer, in this article I've just looked at what could often be overlooked - the cost of transferring data. I've also not really delved into the broader benefits of ObjectBox's technology, for example, from their open source benchmarks, ObjectBox seems to offer a ten times performance benefit compared to other solutions out there, and is being used by more than 300,000 developers.  

The team behind ObjectBox also built technologies currently used by internet heavy-weights like Twitter, Viber and Snapchat, so they seem to be doing something right, and if they can really cut down network traffic by 60%, they could be one of sustainable technology companies to watch.  

Originally posted here.

Read more…

Edge Impulse has joined 1% for Planet, pledging to donate 1% of our revenue to support nonprofit organizations focused on the environment. To complement this effort we launched the ElephantEdge competition, aiming to create the world’s best elephant tracking device to protect elephant populations that would otherwise be impacted by poaching. In this similar vein, this blog will detail how Lacuna Space, Edge Impulse, a microcontroller and LoraWAN can promote the conservation of endangered species by monitoring bird calls in remote areas.

Over the past years, The Things Networks has worked around the democratization of the Internet of Things, building a global and crowdsourced LoraWAN network carried by the thousands of users operating their own gateways worldwide. Thanks to Lacuna Space’ satellites constellation, the network coverage goes one step further. Lacuna Space uses LEO (Low-Earth Orbit) satellites to provide LoRaWAN coverage at any point around the globe. Messages received by satellites are then routed to ground stations and forwarded to LoRaWAN service providers such as TTN. This technology can benefit several industries and applications: tracking a vessel not only in harbors but across the oceans, monitoring endangered species in remote areas. All that with only 25mW power (ISM band limit) to send a message to the satellite. This is truly amazing!

Most of these devices are typically simple, just sending a single temperature value, or other sensor reading, to the satellite - but with machine learning we can track much more: what devices hear, see, or feel. In this blog post we'll take you through the process of deploying a bird sound classification project using an Arduino Nano 33 BLE Sense board and a Lacuna Space LS200 development kit. The inferencing results are then sent to a TTN application.

Note: Access to the Lacuna Space program and dev kit is closed group at the moment. Get in touch with Lacuna Space for hardware and software access. The technical details to configure your Arduino sketch and TTN application are available in our GitHub repository.

 

Our bird sound model classifies house sparrow and rose-ringed parakeet species with a 92% accuracy. You can clone our public project or make your own classification model following our different tutorials such as Recognize sounds from audio or Continuous Motion Recognition.

3U_BsvrlCU1J-Fvgi4Y_vV_I5u_LPwb7vSFhlV-Y4c3GCbOki958ccFA1GbVN4jVDRIrUVZVAa5gHwTmYKv17oFq6tXrmihcWbUblNACJ9gS1A_0f1sgLsw1WNYeFAz71_5KeimC

Once you have trained your model, head to the Deployment section, select the Arduino library and Build it.

QGsN2Sy7bP1MsEmsnFyH9cbMxsrbSAw-8_Q-K1_X8-YSXmHLXBHQ8SmGvXNv-mVT3InaLUJoJutnogOePJu-5yz2lctPemOrQUaj9rm0MSAbpRhKjxBb3BC5g-U5qHUImf4HIVvT

Import the library within the Arduino IDE, and open the microphone continuous example sketch. We made a few modifications to this example sketch to interact with the LS200 dev kit: we added a new UART link and we transmit classification results only if the prediction score is above 0.8.

Connect with the Lacuna Space dashboard by following the instructions on our application’s GitHub ReadMe. By using a web tracker you can determine when the next good time a Lacuna Space satellite will be flying in your location, then you can receive the signal through your The Things Network application and view the inferencing results on the bird call classification:

    {
       "housesparrow": "0.91406",
       "redringedparakeet": "0.05078",
       "noise": "0.03125",
       "satellite": true,
   }

No Lacuna Space development kit yet? No problem! You can already start building and verifying your ML models on the Arduino Nano 33 BLE Sense or one of our other development kits, test it out with your local LoRaWAN network (by pairing it with a LoRa radio or LoRa module) and switch over to the Lacuna satellites when you get your kit.

Originally posted on the Edge Impulse blog by Aurelien Lequertier - Lead User Success Engineer at Edge Impulse, Jenny Plunkett - User Success Engineer at Edge Impulse, & Raul James - Embedded Software Engineer at Edge Impulse

Read more…

Five IoT retail trends for 2021

In 2020 we saw retailers hard hit by the economic effects of the COVID-19 pandemic with dozens of retailers—Neiman Marcus, J.C. Penney, and Brooks Brothers to name a few— declaring bankruptcy. During the unprecedented chaos of lockdowns and social distancing, consumers accelerated their shift to online shopping. Retailers like Target and Best Buy saw online sales double while Amazon’s e–commerce sales grew 39 percent.1 Retailers navigated supply chain disruptions due to COVID-19, climate change events, trade tensions, and cybersecurity events.  

After the last twelve tumultuous months, what will 2021 bring for the retail industry? I spoke with Microsoft Azure IoT partners to understand how they are planning for 2021 and compiled insights about five retail trends. One theme we’re seeing is a focus on efficiency. Retailers will look to pre-configured digital platforms that leverage cloud-based technologies including the Internet of Things (IoT), artificial intelligence (AI), and edge computing to meet their business goals. 

a group of people standing in front of a mirror posing for the camera

Empowering frontline workers with real-time data

In 2021, retailers will increase efficiency by empowering frontline workers with real-time data. Retail employees will be able to respond more quickly to customers and expand their roles to manage curbside pickups, returns, and frictionless kiosks.  

In H&M Mitte Garten in Berlin, H&M empowered employee ambassadors with fashionable bracelets connected to the Azure cloud. Ambassadors were able to receive real-time requests via their bracelets when customers needed help in fitting rooms or at a cash desk. The ambassadors also received visual merchandising instructions and promotional updates. 

Through the app built on Microsoft partner Turnpike’s wearable SaaS platform leveraging Azure IoT Hub, these frontline workers could also communicate with their peers or their management team during or after store hours. With the real-time data from the connected bracelets, H&M ambassadors were empowered to delivered best-in-class service.   

Carl Norberg, Founder, Turnpike explained, “We realized that by connecting store IoT sensors, POS systems, and AI cameras, store staff can be empowered to interact at the right place at the right time.” 

Leveraging live stream video to innovate omnichannel

Livestreaming has been exploding in China as influencers sell through their social media channels. Forbes recently projected that nearly 40 percent of China’s population will have viewed livestreams during 2020.2 Retailers in the West are starting to leverage live stream technology to create innovative omnichannel solutions.  

For example, Kjell & Company, one of Scandinavia’s leading consumer electronics retailers, is using a solution from Bambuser and Ombori called Omni-queue built on top of the Ombori Grid. Omni-queue enables store employees to handle a seamless combination of physical and online visitors within the same queue using one-to-one live stream video for online visitors.  

Kjell & Company ensures e-commerce customers receive the same level of technical expertise and personalized service they would receive in one of their physical locations. Omni-queue also enables its store employees to be utilized highly efficiently with advanced routing and knowledge matching. 

Maryam Ghahremani, CEO of Bambuser explains, “Live video shopping is the future, and we are so excited to see how Kjell & Company has found a use for our one-to-one solution.” Martin Knutson, CTO of Kjell & Company added “With physical store locations heavily affected due to the pandemic, offering a new and innovative way for customers to ask questions—especially about electronics—will be key to Kjell’s continued success in moving customers online.” 

20191026_JagerandKokemor_Attabotics_RobotOnWhite_15519.FIN (1)

Augmenting omnichannel with dark stores and micro-fulfillment centers  

In 2021, retailers will continue experimenting with dark stores—traditional retail stores that have been converted to local fulfillment centers—and micro-fulfillment centers. These supply chain innovations will increase efficiency by bringing products closer to customers. 

Microsoft partner Attabotics, a 3D robotics supply chain company, works with an American luxury department store retailer to reduce costs and delivery time using a micro-fulfillment center. Attabotics’ unique use of both horizontal and vertical space reduces warehouse needs by 85 percent. Attabotics’ structure and robotic shuttles leveraged Microsoft Azure Edge Zones, Azure IoT Central, and Azure Sphere.

The luxury retailer leverages the micro-fulfillment center to package and ship multiple beauty products together. As a result, customers experience faster delivery times. The retailer also reduces costs related to packaging, delivery, and warehouse space.  

Scott Gravelle, Founder, CEO, and CTO of Attabotics explained, “Commerce is at a crossroads, and for retailers and brands to thrive, they need to adapt and take advantage of new technologies to effectively meet consumers’ growing demands. Supply chains have not traditionally been set up for e-commerce. We will see supply chain innovations in automation and modulation take off in 2021 as they bring a wider variety of products closer to the consumer and streamline the picking and shipping to support e-commerce.” 

a group of people wearing costumes

Helping keep warehouse workers safe

What will this look like? Cognizant’s recent work with an athletic apparel retailer offers a blueprint. During the peak holiday season, the retailer needed to protect its expanding warehouse workforce while minimizing absenteeism. To implement physical distancing and other safety measures, the retailer  leveraged Cognizant’s Safe Buildings solution built with Azure IoT Edge and IoT Hub services.   

With this solution, employees maintain physical distancing using smart wristbands. When two smart wristbands were within a pre-defined distance of each other for more than a pre-defined time, the worker’s bands buzzed to reinforce safe behaviors. The results drove nearly 98 percent distancing compliance in the initial pilot. As the retailer plans to scale-up its workforce at other locations, implementing additional safety modules are being considered:   

  • Touchless temperature checks.  
  • Occupancy sensors communicate capacity information to the management team for compliance records.  
  • Air quality sensors provide environmental data so the facility team could help ensure optimal conditions for workers’ health.  

“For organizations to thrive during and post-pandemic, enterprise-grade workplace safety cannot be compromised. Real-time visibility of threats is providing essential businesses an edge in minimizing risks proactively while building employee trust and empowering productivity in a safer workplace,” Rajiv Mukherjee, Cognizant’s IoT Practice Director for Retail and Consumer Goods.  

Optimizing inventory management with real-time edge data

In 2021, retailers will ramp up the adoption of intelligent edge solutions to optimize inventory management with real-time data. Most retailers have complex inventory management systems. However, no matter how good the systems are, there can still be data gaps due to grocery pick-up services, theft, and sweethearting. The key to addressing these gaps is to combine real-time data from applications running on edge cameras and other edge devices in the physical store with backend enterprise resource planning (ERP) data.  

Seattle Goodwill worked with Avanade to implement a new Microsoft-based Dynamics platform across its 24 stores. The new system provided almost real-time visibility into the movement of goods from the warehouses to the stores. 

Rasmus Hyltegård, Director of Advanced Analytics at Avanade explained, “To ensure inventory moves quickly off the shelves, retailers can combine real-time inventory insights from Avanade’s smart inventory accelerator with other solutions across the customer journey to meet customer expectations.” Hyltegård continued, “Customers can check online to find the products they want, find the stores with product in stock, and gain insight into which stores have the shortest queues, which is important during the pandemic and beyond. Once a customer is in the store, digital signage allows for endless aisle support.” 

a person standing in front of a building

Summary

The new year 2021 holds a wealth of opportunities for retailers. We foresee retail leaders reimagining their businesses by investing in platforms that integrate IoT, AI, and edge computing technologies. Retailers will focus on increasing efficiencies to reduce costs. Modular platforms supported by an ecosystem of strong partner solutions will empower frontline workers with data, augment omnichannel fulfillment with dark stores and micro-fulfillment, leverage livestream video to enhance omnichannel, prioritize warehouse worker safety, and optimize inventory management with real-time data. 

Originally posted here.

Read more…

Security has long been a worry for the Internet of Things projects, and for many organizations with active or planned IoT deployments, security concerns have hampered digital ambitions. By implementing IoT security best practices, however, risk can be minimized.

Fortunately, IoT security best practices can help organizations reduce the risks facing their deployments and broader digital transformation initiatives. These same best practices can also reduce legal liability and protect an organization’s reputation.

Technological fragmentation is not just one of the biggest barriers to IoT adoption, but it also complicates the goal of securing connected devices and related services. With IoT-related cyberattacks on the rise, organizations must become more adept at managing cyber-risk or face potential reputational and legal consequences. This article summarizes best practices for enterprise and industrial IoT projects.

Key takeaways from this article include the following:

  • Data security remains a central technology hurdle related to IoT deployments.
  • IoT security best practices also can help organizations curb the risk of broader digital transformation initiatives.
  • Securing IoT projects requires a comprehensive view that encompasses the entire life cycle of connected devices and relevant supply chains.

Fragmentation and security have long been two of the most significant barriers to the Internet of Things adoption. The two challenges are also closely related.

Despite the Internet of Things (IoT) moniker, which implies a synthesis of connected devices, IoT technologies vary considerably based on their intended use. Organizations deploying IoT thus rely on an array of connectivity types, standards and hardware. As a result, even a simple IoT device can pose many security vulnerabilities, including weak authentication, insecure cloud integration, and outdated firmware and software.

For many organizations with active or planned IoT deployments, security concerns have hampered digital ambitions. An IoT World Today August 2020 survey revealed data security as the top technology hurdle for IoT deployments, selected by 46% of respondents.

Fortunately, IoT security best practices can help organizations reduce the risks facing their deployments and broader digital transformation initiatives. These same best practices can also reduce legal liability and protect an organization’s reputation.

But to be effective, an IoT-focused security strategy requires a broad view that encompasses the entire life cycle of an organization’s connected devices and projects in addition to relevant supply chains.

Know What You Have and What You Need

Asset management is a cornerstone of effective cyber defence. Organizations should identify which processes and systems need protection. They should also strive to assess the risk cyber attacks pose to assets and their broader operations.

In terms of enterprise and industrial IoT deployments, asset awareness is frequently spotty. It can be challenging given the array of industry verticals and the lack of comprehensive tools to track assets across those verticals. But asset awareness also demands a contextual understanding of the computing environment, including the interplay among devices, personnel, data and systems, as the National Institute of Standards and Technology (NIST) has observed.

There are two fundamental questions when creating an asset inventory: What is on my network? And what are these assets doing on my network?

Answering the latter requires tracking endpoints’ behaviours and their intended purpose from a business or operational perspective. From a networking perspective, asset management should involve more than counting networking nodes; it should focus on data protection and building intrinsic security into business processes.

Relevant considerations include the following:

  • Compliance with relevant security and privacy laws and standards.
  • Interval of security assessments.
  • Optimal access of personnel to facilities, information and technology, whether remote or in-person.
  • Data protection for sensitive information, including strong encryption for data at rest and data in transit.
  • Degree of security automation versus manual controls, as well as physical security controls to ensure worker safety.

IoT device makers and application developers also should implement a vulnerability disclosure program. Bug bounty programs are another option that should include public contact information for security researchers and plans for responding to disclosed vulnerabilities.

Organizations that have accurately assessed current cybersecurity readiness need to set relevant goals and create a comprehensive governance program to manage and enforce operational and regulatory policies and requirements. Governance programs also ensure that appropriate security controls are in place. Organizations need to have a plan to implement controls and determine accountability for that enforcement. Another consideration is determining when security policies need to be revised.

An effective governance plan is vital for engineering security into architecture and processes, as well as for safeguarding legacy devices with relatively weak security controls. Devising an effective risk management strategy for enterprise and industrial IoT devices is a complex endeavour, potentially involving a series of stakeholders and entities. Organizations that find it difficult to assess the cybersecurity of their IoT project should consider third-party assessments.

Many tools are available to help organizations evaluate cyber-risk and defences. These include the vulnerability database and the Security and Privacy Controls for Information Systems and Organizations document from the National Institute of Standards and Technology. Another resource is the list of 20 Critical Security Controls for Effective Cyber Defense. In terms of studying the threat landscape, the MITRE ATT&CK is one of the most popular frameworks for adversary tactics and techniques.

At this stage of the process, another vital consideration is the degree of cybersecurity savviness and support within your business. Three out of ten organizations deploying IoT cite lack of support for cybersecurity as a hurdle, according to August 2020 research from IoT World Today. Security awareness is also frequently a challenge. Many cyberattacks against organizations — including those with an IoT element — involve phishing, like the 2015 attack against Ukraine’s electric grid.

IoT Security Best Practices

Internet of Things projects demands a secure foundation. That starts with asset awareness and extends into responding to real and simulated cyberattacks.

Step 1: Know what you have.

Building an IoT security program starts with achieving a comprehensive understanding of which systems need to be protected.

Step 2: Deploy safeguards.

Shielding devices from cyber-risk requires a thorough approach. This step involves cyber-hygiene, effective asset control and the use of other security controls.

Step 3: Identify threats

Spotting anomalies can help mitigate attacks. Defenders should hone their skills through wargaming.

Step 4: Respond effectively.

Cyberattacks are inevitable but should provide feedback that feeds back to step 1.

Exploiting human gullibility is one of the most common cybercriminal strategies. While cybersecurity training can help individuals recognize suspected malicious activities, such programs tend not to be entirely effective. “It only takes one user and one-click to introduce an exploit into a network,” wrote Forrester analyst Chase Cunningham in the book “Cyber Warfare.” Recent studies have found that, even after receiving cybersecurity training, employees continue to click on phishing links about 3% of the time.

Security teams should work to earn the support of colleagues, while also factoring in the human element, according to David Coher, former head of reliability and cybersecurity for a major electric utility. “You can do what you can in terms of educating folks, whether it’s as a company IT department or as a consumer product manufacturer,” he said. But it is essential to put controls in place that can withstand user error and occasionally sloppy cybersecurity hygiene.

At the same time, organizations should also look to pool cybersecurity expertise inside and outside the business. “Designing the controls that are necessary to withstand user error requires understanding what users do and why they do it,” Coher said. “That means pulling together users from throughout your organization’s user chain — internal and external, vendors and customers, and counterparts.”

Those counterparts are easier to engage in some industries than others. Utilities, for example, have a strong track record in this regard, because of the limited market competition between them. Collaboration “can be more challenging in other industries, but no less necessary,” Coher added.

Deploy Appropriate Safeguards

Protecting an organization from cyberattacks demands a clear framework that is sensitive to business needs. While regulated industries are obligated to comply with specific cybersecurity-related requirements, consumer-facing organizations tend to have more generic requirements for privacy protections, data breach notifications and so forth. That said, all types of organizations deploying IoT have leeway in selecting a guiding philosophy for their cybersecurity efforts.

A basic security principle is to minimize networked or vulnerable systems’ attack surface — for instance, closing unused network ports and eliminating IoT device communication over the open internet. Generally speaking, building security into the architecture of IoT deployments and reducing attackers’ options to sabotage a system is more reliable than adding layers of defence to an unsecured architecture. Organizations deploying IoT projects should consider intrinsic security functionality such as embedded processors with cryptographic support.

But it is not practical to remove all risk from an IT system. For that reason, one of the most popular options is defence-in-depth, a military-rooted concept espousing the use of multiple layers of security. The basic idea is that if one countermeasure fails, additional security layers are available.

While the core principle of implementing multiple layers of security remains popular, defence in depth is also tied to the concept of perimeter-based defence, which is increasingly falling out of favour. “The defence-in-depth approach to cyber defence was formulated on the basis that everything outside of an organization’s perimeter should be considered ‘untrusted’ while everything internal should be inherently ‘trusted,’” said Andrew Rafla, a Deloitte Risk & Financial Advisory principal. “Organizations would layer a set of boundary security controls such that anyone trying to access the trusted side from the untrusted side had to traverse a set of detection and prevention controls to gain access to the internal network.”

Several trends have chipped away at the perimeter-based model. As a result, “modern enterprises no longer have defined perimeters,” Rafla said. “Gone are the days of inherently trusting any connection based on where the source originates.” Trends ranging from the proliferation of IoT devices and mobile applications to the popularity of cloud computing have fueled interest in cybersecurity models such as zero trust. “At its core, zero trust commits to ‘never trusting, always verifying’ as it relates to access control,” Rafla said. “Within the context of zero trusts, security boundaries are created at a lower level in the stack, and risk-based access control decisions are made based on contextual information of the user, device, workload or network attempting to gain access.”

Zero trust’s roots stretch back to the 1970s when a handful of computer scientists theorized on the most effective access control methods for networks. “Every program and every privileged user of the system should operate using the least amount of privilege necessary to complete the job,” one of those researchers, Jerome Saltzer, concluded in 1974.

While the concept of least privilege sought to limit trust among internal computing network users, zero trusts extend the principle to devices, networks, workloads and external users. The recent surge in remote working has accelerated interest in the zero-trust model. “Many businesses have changed their paradigm for security as a result of COVID-19,” said Jason Haward-Grau, a leader in KPMG’s cybersecurity practice. “Many organizations are experiencing a surge to the cloud because businesses have concluded they cannot rely on a physically domiciled system in a set location.”

Based on data from Deloitte, 37.4% of businesses accelerated their zero trust adoption plans in response to the pandemic. In contrast, more than one-third, or 35.2%, of those embracing zero trusts stated that the pandemic had not changed the speed of their organization’s zero-trust adoption.

“I suspect that many of the respondents that said their organization’s zero-trust adoption efforts were unchanged by the pandemic were already embracing zero trusts and were continuing with efforts as planned,” Rafla said. “In many cases, the need to support a completely remote workforce in a secure and scalable way has provided a tangible use case to start pursuing zero-trust adoption.”

A growing number of organizations are beginning to blend aspects of zero trust and traditional perimeter-based controls through a model known as secure access service edge (SASE), according to Rafla. “In this model, traditional perimeter-based controls of the defence-in-depth approach are converged and delivered through a cloud-based subscription service,” he said. “This provides a more consistent, resilient, scalable and seamless user experience regardless of where the target application a user is trying to access may be hosted. User access can be tightly controlled, and all traffic passes through multiple layers of cloud-based detection and prevention controls.”

Regardless of the framework, organizations should have policies in place for access control and identity management, especially for passwords. As Forrester’s Cunningham noted in “Cyber Warfare,” the password is “the single most prolific means of authentication for enterprises, users, and almost any system on the planet” — is the lynchpin of failed security in cyberspace. Almost everything uses a password at some stage.” Numerous password repositories have been breached, and passwords are frequently recycled, making the password a common security weakness for user accounts as well as IoT devices.

A significant number of consumer-grade IoT devices have also had their default passwords posted online. Weak passwords used in IoT devices also fueled the growth of the Mirai botnet, which led to widespread internet outages in 2016. More recently, unsecured passwords on IoT devices in enterprise settings have reportedly attracted state-sponsored actors’ attention.

IoT devices and related systems also need an effective mechanism for device management, including tasks such as patching, connectivity management, device logging, device configuration, software and firmware updates and device provisioning. Device management capabilities also extend to access control modifications and include remediation of compromised devices. It is vital to ensure that device management processes themselves are secure and that a system is in place for verifying the integrity of software updates, which should be regular and not interfere with device functionality.

Organizations must additionally address the life span of devices and the cadence of software updates. Many environments allow IT pros to identify a specific end-of-life period and remove or replace expired hardware. In such cases, there should be a plan for device disposal or transfer of ownership. In other contexts, such as in industrial environments, legacy workstations don’t have a defined expiration date and run out-of-date software. These systems should be segmented on the network. Often, such industrial systems cannot be easily patched like IT systems are, requiring security professionals to perform a comprehensive security audit on the system before taking additional steps.

Identify Threats and Anomalies

In recent years, attacks have become so common that the cybersecurity community has shifted its approach from preventing breaches from assuming a breach has already happened. The threat landscape has evolved to the point that cyberattacks against most organizations are inevitable.

“You hear it everywhere: It’s a matter of when, not if, something happens,” said Dan Frank, a principal at Deloitte specializing in privacy and data protection. Matters have only become more precarious in 2020. The FBI has reported a three- to four-fold increase in cybersecurity complaints after the advent of COVID-19.

Advanced defenders have taken a more aggressive stance known as threat hunting, which focuses on proactively identifying breaches. Another popular strategy is to study adversary behaviour and tactics to classify attack types. Models such as the MITRE ATT&CK framework and the Common Vulnerability Scoring System (CVSS) are popular for assessing adversary tactics and vulnerabilities.

While approaches to analyzing vulnerabilities and potential attacks vary according to an organization’s maturity, situational awareness is a prerequisite at any stage. The U.S. Army Field Manual defines the term like this: “Knowledge and understanding of the current situation which promotes timely, relevant and accurate assessment of friendly, enemy and other operations within the battlespace to facilitate decision making.”

In cybersecurity as in warfare, situational awareness requires a clear perception of the elements in an environment and their potential to cause future events. In some cases, the possibility of a future cyber attack can be averted by merely patching software with known vulnerabilities.

Intrusion detection systems can automate some degree of monitoring of networks and operating systems. Intrusion detection systems that are based on detecting malware signatures also can identify common attacks. They are, however, not effective at recognizing so-called zero-day malware, which has not yet been catalogued by security researchers. Intrusion detection based on malware signatures is also ineffective at detecting custom attacks, (i.e., a disgruntled employee who knows just enough Python or PowerShell to be dangerous. Sophisticated threat actors who slip through defences to gain network access can become insiders, with permission to view sensitive networks and files. In such cases, situational awareness is a prerequisite to mitigate damage.

Another strategy for intrusion detection systems is to focus on context and anomalies rather than malware signatures. Such systems could use machine learning to learn legitimate commands, use of messaging protocols and so forth. While this strategy overcomes the reliance on malware signatures, it can potentially trigger false alarms. Such a system can also detect so-called slow-rate attacks, a type of denial of service attack that gradually robs networking bandwidth but is more difficult to detect than volumetric attacks.

Respond Effectively to Cyber-Incidents

The foundation for successful cyber-incident response lies in having concrete security policies, architecture and processes. “Once you have a breach, it’s kind of too late,” said Deloitte’s Frank. “It’s what you do before that matters.”

That said, the goal of warding off all cyber-incidents, which range from violations of security policies and laws to data breaches, is not realistic. It is thus essential to implement short- and long-term plans for managing cybersecurity emergencies. Organizations should have contingency plans for addressing possible attacks, practising how to respond to them through wargaming exercises to improve their ability to mitigate some cyberattacks and develop effective, coordinated escalation measures for successful breaches.

There are several aspects of the zero trust model that enhance organizations’ ability to respond and recover from cyber events. “Network and micro-segmentation, for example, is a concept by which trust zones are created by organizations around certain classes or types of assets, restricting the blast radius of potentially destructive cyberattacks and limiting the ability for an attacker to move laterally within the environment,” Rafla said. Also, efforts to automate and orchestrate zero trust principles can enhance the efficiency of security operations, speeding efforts to mitigate attacks. “Repetitive and manual tasks can now be automated and proactive actions to isolate and remediate security threats can be orchestrated through integrated controls,” Rafla added.

Response to cyber-incidents involves coordinating multiple stakeholders beyond the security team. “Every business function could be impacted — marketing, customer relations, legal compliance, information technology, etc.,” Frank said.

A six-tiered model for cyber incident response from the SANS Institute contains the following steps:

  • Preparation: Preparing the team to react to events ranging from cyberattacks to hardware failure and power outages.
  • Identification: Determining if an operational anomaly should be classified as a cybersecurity incident, and how to respond to it.
  • Containment: Segmenting compromised devices on the network long enough to limit damage in the event of a confirmed cybersecurity incident. Conversely, long-term containment measures involve hardening effective systems to allow them to enable normal operations.
  • Eradication: Removing or restoring compromised systems. If a security team detects malware on an IoT device, for instance, this phase could involve reimaging its hardware to prevent reinfection.
  • Recovery: Integrating previously compromised systems back into production and ensuring they operate normally after that. In addition to addressing the security event directly, recovery can involve crisis communications with external stakeholders such as customers or regulators.
  • Lessons Learned: Documenting and reviewing the factors that led to the cyber-incident and taking steps to avoid future problems. Feedback from this step should create a feedback loop providing insights that support future preparation, identification, etc.

While the bulk of the SANS model focuses on cybersecurity operations, the last step should be a multidisciplinary process. Investing in cybersecurity liability insurance to offset risks identified after ongoing cyber-incident response requires support from upper management and the legal team. Ensuring compliance with the evolving regulatory landscape also demands feedback from the legal department.

A central practice that can prove helpful is documentation — not just for security incidents, but as part of ongoing cybersecurity assessment and strategy. Organizations with mature security documentation tend to be better positioned to deal with breaches.

“If you fully document your program — your policies, procedures, standards and training — that might put you in a more favourable position after a breach,” Frank explained. “If you have all that information summarized and ready, in the event of an investigation by a regulatory authority after an incident, it shows the organization has robust programs in place.”

Documenting security events and controls can help organizations become more proactive and more capable of embracing automation and machine learning tools. As they collect data, they should repeatedly ask how to make the most of it. KPMG’s Haward-Grau said cybersecurity teams should consider the following questions:

  • What data should we focus on?
  • What can we do to improve our operational decision making?
  • How do we reduce our time and costs efficiently and effectively, given the nature of the reality in which we’re operating?

Ultimately, answering those questions may involve using machine learning or artificial intelligence technology, Haward- Grau said. “If your business is using machine learning or AI, you have to digitally enable them so that they can do what they want to do,” he said.

Finally, documenting security events and practices as they relate to IoT devices and beyond can be useful in evaluating the effectiveness of cybersecurity spending and provide valuable feedback for digital transformation programs. “Security is a foundational requirement that needs to be ingrained holistically in architecture and processes and governed by policies,” said Chander Damodaran, chief architect at Brillio, a digital consultancy firm. ”Security should be a common denominator.”

IoT Security

Recent legislation requires businesses to assume responsibility for protecting the Internet of Things (IoT) devices. “Security by Design” approaches are essential since successful applications deploy millions of units and analysts predict billions of devices deployed in the next five to ten years. The cost of fixing compromised devices later could overwhelm a business.

Security risks can never be eliminated: there is no single solution for all concerns, and the cost to counter every possible threat vector is prohibitively expensive. The best we can do is minimize the risk, and design devices and processes to be easily updatable.

It is best to assess damage potential and implement security methods accordingly. For example, for temperature and humidity sensors used in environmental monitoring, data protection needs are not as stringent as devices transmitting credit card information. The first may require anonymization for privacy, and the second may require encryption to prevent unauthorized access.

Overall Objectives

Senders and receivers must authenticate. IoT devices must transmit to the correct servers and ensure they receive messages from the correct servers.

Mission-critical applications, such as vehicle crash notification or medical alerts, may fail if the connection is not reliable. Lack of communication itself is a lack of security.

Connectivity errors can make good data unreliable, and actions on the content may be erroneous. It is best to select connectivity providers with strong security practices—e.g., whitelisting access and traffic segregation to prevent unauthorized communication.

ACtC-3dLml_wPNzqObxWBELrfzifYiQLQpU6QVaKaMERqQZXspv-WPYLG17u2sJEtTM1RP3Kj42_qgp4SLMhoJwYt75EXfRWF8MaqPbvJFl6fCp3EIt30sEvOZ3P74hoo21lwBkEd9Td41iGvZY-zNMhEvIo6A=w1980-h873-no?authuser=0&profile=RESIZE_710x

IoT Security: 360-Degree Approach

Finally, only authorized recipients should access the information. In particular, privacy laws require extra care in accessing the information on individuals.

Data Chain

Developers should implement security best practices at all points in the chain. However, traditional IT security protects servers with access controls, intrusion detection, etc., the farther away from the servers that best practices are implemented, the less impact that remote IoT device breaches have on the overall application.

For example, compromised sensors might send bad data, and servers might take incorrect actions despite data filtering. Thus, gateways offer an ideal location for security with compute capacity for encryption and implement over-the-air (OTA) updates for security fixes.

Servers often automate responses on data content. Simplistic and automated responses to bad data could cascade into much greater difficulty. If devices transmit excessively, servers could overload and fail to provide timely responses to transmissions—retry algorithms resulting from network unavailability often create data storms.

IoT devices often use electrical power rather than batteries, and compromised units could continue to operate for years. Implementing over-the-air (OTA) functions for remotely disabling devices could be critical.

When a breach requires device firmware updates, OTA support is vital when devices are inaccessible or large numbers of units must be modified rapidly. All devices should support OTA, even if it increases costs—for example, adding memory for managing multiple “images” of firmware for updates.

In summary, IoT security best practices of authentication, encryption, remote device disable, and OTA for security fixes, along with traditional IT server protection, offers the best chance of minimizing risks of attacks on IoT applications.

Originally posted here.

Read more…

The benefits of IoT data are widely touted. Enhanced operational visibility, reduced costs, improved efficiencies and increased productivity have driven organizations to take major strides towards digital transformation. With countless promising business opportunities, it’s no surprise that IoT is expanding rapidly and relentlessly. It is estimated that there will be 75.4 billion IoT devices by 2025. As IoT grows, so do the volumes of IoT data that need to be collected, analyzed and stored. Unfortunately, significant barriers exist that can limit or block access to this data altogether.

Successful IoT data acquisition starts and ends with reliable and scalable IoT connectivity. Selecting the right communications technology is paramount to the long-term success of your IoT project and various factors must be considered from the beginning to build a functional wireless infrastructure that can support and manage the influx of IoT data today and in the future.

Here are five IoT architecture must-haves for unlocking IoT data at scale.

1. Network Ownership

For many businesses, IoT data is one of their greatest assets, if not the most valuable. This intensifies the demand to protect the flow of data at all costs. With maximum data authority and architecture control, the adoption of privately managed networks is becoming prevalent across industrial verticals.

Beyond the undeniable benefits of data security and privacy, private networks give users more control over their deployment with the flexibility to tailor their coverage to the specific needs of their campus style network. On a public network, users risk not having the reliable connectivity needed for indoor, underground and remote critical IoT applications. And since this network is privately owned and operated, users also avoid the costly monthly access, data plans and subscription costs imposed by public operators, lowering the overall total-cost-of-ownership. Private networks also provide full control over network availability and uptime to ensure users have reliable access to their data at all times.

2. Minimal Infrastructure Requirements

Since the number of end devices is often fixed to your IoT use cases, choosing a wireless technology that requires minimal supporting infrastructure like base stations and repeaters, as well as configuration and optimization is crucial to cost-effectively scale your IoT network.

Wireless solutions with long range and excellent penetration capability, such as next-gen low-power wide area networks, require fewer base stations to cover a vast, structurally dense industrial or commercial campuses. Likewise, a robust radio link and large network capacity allow an individual base station to effectively support massive amounts of sensors without comprising performance to ensure a continuous flow of IoT data today and in the future.

3. Network and Device Management

As IoT initiatives move beyond proofs-of-concept, businesses need an effective and secure approach to operate, control and expand their IoT network with minimal costs and complexity.

As IoT deployments scale to hundreds or even thousands of geographically dispersed nodes, a manual approach to connecting, configuring and troubleshooting devices is inefficient and expensive. Likewise, by leaving devices completely unattended, users risk losing business-critical IoT data when it’s needed the most. A network and device management platform provides a single-pane, top-down view of all network traffic, registered nodes and their status for streamlined network monitoring and troubleshooting. Likewise, it acts as the bridge between the edge network and users’ downstream data servers and enterprise applications so users can streamline management of their entire IoT project from device to dashboard.

4. Legacy System Integration

Most traditional assets, machines, and facilities were not designed for IoT connectivity, creating huge data silos. This leaves companies with two choices: building entirely new, greenfield plants with native IoT technologies or updating brownfield facilities for IoT connectivity. Highly integrable, plug-and-play IoT connectivity is key to streamlining the costs and complexity of an IoT deployment. Businesses need a solution that can bridge the gap between legacy OT and IT systems to unlock new layers of data that were previously inaccessible. Wireless IoT connectivity must be able to easily retrofit existing assets and equipment without complex hardware modifications and production downtime. Likewise, it must enable straightforward data transfer to the existing IT infrastructure and business applications for data management, visualization and machine learning.

5. Interoperability

Each IoT system is a mashup of diverse components and technologies. This makes interoperability a prerequisite for IoT scalability, to avoid being saddled with an obsolete system that fails to keep pace with new innovation later on. By designing an interoperable architecture from the beginning, you can avoid fragmentation and reduce the integration costs of your IoT project in the long run. 

Today, technology standards exist to foster horizontal interoperability by fueling global cross-vendor support through robust, transparent and consistent technology specifications. For example, a standard-based wireless protocol allows you to benefit from a growing portfolio of off-the-shelf hardware across industry domains. When it comes to vertical interoperability, versatile APIs and open messaging protocols act as the glue to connect the edge network with a multitude of value-deriving backend applications. Leveraging these open interfaces, you can also scale your deployment across locations and seamlessly aggregate IoT data across premises.  

IoT data is the lifeblood of business intelligence and competitive differentiation and IoT connectivity is the crux to ensuring reliable and secure access to this data. When it comes to building a future-proof wireless architecture, it’s important to consider not only existing requirements, but also those that might pop up down the road. A wireless solution that offers data ownership, minimal infrastructure requirements, built-in network management and integration and interoperability will not only ensure access to IoT data today, but provide cost-effective support for the influx of data and devices in the future.

Originally posted here.

Read more…

by Ariane Elena Fuchs

Solar power, wind energy, micro cogeneration power plants: energy from renewable sources has become indispensable, but it makes power generation and distribution far more complex. How the Internet of Things is helping make energy management sustainable.

It feels like Ground Hog Day yet again – in 2020 it happened on August 22. That was the point when the demand for raw materials exceeded the Earth’s supply and capacity to reproduce these natural resources. All reserves that are consumed from that date on cannot be regenerated in the current year. In other words, humanity is living above its means, consuming around 50 percent more energy than the Earth provides naturally.

To conserve these precious resources and reduce climate-damaging CO2 emissions, the energy we need must come from renewable sources such as wind, sun and water. This is the only way to reduce both greenhouse gases and our fossil fuel use. Fortunately, a start has now been made: In 2019, renewable energies – predominantly from wind and sun – will already cover almost 43 percent of Germany's energy requirements and the trend is rising.

DECENTRALIZING ENERGY PRODUCTION

This also means, however, that the traditional energy management model – a few power plants supplying a lot of consumers – is outdated. After all, the phasing out of large nuclear and coal-fired power plants doesn’t just have consequences for Germany’s CO2 balance. Shifting electricity production to wind, solar and smaller cogeneration plants reverses the previous pattern of energy generation and distribution from a highly centralized to an increasingly decentralized organizational structure. Instead of just a few large power plants sending electricity to the grid, there are now many smaller energy sources such as solar panels and wind turbines. This has made the management of it all – including the optimal distribution of the electricity – far more complex as a result. It’s up to the energy sector to wrangle this challenging transformation. As the country’s energy is becoming more sustainable, it’s also becoming harder to organize, since the energy generated from wind and sun cannot be planned in advance as easily as coal and nuclear power can. What’s more, there are thousands of wind turbines and solar panels making electricity and feeding it into the grid. This has made the management of the power network extremely difficult. In particular, there’s a lack of real-time information about the amount of electricity being generated.

KEY TECHNOLOGY IOT: FROM ENERGY FLOW TO DATA STREAM

This is where the Internet of Things comes into play: IoT can supply exactly this data from every power generator and send it to a central location. Once there, it can be evaluated before ideally enabling the power grid to be controlled automatically. The result is an IoT ecosystem. In order to operate wind farms more efficiently and reliably, a project team is currently developing an IoT-supported system that bundles and processes all relevant parameters and readings at a wind farm. They can then reconstruct the current operating and maintenance status of individual turbines. This information can be used to detect whether certain components are about to wear out and replace them before a turbine fails.

POTENTIAL FOR NEW BUSINESS MODELS

According to a recent Gartner study, the Internet of Things (IoT) is becoming a key technology for monitoring and orchestrating the complex energy and water ecosystem. In addition, consumers want more control over energy prices and more environmentally friendly power products. With the introduction of smart metering, data from so-called prosumers is becoming increasingly important. These energy producing consumers act like operators of the photovoltaic systems on their roofs. IoT sensors are used to collect the necessary power generation information. Although they are only used locally and for specific purposes, they provide energy companies with a lot of data. In order to be able to use the potential of this information for the expansion of renewable energy, it must be combined and evaluated intelligently. According to Gartner, IoT has the potential to change the energy value chain in four key areas: It enables new business models, optimizes asset management, automates operations and digitalizes the entire value chain from energy source to kWh.

ENERGY TRANSITION REQUIRES TECHNOLOGICAL CHANGE

Installing smaller power-generating systems will soon no longer pose the greatest challenge for operators. In the near future, coherently linking, integrating and controlling them will be the order of the day. The energy transition is therefore spurring technological change on a grand scale. For example, smart grids will only function properly and increase overall capacity when data on generation, consumption and networks is available in real-time. The Internet of Things enables the necessary fast data processing, even from the smallest consumers and prosumers on the grid. With the help of the Internet of Things, more and more household appliances can communicate with the Internet. These devices are then in turn connected to a smart meter gateway, i.e. a hub for the intelligent management of consumers, producers and storage locations at private households and commercial enterprises. To be able to use the true potential of this information, however, all the data must flow together into a common data platform, so that it can be analyzed intelligently.

FROM INDIVIDUAL APPLICATIONS TO AN ECOSYSTEM

For the transmission of data from the Internet of Things, Germany has national fixed-line and mobile networks available. New technology such as the 5G mobile standard allows data to be securely and reliably transferred to the cloud either directly via the 5G network or a 5G campus networks. Software for data analytics and AI tailored to energy firms are now available – including monitoring, analysis, forecasting and optimization tools. Any analyzed data can be accessed via web browsers and in-house data centers. Taken together, it all provides the energy sector with a comprehensive IoT ecosystem for the future.

Originally posted here.

Read more…

Earlier, Artificial Intelligence was just a notion we experienced in Sci-Fi movies and documentaries, but now it's making a huge impact on society as well as the business world. 

Today, modern technologies like Artificial Intelligence and Machine Learning are no longer trendy words. It has been around us for more than a decade, but yes, we are harnessing its power for the last couple of years. Due to its more incredible data processing speed and advanced prediction capabilities, Artificial Intelligence is making its way into our daily lives. 

Artificial Intelligence is a vast term that refers to any software or application that engages in studying human preferences during interactions. There are several real-life artificial intelligence applications such as Netflix's movie recommendations, Apple's Siri, Amazon's shopping personalized mails, and Google's Deep Mind. This way, AI is changing business dynamics and allowing them to upscale their game in a positive way. 

AI is implemented in most businesses. As per the report published by PwC, AI will contribute $15.7 trillion by 2030 to the global economy. Businesses that were still considering that AI is a futuristic technology are now investing in it because it will reap many benefits and improve their business offerings. 

From assisting customers to improve their overall experience to automate certain business tasks to crafting personalized solutions, AI has the potential to revamp your modern business outlet. 

5 Ways AI is Accelerating Modern Business World

8502289064?profile=RESIZE_710x

According to Salesforce's report, usage of AI is no longer limited to large corporations but SMBs are also using AI technology to achieve higher business growth. 

We live in an age of technology. Businesses are getting evolved as per consumer's demand. Small businesses are slowly automating their business operations in order to capture huge market share while large companies are revamping their existing strategies to establish their brand. 

It means everyone in the business world is now enjoying their proportionate revenue share. Thanks to technologies such as AI and ML, it is now only improving customer relationships and enabling owners to make informed decisions with accurate insights. From SIRI to self-driving cars, AI is transforming our lives in such a way that stimulates human behavior to another level. 

AI is nothing but the technology that can solve problems without human interference and come up with rational solutions taken with the help of 

  • Knowledge
  • Reasoning
  • Perception
  • Learning
  • Planning 

Popular brands like Amazon, Apple, Tesla, etc., are using this technology to transform our current and future lives. The biggest benefit AI has given to businesses is that it has eliminated human intervention in tedious or mundane tasks. Here we have addressed five ways how AI is revolutionizing modern businesses and improving business operations. 

AI Ameliorate Customer Experience

These days, customers are expecting lightning-fast responses from brands, and AI-based advancements allow businesses to integrate voice search or chatbots into their strategy. AI-based chatbots improve customer experience and resolve their queries in seconds without getting frustrated. Moreover, chatbots will also help to find the best suitable products based on customer's preferences. 

Built using smart technologies, chatbots are getting more attention, especially in eCommerce and on-demand business. The online food delivery business has experienced major growth, and entrepreneurs are now integrating chatbots in their restaurant's online ordering system so that customers can frequently ask questions related to orders and get them resolved in minutes. Along with good customer support, chatbots and virtual assistants have several capabilities such as: 

  1. NLP that can interpret voice-based interactions with customers
  2. Resolve customer's queries through accurate insights
  3. Provide personalized offerings to customers

In short, chatbots interact with your customers, assist them round the clock, and provide a personalized experience without getting frustrated. 

Cut Down on Recruiting and Onboarding Cost

According to Deloitte, more than 40% of companies are now using AI in their human resource operations in order to gain long-term benefits. Therefore, more and more organizations are now using AI-based technologies while others are still on their way to adopt this transformation. 

There are few ways in which AI can play a major role in HR operations are: 

  • Onboarding
  • Talent acquisition 
  • HR management 
  • Sentiment analysis
  • Performance management 

Overall, implementing AI in HR speeds up the hiring process, cuts down administrative costs, scans thousands of candidates in just a couple of hours, and reduces bias against candidates. 

8502290055?profile=RESIZE_710x

According ot Oracle, HR departments are more likely to use AI to source the best talent as it automates certain tasks and comes up with best results without any bias. Usage of AI is not limited to delivering quality customer support, but it has significantly impacted the organization's recruitment and onboarding process. The companies already using AI have admitted that it has resulted in noticeable benefits. 

Generate Sales

According to Gartner, it is projected that by 2021, 30% of businesses will use AI in their existing sales strategies. Businesses that are not integrating AI in their existing CRM software will be left behind since AI can do wonders and improve customer experiences and sales altogether.

When you leverage AI into your organization's CRM:

  • It studies customer's data
  • Based on that data, your sales team can put efforts
  • It would help to predict whether the customer is interested or not 

Based on the customer's browsing history, personal details, and behavior, you can turn this visitor into a lead with personalized marketing strategies like promotional emails and offerings that ultimately boost sales and customer engagement ratio. Moreover, you can also get started with paid advertising campaigns based on demographics and insights. 

Improve Recommendations for Customers

Leveraging AI, brands can more smartly analyze data to predict customer behavior and craft their marketing strategy based on their preferences and interest. This level of personalization delivers a seamless customer experience, and they feel valued. But for that, you need to understand the customer's demands first. 

For example, Starbucks, using "My Starbucks Barista", which utilizes AI to enable customers to place orders with voice technology or messaging. This level of personalization helps brands to suggest better products and connect with customers. 

Help You Re-target Online Ads

Running paid advertising campaigns on Google or social media are cost-effective and powerful ways to grab users' attention. Hence, targeting audiences' right set using AI and ML algorithms helps you study user preferences for better conversions.  

For instance, if you have an online delivery business, machine learning studies your audience, behavior, and sentiments and helps you re-target with best offerings. Moreover, an advanced level of AI algorithms helps you target customers at the right time so that it will encourage them to make a decision. 

Summing Up

The introduction of AI-powered technology in the modern business world has allowed enterprises to implement crafted and well-researched methods to avail long-term business goals. Artificial Intelligence in the business world plays a crucial role in resolving customer's issues in real-time with innovative solutions and increases businesses' productivity. 

Therefore, we can conclude that AI's capabilities speed-up the decision-making process and solve real problems with smart solutions. Indeed, AI is here to stay for a long time and reap multiple benefits to the business.

Read more…

by Olivier Pauzet

Over the past year, we have seen the Industrial IoT (IIoT) take an important step forward, crossing the chasm that previously separated IIoT early adopters from the majority of companies.

New solutions like Octave, Sierra Wireless’ edge-to-cloud solution for connecting industrial assets, have greatly simplified the IIoT, making it possible now for practically any company to securely extract, transmit, and act on data from bio-waste collectors, liquid fertilizer tanks, water purifiers, hot water heaters and other industrial equipment.

So, what IIoT trends will these 2020 developments lead to in 2021? I expect that they will drive greater adoption of the IIoT next year, as manufacturing, utility, healthcare, and other organizations further realize that they can help their previously silent industrial assets speak using the APIs integrated in new IoT solutions. At the same time, I expect we will start to see the development of some revolutionary IIoT applications that use 5G’s Ultra-Reliable, Low-Latency Communications (URLLC) capabilities to change the way our factories, electric grid, and healthcare systems operate.

In 2021, Industrial Equipment APIs Will Give Quiet Equipment A Voice

Cloud APIs have transformed the tech industry, and with it, our digital economy. By enabling SaaS and other cloud-based applications to easily and securely talk to each other, cloud APIs have vastly expanded the value of these applications to users. These APIs have also spawned billion-dollar companies like Stripe, Tableau, and Twilio, whose API-focused business models have transformed the online payments, data visualization, and customer service markets.

2021 will be the year industrial companies begin seeing their markets transformed by APIs, as more of these companies begin using industrial equipment APIs built into new IIoT solutions to enable their industrial assets to talk to the cloud.

Using new edge-to-cloud solutions - like Octave -with built-in Industrial equipment APIs for Modbus and other industrial communications protocols, these companies will be able to securely connect these assets to the cloud almost as easily as if this equipment was a cloud-based application.

In fact, by simply plugging a low-cost IoT gateway with these IIoT APIs into their industrial equipment, they will be able to deploy IIoT applications that allow them to remotely monitor, maintain, and control this equipment. Then, using these applications, they can lower equipment downtime, reduce maintenance costs, launch new Equipment-as-a-Service business models, and innovate faster.

Industrial companies have been trying to connect their assets to the cloud for years, but have been stymied by the complexity, time, and expense involved in doing so. In 2021, industrial equipment APIs will provide these companies with a way to simply, quickly, and cheaply connect this equipment to the cloud. By giving a voice to billions of pieces of industrial equipment, these Industrial IoT APIs will help bring about the productivity, sustainability, and other benefits Industry 4.0 has long promised.

In 2021 Manufacturing, Utility and Healthcare Will Drive Growth of the Industrial IoT

Until recently, the consumer sector, and especially the smart home market, has led the way in adopting the IoT, as the success of the Google Nest smart thermostat, the Amazon Echo smart speaker and Ring smart doorbell, and the Phillips Hue smart lights demonstrate. However, in 2021 another IIoT trend we can expect to see is the industrial sector starting to catch up to the consumer market regarding the IoT, with the manufacturing, utility, and healthcare markets leading the way.

For example, new IIoT solutions now make it possible for Original Equipment Manufacturers (OEMs) and other manufacturing companies to simply plug their equipment into the IIoT and begin acting on data from this equipment almost immediately. This has lowered the time to value for IIoT applications to the point where companies can begin reaping financial benefits greater than the total cost for their IIoT application in a few short months.

At this point, manufacturers who don’t have a plan to integrate the IIoT into their assets are, to put it bluntly, leaving money on the table – money their competitors will happily snap up with their own new connected industrial equipment offerings if they do not.

Like manufacturing companies, utilities will ramp up their use of the IIoT in 2021, as they seek to improve their operational efficiency, customer engagement, reliability, and sustainability. For example, utilities will increasingly use the IIoT to perform remote diagnostics and predictive maintenance on their grid infrastructure, reducing this equipment’s downtime while also lowering maintenance costs. In addition, a growing number of utilities will use the IIoT to collect and analyze data on their wind, solar and other renewable energy generation portfolios, allowing them to reduce greenhouse gas emissions while still balancing energy supply and demand on the grid.

Along with manufacturing and utilities, healthcare is the third market sector I expect to lead the way in adopting the IIoT in 2021. The COVID-19 pandemic has demonstrated to healthcare providers how connectivity – such as Internet-based telemedicine solutions -- can improve patient outcomes while reducing their costs. In 2021 they will increase their use of the IIoT, as they work to extend this connectivity to patient monitors, scanners and other medical devices. With the Internet of Medical Things (IoMT), healthcare providers will be better able to prepare patient treatments, remotely monitor and respond to changes to their patients’ conditions, and generate health care treatment documents.

Revolutionary Ultra-Reliable, Low-Latency 5G Applications Will Begin to Be Developed

There is a lot of buzz regarding 5G New Radio (NR) in the IIoT market. However, having been designed to co-exist with 4G LTE, most of 5G NR’s impact in this market is still evolutionary, not revolutionary. Companies are beginning to adopt 5G to wring better performance out of their existing IIoT applications, or to future-proof their connectivity strategies. But they are doing this while continuing to use LTE, as well as Low Power Wide Area (LPWA) 5G technologies, like LTE-M and NB-IoT, for now.

In 2021 however I think we will begin to see companies starting to develop revolutionary new IIoT application proof of concepts designed to take advantage of 5G NR’s Ultra-Reliable, Low-Latency Communications (URLLC) capabilities. These URLLC applications – including smart Automated Guided Vehicle (AGVs) for manufacturing, self-healing energy grids for utilities and remote surgery for health care – are simply not possible with existing wireless technologies.

Thanks to its ability to deliver ultra-high reliability and latencies as low as one millisecond, 5G NR enables companies to finally build URLLC applications – especially when 5G NR is used in conjunction with new edge computing technologies.

It will be a long time before any of these URLLC application proof-of-concepts are commercialized. But as far as 5G Wave 5+, next year is when we will first begin seeing this wave forming out at sea. And when it does eventually reach shore, it will have a revolutionary impact on our connected economy.

Originally posted here.

Read more…

When analyzing whether a machine learning model works well, we rely on accuracy numbers, F1 scores and confusion matrices - but they don't give any insight into why a machine learning model misclassifies data. Is it because data looks very similar, is it because data is mislabeled, or is it because preprocessing parameters are chosen incorrectly? To answer these questions we have now added the feature explorer to all neural network blocks in Edge Impulse. The feature explorer shows your complete dataset in one 3D graph, and shows you whether data was classified correct or incorrect.

8481148295?profile=RESIZE_710x

Showing exactly which data samples are misclassified in the feature explorer.

If you haven't used the feature explorer before: it's one of the most interesting options in the Edge Impulse. The axes are the output of the signal processing process (we heavily rely on signal processing to extract interesting features beforehand, making smaller and more reliable ML models), and they can let you quickly validate whether your data separates nicely. In addition the feature explorer is integrated in Live classification, where you can compare incoming test data directly with your training set.

8481149063?profile=RESIZE_710x

Redesign of the neural network pages.

This work has been part of a redesign of our neural network pages. These pages are now more compact, giving you full insight in both your neural network architecture, and the training performance - and giving you an easy way to compare models with different optimization options (like comparing an int8 quantized model vs. an unoptimized model) and show accurate on-device performance metrics for a wide variety of targets.

Next steps

Currently the feature explorer shows the performance of your training set, but over the next weeks we'll also integrate the feature explorer and the new confusion matrix to the Model testing page in Edge Impulse. This will give you direct insight in the performance of your test set in the same way, so keep an eye out for that!

Want to try the new feature explorer out? Just head to any neural network block in your Edge Impulse project and retrain. Don't have a project yet?! Followone of our tutorials on building embedded machine learning models on real sensor data, it takes 30 minutes and you can even use your phone as a sensor.

Article originally written by Jan Jongboom, the CTO and co-founder of Edge Impulse. He loves pretty pictures, colors, and insight in his ML models.

Read more…

When I think about the things that held the planet together in 2020, it was digital experiences delivered over wireless connectivity that made remote things local.

While heroes like doctors, nurses, first responders, teachers, and other essential personnel bore the brunt of the COVID-19 response, billions of people around the world found themselves cut off from society. In order to keep people safe, we were physically isolated from each other. Far beyond the six feet of social distancing, most of humanity weathered the storm from their homes.

And then little by little, old things we took for granted, combined with new things many had never heard of, pulled the world together. Let’s take a look at the technologies and trends that made the biggest impact in 2020 and where they’re headed in 2021:

The Internet

The global Internet infrastructure from which everything else is built is an undeniable hero of the pandemic. This highly-distributed network designed to withstand a nuclear attack performed admirably as usage by people, machines, critical infrastructure, hospitals, and businesses skyrocketed. Like the air we breathe, this primary facilitator of connected, digital experiences is indispensable to our modern society. Unfortunately, the Internet is also home to a growing cyberwar and security will be the biggest concern as we move into 2021 and beyond. It goes without saying that the Internet is one of the world’s most critical utilities along with water, electricity, and the farm-to-table supply chain of food.

Wireless Connectivity

People are mobile and they stay connected through their smartphones, tablets, in cars and airplanes, on laptops, and other devices. Just like the Internet, the cellular infrastructure has remained exceptionally resilient to enable communications and digital experiences delivered via native apps and the web. Indoor wireless connectivity continues to be dominated by WiFi at home and all those empty offices. Moving into 2021, the continued rollout of 5G around the world will give cellular endpoints dramatic increases in data capacity and WiFi-like speeds. Additionally, private 5G networks will challenge WiFi as a formidable indoor option, but WiFi 6E with increased capacity and speed won’t give up without a fight. All of these developments are good for consumers who need to stay connected from anywhere like never before.

Web Conferencing

With many people stuck at home in 2020, web conferencing technology took the place of traveling to other locations to meet people or receive education. This technology isn’t new and includes familiar players like GoToMeeting, Skype, WebEx, Google Hangouts/Meet, BlueJeans, FaceTime, and others. Before COVID, these platforms enjoyed success, but most people preferred to fly on airplanes to meet customers and attend conferences while students hopped on the bus to go to school. In 2020, “necessity is the mother of invention” took hold and the use of Zoom and Teams skyrocketed as airplanes sat on the ground while business offices and schools remained empty. These two platforms further increased their stickiness by increasing the number of visible people and adding features like breakout rooms to meet the demands of businesses, virtual conference organizers, and school teachers. Despite the rollout of the vaccine, COVID won’t be extinguished overnight and these platforms will remain strong through the first half of 2021 as organizations rethink where and when people work and learn. There’s way too many players in this space so look for some consolidation.

E-Commerce

“Stay at home” orders and closed businesses gave e-commerce platforms a dramatic boost in 2020 as they took the place of shopping at stores or going to malls. Amazon soared to even higher heights, Walmart upped their game, Etsy brought the artsy, and thousands of Shopify sites delivered the goods. Speaking of delivery, the empty city streets became home to fleets FedEx, Amazon, UPS, and DHL trucks bringing packages to your front doorstep. Many retail employees traded-in working at customer-facing stores for working in a distribution centers as long as they could outperform robots. Even though people are looking forward to hanging out at malls in 2021, the e-commerce, distribution center, delivery truck trinity is here to stay. This ball was already in motion and got a rocket boost from COVID. This market will stay hot in the first half of 2021 and then cool a bit in the second half.

Ghost Kitchens

The COVID pandemic really took a toll on restaurants in the 2020, with many of them going out of business permanently. Those that survived had to pivot to digital and other ways of doing business. High-end steakhouses started making burgers on grills in the parking lot, while takeout pizzerias discovered they finally had the best business model. Having a drive-thru lane was definitely one of the keys to success in a world without waiters, busboys, and hosts. “Front of house” was shut down, but the “back of house” still had a pulse. Adding mobile web and native apps that allowed customers to easily order from operating “ghost kitchens” and pay with credit cards or Apple/Google/Samsung Pay enabled many restaurants to survive. A combination of curbside pickup and delivery from the likes of DoorDash, Uber Eats, Postmates, Instacart and Grubhub made this business model work. A surge in digital marketing also took place where many restaurants learned the importance of maintaining a relationship with their loyal customers via connected mobile devices. For the most part, 2021 has restauranteurs hoping for 100% in-person dining, but a new business model that looks a lot like catering + digital + physical delivery is something that has legs.

The Internet of Things

At its very essence, IoT is all about remotely knowing the state of a device or environmental system along with being able to remotely control some of those machines. COVID forced people to work, learn, and meet remotely and this same trend applied to the industrial world. The need to remotely operate industrial equipment or an entire “lights out” factory became an urgent imperative in order to keep workers safe. This is yet another case where the pandemic dramatically accelerated digital transformation. Connecting everything via APIs, modeling entities as digital twins, and having software bots bring everything to life with analytics has become an ROI game-changer for companies trying to survive in a free-falling economy. Despite massive employee layoffs and furloughs, jobs and tasks still have to be accomplished, and business leaders will look to IoT-fueled automation to keep their companies running and drive economic gains in 2021.

Streaming Entertainment

Closed movie theaters, football stadiums, bowling alleys, and other sources of entertainment left most people sitting at home watching TV in 2020. This turned into a dream come true for streaming entertainment companies like Netflix, Apple TV+, Disney+, HBO Max, Hulu, Amazon Prime Video, Youtube TV, and others. That said, Quibi and Facebook Watch didn’t make it. The idea of binge-watching shows during the weekend turned into binge-watching every season of every show almost every day. Delivering all these streams over the Internet via apps has made it easy to get hooked. Multiplayer video games fall in this category as well and represent an even larger market than the film industry. Gamers socially distanced as they played each other from their locked-down homes. The rise of cloud gaming combined with the rollout of low-latency 5G and Edge computing will give gamers true mobility in 2021. On the other hand, the video streaming market has too many players and looks ripe for consolidation in 2021 as people escape the living room once the vaccine is broadly deployed.

Healthcare

With doctors and nurses working around the clock as hospitals and clinics were stretched to the limit, it became increasingly difficult for non-COVID patients to receive the healthcare they needed. This unfortunate situation gave tele-medicine the shot in the arm (no pun intended) it needed. The combination of healthcare professionals delivering healthcare digitally over widespread connectivity helped those in need. This was especially important in rural areas that lacked the healthcare capacity of cities. Concurrently, the Internet of Things is making deeper inroads into delivering the health of a person to healthcare professionals via wearable technology. Connected healthcare has a bright future that will accelerate in 2021 as high-bandwidth 5G provides coverage to more of the population to facilitate virtual visits to the doctor from anywhere.

Working and Living

As companies and governments told their employees to work from home, it gave people time to rethink their living and working situation. Lots of people living in previously hip, urban, high-rise buildings found themselves residing in not-so-cool, hollowed-out ghost towns comprised of boarded-up windows and closed bars and cafés. Others began to question why they were living in areas with expensive real estate and high taxes when they not longer had to be close to the office. This led to a 2020 COVID exodus out of pricey apartments/condos downtown to cheaper homes in distant suburbs as well as the move from pricey areas like Silicon Valley to cheaper destinations like Texas. Since you were stuck in your home, having a larger house with a home office, fast broadband, and a back yard became the most important thing. Looking ahead to 2021, a hybrid model of work-from-home plus occasionally going into the office is here to stay as employees will no longer tolerate sitting in traffic two hours a day just to sit in a cubicle in a skyscraper. The digital transformation of how and where we work has truly accelerated.

Data and Advanced Analytics

Data has shown itself to be one of the world’s most important assets during the time of COVID. Petabytes of data has continuously streamed-in from all over the world letting us know the number of cases, the growth or decline of infections, hospitalizations, contact-tracing, free ICU beds, temperature checks, deaths, and hotspots of infection. Some of this data has been reported manually while lots of other sources are fully automated from machines. Capturing, storing, organizing, modeling and analyzing this big data has elevated the importance of cloud and edge computing, global-scale databases, advanced analytics software, and the growing importance of machine learning. This is a trend that was already taking place in business and now has a giant spotlight on it due to its global importance. There’s no stopping the data + advanced analytics juggernaut in 2021 and beyond.

Conclusion

2020 was one of the worst years in human history and the loss of life was just heartbreaking. People, businesses, and our education system had to become resourceful to survive. This resourcefulness amplified the importance of delivering connected, digital experiences to make previously remote things into local ones. Cheers to 2021 and the hope for a brighter day for all of humanity.

Read more…

By Michele Pelino

The COVID-19 pandemic drove businesses and employees to became more reliant on technology for both professional and personal purposes. In 2021, demand for new internet-of-things (IoT) applications, technologies, and solutions will be driven by connected healthcare, smart offices, remote asset monitoring, and location services, all powered by a growing diversity of networking technologies.

In 2021, we predict that:

  • Network connectivity chaos will reign. Technology leaders will be inundated by an array of wireless connectivity options. Forrester expects that implementation of 5G and Wi-Fi technologies will decline from 2020 levels as organizations sort through market chaos. For long-distance connectivity, low-earth-orbit satellites now provide a complementary option, with more than 400 Starlink satellites delivering satellite connectivity today. We expect interest in satellite and other lower-power networking technologies to increase by 20% in the coming year.
  • Connected device makers will double down on healthcare use cases. Many people stayed at home in 2020, leaving chronic conditions unmanaged, cancers undetected, and preventable conditions unnoticed. In 2021, proactive engagement using wearables and sensors to detect patients’ health at home will surge. Consumer interest in digital health devices will accelerate as individuals appreciate the convenience of at-home monitoring, insight into their health, and the reduced cost of connected health devices.
  • Smart office initiatives will drive employee-experience transformation. In 2021, some firms will ditch expensive corporate real estate driven by the COVID-19 crisis. However, we expect at least 80% of firms to develop comprehensive on-premises return-to-work office strategies that include IoT applications to enhance employee safety and improve resource efficiency such as smart lighting, energy and environmental monitoring, or sensor-enabled space utilization and activity monitoring in high traffic areas.*
  • The near ubiquity of connected machines will finally disrupt traditional business. Manufacturers, distributors, utilities, and pharma firms switched to remote operations in 2020 and began connecting previously disconnected assets. This connected-asset approach increased reliance on remote experts to address repairs without protracted downtime and expensive travel. In 2021, field service firms and industrial OEMs will rush to keep up with customer demand for more connected assets and machines.
  • Consumer and employee location data will be core to convenience. The COVID-19 pandemic elevated the importance location plays in delivering convenient customer and employee experiences. In 2021, brands must utilize location to generate convenience for consumers or employees with virtual queues, curbside pickup, and checking in for reservations. They will depend on technology partners to help use location data, as well as a third-party source of location trusted and controlled by consumers.

* Proactive firms, including Atea, have extended IoT investments to enhance employee experience and productivity by enabling employees to access a mobile app that uses data collected from light-fixture sensors to locate open desks and conference rooms. Employees can modify light and temperature settings according to personal preferences, and the system adjusts light color and intensity to better align with employees’ circadian rhythms to aid in concentration and energy levels. See the Forrester report “Rethink Your Smart Office Strategy.”

Originally posted HERE.

Read more…

Arm DevSummit 2020 debuted this week (October 6 – 8) as an online virtual conference focused on engineers and providing them with insights into the Arm ecosystem. The summit lasted three days over which Arm painted an interesting technology story about the current and future state of computing and where developers fit within that story. I’ve been attending Arm Techcon for more than half a decade now (which has become Arm DevSummit) and as I perused content, there were several take-a-ways I noticed for developers working on microcontroller based embedded systems. In this post, we will examine these key take-a-ways and I’ll point you to some of the sessions that I also think may pique your interest.

(For those of you that aren’t yet aware, you can register up until October 21st (for free) and still watch the conferences materials up until November 28th . Click here to register)

Take-A-Way #1 – Expect Big Things from NVIDIAs Acquisition of Arm

As many readers probably already know, NVIDIA is in the process of acquiring Arm. This acquisition has the potential to be one of the focal points that I think will lead to a technological revolution in computing technologies, particularly around artificial intelligence but that will also impact nearly every embedded system at the edge and beyond. While many of us have probably wondered what plans NVIDIA CEO Jensen Huang may have for Arm, the Keynotes for October 6th include a fireside chat between Jensen Huang and Arm CEO Simon Segars. Listening to this conversation is well worth the time and will help give developers some insights into the future but also assurances that the Arm business model will not be dramatically upended.

Take-A-Way #2 – Machine Learning for MCU’s is Accelerating

It is sometimes difficult at a conference to get a feel for what is real and what is a little more smoke and mirrors. Sometimes, announcements are real, but they just take several years to filter their way into the market and affect how developers build systems. Machine learning is one of those technologies that I find there is a lot of interest around but that developers also aren’t quite sure what to do with yet, at least in the microcontroller space. When we hear machine learning, we think artificial intelligence, big datasets and more processing power than will fit on an MCU.

There were several interesting talks at DevSummit around machine learning such as:

Some of these were foundational, providing embedded developers with the fundamentals to get started while others provided hands-on explorations of machine learning with development boards. The take-a-way that I gather here is that the effort to bring machine learning capabilities to microcontrollers so that they can be leveraged in industry use cases is accelerating. Lots of effort is being placed in ML algorithms, tools, frameworks and even the hardware. There were several talks that mentioned Arm’s Cortex-M55 architecture that will include Helium technology to help accelerate machine learning and DSP processing capabilities.

Take-A-Way #3 – The Constant Need for Reinvention

In my last take-a-way, I eluded to the fact that things are accelerating. Acceleration is not just happening though in the technologies that we use to build systems. The very application domain that we can apply these technology domains to is dramatically expanding. Not only can we start to deploy security and ML technologies at the edge but in domains such as space and medical systems. There were several interesting talks about how technologies are being used around the world to solve interesting and unique problems such as protecting vulnerable ecosystems, mapping the sea floor, fighting against diseases and so much more.

By carefully watching and listening, you’ll notice that many speakers have been involved in many different types of products over their careers and that they are constantly having to reinvent their skill sets, capabilities and even their interests! This is what makes working in embedded systems so interesting! It is constantly changing and evolving and as engineers we don’t get to sit idly behind a desk. Just as Arm, NVIDIA and many of the other ecosystem partners and speakers show us, technology is rapidly changing but so are the problem domains that we can apply these technologies to.

Take-A-Way #4 – Mbed and Keil are Evolving

There are also interesting changes coming to the Arm toolchains and tools like Mbed and Keil MDK. In Reinhard Keil’s talk, “Introduction to an Open Approach for Low-Power IoT Development“, developers got an insight into the changes that are coming to Mbed and Keil with the core focus being on IoT development. The talk focused on the endpoint and discussed how Mbed and Keil MDK are being moved to an online platform designed to help developers move through the product development faster from prototyping to production. The Keil Studio Online is currently in early access and will be released early next year.

(If you are interested in endpoints and AI, you might also want to check-out this article on “How Do We Accelerate Endpoint AI Innovation? Put Developers First“)

Conclusions

Arm DevSummit had a lot to offer developers this year and without the need to travel to California to participate. (Although I greatly missed catching up with friends and colleagues in person). If you haven’t already, I would recommend checking out the DevSummit and watching a few of the talks I mentioned. There certainly were a lot more talks and I’m still in the process of sifting through everything. Hopefully there will be a few sessions that will inspire you and give you a feel for where the industry is headed and how you will need to pivot your own skills in the coming years.

Originaly posted here

Read more…

Will We Ever Get Quantum Computers?

In a recent issue of IEEE Spectrum, Mikhail Dyakonov makes a pretty compelling argument that quantum computing (QC) isn't going to fly anytime soon. Now, I'm no expert on QC, and there sure is a lot of money being thrown at the problem by some very smart people, but having watched from the sidelines QC seems a lot like fusion research. Every year more claims are made, more venture capital gets burned, but we don't seem to get closer to useful systems.

Consider D-Wave Systems. They've been trying to build a QC for twenty years, and indeed do have products more or less on the market, including, it's claimed, one of 1024 q-bits. But there's a lot of controversy about whether their machines are either quantum computers at all, or if they offer any speedup over classical machines. One would think that if a 1K q-bit machine really did work the press would be all abuzz, and we'd be hearing constantly of new incredible results. Instead, the machines seem to disappear into research labs.

Mr. Duakonov notes that optimistic people expect useful QCs in the next 5-10 years; those less sanguine expect 20-30 years, a prediction that hasn't changed in two decades. He thinks a window of many decades to never is more realistic. Experts think that a useful machine, one that can do the sort of calculations your laptop is capable of, will require between 1000 and 100,000 q-bits. To me, this level of uncertainty suggests that there is a profound lack of knowledge about how these machines will work and what they will be able to do.

According to the author, a 1000 q-bit machine can be in 21000 states (a classical machine with N transistors can be in only 2N states), which is about 10300, or more than the number of sub-atomic particles in the universe. At 100,000 q-bits we're talking 1030,000, a mind-boggling number.

Because of noise, expect errors. Some theorize that those errors can be eliminated by adding q-bits, on the order of 1000 to 100,000 additional per q-bit. So a useful machine will need at least millions, or perhaps many orders of magnitude more, of these squirrelly microdots that are tamed only by keeping them at 10 millikelvin.

A related article in Spectrum mentions a committee formed of prestigious researchers tasked with assessing the probability of success with QC concluded that:

"[I]t is highly unexpected" that anyone will be able to build a quantum computer that could compromise public-key cryptosystems (a task that quantum computers are, in theory, especially suitable for tackling) in the coming decade. And while less-capable "noisy intermediate-scale quantum computers" will be built within that time frame, "there are at present no known algorithms/applications that could make effective use of this class of machine," the committee says."

I don't have a dog in this fight, but am relieved that useful QC seems to be no closer than The Distant Shore (to quote Jan de Hartog, one of my favorite writers). If it were feasible to easily break encryption schemes banking and other systems could collapse. I imagine Blockchain would fail as hash algorithms became reversable. The resulting disruption would not be healthy for our society.

On the other hand, Bruce Schneier's article in the March issue of IEEE Computing Edge suggests that QC won't break all forms of encryption, though he does think a lot of our current infrastructure will be vulnerable. The moral: if and when QC becomes practical, expect chaos.

I was once afraid of quantum computing, as it involves mechanisms that I'll never understand. But then I realized those machines will have an API. Just as one doesn't need to know how a computer works to program in Python, we'll be insulated from the quantum horrors by layers of abstraction.

Originaly posted here

Read more…

A scientist from Russia has developed a new neural network architecture and tested its learning ability on the recognition of handwritten digits. The intelligence of the network was amplified by chaos, and the classification accuracy reached 96.3%. The network can be used in microcontrollers with a small amount of RAM and embedded in such household items as shoes or refrigerators, making them 'smart.' The study was published in Electronics.

Today, the search for new neural networks that can operate on microcontrollers with a small amount of random access memory (RAM) is of particular importance. For comparison, in ordinary modern computers, random access memory is calculated in gigabytes. Although microcontrollers possess significantly less processing power than laptops and smartphones, they are smaller and can be interfaced with household items. Smart doors, refrigerators, shoes, glasses, kettles and coffee makers create the foundation for so-called ambient intelligece. The term denotes an environment of interconnected smart devices. 

An example of ambient intelligence is a smart home. The devices with limited memory are not able to store a large number of keys for secure data transfer and arrays of neural network settings. It prevents the introduction of artificial intelligence into Internet of Things devices, as they lack the required computing power. However, artificial intelligence would allow smart devices to spend less time on analysis and decision-making, better understand a user and assist them in a friendly manner. Therefore, many new opportunities can arise in the creation of environmental intelligence, for example, in the field of health care.

Andrei Velichko from Petrozavodsk State University, Russia, has created a new neural network architecture that allows efficient use of small volumes of RAM and opens the opportunities for the introduction of low-power devices to the Internet of Things. The network, called LogNNet, is a feed-forward neural network in which the signals are directed exclusively from input to output. Its uses deterministic chaotic filters for the incoming signals. The system randomly mixes the input information, but at the same time extracts valuable data from the information that are invisible initially. A similar mechanism is used by reservoir neural networks. To generate chaos, a simple logistic mapping equation is applied, where the next value is calculated based on the previous one. The equation is commonly used in population biology and as an example of a simple equation for calculating a sequence of chaotic values. In this way, the simple equation stores an infinite set of random numbers calculated by the processor, and the network architecture uses them and consumes less RAM.

7978216495?profile=RESIZE_584x

The scientist tested his neural network on handwritten digit recognition from the MNIST database, which is considered the standard for training neural networks to recognize images. The database contains more than 70,000 handwritten digits. Sixty-thousand of these digits are intended for training the neural network, and another 10,000 for network testing. The more neurons and chaos in the network, the better it recognized images. The maximum accuracy achieved by the network is 96.3%, while the developed architecture uses no more than 29 KB of RAM. In addition, LogNNet demonstrated promising results using very small RAM sizes, in the range of 1-2kB. A miniature controller, Atmega328, can be embedded into a smart door or even a smart insole, has approximately the same amount of memory.

"Thanks to this development, new opportunities for the Internet of Things are opening up, as any device equipped with a low-power miniature controller can be powered with artificial intelligence. In this way, a path is opened for intelligent processing of information on peripheral devices without sending data to cloud services, and it improves the operation of, for example, a smart home. This is an important contribution to the development of IoT technologies, which are actively researched by the scientists of Petrozavodsk State University. In addition, the research outlines an alternative way to investigate the influence of chaos on artificial intelligence," said Andrei Velichko.

Originally posted HERE.

by Russian Science Foundation

Image Credit: Andrei Velichko

 

 

 

 

Read more…

Can AI Replace Firmware?

Scott Rosenthal and I go back about a thousand years; we've worked together, helped midwife the embedded field into being, had some amazing sailing adventures, and recently took a jaunt to the Azores just for the heck of it. Our sons are both big data people; their physics PhDs were perfect entrees into that field, and both now work in the field of artificial intelligence.

At lunch recently we were talking about embedded systems and AI, and Scott posed a thought that has been rattling around in my head since. Could AI replace firmware?

Firmware is a huge problem for our industry. It's hideously expensive. Only highly-skilled people can create it, and there are too few of us.

What if an AI engine of some sort could be dumped into a microcontroller and the "software" then created by training that AI? If that were possible - and that's a big "if" - then it might be possible to achieve what was hoped for when COBOL was invented: programmers would no longer be needed as domain experts could do the work. That didn't pan out for COBOL; the industry learned that accountants couldn't code. Though the language was much more friendly than the assembly it replaced, it still required serious development skills.

But with AI, could a domain expert train an inference engine?

Consider a robot: a "home economics" major could create scenarios of stacking dishes from a dishwasher. Maybe these would be in the form of videos, which were then fed to the AI engine as it tuned the weighting coefficients to achieve what the home ec expert deems worthy goals.

My first objection to this idea was that these sorts of systems have physical constraints. With firmware I'd write code to sample limit switches so the motors would turn off if at an end-of-motion extreme. During training an AI-based system would try and drive the motors into all kinds of crazy positions, banging destructively into stops. But think how a child learns: a parent encourages experimentation but prevents the youngster from self-harm. Maybe that's the role of the future developer training an AI. Or perhaps the training will be done on a simulator of some sort where nothing can go horribly wrong.

Taking this further, a domain expert could define the desired inputs and outputs, and then a poorly-paid person do the actual training. CEOs will love that. With that model a strange parallel emerges to computation a century ago: before the computer age "computers" were people doing simple math to create tables of logs, trig, ballistics, etc. A room full all labored at a problem. They weren't particularly skilled, didn't make much, but did the rote work under the direction of one master. Maybe AI trainers will be somewhat like that.

Like we outsource clothing manufacturing to Bangladesh, I could see training, basically grunt work, being sent overseas as well.

I'm not wild about this idea as it means we'd have an IoT of idiots: billions of AI-powered machines where no one really knows how they work. They've been well-trained but what happens when there's a corner case?

And most of the AI literature I read suggests that inference successes of 97% or so are the norm. That might be fine for classifying faces, but a 3% failure rate of a safety-critical system is a disaster. And the same rate for less-critical systems like factory controllers would also be completely unacceptable.

But the idea is intriguing.

Original post can be viewed here

Feel free to email me with comments.

Back to Jack's blog index page.

Read more…

7811924256?profile=RESIZE_400x

 

CLICK HERE TO DOWNLOAD

This complete guide is a 212-page eBook and is a must read for business leaders, product managers and engineers who want to implement, scale and optimize their business with IoT communications.

Whether you want to attempt initial entry into the IoT-sphere, or expand existing deployments, this book can help with your goals, providing deep understanding into all aspects of IoT.

CLICK HERE TO DOWNLOAD

Read more…

Edge Products Are Now Managed At The Cloud

Now more than ever, there are billions of edge products in the world. But without proper cloud computing, making the most of electronic devices that run on Linux or any other OS would not be possible.

And so, a question most people keep asking is which is the best Software-as-a-service platform that can effectively manage edge devices through cloud computing. Well, while edge device management may not be something, the fact that cloud computing space is not fully exploited means there is a lot to do in the cloud space.

Product remote management is especially necessary for the 21st century and beyond. Because of the increasing number of devices connected to the internet of things (IoT), a reliable SaaS platform should, therefore, help with maintaining software glitches from anywhere in the world. From smart homes, stereo speakers, cars, to personal computers, any product that is connected to the internet needs real-time protection from hacking threats such as unlawful access to business or personal data.

Data being the most vital asset is constantly at risk, especially if individuals using edge products do not connect to trusted, reliable, and secure edge device management platforms.

Bridges the Gap Between Complicated Software And End Users

Cloud computing is the new frontier through which SaaS platforms help manage edge devices in real-time. But something even more noteworthy is the increasing number of complicated software that now run edge devices at homes and in workplaces.

Edge device management, therefore, ensures everything runs smoothly. From fixing bugs, running debugging commands to real-time software patch deployment, cloud management of edge products bridges a gap between end-users and complicated software that is becoming the norm these days.

Even more importantly, going beyond physical firewall barriers is a major necessity in remote management of edge devices. A reliable Software-as-a-Service, therefore, ensures data encryption for edge devices is not only hackproof by also accessed by the right people. Moreover, deployment of secure routers and access tools are especially critical in cloud computing when managing edge devices. And so, developers behind successful SaaS platforms do conduct regular security checks over the cloud, design and implement solutions for edge products.

Reliable IT Infrastructure Is Necessary

Software-as-a-service platforms that manage edge devices focus on having a reliable IT infrastructure and centralized systems through which they can conduct cloud computing. It is all about remotely managing edge devices with the help of an IT infrastructure that eliminates challenges such as connectivity latency.

Originally posted here

Read more…

Introducing Profiler, by Auptimizer: Select the best AI model for your target device — no deployment required.

Profiler is a simulator for profiling the performance of Machine Learning (ML) model scripts. Profiler can be used during both the training and inference stages of the development pipeline. It is particularly useful for evaluating script performance and resource requirements for models and scripts being deployed to edge devices. Profiler is part of Auptimizer. You can get Profiler from the Auptimizer GitHub page or via pip install auptimizer.

The cost of training machine learning models in the cloud has dropped dramatically over the past few years. While this drop has pushed model development to the cloud, there are still important reasons for training, adapting, and deploying models to devices. Performance and security are the big two but cost-savings is also an important consideration as the cost of transferring and storing data, and building models for millions of devices tends to add up. Unsurprisingly, machine learning for edge devices or Edge AI as it is more commonly known continues to become mainstream even as cloud compute becomes cheaper.

Developing models for the edge opens up interesting problems for practitioners.

  1. Model selection now involves taking into consideration the resource requirements of these models.
  2. The training-testing cycle becomes longer due to having a device in the loop because the model now needs to be deployed on the device to test its performance. This problem is only magnified when there are multiple target devices.

Currently, there are three ways to shorten the model selection/deployment cycle:

  • The use of device-specific simulators that run on the development machine and preclude the need for deployment to the device. Caveat: Simulators are usually not generalizable across devices.
  • The use of profilers that are native to the target device. Caveat: They need the model to be deployed to the target device for measurement.
  • The use of measures like FLOPS or Multiply-Add (MAC) operations to give approximate measures of resource usage. Caveat: The model itself is only one (sometimes insignificant) part of the entire pipeline (which also includes data loading, augmentation, feature engineering, etc.)

In practice, if you want to pick a model that will run efficiently on your target devices but do not have access to a dedicated simulator, you have to test each model by deploying on all of the target devices.

Profiler helps alleviate these issues. Profiler allows you to simulate, on your development machine, how your training or inference script will perform on a target device. With Profiler, you can understand CPU- and memory-usage as well as run-time for your model script on the target device.

How Profiler works

Profiler encapsulates the model script, its requirements, and corresponding data into a Docker container. It uses user-inputs on compute-, memory-, and framework-constraints to build a corresponding Docker image so the script can run independently and without external dependencies. This image can then easily be scaled and ported to ease future development and deployment. As the model script is executed within the container, Profiler tracks and records various resource utilization statistics including Average CPU UtilizationMemory UsageNetwork I/O, and Block I/O. The logger also supports setting the Sample Time to control how frequently Profiler samples utilization statistics from the Docker container.

Get Profiler: Click here

How Profiler helps

Our results show that Profiler can help users build a good estimate of model runtime and memory usage for many popular image/video recognition models. We conducted over 300 experiments across a variety of models (InceptionV3, SqueezeNet, Resnet18, MobileNetV2–0.25x, -0.5x, -0.75x, -1.0x, 3D-SqueezeNet, 3D-ShuffleNetV2–0.25x, -0.5x, -1.0x, -1.5x, -2.0x, 3D-MobileNetV2–0.25x, -0.5x, -0.75x, -1.0x, -2.0x) on three different devices — LG G6 and Samsung S8 phones, and NVIDIA Jetson Nano. You can find the full set of experimental results and more information on how to conduct similar experiments on your devices here.

The addition of Profiler brings Auptimizer closer to the vision of a tool that helps machine learning scientists and engineers build models for edge devices. The hyperparameter optimization (HPO) capabilities of Auptimizer help speed up model discovery. Profiler helps with choosing the right model for deployment. It is particularly useful in the following two scenarios:

  1. Deciding between models — The ranking of the run-times and memory usages of the model scripts measured using Profiler on the development machine is indicative of their ranking on the target device. For instance, if Model1 is faster than Model2 when measured using Profiler on the development machine, Model1 will be faster than Model2 on the device. This ranking is valid only when the CPU’s are running at full utilization.
  2. Predicting model script performance on the device — A simple linear relationship relates the run-times and memory usage measured using Profiler on the development machine with the usage measured using a native profiling tool on the target device. In other words, if a model runs in time x when measured using Profiler, it will run approximately in time (a*x+b) on the target device (where a and b can be discovered by profiling a few models on the device with a native profiling tool). The strength of this relationship depends on the architectural similarity between the models but, in general, the models designed for the same task are architecturally similar as they are composed of the same set of layers. This makes Profiler a useful tool for selecting the best suited model.

Looking forward

Profiler continues to evolve. So far, we have tested its efficacy on select mobile- and edge-platforms for running popular image and video recognition models for inference, but there is much more to explore. Profiler might have limitations for certain models or devices and can potentially result in inconsistencies between Profiler outputs and on-device measurements. Our experiment page provides more information on how to best set up your experiment using Profiler and how to interpret potential inconsistencies in results. The exact use case varies from user to user but we believe that Profiler is relevant to anyone deploying models on devices. We hope that Profiler’s estimation capability can enable leaner and faster model development for resource-constrained devices. We’d love to hear (via github) if you use Profiler during deployment.

Originaly posted here


Authors: Samarth Tripathi, Junyao Guo, Vera Serdiukova, Unmesh Kurup, and Mohak Shah — Advanced AI, LG Electronics USA

Read more…
RSS
Email me when there are new items in this category –

Charter Sponsors

Upcoming IoT Events

More IoT News

Arcadia makes supporting clean energy easier

Nowadays, it’s easier than ever to power your home with clean energy, and yet, many Americans don’t know how to make the switch. Luckily, you don’t have to install expensive solar panels or switch utility companies…

Continue

4 industries to watch for AI disruption

Consumer-centric applications for artificial intelligence (AI) and automation are helping to stamp out the public perception that these technologies will only benefit businesses and negatively impact jobs and hiring. The conversation from human…

Continue

Answering your Huawei ban questions

A lot has happened since we uploaded our most recent video about the Huawei ban last month. Another reprieve has been issued, licenses have been granted and the FCC has officially barred Huawei equipment from U.S. networks. Our viewers had some… Continue

IoT Career Opportunities