Subscribe to our Newsletter | Join our LinkedIn Group | Post on IoT Central


Featured Posts (738)

Sort by

By Jacqi Levy

The Internet of Things (IoT) is transforming every facet of the building – how we inhabit them, how we manage them, and even how we build them. There is a vast ecosystem around today’s buildings, and no part of the ecosystem is untouched.

In this blog series, I plan to examine the trends being driven by IoT across the buildings ecosystem. Since the lifecycle of building begins with design and construction, let’s start there. Here are four ways that the IoT is radically transforming building design and construction.

Building information modeling

Building information modeling (BIM) is a process that provides an intelligent, 3D model of a building. Typically, BIM is used to model a building’s structure and systems during design and construction, so that changes to one set of plans can be updated simultaneously in all other impacted plans. Taken a step further, however, BIM can also become a catalyst for smart buildings projects.

Once a building is up and running, data from IoT sensors can be pulled into the BIM. You can use that data to model things like energy usage patterns, temperature trends or people movement throughout a building. The output from these models can then be analyzed to improve future buildings projects. Beyond its impact on design and construction, BIM also has important implications for the management of building operations.

Green building

The construction industry is a huge driver of landfill waste – up to 40% of all solid waste in the US comes from the buildings projects. This unfortunate fact has ignited a wave of interest in sustainable architecture and construction. But the green building movement has become about much more than keeping building materials out of landfills. It is influencing the design and engineering of building systems themselves, allowing buildings to reduce their impact on the environment through energy management.

Today’s green buildings are being engineered to do things like shut down unnecessary systems automatically when the building is unoccupied, or open and close louvers automatically to let in optimal levels of natural light. In a previous post, I talk about 3 examples of the IoT in green buildings, but these are just some of the cool ways that the construction industry is learning to be more sustainable with help from the IoT.

Intelligent prefab

Using prefabricated building components can be faster and more cost effective than traditional building methods, and it has an added benefit of creating less construction waste. However, using prefab for large commercial buildings projects can be very complex to coordinate. The IoT is helping to solve this problem.

Using RFID sensors, individual prefab parts can be tracked throughout the supply chain. A recent example is the construction of the Leadenhall Building in London. Since the building occupies a relatively small footprint but required large prefabricated components, it was a logistically complex task to coordinate the installation. RFID data was used to help mitigate the effects of any downstream delays in construction. In addition, the data was the fed into the BIM once parts were installed, allowing for real time rendering of the building in progress, as well as establishment of project controls and KPIs.

Construction management

Time is money, so any delays on a construction project can be costly. So how do you prevent your critical heavy equipment from going down and backing up all the other trades on site? With the IoT!

Heavy construction equipment is being outfitted with sensors, which can be remotely monitored for key indicators of potential maintenance issues like temperature fluctuations, excessive vibrations, etc. When abnormal patterns are detected, alerts can trigger maintenance workers to intervene early, before critical equipment fails. Performing predictive maintenance in this way can save time and money, as well as prevent unnecessary delays in construction projects.

Originally posted here.

Read more…

4 key questions to ask tech vendors

Posted by Terri Hiskey

Without mindful and strategic investments, a company’s supply chain could become wedged in its own proverbial Suez Canal, ground to a halt by outside forces and its inflexible, complex systems.

 

It’s a dramatic image, but one that became reality for many companies in the last year. Supply chain failures aren’t typically such high-profile events as the Suez Canal blockage, but rather death by a thousand inefficiencies, each slowing business operations and affecting the customer experience.

Delay by delay and spreadsheet by spreadsheet, companies are at risk of falling behind more nimble, cloud-enabled competitors. And as we emerge from the pandemic with a new understanding of how important adaptable, integrated supply chains are, company leaders have critical choices to make.

The Hannover Messe conference (held online from April 12-16) gives manufacturing and supply chain executives around the world a chance to hear perspectives from industry leaders and explore the latest manufacturing and supply chain technologies available.

Technology holds great promise. But if executives don’t ask key strategic questions to supply chain software vendors, they could unknowingly introduce a range of operational and strategic obstacles into their company’s future.

If you’re attending Hannover Messe, here are a few critical questions to ask:

Are advanced technologies like machine learning, IoT, and blockchain integrated into your supply chain applications and business processes, or are they addressed separately?

It’s important to go beyond the marketing. Is the vendor actually promoting pilots of advanced technologies that are simply customized use cases for small parts of an overall business process hosted on a separate platform? If so, it may be up to your company to figure out how to integrate it with the rest of that vendor’s applications and to maintain those integrations.

To avoid this situation, seek solutions that have been purpose-built to leverage advanced technologies across use cases that address the problems you hope to solve. It’s also critical that these solutions come with built-in connections to ensure easy integration across your enterprise and to third party applications.

Are your applications or solutions written specifically for the cloud?

If a vendor’s solution for a key process (like integrated business planning or plan to produce, for example) includes applications developed over time by a range of internal development teams, partners, and acquired companies, what you’re likely to end up with is a range of disjointed applications and processes with varying user interfaces and no common data model. Look for a cloud solution that helps connect and streamline your business processes seamlessly.

Update schedules for the various applications could also be disjointed and complicated, so customers can be tempted to skip updates. But some upgrades may be forced, causing disruption in key areas of your business at various times.

And if some of the applications in the solution were written for the on-premises world, business processes will likely need customization, making them hard-wired and inflexible. The convenience of cloud solutions is that they can take frequent updates more easily, resulting in greater value driven by the latest innovations.

Are your supply chain applications fully integrated—and can they be integrated with other key applications like ERP or CX?

A lack of integration between and among applications within the supply chain and beyond means that end users don’t have visibility into the company’s operations—and that directly affects the quality and speed of business decisions. When market disruptions or new opportunities occur, unintegrated systems make it harder to shift operations—or even come to an agreement on what shift should happen.

And because many key business processes span multiple areas—like manufacturing forecast to plan, order to cash, and procure to pay—integration also increases efficiency. If applications are not integrated across these entire processes, business users resort to pulling data from the various systems and then often spend time debating whose data is right.

Of course, all of these issues increase operational costs and make it harder for a company to adapt to change. They also keep the IT department busy with maintenance tasks rather than focusing on more strategic projects.

Do you rely heavily on partners to deliver functionality in your supply chain solutions?

Ask for clarity on which products within the solution belong to the vendor and which were developed by partners. Is there a single SLA for the entire solution? Will the two organizations’ development teams work together on a roadmap that aligns the technologies? Will their priority be on making a better solution together or on enhancements to their own technology? Will they focus on enabling data to flow easily across the supply chain solution, as well as to other systems like ERP? Will they be able to overcome technical issues that arise and streamline customer support?

It’s critical for supply chain decision-makers to gain insight into these crucial questions. If the vendor is unable to meet these foundational needs, the customer will face constant obstacles in their supply chain operations.

Originally posted here.

Read more…

By Ricardo Buranello

What Is the Concept of a Virtual Factory?

For a decade, the first Friday in October has been designated as National Manufacturing Day. This day begins a month-long events schedule at manufacturing companies nationwide to attract talent to modern manufacturing careers.

For some period, manufacturing went out of fashion. Young tech talents preferred software and financial services career opportunities. This preference has changed in recent years. The advent of digital technologies and robotization brought some glamour back.

The connected factory is democratizing another innovation — the virtual factory. Without critical asset connection at the IoT edge, the virtual factory couldn’t have been realized by anything other than brand-new factories and technology implementations.

There are technologies that enable decades-old assets to communicate. Such technologies allow us to join machine data with physical environment and operational conditions data. Benefits of virtual factory technologies like digital twin are within reach for greenfield and legacy implementations.

Digital twin technologies can be used for predictive maintenance and scenario planning analysis. At its core, the digital twin is about access to real-time operational data to predict and manage the asset’s life cycle. It leverages relevant life cycle management information inside and outside the factory. The possibilities of bringing various data types together for advanced analysis are promising.

I used to see a distinction between IoT-enabled greenfield technology in new factories and legacy technology in older ones. Data flowed seamlessly from IoT-enabled machines to enterprise systems or the cloud for advanced analytics in new factories’ connected assets. In older factories, while data wanted to move to the enterprise systems or the cloud, it hit countless walls. Innovative factories were creating IoT technologies in proof of concepts (POCs) on legacy equipment, but this wasn’t the norm.

No matter the age of the factory or equipment, everything looks alike. When manufacturing companies invest in machines, the expectation is this asset will be used for a decade or more. We had to invent something inclusive to new and legacy machines and systems.

We had to create something to allow decades-old equipment from diverse brands and types (PLCs, CNCs, robots, etc.) to communicate with one another. We had to think in terms of how to make legacy machines to talk to legacy systems. Connecting was not enough. We had to make it accessible for experienced developers and technicians not specialized in systems integration.

If plant managers and leaders have clear and consumable data, they can use it for analysis and measurement. Surfacing and routing data has enabled innovative use cases in processes controlled by aged equipment. Prescriptive and predictive maintenance reduce downtime and allow access to data. This access enables remote operation and improved safety on the plant floor. Each line flows better, improving supply chain orchestration and worker productivity.

Open protocols aren’t optimized for connecting to each machine. You need tools and optimized drivers to connect to the machines, cut latency time and get the data to where it needs to be in the appropriate format to save costs. These tools include:

  • Machine data collection
  • Data transformation and visualization
  • Device management
  • Edge logic
  • Embedded security
  • Enterprise integration
This digital copy of the entire factory floor brings more promise for improving productivity, quality, downtime, throughput and lending access to more data and visibility. It enables factories to make small changes in the way machines and processes operate to achieve improvements.

Plants are trying to get and use data to improve overall equipment effectiveness. OEE applications can calculate how many good and bad parts were produced compared to the machine’s capacity. This analysis can go much deeper. Factories can visualize how the machine works down to sub-processes. They can synchronize each movement to the millisecond and change timing to increase operational efficiency.

The technology is here. It is mature. It’s no longer a question of whether you want to use it — you have it to get to what’s next. I think this makes it a fascinating time for smart manufacturing.

Originally posted here.

Read more…

Waste management is a global concern. According to The World Bank report, about 2.01 billion tonnes of solid waste is generated globally every year. 33% of that waste is not managed in an environmentally safe manner. Waste management in densely populated urban areas is a major problem. The lack of it leads to environmental contamination. It ends up spreading diseases in epidemic proportions. It is a challenge for both developed and developing countries.

By 2050, it is estimated to grow to 3.40 billion tonnes. But here is the catch. IoT waste management systems can help, Municipalities across the globe can employ IoT to manage waste better. IoT technologies are already being employed for modern supply chains. IoT waste management systems have become invaluable as they optimize and automate most of the processes in the industry. IoT adoption, however, is far more significant on the supply chain side. While many IoT-based waste management systems are already in place, a lot of challenges hold them back. 

A smart city collects data of personal vehicles, buildings, public transport, components of urban infrastructures such as power grids and waste management systems, and citizens. The insights derived from the real-time data help municipalities to manage these systems. IoT waste management is a new frontier for local authorities, aiming to reduce municipal waste. As per a recent survey by IoT Analytics, over 70% of cities have deployed IoT systems for security, traffic, and water level monitoring. It is yet to be fully deployed for smart waste management systems using IoT.

With rapid population increase, sanitation-related issues concerning garbage management are on a decline. It creates unhygienic conditions for the citizens in the surrounding areas, leading to the spread of diseases. IoT in waste management is a trending solution. By using IoT, waste management companies can increase operational efficiency and reduce costs. 

The waste collection process in urban areas is complex. It requires a significant amount of resources. More than $300 million per capita is spent annually in collecting and managing waste. Most of the high-income cities charge their citizens to cover a fraction of this expense. The rest of the expense is compensated from the tax revenue, which financially burdens the local government.

Municipalities and waste management companies have improved route efficiencies. But they haven't leveraged technological innovations for improving operational efficiency. Even with the route optimization process, the manual process wastes money and time. The use of smart devices, machine-to-machine connectivity, sensors, and IoT can reduce costs. A smart waste management system using IoT can reduce expenses in the trash collection process. But how? How does the use of IoT in waste management improve waste collection efficiencies?

 

How Does IoT in Waste management Respond to Operational Inefficiencies?


A smart waste management system using IoT improves the efficiency of collecting waste and recycling. Route optimization is the most common use case for using IoT waste management solutions, which reduces fuel consumption. 

IoT-powered, smart waste management solutions comprise endpoints (sensors), IoT platforms, gateways, and web and mobile applications. Sensors are attached to dumpsters to check their fill level. Gateways bridge the gap between the IoT platform and the sensor, sending data to the cloud. IoT platforms then transform the raw data into information. 

 

Benefits of IoT Waste Management Solutions


There are several advantages of using IoT-powered waste management solutions. 

  • Reduced Waste Collection Costs:
    Dumpsters that employ IoT can transmit their real-time information on fill-level. The data is shared with the waste collectors. The use of data and selection of optimum routes leads the waste collection trucks to consider the dumpsters with high fill levels. This saves fuel, money, and effort. 
  • No Missed Pickups:
    The smart IoT waste management system eliminates the overflowing of trash bins. The authorities are immediately notified when the trash bins are about to fill up to their capacity. And the collection trucks are scheduled for pickup. 
  • Waste Generation Analysis:
    IoT waste management isn't about route optimization alone. The actual value of an IoT-powered process lies in data analysis. Most IoT solutions are coupled with data analytics capabilities. They help IoT waste management companies anticipate future waste generation.
  • Reduction In Carbon Dioxide Emission:
    Optimized routes cause less fuel consumption. They reduce the carbon footprint and make the waste management process eco-friendlier.
  • Efficient Recycling:
    Over the years, the appearance of consumer electronic devices in landfills has become a growing concern. This is due to its harmful chemicals and valuable components. But this concern also presents an opportunity. IoT offers an opportunity for businesses by using sanitation systems to recycle e-waste for resources.
  • Automating IoT Management Systems:
    IoT waste management can also be helpful in waste categorization. The use of digital bins can help in automating the sorting, segregation, and categorization of waste. This saves a lot of man-hours. A Polish company Bin-e combines AI-based object recognition, fill level control and data processing. Its 'Smart Waste Bins' identifies and sorts waste into four categories - paper, glass, plastic, and metal. This makes waste processing more efficient. 


Future of IoT Waste Management


IoT waste management is a boon. The growing use of IoT linked with the management of everyday urban life improves the everyday experience of the citizens. Additionally, it reduces carbon footprint. But to do so in the waste management segment, more support is needed from the public sector through incentives and regulations. The private sector needs to contribute via innovation. Engagement from the various state agencies is required to implement the usage of IoT applications. This will help build a more sustainable future.


Conclusion


Those managing the waste collection, sorting, segregation and categorization, can benefit from a smart waste management system using IoT. By employing IoT in waste management, waste management companies can increase operational efficiency. It can reduce costs and enhance the satisfaction level of citizens by ensuring dumpsters don't overflow.

Read more…

Many businesses are already taking advantage of IoT solutions to improve their efficiency and create new revenue streams. However, if you're considering launching a connected business, one of the most important factors to contemplate is the cost of IoT software implementation. This article will give you an overview of what goes into IoT software development and maintenance. 

Different factors feed into the cost, but the two most common concerns for companies getting into IoT are the cost of initial software development (or “integration”) and ongoing expenses after devices have been deployed. Unfortunately, as key stakeholders ponder over the ever-present build vs buy dilemma, the ones who lean towards building often tend to underestimate both significantly.

Let's take a look at a minimum set of software products you would need today to run a connected product, business, or service. First of all, firmware - software that is uploaded and then runs on the hardware. It provides a low-level control for the device's specific logic. Networks and connectivity – it's a part of firmware development, but I would move it into a separate domain, crucial for any IoT implementation.

Cloud is any service made available to users on demand via the Internet from a cloud computing provider's servers. The IoT servers have different purposes, like administration, monitoring, data gathering and analysis. Applications - once the device is connected, in today's reality you would need a user interface to interact with the device or service, configure it, control and monitor remotely, visualize processes, etc. It can be a touch control, a mobile app, a web app, a voice app (e.g. Amazon Alexa skill), etc.

Working with deployed connected products also usually requires two different types of apps: customer-facing applications (remote control, automation settings, maintenance alerts) and applications for internal company use (fleet management, analytics, device health tracking, performance tracking and maintenance alerts). And one thing is to offer an app, and a totally different thing is to build an app people will actually love to use. The latter requires a particularly strong UI/UX expertise in addition to the expected front-end, back-end and QA resources. 

As part of an IoT solution, you'll need additional storage capacity and processing power to perform analytics, run reports, and house the vast amounts of data that will be generated. Invoicing for these capabilities can vary—from a fixed monthly cost to metered billing—so make sure you understand the pricing model to anticipate cash flow better.

Various IoT platforms offer parts of the solutions for the software needs mentioned above. However, it often takes at least 3-5 different vendors to get everything an IoT powered business needs. Not only is it challenging to manage so many vendors, but also the costs really start adding up, making IoT implementation and maintenance pricing prohibitive for many companies, especially the smaller ones.

Fortunately, there are now options like Blynk IoT platform that have launched solutions tailored specifically at small businesses and startups. As a result, engineers and entrepreneurs worldwide can build and commercialize connected products without the heavy investment typically required to start an IoT-enabled business. Anyone with an MCU, some coding skills, and a great product idea can create an IoT business. And their monthly software costs will be less than what they pay for a typical TV subscription in the US.

Out-of-the-box, Blynk is supposed to cover 90-100% of software needs a business typically faces in the first 2-3 years of IoT operations. The platform functionality includes device provisioning and management, data hosting in the cloud, mobile and web apps for customers and staff, firmware over-the-air updates, user and organization management, data analytics, all kinds of automations and much more.

 

IoT software - build or buy?

As you can see, building your own IoT software from scratch is not a cheap endeavor, especially with a team based in the USA. If you have all of the right people on board and have a bulletproof ROI model for your IoT investment - go for it, build in-house. But if you are an OEM whose main focus remains on their core products and you care about optimizing costs and your time to market - then you are probably better off leveraging a solid IoT platform. Those folks have already spent those years (and in most cases, millions) building out the software you need and testing it out with real clients, in real world conditions, with all of the priceless learnings that come with that.

Read more…

Enterprise Resource Planning (ERP) systems, as the name suggests, enable a company, no matter the industry, to better plan the use and management of its resources and achieve seamless operations. Such solutions, when integrated with the Internet of Things (IoT), i.e. a network of connected devices that enables the exchange of data in real-time, can work wonders in the manufacturing sector. No, really. Now, if you are wondering why that is, allow us to demonstrate via some key benefits of this duo.

  1. Better management of assets: Perhaps one of the best things about technologies and their evolution that helped businesses tend to their assets, machines, and equipment much better than before, i.e. not wait to tend to it till it has broken down and rendered unproductive, even if temporarily so. But thanks to IoT sensors embedded in such assets, it becomes that much easier to identify any wear and tear, any issues, etc. across their lifetime. These factors are then flagged to the ERP software, which then further informs the appropriate teams and people responsible for the maintenance of such assets. This allows companies to undertake preventive maintenance, thus prolonging the life of their assets. It also helps ensure that operations are not interrupted since maintenance work can be scheduled in a manner to prevent or minimize downtime.
  2. Access to real-time analytics: As the basic idea of IoT suggests, what they do is double up as a source of 24x7 information. They glean data collected from the sensors and then channel said data into the requisite systems. This ability to collect data at all times, from all the connected devices, means manufacturing businesses can process the data through ERP systems to gain access to highly valuable information such as market trends, processes that might need improvements, any possible quality issues, and so much more. Such information, in turn, drives much more informed strategies and marketing-related decisions and that too in real-time.
  3. Improved quality control: Quality is typically one of the key concerns on the priority list of any business involved in manufacturing, no matter what it is that one may be making. Of course, this is where long-established, archaic quality checks would come in, but the problem is that it is quite time-consuming, prone to high levels of human error, etc. This problem, thankfully, is easily addressed with an IoT-integrated ERP solution, which can empower companies and their management to better the quality of their offerings via round-the-clock monitoring of their production processes.

As the world and the technologies around us continue to evolve at a dizzying pace, ERP solutions have emerged as the crowd favorite for all modern businesses. Now, as evidenced from the above discussion, the said popularity of enterprise software development, fortified with advanced technologies such as IoT, artificial intelligence, etc., can bring a world of benefits to manufacturing companies as well as those operating in other sectors across the globe.

Hence, driving the demand for such solutions further up. With that being said, if you too wish to make use of all the aforementioned benefits and countless others such as automation, better levels of customer service, and more, you know the integration of ERP with IoT is the right way forward.

Read more…

The demand for Computer Numerical Control (CNC) equipment is gradually increasing and performing to expect a huge growth over the coming years. For this an annual growth rate of more than six percent. CNC machining plays a major role in present manufacturing and helps us create a diverse range of products in several industries, from agriculture, automotive, and aerospace to Semiconductor and circuit boards.

Nowadays, machining has developed rapidly in periods of processing complexity, precision, machine scale, and automation level. In the development of processing quality and efficiency, CNC machine tools play a vital role. IoT-enabled CNC machine monitoring solutions, which creates machine-to-machine interaction resulting in automated operations and less manual intervention.

Embedded the IoT sensors on CNC machines that can measure various parameters and send them to a platform from where the state and operation of the machines can be fully supervised. Furthermore, CNC machines can scrutinize the data collected from sensors to perpetually replace tools, change the degree of freedom, or perform any other action. 

ADVANTAGES:

An Enterprise can leverage the following advantages by coalescence of Industry 4.0 and CNC. 

Predictive Maintenance:

CNC Machine operators and handlers embrace the Industrial IoT which allows them to appropriately interconnect with their CNC machines in many ways through smartphones or tablets. Therefore the operators can monitor the condition of machines at all times remotely using Faststream’s IoT-based CNC machine monitoring.

This remote and real-time monitoring aids the machine operating person to program a CNC for a checkup or restore.

On the other hand, these can also arrange their CNC machines to send alerts or notifications to operators whenever machines deem themselves due for tuning or maintenance. In other terms, the machine will raise red flags about complications such as a rise in temperature, increased vibrations, or tool damage.

Reducing Downtime and Efficient Machine Monitoring :

Digital Transformation in CNC Machine solutions has broad competence and is not restricted to distant control and programmed maintenance for CNC machines. Reduce machine downtime and escalate overall equipment effectiveness by using our IoT system and grasping its real-time alert features. The Alerts received from machines can be used to do predictive measures and unexpected breakdown of tools or any other element of a CNC machine.

Faststream Technologies similar solutions to its clients by arranging the IoT energy management solution for their CNC machines. Pre-executing these solutions, the client was facing difficulties with the future breakdown of their machines. Faststream's IoT solution guided them to retain a clear insight into the running hours of their CNC machines, which in turn gave them exact thoughts of how they were maintaining their production run-time.

Machine downtime reducing solutions can be utilized for a chain of CNC machines to not only ameliorate their processing but also to boost the machine synchronization process in industrial inception and realize the operational eminence.

Less manual effort and Worker Safety:

For the bigger enactment, the technology of Industrial IoT can also be implemented to bring down manual efforts, or in other terms, mitigate the possibility of workers’ injury in the factory operation process.

From this action, machine-to-machine synchronization and interrelation come into the picture. The synergy between machines will result in more interpretation between various electromechanical devices, which will lead to automated operations in a Manufacturing unit.

Many companies are already working towards the development of smart robots and machines that can.

Several Companies that perform on smart robots and machine development can work on pre-programmed tasks and retaliation to the existing needs of CNC machines for bringing down the extra strain of quality operation from the manual workforce. All these robots can perform confined and elegant work like opening & close the slab of a CNC machine or reform the tool whenever sharpness is required.

Apart from the lowering injuries in the workshop, our Industry 4.0 in CNC Machine also helps in lowering material wastage and betterment the efficiency of CNC machines, which will help in the rise in production of exact elements in a shorter time frame.

CONCLUSION

CNC machines are electromechanical devices that can operate tools on a different range of axes with more accuracy to generate a small part as per command put through a computer program. These can run faster than any other non-automated machine as well as can generate further objects with high accuracy from any type of design.

Read more…

With a lot of buzz in the industry, the Internet of Things, a.k.a, IoT, has successfully gained traction. Confused about what an IoT is? Don't be because you have been using it literally in your everyday life, and if not you, then definitely someone you know, for example, smartwatches, fitness devices, self-driving cars, smart microwaves, etc.

An IoT is a network of connected devices where the data and information are interlinked in a way you might not know!

Now that the concept of IoT is briefly cleared, let's see how it could become the fifth revolution in the dairy industry.

2018 has seen a fourth industrial revolution, which was a new step in the production, automatization, and computerization of the processes by using the data provided by the IoT devices. One might think this concept is only used in industries like health & fitness or electronics, but the revolution is no less in agro.

As per a study, in 2016, an agro-tech company received a massive amount of $3.2 billion investment. This provides enough evidence to show the growing graph of the need for digitalisation in every aspect of dairy farming.

 

Why is the need for smart dairy farming?

 

With the vastly growing industry, it has become the need of the hour to be up-to-date with the essential technology for the growing competition. To keep up with the healthy living of the livestock, it is essential to prevent any illness by diagnosing it at an earlier stage.

For 97% of the U.S. dairy farms, it is more than just their source of income and is a family-owned business. This also means that most of them have been into livestock farming for generations, but the business is not the same as decades before.

Smart dairy farming using IoT can become revolutionary solutions to improve farm capacity, reduce animal mortality and increase dairy output.

To meet the growing demand for dairy with the increasing population, especially in the developed countries, better tools and specialized equipment are required. IoT integrated smart-collars serve the purpose.

 

How does the smart collar work?

 

The smart collar is a complete IoT-enabled cattle management system with a physical product linked with a digital screen.

The cattle tracking device with an inbuilt GPS gives a real-time location of the cattle and sends the signals to the owners every quarter of an hour.

The collars get connected with the routers installed near the farming field, where they will get signals from.

The vital sensitive devices will be bridged to the collar strap, continuously providing reports over the software dashboard screen. As the belt is installed, the data gets transferred and stored in the form of graphs and charts.

 

What are the benefits of smart dairy farming using IoT?

9316939680?profile=RESIZE_710x

 

Auto-Milking Process

 

Manual milking is a time-consuming process; instead, it also includes more staffing. IoT embedded smart collar belts can resolve the problem more efficiently with less manpower by introducing auto milking.

Since auto-milking is just a robotic system and is entirely automated, it is unaware of the temperature and any diseases affecting the cattle. The machine will yield all the cattle at the same time, the same way.

When we link IoT to the cattle, essential factors are looked upon, which otherwise can get ignored if done manually.

Temperature monitoring, disease tracking, and nutritional requirements are few, tracked down with a smart belt, and helps better quality milk production.

 

Tracking the heat cycle

 

Manually yielding milk to a cow that is not at its heat cycle would lead to low fertility. To continue the best quality milk, cattle must give birth to one calf a year to maintain the lactation period.

A lactating cattle undergo heat every 21-28 days but, is it possible to know that manually and that too accurately? It can be do-able but can take a lot of time.

The heat can stress down the cattle leading to lower milk production and, if yielded simultaneously, can further reduce the fat, protein, casein, and lactose content in milk.

To prevent such errors, smart collars would send alarms to the owner on its dashboard screen. It will notify when is the right time to yield, resulting in better milk production. 

 

Tracking the movement with GPS

 

The tracking collars installed with the GPS will give real-time data allowing individuals to know the accurate information and location of the animals.

The smart collar works best in the field of around 5-10 cattle, as each of them will work as a personal tracker and give owners a whole valuable time to focus on one livestock full time.

Investing over manpower comparatively seems less costly at the start. Still, as time passes by, IoT for cattle becomes a sustainable option and can help your business grow bigger in no time. 

 

Health tracking

 

Healthy eating leads to a healthier life. It works the same in all living entities on the planet. Many studies and experts say that "rumination in cattle is an indicator of health and performance"

The traditional method of visual analyses of ruminations was a process that required a workforce and was performed only when on the field. This is also limited to a particular population level; hence, the chances of errors increase.

What does one get to know about every cow's health quality by sitting idle at a comfortable place? The IoT-enabled software system will track individual cow's rumination data and will help producers to invite when one needs more attention.

Although visual observations can be trusted to access rumination activity in a cow, this method may not provide an accurate result when the challenge arrives to observe at a population level. It would hamper the health standards of the cattle. 

 

Decrease mortality with security alerts

 

What if one needs to know how much grazing a cow did on that particular day? It can only be possible by manually observing it. Furthermore, how to analyse if the rumination is being done effectively?

Monitoring the changes and behaviors of the herd is one of the most significant and time-consuming tasks.

Using IoT devices, such as smart neck belts, it gets easier to monitor fishy cattle movements. The belt sends alarms any time it detects that something is "Off."

The sensors will be embedded in the neck strap around the cow's neck, which will help farmers personally supervise the cow's movement and respond accordingly.

Smart sensors will automatically gather and store the data and will help farmers prevent any growing health issues. 

 

Control Disease Outbreaks

 

These speechless living are never going to deal with their health issues on their own. So whether or not there are any suspicious changes in their behavior, they are very likely to miss out upon some diseases.

The only way left to inspect the diseases is mostly by diagnosing yourself, which is almost certainly going to risk many other cattle lives too.

Lameness, foot and mouth disease, mastitis, and milk fever are some of the most common fatal diseases in cattle. These all can be avoided early and can save farmers from troublesome and financial crashes in the future.

The system will alert the farmer when it needs assistance with the help of an embedded smart vital monitoring device in the collar.

 

In the nutshell

 

In the world of "connecting everything," it not only connects the devices but information and data which can circulate within a span of milliseconds. So why not use the advantages of such devices when it comes to some unexpected outcomes?

Traditional methods of cattle farming are good enough. But, they might cripple milk quality and lead to a massive loss of cash flow if not looked upon. Cattle farming is not an easy job. It needs 24 hours of continuous monitoring and observations to have a successful income.

An IoT is a real-time data collection, precisely a replacement of manpower but a more refined version of it. By introducing the "smart cow" concept, the time and labor are reduced, and productivity increases.

Read more…

Have tyou ever imagined you would one day wake up to news that technology is influencing an industry as offbeat as fashion? Well, here we are in this phase of tech evolution, where we would soon get to wear apparel not from clothing lines like Gucci or Saint Laurent but probably from companies like Apple, Samsung, Google or more.

Yes, smart clothing seems to be the future of wearable technology that kickstarted with ambitious projects like Google Glass. Today, we have first generation Iot wearable technology devices like smartwatches, health monitors, FitBits and more but soon, we will have clothing with embedded sensors that will connect to the internet.

Wearable fashion or smart clothing will be part of the Internet of Things revolution and soon give us insights on our vitals, temperature, hydration levels and bring in a range of predictive analytics concepts into our everyday life.

Excited?

We are, too and that’s why we decided to publish a post that explores what smart clothing is and how it is redefining conventions.

Let’s get started.

Smart Clothing Is The Future Of The Wearables Industry

Let’s start with some numbers. Statistics from the World Economic Forum revealed that around 10% of the people around the world will wear clothes that connect to the internet by 2025. 

This is a really good sign for the smart wearable technology industry. And if you’ve been wondering if this concept is something new or fresh, it’s not. Smart clothing has been a micro niche for a long time with several sports companies like Nike and Adidas rolling out very specific lines of smart clothes for sports purposes. What is new is the approach of mainstream commercialization of smart clothing. 

The idea is to embed IoT-specific peripherals like sensors, batteries and more into the fabric of clothes and connect the entire ecosystem to an app for visualization of diverse insights. With the app, consumers can also execute a couple of fancy actions like changing the design or color of their apparel in real time, change hues and do more.

To give you a quick idea of how remarkable the concept of smart clothing is, here are some pointers:

Smart clothing is highly beneficial in keeping track of vitals in people. Technology is also being developed to monitor the accumulation of brain fluids in real time and report stakeholders and doctors about consequences.

  1. The predominant use of smart clothing lies in the sports industry, where several metrics could be monitored by coaches of individual players and the entire team to reach fitness and tournament goals.
  2. From a manufacturer’s perspective, fraudulent and unauthentic copies of labels and apparels can be eliminated from the market through codes and layered validation mechanisms.
  3. Patients in hospitals could wear smart clothing to track their recovery, reactions to medications, notify or call for nurses and doctors and more.
  4. People suffering from dementia or Alzheimers could sport smart dresses to enable their friends and families to track them from a distance.
  5. Adventurers, spelunkers, high-altitude trekkers and more could also benefit from smart clothing with details on oxygen levels, anticipated temperature, location tracking modules, hydration levels, humidity sensors and more.

Though this looks futuristic and ambitious, the biggest challenge for the smart clothing companies would be to incorporate IoT solutions into their fabrics. The human body consistently generates sweat and heat and there are chances that water from sweat could damage batteries or sensors embedded in the clothes. When we fix these concerns and deliver optimized outfits, smart clothes could very well be what we wear to work every day in the coming years.

Smart Fashion Products In The Market

Like we mentioned, smart clothes are in development and some products are actually available in the market as prototypes. Tons of Kickstarter campaigns are exploring the limitless possibilities of smart fashion as well. For a better clarity on what the products are, here are some real world examples. 

9322942281?profile=RESIZE_710x

Smart Jackets

The best product in development is the smart jacket. Since 2015, two market players - Google and Levi’s - have been collaborating to launch smart jackets with touch-sensitive fabric. In this, capacitive threads made of copper have been developed and used as the jacket’s fabric to allow users to use their smartphones by just using their hands and gestures. 

Minority Report vibes anyone?

Smart Socks

An inevitable accessory, socks have always been in our wardrobes.Smart socks are here to replace the conventional ones and give you a more immersive experience. 

What could be immersive in socks you ask? 

Well, smart socks could sense the pressure you put on your foot when walking, calculate your walking speed, the distance you cover (and could cover) and offer a detailed visualization of insights from multiple data points. This could influence the way you walk as well.

Smart Shoes

If we’re making our socks smart, why leave behind shoes? Somebody out there had a similar thought and the result is a pair of shoes that tracks your fitness, speed, pressure and most importantly, lets you control your television using your feet. All you have to do is extend your foot, point at something on your television and press a button. We’re sure there would be more experiences added to the product as it evolves.

Smart Sleepwear

Sleep has always been a concern for most of us. While some of us oversleep, a few of us hardly get good sleep. There are tons of factors influencing our sleep including our sleeping positions, stress and anxiety levels and more. 

Smart sleepwear, however, is here to fix our broken and disconnected sleep patterns by giving us insights on breathing, heart rates, sleep positions and more. The best part is that you don’t have to wear an additional wristband for this. The fabric has embedded devices that take care of its intended purposes.

Is Smart Clothing The New-Age Market Wearable?

Wearable tech plays a crucial role in the tech space because it’s probably something that could be the most integral to people. Conventional wearable technology devices like wristbands, smartwatches, eyewear and more appear and function as extensions but that’s not the case with clothing.

It is us and who we are. From a consumer’s perspective, there’s no challenge involved whatsoever in maintaining smart clothes. It’s on companies to develop and launch products that could take in the regular wear and tear of humans and be resistive to it, be washable and more. 

If these preliminary challenges are taken care of seamlessly by companies, smart clothing could easily become the new-age market wearables in the future. It’s similar to what electric vehicles are to the automotive industry. 

Wrapping Up

This is an exciting time to be alive and innovations like these are proof of our collective wisdom taking us to a new level. Let’s wait and see if our favorite clothing brands join the smart clothing bandwagon and launch personalized clothing just the way we would like it individually. Also, let’s see what more innovation is waiting on the cards in this space.

Read more…

By Ashley Ferguson

Thanks to the introduction of connected products, digital services, and increased customer expectations, it has been the trend for IoT enterprise spend to consistently increase. The global IoT market is projected to reach $1.4 trillion USD by 2027. The pressure to build IoT solutions and get a return on those investments has teams on a frantic search for IoT engineers to secure in-house IoT expertise. However, due to the complexity of IoT solutions, finding this in a single engineer is a difficult or impossible proposition.

So how do you adjust your search for an IoT engineer? The first step is to acknowledge that IoT solution development requires the fusion of multiple disciplines. Even simple IoT applications require hardware and software engineering, knowledge of protocols and connectivity, web development skills, and analytics. Certainly, there are many engineers with IoT knowledge, but complete IoT solutions require a team of partners with diverse skills. This often requires utilizing external sources to supplement the expertise gaps.

THE ANATOMY OF AN IoT SOLUTION

IoT solutions provide enterprises with opportunities for innovation through new product offerings and cost savings through refined operations. An IoT solution is an integrated bundle of technologies that help users answer a question or solve a specific problem by receiving data from devices connected to the internet. One of the most common IoT use cases is asset tracking solutions for enterprises who want to monitor trucks, equipment, inventory, or other items with IoT. The anatomy of an asset tracking IoT solution includes the following:

9266380467?profile=RESIZE_710x

This is a simple asset tracking example. For more complex solutions including remote monitoring or predictive maintenance, enterprises must also consider installation, increased bandwidth, post-development support, and UX/UI for the design of the interface for customers or others who will use the solution. Enterprise IoT solutions require an ecosystem of partners, components, and tools to be brought to market successfully.

Consider the design of your desired connected solution. Do you know where you will need to augment skills and services?

If you are in the early stages of IoT concept development and at the center of a buy vs. build debate, it may be a worthwhile exercise to assess your existing team’s skills and how they correspond with the IoT solution you are trying to build.

IoT SKILLS ASSESSMENT

  • Hardware
  • Firmware
  • Connectivity
  • Programming
  • Cloud
  • Data Science
  • Presentation
  • Technical Support and Maintenance
  • Security
  • Organizational Alignment

MAKING TIME FOR IoT APPLICATION DEVELOPMENT

The time it will take your organization to build a solution is dependent on the complexity of the application. One way to estimate the time and cost of IoT application development is with Indeema’s IoT Cost Calculator. This tool can help roughly estimate the hours required and the cost associated with the IoT solution your team is interested in building. In MachNation’s independent comparison of the Losant Enterprise IoT Platform and Azure, it was determined that developers could build an IoT solution in 30 hours using Losant and in 74-94 hours using Microsoft Azure.

As you consider IoT application development, consider the makeup of your team. Is your team prepared to dedicate hours to the development of a new solution, or will it be a side project? Enterprise IT teams are often in place to maintain existing operating systems and to ensure networks are running smoothly. In the event that an IT team is tapped to even partially build an IoT solution, there is a great chance that the IT team will need to invite partners to build or provide part of the stack.

HOW THE IoT JOB GETS DONE

Successful enterprises recognize early on that some of these skills will need to be augmented through additional people, through an ecosystem, or with software. It will require more than one ‘IoT engineer’ for the job. According to the results of a McKinsey survey, “the preferences of IoT leaders suggest a greater willingness to draw capabilities from an ecosystem of technology partners, rather than rely on homegrown capabilities.”

IoT architecture alone is intricate. Losant, an IoT application enablement platform, is designed with many of the IoT-specific components already in place. Losant enables users to build applications in a low-to-no code environment and scale them up to millions of devices. Losant is one piece in the wider scope of an IoT solution. In order to build a complete solution, an enterprise needs hardware, software, connectivity, and integration. For those components, our team relies on additional partners from the IoT ecosystem.

The IoT ecosystem, also known as the IoT landscape, refers to the network of IoT suppliers (hardware, devices, software platforms, sensors, connectivity, software, systems integrators, data scientists, data analytics) whose combined services help enterprises create complete IoT solutions. At Losant, we’ve built an IoT ecosystem with reliable experienced partners. When IoT customers need custom hardware, connectivity, system integrators, dev shops, or other experts with proven IoT expertise, we can tap one of our partners to help in their areas of expertise.

SECURE, SCALABLE, SEAMLESS IoT

Creating secure, scalable, and seamless IoT solutions for your environment begins by starting small. Starting small gives your enterprise the ability to establish its ecosystem. Teams can begin with a small investment and apply learnings to subsequent projects. Many IoT success stories begin with enterprises setting out to solve one problem. The simple beginnings have enabled them to now reap the benefits of the data harvest in their environments.

Originally posted here.

Read more…

By Tony Pisani

For midstream oil and gas operators, data flow can be as important as product flow. The operator’s job is to safely move oil and natural gas from its extraction point (upstream), to where it’s converted to fuels (midstream), to customer delivery locations (downstream). During this process, pump stations, meter stations, storage sites, interconnection points, and block valves generate a substantial volume and variety of data that can lead to increased efficiency and safety.

“Just one pipeline pump station might have 6 Programmable Logic Controllers (PLCs), 12 flow computers, and 30 field instruments, and each one is a source of valuable operational information,” said Mike Walden, IT and SCADA Director for New Frontier Technologies, a Cisco IoT Design-In Partner that implements OT and IT systems for industrial applications. Until recently, data collection from pipelines was so expensive that most operators only collected the bare minimum data required to comply with industry regulations. That data included pump discharge pressure, for instance, but not pump bearing temperature, which helps predict future equipment failures.

A turnkey solution to modernize midstream operations

Now midstream operators are modernizing their pipelines with Industrial Internet of Things (IIoT) solutions. Cisco and New Frontier Technologies have teamed up to offer a solution combining the Cisco 1100 Series Industrial Integrated Services Router, Cisco Edge Intelligence, and New Frontier’s know-how. Deployed at edge locations like pump stations, the solution extracts data from pipeline equipment and is sent via legacy protocols, transforming data at the edge to a format that analytics and other enterprise applications understand. The transformation also minimizes bandwidth usage.

Mike Walden views the Cisco IR1101 as a game-changer for midstream operators. He shared with me that “Before the Cisco IR1101, our customers needed four separate devices to transmit edge data to a cloud server—a router at the pump station, an edge device to do protocol conversion from the old to the new, a network switch, and maybe a firewall to encrypt messages…With the Cisco IR1101, we can meet all of those requirements with one physical device.”

Collect more data, at almost no extra cost

Using this IIoT solution, midstream operators can for the first time:

  • Collect all available field data instead of just the data on a polling list. If the maintenance team requests a new type of data, the operations team can meet the request using the built-in protocol translators in Edge Intelligence. “Collecting a new type of data takes almost no extra work,” Mike said. “It makes the operations team look like heroes.”
  • Collect data more frequently, helping to spot anomalies. Recording pump discharge pressure more frequently, for example, makes it easier to detect leaks. Interest in predicting (rather than responding to) equipment failure is also growing. The life of pump seals, for example, depends on both the pressure that seals experience over their lifetime and the peak pressures. “If you only collect pump pressure every 30 minutes, you probably missed the spike,” Mike explained. “If you do see the spike and replace the seal before it fails, you can prevent a very costly unexpected outage – saving far more than the cost of a new seal.”
  • Protect sensitive data with end-to-end security. Security is built into the IR1101, with secure boot, VPN, certificate-based authentication, and TLS encryption.
  • Give IT and OT their own interfaces so they don’t have to rely on the other team. The IT team has an interface to set up network templates to make sure device configuration is secure and consistent. Field engineers have their own interface to extract, transform, and deliver industrial data from Modbus, OPC-UA, EIP/CIP, or MQTT devices.

As Mike summed it up, “It’s finally simple to deploy a secure industrial network that makes all field data available to enterprise applications—in less time and using less bandwidth.”

Originally posted here.

Read more…

The head is surely the most complex group of organs in the human body, but also the most delicate. The assessment and prevention of risks in the workplace remains the first priority approach to avoid accidents or reduce the number of serious injuries to the head. This is why wearing a hard hat in an industrial working environment is often required by law and helps to avoid serious accidents.

This article will give you an overview of how to detect that the wearing of a helmet is well respected by all workers using a machine learning object detection model.

For this project, we have been using:

  • Edge Impulse Studi to acquire some custom data, visualize the data, train the machine learning model and validate the inference results.
  • Part of this public dataset from Roboflow, where the images containing the smallest bounding boxes has been removed.
  • Part of the Flicker-Faces-HQ (FFHQ) (under Creative Commons BY 2.0 license) to rebalance the classes in our dataset.
  • Google Colab to convert the Yolo v5 PyTorch format from the public dataset to Edge Impulse Ingestion format.
  • A Rasberry Pi, NVIDIA Jetson Nano or with any Intel-based Macbooks to deploy the inference model.

Before we get started, here are some insights of the benefits / drawbacks of using a public dataset versus collecting your own. 

Using a public dataset is a nice-to-have to start developing your application quickly, validate your idea and check the first results. But we often get disappointed with the results when testing on your own data and in real conditions. As such, for very specific applications, you might spend much more time trying to tweak an open dataset rather than collecting your own. Also, remember to always make sure that the license suits your needs when using a dataset you found online.

On the other hand, collecting your own dataset can take a lot of time, it is a repetitive task and most of the time annoying. But, it gives the possibility to collect data that will be as close as possible to your real life application, with the same lighting conditions, the same camera or the same angle for example. Therefore, your accuracy in your real conditions will be much higher. 

Using only custom data can indeed work well in your environment but it might not give the same accuracy in another environment, thus generalization is harder.

The dataset which has been used for this project is a mix of open data, supplemented by custom data.

First iteration, using only the public datasets

At first, we tried to train our model only using a small portion of this public dataset: 176 items in the training set and 57 items in the test set where we took only images containing a bounding box bigger than 130 pixels, we will see later why. 

Rav03Ny7X2bh1iOSftwHgggWj31SyQWk-sl_k4Uot4Jpw3eMg9XgYYrIyajogGfGOfL8j7qttiAWAcsABUgcoHUIg1QfYQRxeZfF_dnSFpsSiXhiIHduAZI9x6qcgikCcluR24r1

If you go through the public dataset, you can see that the entire dataset is strongly missing some “head” data samples. The dataset is therefore considered as imbalanced.

Several techniques exist to rebalance a dataset, here, we will add new images from Flicker-Faces-HQ (FFHQ). These images do not have bounding boxes but drawing them can be done easily in the Edge Impulse Studio. You can directly import them using the uploader portal. Once your data has been uploaded, just draw boxes around the heads and give it a label as below: 

AcihTfl2wibfy9LOzSUuPKEcF7IupGPOzPOmMmNi2LUq8sV7I2IVT5W4-7GGS8wJVD1o7VIQ5e7utCkQ1qT2xLawW7mQsTGL_WNuWIVIp5v89sCZt9gZ9fX7fwHo0PG9A3SDBCqV

Now that the dataset is more balanced, with both images and bounding boxes of hard hats and heads, we can create an impulse, which is a mix of digital signal processing (DSP) blocks and training blocks:

_qwt-WMdXI4Oc7BkNQfyEYZKV5MvziDkt1UUl1Hrx-65u_Uf-L_qEUmHMx_qN5Xh-r5vpn8JxCgpJvcT2v4-hWD9ZHE_wJjDgCCXZXxTkOtcTKSKGizDx9ZQO0KnBvvmaBCA1QvD

In this particular object detection use case, the DSP block will resize an image to fit the 320x320 pixels needed for the training block and extract meaningful features for the Neural Network. Although the extracted features don’t show a clear separation between the classes, we can start distinguishing some clusters:

zr70Lpe0Rg3wap9FWoGrco1pfT6L3TWUxYds3NhM_uHMhFDDr89KcLTH_OXIgKs6BrMdP7iihoz8t64Mk2JtbpTfmBAXyRYukNS9zxLk9zuGjZLqvakkgw6oOBuIhiVAzcMcZu9E

To train the model, we selected the Object Detection training block, which fine tunes a pre-trained object detection model on your data. It gives a good performance even with relatively small image datasets. This object detection learning block relies on MobileNetV2 SSD FPN-Lite 320x320.    

According to Daniel Situnayake, co-author of the TinyML book and founding TinyML engineer at Edge Impulse, this model “works much better for larger objects—if the object takes up more space in the frame it’s more likely to be correctly classified.” This has been one of the reason why we got rid of the images containing the smallest bounding boxes in our import script.

After training the model, we obtained a 61.6% accuracy on the training set and 57% accuracy on the testing set. You also might note a huge accuracy difference between the quantized version and the float32 version. However, during the linux deployment, the default model uses the unoptimized version. We will then focus on the float32 version only in this article.

fWwhQWxxLkAdnsFKuIUc2Lf2Lzji9m2uXux5cr3CmLf2cP8fiE_RQHaqJxekyBI3oIzOS81Jwoe6aBPfi1OFgEJSS3XQWnzR9nJ3eTY2M5JNVG38H3Dro2WZH3ltruXn_pUZkVvw

This accuracy is not satisfying, and it tends to have trouble detecting the right objects in real conditions:

hardhat_bad_82fbd9a22a.gif

Second iteration, adding custom data

On the second iteration of this project, we have gone through the process of collecting some of our own data. A very useful and handy way to collect some custom data is using our mobile phone. You can also perform this step with the same camera you will be using in your factory or your construction site, this will be even closer to the real condition and therefore work best with your use case. In our case, we have been using a white hard hat when collecting data. For example, if your company uses yellow ones, consider collecting your data with the same hard hats. 

Once the data has been acquired, go through the labeling process again and retrain your model. 

_f7J4zddenmarUiTf3VMyOz_kG70nieiEkSwR8kB3JhJE5K1IqCdttj4aOtrfzv4QYWXJ4Y9u_0MU1xKfFsU8hUB5gj00Y1E7oKlixjmhNB2p7VIqoamD9migXXPkAOrFRGVFfIo

We obtain a model that is slightly more accurate when looking at the training performances. However, in real conditions, the model works far better than the previous one.

NXnwDbkaWEia7qyM20U2kexTiWBSOXam_ACEGxzKCJ8kYtmxS7eCTMZsuwXJrjvkFUVb9YbSqwS7EOGiE4wu_FFGQ4YOufAB-JZA_uCOEoHO8D75ke6YU4H6QKnCBJyJA0hD4Lw3

Finally, to deploy your model on yourA Rasberry Pi, NVIDIA Jetson Nano or your Intel-based Macbook, just follow the instructions provided in the links. The command line interface `edge-impulse-linux-runner` will create a lightweight web interface where you can see the results.

hardhat_good_18d9e33d3a.gif

Note that the inference is run locally and you do not need any internet connection to detect your objects. Last but not least, the trained models and the inference SDK are open source. You can use it, modify it and integrate it to a broader application matching specifically to your needs such as stopping a machine when a head is detected for more than 10 seconds.

This project has been publicly released, feel free to have a look at it on Edge Impulse studio, clone the project and go through every steps to get a better understanding: https://studio.edgeimpulse.com/public/34898/latest

The essence of this use case is, Edge Impulse allows with very little effort to develop industry grade solutions in the health and safety context. Now this can be embedded in bigger industrial control and automation systems with a consistent and stringent focus on machine operations linked to H&S complaint measures. Pre-training models, which later can be easily retrained in the final industrial context as a step of “calibration,” makes this a customizable solution for your next project.

Originally posted on the Edge Impulse blog by Louis Moreau - User Success Engineer at Edge Impulse & Mihajlo Raljic - Sales EMEA at Edge Impulse

Read more…

By GE Digital

“The End of Cloud Computing.” “The Edge Will Eat The cloud.” “Edge Computing—The End of Cloud Computing as We Know It.”  

Such headlines grab attention, but don’t necessarily reflect reality—especially in Industrial Internet of Things (IoT) deployments. To be sure, edge computing is rapidly emerging as a powerful force in turning industrial machines into intelligent machines, but to paraphrase Mark Twain: “The reports of the death of cloud are greatly exaggerated.” 

The Tipping Point: Edge Computing Hits Mainstream

We’ve all heard the stats—billions and billions of IoT devices, generating inconceivable amounts of big data volumes, with trillions and trillions of U.S. dollars to be invested in IoT over the next several years. Why? Because industrials have squeezed every ounce of productivity and efficiency out of operations over the past couple of decades, and are now looking to digital strategies to improve production, performance, and profit. 

The Industrial Internet of Things (IIoT) represents a world where human intelligence and machine intelligence—what GE Digital calls minds and machines—connect to deliver new value for industrial companies. 

In this new landscape, organizations use data, advanced analytics, and machine learning to drive digital industrial transformation. This can lead to reduced maintenance costs, improved asset utilization, and new business model innovations that further monetize industrial machines and the data they create. 

Despite the “cloud is dead” headlines, GE believes the cloud is still very important in delivering on the promise of IIoT, powering compute-intense workloads to manage massive amounts of data generated by machines. However, there’s no question that edge computing is quickly becoming a critical factor in the total IIoT equation.

“The End of Cloud Computing.” “The Edge Will Eat The cloud.” “Edge Computing—The End of Cloud Computing as We Know It.”  

Such headlines grab attention, but don’t necessarily reflect reality—especially in Industrial Internet of Things (IoT) deployments. To be sure, edge computing is rapidly emerging as a powerful force in turning industrial machines into intelligent machines, but to paraphrase Mark Twain: “The reports of the death of cloud are greatly exaggerated.”

The Tipping Point: Edge Computing Hits Mainstream

We’ve all heard the stats—billions and billions of IoT devices, generating inconceivable amounts of big data volumes, with trillions and trillions of U.S. dollars to be invested in IoT over the next several years. Why? Because industrials have squeezed every ounce of productivity and efficiency out of operations over the past couple of decades, and are now looking to digital strategies to improve production, performance, and profit. 

The Industrial Internet of Things (IIoT) represents a world where human intelligence and machine intelligence—what GE Digital calls minds and machines—connect to deliver new value for industrial companies. 

In this new landscape, organizations use data, advanced analytics, and machine learning to drive digital industrial transformation. This can lead to reduced maintenance costs, improved asset utilization, and new business model innovations that further monetize industrial machines and the data they create. 

Despite the “cloud is dead” headlines, GE believes the cloud is still very important in delivering on the promise of IIoT, powering compute-intense workloads to manage massive amounts of data generated by machines. However, there’s no question that edge computing is quickly becoming a critical factor in the total IIoT equation. 

What is edge computing? 

The “edge” of a network generally refers to technology located adjacent to the machine which you are analyzing or actuating, such as a gas turbine, a jet engine, or magnetic resonance (MR) scanner. 

Until recently, edge computing has been limited to collecting, aggregating, and forwarding data to the cloud. But what if instead of collecting data for transmission to the cloud, industrial companies could turn massive amounts of data into actionable intelligence, available right at the edge? Now they can. 

This is not just valuable to industrial organizations, but absolutely essential.

Edge computing vs. Cloud computing 

Cloud and edge are not at war … it’s not an either/or scenario. Think of your two hands. You go about your day using one or the other or both depending on the task. The same is true in Industrial Internet workloads. If the left hand is edge computing and the right hand is cloud computing, there will be times when the left hand is dominant for a given task, instances where the right hand is dominant, and some cases where both hands are needed together. 

Scenarios in which edge computing will take a leading position include things such as low latency, bandwidth, real-time/near real-time actuation, intermittent or no connectivity, etc. Scenarios where cloud will play a more prominent role include compute-heavy tasks, machine learning, digital twins, cross-plant control, etc. 

The point is you need both options working in tandem to provide design choices across edge to cloud that best meet business and operational goals.

Edge Computing and Cloud Computing: Balance in Action 

Let’s look at a couple of illustrations. In an industrial context, examples of intelligent edge machines abound—pumps, motors, sensors, blowout preventers and more benefit from the growing capabilities of edge computing for real-time analytics and actuation. 

Take locomotives. These modern 200 ton digital machines carry more than 200 sensors that can pump one billion instructions per second. Today, applications can not only collect data locally and respond to changes on that data, but they can also perform meaningful localized analytics. GE Transportation’s Evolution Series Tier 4 Locomotive uses on-board edge computing to analyze data and apply algorithms for running smarter and more efficiently. This improves operational costs, safety, and uptime. 

Sending all that data created by the locomotive to the cloud for processing, analyzing, and actuation isn’t useful, practical, or cost-effective. 

Now let’s switch gears (pun intended) and talk about another mode of transportation—trucking. Here’s an example where edge plays an important yet minor role, while cloud assumes a more dominant position. In this example, the company has 1,000 trucks under management. There are sensors on each truck tracking performance of the vehicle such as engine, transmission, electrical, battery, and more. 

But in this case, instead of real-time analytics and actuation on the machine (like our locomotive example), the data is being ingested, then stored and forwarded to the cloud where time series data and analytics are used to track performance of vehicle components. The fleet operator then leverages a fleet management solution for scheduled maintenance and cost analysis. This gives him or her insights such as the cost over time per part type, or the median costs over time, etc. The company can use this data to improve uptime of its vehicles, lower repair costs, and improve the safe operation of the vehicle.

What’s next in edge computing 

While edge computing isn’t a new concept, innovation is now beginning to deliver on the promise—unlocking untapped value from the data being created by machines. 

GE has been at the forefront of bridging minds and machines. Predix Platform supports a consistent execution environment across cloud and edge devices, helping industrials achieve new levels of performance, production, and profit.

Originally posted here.

Read more…

Computer vision is fundamental to capturing real-world data within the IoT. Arm technology provides a secure ecosystem for smart cameras in business, industrial and home applications

By Mohamed Awad, VP IoT & Embedded, Arm

Computer vision leverages artificial intelligence (AI) to enable devices such as smart cameras to interpret and understand what is happening in an image. Recreating a sensor as powerful as the human eye with technology opens up a wide and varied range of use cases for computers to perform tasks that previously required human sight – so it’s no wonder that computer vision is quickly becoming one of the most important ways to capture and act on real-world data within the Internet of Things (IoT).

Smart cameras now use computer vision in a range of business and industrial applications, from counting cars in parking lots to monitoring footfall in retail stores or spotting defects on a production line. And in the home, smart cameras can tell us when a package has been delivered, whether the dog escaped from the back yard or when our baby is awake.

Across the business and consumer worlds, the adoption of smart camera technology is growing exponentially. In its 2020 report “Cameras and Computing for Surveillance and Security”, market research and strategy consulting company Yole Développement estimates that for surveillance alone, there are approximately one billion cameras across the world. That number of installations is expected to double by 2024.

This technology features key advancements in security, heterogeneous computing, image processing and cloud services – enabling future computer vision products that are more capable than ever.

Smart camera security is top priority for computer vision

IoT security is a key priority and challenge for the technology industry. It’s important that all IoT devices are secure from exploitation by malicious actors, but it’s even more critical when that device captures and stores image data about people, places and high-value assets.

Unauthorized access to smart cameras tasked with watching over factories, hospitals, schools or homes would not only be a significant breach of privacy, it could also lead to untold harm—from plotting crimes to the leaking of confidential information. Compromising a smart camera could also provide a gateway, giving a malicious actor access to other devices within the network – from door, heating and lighting controls to control over an entire smart factory floor.

We need to be able to trust smart cameras to maintain security for us all, not open up new avenues for exploitation. Arm has embraced the importance of security in IoT devices for many years through its product portfolio offerings such as Arm TrustZone for both Cortex-A and Cortex-M.

In the future, smart camera chips based on the Armv9 architecture will add further security enhancements for computer vision products through the Arm Confidential Compute Architecture (CCA).

Further to this, Arm promotes common standards of security best practice such as PSA Certified and PARSEC. These are designed to ensure that all future smart camera deployments have built-in security, from the point the image sensor first records the scene to storage, whether that data is stored locally or in the cloud by using advanced security and data encryption techniques.

Endpoint AI powers computer vision in smart camera devices

9197834489?profile=RESIZE_710x

The combination of image sensor technology and endpoint AI is enabling smart cameras to infer increasingly complex insights from the vast amounts of computer vision data they capture. New machine learning capabilities within smart camera devices meet a diverse range of use cases – such as detecting individual people or animals, recognizing specific objects and reading license plates. All of these applications for computer vision require ML algorithms running on the endpoint device itself, rather than sending data to the cloud for inference. It’s all about moving compute closer to data.

For example, a smart camera employed at a busy intersection could use computer vision to determine the number and type of vehicles waiting at a red signal at various hours throughout the day. By processing its own data and inferring meaning using ML, the smart camera could automatically adjust its timings in order to reduce congestion and limit build-up of emissions automatically without human involvement.

Arm’s investment in AI for applications in endpoints and beyond is demonstrated through its range of Ethos machine learning processors: highly scalable and efficient NPUs capable of supporting a range of 0.1 to 10 TOP/s through many-core technologies. Software also plays a vital role in ML and this is why Arm continues to support the open-source community through the Arm NN SDK and TensorFlow Lite for Microcontrollers (TFLM) open-source frameworks.

These machine learning workload frameworks are based on existing neural networks and power-efficient Arm Cortex-A CPUs, Mali GPUs and Ethos NPUs as well as Arm Compute library and CMSIS-NN – a collection of low-level machine learning functions optimized for Cortex-A CPU, Cortex-M CPU and Mali GPU architectures.

The Armv9 architecture supports enhanced AI capabilities, too, by providing accessible vector arithmetic (individual arrays of data that can be computed in parallel) via Scalable Vector Extension 2 (SVE2). This enables scaling of the hardware vector length without having to rewrite or recompile code. In the future, extensions for matrix multiplication (a key element in enhancing ML) will push the AI envelope further.

Smart cameras connected in the cloud

Cloud and edge computing is also helping to expedite the adoption of smart cameras. Traditional CCTV architectures saw camera data stored on-premises via a Network Video Recorder (NVR) or a Digital Video Recorder (DVR). This model had numerous limitations, from the vast amount of storage required to the limited number of physical connections on each NVR.

Moving to a cloud-native model simplifies the rollout of smart cameras enormously: any number of cameras can be provisioned and managed via a configuration file downloaded to the device. There’s also a virtuous cycle at play: Data from smart cameras can be now used to train the models in the cloud for specific use-cases so that cameras become even smarter. And the smarter they become, the less data they need to send upstream.

The use of cloud computing also enables automation of processes via AI sensor fusion by combining computer vision data from multiple smart cameras. Taking our earlier example of the smart camera placed at a road intersection, cloud AI algorithms could combine data from multiple cameras to constantly adjust traffic light timings holistically across an entire city, keeping traffic moving.

Arm enables the required processing continuum from cloud to endpoint. Cortex-M microcontrollers and Cortex-A processors power smart cameras, with Cortex-A processors also powering edge gateways. Cloud and edge servers harness the capabilities of the Neoverse platform.

New hardware and software demands on smart cameras

9197835086?profile=RESIZE_710x

The compute needs for computer vision devices continue to grow year over year, with ultra-high resolution video capture (8K 60fps) and 64-bit (Armv8-A) processing marking the current standard for high-end smart camera products.

As a result, the system-on-chip (SoC) within next-generation smart cameras will need to embrace heterogenous architectures, combining CPUs, GPUs, NPUs alongside dedicated hardware for functions like computer vision, image processing, video encoding and decoding.

Storage, too, is a key concern: While endpoint AI can reduce storage requirements by processing images locally on the camera, many use cases will require that data be retained somewhere for safety and security – whether on the device, in edge servers or in the cloud.

To ensure proper storage of high-resolution computer vision data, new video encoding and decoding standards such as H.265 and AV1 are becoming the de facto standard.

New use cases driving continuous innovation

Overall, the demands from the new use cases are driving the need for continuous improvement in computing and imaging technologies across the board.

When we think about image-capturing devices such as CCTV cameras today, we should no longer imagine grainy images of barely recognizable faces passing by a camera. Advancements in computer vision – more efficient and powerful compute coupled with the intelligence of AI and machine learning – are making smart cameras not just image sensors but image interpreters. This bridge between the analog and digital worlds is opening up new classes of applications and use cases that were unimaginable a few years ago.

Originally posted here.

Read more…

TinyML focuses on optimizing machine learning (ML) workloads so that they can be processed on microcontrollers no bigger than a grain of rice and consuming only milliwatts of power.

By Arm Blueprint staff
 

TinyML focuses on the optimization of machine learning (ML) workloads so that they can be processed on microcontrollers no bigger than a grain of rice and consuming only a few milliwatts of power.

TinyML gives tiny devices intelligence. We mean tiny in every sense of the word: as tiny as a grain of rice and consuming tiny amounts of power. Supported by Arm, Google, Qualcomm and others, tinyML has the potential to transform the Internet of Things (IoT), where billions of tiny devices, based on Arm chips, are already being used to provide greater insight and efficiency in sectors including consumer, medical, automotive and industrial.

Why target microcontrollers with tinyML?

Microcontrollers such as the Arm Cortex-M family are an ideal platform for ML because they’re already used everywhere. They perform real-time calculations quickly and efficiently, so they’re reliable and responsive, and because they use very little power, can be deployed in places where replacing the battery is difficult or inconvenient. Perhaps even more importantly, they’re cheap enough to be used just about anywhere. The market analyst IDC reports that 28.1 billion microcontrollers were sold in 2018, and forecasts that annual shipment volume will grow to 38.2 billion by 2023.

TinyML on microcontrollers gives us new techniques for analyzing and making sense of the massive amount of data generated by the IoT. In particular, deep learning methods can be used to process information and make sense of the data from sensors that do things like detect sounds, capture images, and track motion.

Advanced pattern recognition in a very compact format

Looking at the math involved in machine learning, data scientists found they could reduce complexity by making certain changes, such as replacing floating-point calculations with simple 8-bit operations. These changes created machine learning models that work much more efficiently and require far fewer processing and memory resources.

TinyML technology is evolving rapidly thanks to new technology and an engaged base of committed developers. Only a few years ago, we were celebrating our ability to run a speech-recognition model capable of waking the system if it detects certain words on a constrained Arm Cortex-M3 microcontroller using just 15 kilobytes (KB) of code and 22KB of data.

Since then, Arm has launched new machine learning (ML) processors, called the Ethos-U55 and Ethos-U65, a microNPU specifically designed to accelerate ML inference in embedded and IoT devices.

The Ethos-U55, combined with the AI-capable Cortex-M55 processor, will provide a significant uplift in ML performance and improvement in energy efficiency over the already impressive examples we are seeing today.

TinyML takes endpoint devices to the next level

The potential use cases of tinyML are almost unlimited. Developers are already working with tinyML to explore all sorts of new ideas: responsive traffic lights that change signaling to reduce congestion, industrial machines that can predict when they’ll need service, sensors that can monitor crops for the presence of damaging insects, in-store shelves that can request restocking when inventory gets low, healthcare monitors that track vitals while maintaining privacy. The list goes on.

TinyML can make endpoint devices more consistent and reliable, since there’s less need to rely on busy, crowded internet connections to send data back and forth to the cloud. Reducing or even eliminating interactions with the cloud has major benefits including reduced energy use, significantly reduced latency in processing data and security benefits, since data that doesn’t travel is far less exposed to attack. 

It’s worth nothing that these tinyML models, which perform inference on the microcontroller, aren’t intended to replace the more sophisticated inference that currently happens in the cloud. What they do instead is bring specific capabilities down from the cloud to the endpoint device. That way, developers can save cloud interactions for if and when they’re needed. 

TinyML also gives developers a powerful new set of tools for solving problems. ML makes it possible to detect complex events that rule-based systems struggle to identify, so endpoint AI devices can start contributing in new ways. Also, since ML makes it possible to control devices with words or gestures, instead of buttons or a smartphone, endpoint devices can be built more rugged and deployable in more challenging operating environments. 

TinyML gaining momentum with an expanding ecosystem

Industry players have been quick to recognize the value of tinyML and have moved rapidly to create a supportive ecosystem. Developers at every level, from enthusiastic hobbyists to experienced professionals, can now access tools that make it easy to get started. All that’s needed is a laptop, an open-source software library and a USB cable to connect the laptop to one of several inexpensive development boards priced as low as a few dollars.

In fact, at the start of 2021, Raspberry Pi released its very first microcontroller board, one of the most affordable development board available in the market at just $4. Named Raspberry Pi Pico, it’s powered by the RP2040 SoC, a surprisingly powerful dual Arm Cortex-M0+ processor. The RP2040 MCU is able to run TensorFlow Lite Micro and we’re expecting to see a wide range of ML use cases for this board over the coming months.

Arm is a strong proponent of tinyML because our microcontroller architectures are so central to the IoT, and because we see the potential of on-device inference. Arm’s collaboration with Google is making it even easier for developers to deploy endpoint machine learning in power-conscious environments.

The combination of Arm CMSIS-NN libraries with Google’s TensorFlow Lite Micro (TFLu) framework, allows data scientists and software developers to take advantage of Arm’s hardware optimizations without needing to become experts in embedded programming.

On top of this, Arm is investing in new tools derived from Keil MDK to help developers get from prototype to production when deploying ML applications.

TinyML would not be possible without a number of early influencers. Pete Warden, a “founding father” of tinyML and a technical lead of TensorFlow Lite Micro at Google,&nbspArm Innovator, Kwabena Agyeman, who developed OpenMV, a project dedicated to low-cost, extensible, Python-powered machine-vision modules that support machine learning algorithms, and Arm Innovator, Daniel Situnayake a founding tinyML engineer and developer from Edge Impulse, a company that offers a full tinyML pipeline that covers data collection, model training and model optimization. Also, Arm partners such as Cartesiam.ai, a company that offers NanoEdge AI, a tool that creates software models on the endpoint based on the sensor behavior observed in real conditions have been pushing the possibilities of tinyML to another level. 

Arm, is also a partner of the TinyML Foundation, an open community that coordinates meet-ups to help people connect, share ideas, and get involved. There are many localised tinyML meet-ups covering UK, Israel and Seattle to name a few, as well as a global series of tinyML Summits. For more information, visit the tinyML foundation website.

Originally posted here.

Read more…

Once again, I’m jumping up and down in excitement because I’m going to be hosting a panel discussion as part of a webinar series — Fast and Fearless: The Future of IoT Software Development — being held under the august auspices of IotCentral.io

At this event, the second of a four-part series, we will be focusing on “AI and IoT Innovation” (see also What the FAQ are AI, ANNs, ML, DL, and DNNs? and What the FAQ are the IoT, IIoT, IoHT, and AIoT?).

9132441666?profile=RESIZE_400x

Panel members Karl Fezer (upper left), Wei Xiao (upper right), Nikhil Bhaskaran (lower left), and Tina Shyuan (bottom right) (Click image to see a larger version)

As we all know, the IoT is transforming the software landscape. What used to be a relatively straightforward embedded software stack has been revolutionized by the IoT, with developers now having to juggle specialized workloads, security, artificial intelligence (AI) and machine learning (ML), real-time connectivity, managing devices that have been deployed into the field… the list goes on.

In this webinar — which will be held on Tuesday 29 June 2021 from 10:00 a.m. to 11:00 a.m. CDT — I will be joined by four industry luminaries to discuss how to juggle the additional complexities that machine learning adds to IoT development, why on-device machine learning is more important now than ever, and what the combination of AI and IoT looks like for developers in the future.

The luminaries in question (and whom I will be questioning) are Karl Fezer (AI Ecosystem Evangelist at Arm), Wei Xiao (Principal Engineer, Sr. Strategic Alliances Manager at Nvidia), Nikhil Bhaskaran (Founder of Shunya OS), and Tina Shyuan (Director of Product Marketing at Qeexo).

So, what say you? Dare I hope that we will have the pleasure of your company and that you will be able to join us to (a) tease your auditory input systems with our discussions and (b) join our question-and-answer free-for-all frensy at the end? If so, may I suggest that you Register Now before all of the good virtual seats are taken, metaphorically speaking, of course.

>> Clicke here to register

Read more…

What is 5G NR (New Radio)?

by Gus Vos

Unless you have been living under a rock, you have been seeing and hearing a lot about&nbsp5G these days. In addition, if you are at all involved in Internet of Things (IoT) or other initiatives at your organization that use cellular networking technologies, you have also likely heard about 5G New Radio, otherwise known as 5G NR, the new 5G radio access technology specification.

However, all the jargon, hype, and sometimes contradictory statements made by solution providers, the media, and analysts regarding 5G and 5G NR can make it difficult to understand what 5G NR actually is, how it works, what its advantages are, to what extent it is different than other cellular radio access technologies, and perhaps most importantly, how your organization can use this new radio access technology.

In this blog, we will provide you with an overview on 5G NR, offering you answers to these and other basic 5G NR questions – with a particular focus on what these answers mean for those in the IoT industry. 

We can’t promise to make you a 5G NR expert with this blog – but we can say that if you are confused about 5G NR before reading it, you will come away afterward with a better understanding of what 5G NR is, how it works, and how it might transform your industry.

What is the NR in 5G NR?

As its name implies, 5G New Radio or 5G NR is the new radio access technology specification found in the 5G standard. 

Set by the 3rd Generation Partnership Project (3GPP) telecommunications standards group, the 5G NR specification defines how 5G NR edge devices (smart phones, embedded modules, routers, and gateways) and 5G NR network infrastructure (base stations, small cells, and other Radio Access Network equipment) wirelessly transmit data. To put it another way, 5G NR describes how 5G NR edge devices and 5G NR network infrastructure use radio waves to talk to each other. 

5G NR is a very important part of 5G. After all, it describes how 5G solutions will use radio waves to wirelessly transmit data faster and with less latency than previous radio access technology specifications. However, while 5G NR is a very important part of the new 5G standard, it does not encompass everything related to 5G. 

For example, 5G includes a new core network architecture standard (appropriately named 5G Core Network or 5GCN) that specifies the architecture of the network that collects, processes, and routes data from edge devices and then sends this data to the cloud, other edge devices, or elsewhere. The 5GCN will improve 5G networks’ operational capacity, efficiency, and performance.

However, 5GCN is not a radio access technology like 5G NR, but rather a core network technology. In fact, networks using the 5GCN core network will be able to work with previous types of radio access technologies – like LTE. 

Is 5G NR one of 5G’s most important new technological advancements? Yes. But it is not the only technological advancement to be introduced by 5G.  

How does 5G NR work?

Like all radio access communications technology specifications, the 5G NR specification describes how edge devices and network infrastructure transmit data to each other using electromagnetic radio waves. Depending on the frequency of the electromagnetic waves (how long the wave is), it occupies a different part of the wireless spectrum.

Some of the waves that 5G NR uses have frequencies of between 400 MHz and 6 GHz. These waves occupy what is called sub-6 spectrum (since their frequencies are all under 6 GHz).

This sub-6 spectrum is used by other cellular radio access technologies, like LTE, as well. In the past, using different cellular radio access technologies like this over the same spectrum would lead to unmanageable interference problems, with the different technologies radio waves interfering with each other. 

One of 5G NR’s many advantages is that it’s solved this problem, using a technology called Dynamic Spectrum Sharing (DSS). This DSS technology allows 5G NR signals to use the same band of spectrum as LTE and other cellular technologies, like LTE-M and NB-IoT. This allows 5G NR networks to be rolled out without shutting down LTE or other networks that support existing LTE smart phones or IoT devices. You can learn more about DSS, and how it speeds the rollout of 5G NR while also extending the life of IoT devices, here.

One of 5G NR’s other major advancements is that it does not just use waves in the sub-6 spectrum to transmit data. The 5G NR specification also specifies how edge devices and network infrastructure can use radio waves in bands between 24 GHz and 52 GHz to transmit data.

These millimeter wave (mmWave) bands greatly expand the amount of spectrum available for wireless data communications. The lack of spectrum capacity has been a problem in the past, as there is a limited number of bands of sub-6 spectrum available for organizations to use for cellular communications, and many of these bands are small. Lack of available capacity and narrow spectrum bands led to network congestion, which limits the amount of data that can be transmitted over networks that use sub-6 spectrum. 

mmWave opens up a massive amount of new wireless spectrum, as well as much broader bands of wireless spectrum for cellular data transmission. This additional spectrum and these broader spectrum bands increase the capacity (amount of data) that can be transmitted over these bands, enabling 5G NR mmWave devices to achieve data speeds that are four or more times faster than devices that use just sub-6 spectrum. 

The additional wireless capacity provided by mmWave also reduces latency (the time between when device sends a signal and when it receives a response). By reducing latency from 10 milliseconds with sub-6 devices to 3-4 milliseconds or lower with 5G NR mmWave devices, 5G enables new industrial automation, autonomous vehicle and immersive gaming use cases, as well as Virtual Reality (VR), Augmented Reality (AR), and similar Extended Reality (XR) use cases, all of which require very low latency. 

On the other hand, these new mmWave devices and network infrastructure come with new technical requirements, as well as drawbacks associated with their use of mmWave spectrum. For example, mmWave devices use more power and generate more heat than sub-6 devices. In addition, mmWave signals have less range and do not penetrate walls and other physical objects as easily as sub-6 waves. 5G NR includes some technologies, such as beamforming and massive Multiple Input Multiple Output (MIMO) that lessen some of these range and obstacle penetration limitations – but they do not eliminate them. 

To learn more about the implications of 5G NR mmWave on the design of IoT and other products, read our blog, Seven Tips For Designing 5G NR mmWave Products.

In addition, there has been a lot written on these two different “flavors” (sub-6 and mmWave) of 5G NR. If you are interested in learning more about the differences between sub-6 5G NR and mmWave 5G NR, and how together they enable both evolutionary and revolutionary changes for Fixed Wireless Access (FWA), mobile broadband, IoT and other wireless applications, read our previous blog A Closer Look at the Five Waves of 5G.

What is the difference between 5G NR and LTE?

Though sub-6 and mmWave are very different, both types of 5G NR provide data transfer speed, latency, and other performance improvements compared to LTE, the previous radio access technology specification used for cellular communications. 

For example, outside of its use of mmWave, 5G NR features other technical advancements designed to improve network performance, including:

• Flexible numerology, which enables 5G NR network infrastructure to set the spacing between subcarriers in a band of wireless spectrum at 15, 30, 60, 120 and 240 kHz, rather than only use 15 kHz spacing, like LTE. This flexible numerology is what allows 5G NR to use mmWave spectrum in the first place. It also improves the performance of 5G NR devices that use higher sub-6 spectrum, such as 3.5 GHz C-Band spectrum, since the network can adjust the subcarrier spacing to meet the particular spectrum and use case requirements of the data it is transmitting. For example, when low latency is required, the network can use wider subcarrier spacing to help improve the latency of the transmission.
• Beamforming, in which massive MIMO (multiple-input and multiple-output) antenna technologies are used to focus wireless signal and then sweep them across areas till they make a strong connection. Beamforming helps extend the range of networks that use mmWave and higher sub-6 spectrum.  
• Selective Hybrid Automatic Repeat Request (HARQ), which allows 5G NR to break large data blocks into smaller blocks, so that when there is an error, the retransmission is smaller and results in higher data transfer speeds than LTE, which transfers data in larger blocks. 
• Faster Time Division Duplexing (TDD), which enables 5G NR networks to switch between uplink and downlink faster, reducing latency. 
• Pre-emptive scheduling, which lowers latency by allowing higher-priority data to overwrite or pre-empt lower-priority data, even if the lower-priority data is already being transmitted. 
• Shorter scheduling units that trim the minimum scheduling unit to just two symbols, improving latency.
• A new inactive state for devices. LTE devices had two states – idle and connected. 5G NR includes a new state – inactive – that reduces the time needed for an edge device to move in and out of its connected state (the state used for transmission), making the device more responsive. 

These and the other technical advancements made to 5G NR are complicated, but the result of these advancements is pretty simple – faster data speeds, lower latency, more spectrum agility, and otherwise better performance than LTE. 

Are LPWA radio access technology specifications, like NB-IoT and LTE-M, supported by 5G?

Though 5G features a new radio access technology, 5G NR, 5G supports other radio access technologies as well. This includes the Low Power Wide Area (LPWA) technologies, Narrowband IoT (NB-IoT), and Long Term Evolution for Machines (LTE-M). In fact, these LPWA standards are the standards that 5G uses to address one of its three main use cases – Massive, Machine-Type Communications (mMTC). 

Improvements have been and continue to be made to these 5G LPWA standards to address these mMTC use cases – improvements that further lower the cost of LPWA devices, reduce these devices’ power usage, and enable an even larger number of LPWA devices to connect to the network in a given area.

What are the use cases for 5G NR and 5G LPWA Radio Access Technologies?

Today, LTE supports three basic use cases:

• Voice: People today can use LTE to talk to each other using mobile devices. 
• Mobile broadband (MBB): People can use smartphones, tablets, mobile and other edge devices to view videos, play games, and use other applications that require broadband data speeds.
• IoT: People can use cellular modules, routers, and other gateways embedded in practically anything – a smart speaker, a dog collar, a commercial washing machine, a safety shoe, an industrial air purifier, a liquid fertilizer storage tank – to transmit data from the thing to the cloud or a private data center and back via the internet.  

5G NR, as well as 5G’s LPWA radio access technologies (NB-IoT and LTE-M) will continue to support these existing IoT and voice use cases. 

However, 5G also expands on the MBB use case with a new Enhanced Mobile Broadband (eMBB) use case. These eMBB use cases leverage 5G NR’s higher peak and average speeds and lower latency to enable smart phones and other devices to support high-definition cloud-based immersive video games, high quality video calls and new VR, AR, and other XR applications.

In addition, 5G NR also supports a new use case, called Ultra-Reliable, Low-Latency Communications (URLLC). 5G NR enables devices to create connections that are ultra-reliable with very low latency. With these new 5G NR capabilities, as well as 5G NR’s support for very fast handoffs and high mobility, organizations can now deploy new factory automation, smart city 2.0 and other next generation Industrial IoT (IIoT) applications, as well as Vehicle-to-everything (V2X) applications, such as autonomous vehicles. 

As we mentioned above, 5G will also support the new mMTC use case, which represents an enhancement of the existing IoT use case. However, in the case of mMTC, new use cases will be enabled by improvements to LTE-M and NB-IoT radio access technology standards, not 5G NR. Examples of these types of new mMTC use cases include large-scale deployments of small, low cost edge devices (like sensors) for smart city, smart logistics, smart grid, and similar applications.

But this is not all. 3GPP is looking at additional new use cases (and new technologies for these use cases), as discussed in this recent blog on Release 17 of the 5G standard. One of these new technologies is a new Reduced Capability (RedCap) device – sometimes referred to as NR Light – for IoT or MTC use cases that require faster data speeds than LPWA devices can provide, but also need devices that are less expensive than the 5G NR devices being deployed today.

3GPP is also examining standard changes to NR, LTE-M, and NB-IoT in 5G Release 17 that would make it possible for satellites to use these technologies for Non-Terrestrial Network (NTN) communications. This new NTN feature would help enable the deployment of satellites able to provide NR, LTE-M, and NB-IoT coverage in very remote areas, far away from cellular base stations.

What should you look for in a 5G NR module, router or gateway solution?

While all 5G NR edge devices use the 5G NR technology specification, they are not all created equal. In fact, the flexibility, performance, quality, security, and other capabilities of a 5G NR edge device can make the difference between a successful 5G NR application rollout and a failed one. 

As they evaluate 5G NR edge devices for their application, organizations should ask themselves the following questions:

• Is the edge device multi-mode? 
While Mobile Network Operators (MNOs) are rapidly expanding their 5G NR networks, there are still many areas where 5G NR coverage is not available. Multi-mode edge devices that can support LTE, or even 3G, help ensure that wherever the edge device is deployed, it will be able to connect to a MNO’s network – even if this connection does not provide the data speed, latency, or other performance needed to maximize the value of the 5G NR application. 

In addition, many MNOs are rolling out non-standalone (NSA) 5G NR networks at first. These NSA 5G NR networks need a LTE connection in addition to a 5G NR connection to transmit data from and to 5G NR devices. If your edge device does not include support for LTE, it will not be able to use 5G NR on these NSA networks. 

• How secure are the edge devices? 
Data is valuable and sensitive – and the data transmitted by 5G NR devices is no different. To limit the risk that this data is exposed, altered, or destroyed, organizations need to adopt a Defense in Depth approach to 5G NR cybersecurity, with layers of security implemented at the cloud, network, and edge device levels. 

At the edge device level, organizations should ensure their devices have security built-in with features such as HTTPS, secure socket, secure boot, and free unlimited firmware over-the-air (FOTA) updates. 

Organizations will also want to use edge devices from trustworthy companies that are headquartered in countries that have strict laws in place to protect customer data. In doing so you will ensure these companies are committed to working with you to prevent state or other malicious actors from gaining access to your 5G NR data.

• Are the 5G NR devices future-proof? 
Over time, organizations are likely to want to upgrade their applications. In addition, the 5G NR specification is not set in stone, and updates to it are made periodically. Organizations will want to ensure their 5G NR edge devices are futureproof, with capabilities that include the ability to update them with new firmware over the air, so they can upgrade their applications and take advantage of new 5G NR capabilities in the future. 

• Can the 5G NR device do edge processing? 
While 5G NR increases the amount of data that can be transmitted over cellular wireless networks, in many cases organizations will want to filter, prioritize, or otherwise process some of their 5G NR application’s data at the edge. This edge processing can enable these organizations to lower their data transmission costs, improve application performance, and lower their devices energy use. 

5G NR edge devices that offer organizations the ability to easily process data at the edge allow them to lower their data transmission expenses, optimize application performance, and maximize their devices’ battery lives. 

Originally posted here.

Read more…

Premier Sponsors

Upcoming IoT Events

More IoT News

Arcadia makes supporting clean energy easier

Nowadays, it’s easier than ever to power your home with clean energy, and yet, many Americans don’t know how to make the switch. Luckily, you don’t have to install expensive solar panels or switch utility companies…

Continue

4 industries to watch for AI disruption

Consumer-centric applications for artificial intelligence (AI) and automation are helping to stamp out the public perception that these technologies will only benefit businesses and negatively impact jobs and hiring. The conversation from human…

Continue

Answering your Huawei ban questions

A lot has happened since we uploaded our most recent video about the Huawei ban last month. Another reprieve has been issued, licenses have been granted and the FCC has officially barred Huawei equipment from U.S. networks. Our viewers had some… Continue

IoT Career Opportunities