Join IoT Central | Join our LinkedIn Group | Post on IoT Central


Featured Posts (670)

Sort by

The advent of the internet of things on Metaverse is expected to change its overall market outlook in the future. The IoT Includes a plethora of features which, in turn, will highly benefit the Metaverse Market in the upcoming years. With a growth rate of 38.25 per cent CAGR, the metaverse market size was estimated to be worth USD 124.04 billion in 2022 and USD 1655.29 billion in 2030.

The IoT, which was first launched in 1999, links hundreds of devices, including thermostats, voice-activated speakers, and medical equipment, to a variety of data. IoT is now poised to revolutionize the Metaverse as it effortlessly connects the 3D environment to a wide range of physical objects. One of the renowned & largest private software firms in the UK, IRIS Software Group, offers software solutions and services that significantly improve operational compliance, efficiency, and accuracy.

The identity environment will expand enormously as the Metaverse takes traction and new applications and access points emerge alongside it, creating additional entry points for potential bad market players. Already, 84% of corporate executives concur that their company now manages significantly more digital identities than it did ten years ago (up to 10x). Additionally, 95% of firms say they have trouble keeping track of all the identities that are currently a part of their organization (human and machine). We have a perfect storm of rising complexity and expanding threat vectors that may be exploited, which can lead to breaches, business disruption, and material expenses when we add in the Metaverse and the rise in IoT usage that will accompany it.

Top features of IoT:

 a.) A 360-degree enhanced and real-world training: 

Using the IoT, we are able to develop and test training methods in situations where we are unable to do so in the real world due to the scope and authenticity of training on extreme real-world situations (such as severe weather or cyber events) that can be done through virtual simulations using digital twins in the Metaverse. Io Train-sim will aid in preparing people and AI/software to cooperate to better recognize issues and lessen the impact in real life as virtual metaverse environments develop to more closely resemble reality.

b.) Smarter and better long-term planning along with its near-term response: 

The metaverse system will increasingly closely resemble our real world as it fills up with digital duplicates of real-world objects (such as cars, buildings, factories, and people). We will be able to run different long-term planning scenarios, identify the most optimal designs for our energy, transportation, and healthcare systems, and dynamically operate these techniques as the real world evolves thanks to this system-of-systems complicated virtual simulation (e.g., more renewable sources, new diseases, population migrations or demographic changes). These simulations will assist teams of humans in responding to current events and solving an issue utilizing monthly, weekly, or day-ahead planning, in addition to long-term planning. AI will then be used to learn from the outcome and enhance the response during the next event.

Conclusion

Brands are utilizing a variety of cutting-edge technologies to fuel the Metaverse with the aim of making the virtual as real-time and authentic as possible. These technologies include AR, VR, Blockchain, AI, and IoT. Sensors, cameras, and wearables are already implemented and in use due to the present IoT development. These gadgets are the engines that make it possible for the Metaverse to reflect the real world in real-time when they are connected to it. A metaverse representation of a physical site, such as Samsung's 837x recreation of its 837 Washington St. experience centre in New York City's Meatpacking District, might, for instance, be updated continuously and in real-time as objects enter and exit the physical location

Read more…

Against the backdrop of digital technology and the industrial revolution, the Internet of Things has become the most influential and disruptive of all the latest technologies. As an advanced technology, IoT is showing a palpable difference in how businesses operate. 

Although the Fourth Industrial Revolution is still in its infancy, early adopters of this advanced technology are edging out the competition with their competitive advantage. 

Businesses eager to become a part of this disruptive technology are jostling against each other to implement IoT solutions. Yet, they are unaware of the steps in effective implementation and the challenges they might face during the process. 

This is a complete guide– the only one you’ll need – that focuses on delivering effective and uncomplicated IoT implementation. 

 

Key Elements of IoT

There are three main elements of IoT technology:

  • Connectivity:

IoT devices are connected to the internet and have a URI – Unique Resource Identifier – that can relay data to the connected network. The devices can be connected among themselves to a centralized server, a cloud, or a network of servers.

  • Data Communication:

IoT devices continuously share data with other devices in the network or the server. 

  • Interaction

IoT devices do not simply gather data. They transmit it to their endpoints or server. There is no point in collecting data if it is not put to good use. The collected data is used to deliver IoT smart solutions in automation, take real-time business decisions, formulate strategies, or monitor processes. 

How Does IoT work?

IoT devices have URI and come with embedded sensors. With these sensors, the devices sense their environment and gather information. For example, the devices could be air conditioners, smart watches, cars, etc. Then, all the devices dump their collected data into the IoT platform or gateway. 

The IoT platform then performs analytics on the data from various sources and derives useful information per the requirement

What are the Layers in IoT Architecture?

Although there isn’t a standard IoT structure that’s universally accepted, the 4-layer architecture is considered to be the basic form. The four layers include perception, network, middleware, and application.

  • Perception:

Perception is the first or the physical layer of IoT architecture. All the sensors, edge devices, and actuators gather useful information based on the project needs in this layer. The purpose of this layer is to gather data and transfer it to the next layer. 

  • Network:

It is the connecting layer between perception and application. This layer gathers information from the perception and transmits the data to other devices or servers. 

  • Middleware

The middleware layer offers storage and processing capabilities. It stores the incoming data and applies appropriate analytics based on requirements. 

  • Application

The user interacts with the application layer, responsible for taking specific services to the end-user. 

Implementation Requirements

Effective and seamless implementation of IoT depends on specific tools, such as:

  • High-Level Security 

Security is one of the fundamental IoT implementation requirements. Since the IoT devices gather real-time sensitive data about the environment, it is critical to put in place high-level security measures that ensure that sensitive information stays protected and confidential.  

  • Asset Management

Asset management includes the software, hardware, and processes that ensure that the devices are registered, upgraded, secured, and well-managed. 

  • Cloud Computing

Since massive amounts of structured and unstructured data are gathered and processed, it is stored in the cloud. The cloud acts as a centralized repository of resources that allows the data to be accessed easily. Cloud computing ensures seamless communication between various IoT devices. 

  • Data Analytics

With advanced algorithms, large amounts of data are processed and analyzed from the cloud platform. As a result, you can derive trends based on the analytics, and corrective action can be taken. 

What are the IoT Implementation Steps?

Knowing the appropriate IoT implementation steps will help your business align your goals and expectations against the solution. You can also ensure the entire process is time-bound, cost-efficient, and satisfies all your business needs. 

10661243654?profile=RESIZE_710x

Set Business Objectives 

IoT implementation should serve your business goals and objectives. Unfortunately, not every entrepreneur is an accomplished technician or computer-savvy. You can hire experts if you lack the practical know-how regarding IoT, the components needed, and specialist knowledge. 

Think of what you will accomplish with IoT, such as improving customer experience, eliminating operational inconsistencies, reducing costs, etc. With a clear understanding of IoT technology, you should be able to align your business needs to IoT applications. 

Hardware Components and Tools

Selecting the necessary tools, components, hardware, and software systems needed for the implementation is the next critical step. First, you must choose the tools and technology, keeping in mind connectivity and interoperability. 

You should also select the right IoT platform that acts as a centralized repository for collecting and controlling all aspects of the network and devices. You can choose to have a custom-made platform or get one from suppliers. 

Some of the major components you require for implementation include,

  • Sensors
  • Gateways
  • Communication protocols
  • IoT platforms
  • Analytics and data management software

Implementation

Before initiating the implementation process, it is recommended that you put together a team of IoT experts and professionals with selected use case experience and knowledge. Make sure that the team comprises experts from operations and IT with a specific skill set in IoT. 

A typical team should be experts with skills in mechanical engineering, embedded system design, electrical and industrial design, technical expertise, and front/back-end development. 

Prototyping

Before giving the go-ahead, the team must develop an Internet of Things implementation prototype. 

A prototype will help you experiment and identify fault lines, connectivity, and compatibility issues. After testing the prototype, you can include modified design ideas. 

Integrate with Advanced technologies

After the sensors gather useful data, you can add layers of other technologies such as analytics, edge computing, and machine learning. 

The amount of unstructured data collected by the sensors far exceeds structured data. However, both structured and unstructured, machine learning, deep learning neural systems, and cognitive computing technologies can be used for improvement. 

Take Security Measures

Security is one of the top concerns of most businesses. With IoT depending predominantly on the internet for functioning, it is prone to security attacks. However, communication protocols, endpoint security, encryption, and access control management can minimize security breaches. 

Although there are no standardized IoT implementation steps, most projects follow these processes. But the exact sequence of IoT implementation depends on your project’s specific needs.

Challenges in IoT Implementation

Every new technology comes with its own set of implementation challenges. 

10661244498?profile=RESIZE_710x

When you keep these challenges of IoT implementation in mind, you’ll be better equipped to handle them. 

  • Lack of Network Security

When your entire system is dependent on the network connectivity for functioning, you are just adding another layer of security concern to deal with. 

Unless you have a robust network security system, you are bound to face issues such as hacking into the servers or devices. Unfortunately, the IoT hacking statistics are rising, with over 1.5 million security breaches reported in 2021 alone. 

  • Data Retention and Storage 

IoT devices continually gather data, and over time the data becomes unwieldy to handle. Such massive amounts of data need high-capacity storage units and advanced IoT analytics technologies. 

  • Lack of Compatibility 

IoT implementation involves several sensors, devices, and tools, and a successful implementation largely depends on the seamless integration between these systems. In addition, since there are no standards for devices or protocols, there could be major compatibility issues during implementation. 

IoT is the latest technology that is delivering promising results. Yet, similar to any technology, without proper implementation, your businesses can’t hope to leverage its immense benefits. 

Taking chances with IoT implementation is not a smart business move, as your productivity, security, customer experience, and future depend on proper and effective implementation. The only way to harness this technology would be to seek a reliable IoT app development company that can take your initiatives towards success.

Read more…

How Doews IoT help in Retail? Continuous and seamless communication is now a reality between people, processes and things.  IoT has been enabling retailers to connect with people and businesses and gain useful insight about product performance and engagement of people with such products. 

Importance of IoT in Retail

  • It helps improve customer experience in new ways and helps brick and mortar shops compete with their online counterparts by engaging customers in different ways.
  • IoT can track customer preferences, analyze their habits and share relevant information with the marketing teams and help improve the product or brand features and design and keep the customer updated on new products, delivery status etc.
  • Using IoT retailers can increase efficiency and profitability in various ways for their benefit.
  • IoT can significantly improve the overall customer experience, like automated checkouts and integration with messaging platforms and order systems.
  • It helps increase efficiency in transportation and logistics by reducing the time to deliver goods to market or store. It helps in vehicle management, and tracking deliveries. This helps in reducing costs, improving the bottom line and increasing customer satisfaction.
  • Inventory management becomes easier with IoT. Tracking inventory is much easier and simpler from the stocking of goods to initiating a purchase.
  • It helps increase operational efficiency in warehouses, by optimizing temperature controls, improving maintenance, and managing the warehouse. 

Use Cases of IoT in Retail

  1. IoT is used in Facility management to ensure day-to-day areas are clean and can be used to monitor consumable supplies levels. It can be used to monitor store environments like temperature, lighting, ventilation and refrigeration. IoT can identify key areas that can provide a complete 360 degrees view of facility management.
  2. It can help in tracking the number of persons entering a facility. This is especially useful because of the pandemic situation, to ensure that no overcrowding takes place.
    Occupancy sensors provide vital data on store traffic patterns and also on the time spent in any particular area. This helps retailers with better planning and product placement strategies. This helps in guided selling with more effective display setups, layouts, and space management.
  3. IoT helps in a big way for Supply chain and logistics, by providing information on the stock levels. 
  4. IoT helps in asset tracking in items like shopping carts and baskets. Sensors can ensure that location data is available for all carts making retrieval easy. It can help lock carts if they are taken out of location.
  5. IoT devices can and are being used to personalize user experience. Bluetooth beacons are used to send personalized real-time alerts to phones when the customer is near an aisle or a store. This can prompt a customer to enter the store or look at the aisle area and take advantage of offers etc. IoT-based beacons, helps Target, collect user data and also send hyper-personalized content to customers.
  6. Smart shelves are another example of innovative IoT ideas. Maintaining shelves to refill products or ensure correct items are placed on the right shelves is a time-consuming task. Smart shelves automate these tasks easily. They can help save time and resolve manual errors.

Businesses should utilize new technologies to revolutionize the retail sector in a better way. Digitalization or digital transformation of brick and mortar stores is not a new concept. With every industry wanting to improve its services and facilities and trying to stay ahead of the competition, digitalization in retail industry is playing a big role in this transformation. To summarize, digitalization helps in enhanced data collection, helps data-driven customer insights, gives a better customer experience, and increases profits and productivity. It encourages a digital culture.

Read more…

Arm DevSummit 2020 debuted this week (October 6 – 8) as an online virtual conference focused on engineers and providing them with insights into the Arm ecosystem. The summit lasted three days over which Arm painted an interesting technology story about the current and future state of computing and where developers fit within that story. I’ve been attending Arm Techcon for more than half a decade now (which has become Arm DevSummit) and as I perused content, there were several take-a-ways I noticed for developers working on microcontroller based embedded systems. In this post, we will examine these key take-a-ways and I’ll point you to some of the sessions that I also think may pique your interest.

(For those of you that aren’t yet aware, you can register up until October 21st (for free) and still watch the conferences materials up until November 28th . Click here to register)

Take-A-Way #1 – Expect Big Things from NVIDIAs Acquisition of Arm

As many readers probably already know, NVIDIA is in the process of acquiring Arm. This acquisition has the potential to be one of the focal points that I think will lead to a technological revolution in computing technologies, particularly around artificial intelligence but that will also impact nearly every embedded system at the edge and beyond. While many of us have probably wondered what plans NVIDIA CEO Jensen Huang may have for Arm, the Keynotes for October 6th include a fireside chat between Jensen Huang and Arm CEO Simon Segars. Listening to this conversation is well worth the time and will help give developers some insights into the future but also assurances that the Arm business model will not be dramatically upended.

Take-A-Way #2 – Machine Learning for MCU’s is Accelerating

It is sometimes difficult at a conference to get a feel for what is real and what is a little more smoke and mirrors. Sometimes, announcements are real, but they just take several years to filter their way into the market and affect how developers build systems. Machine learning is one of those technologies that I find there is a lot of interest around but that developers also aren’t quite sure what to do with yet, at least in the microcontroller space. When we hear machine learning, we think artificial intelligence, big datasets and more processing power than will fit on an MCU.

There were several interesting talks at DevSummit around machine learning such as:

Some of these were foundational, providing embedded developers with the fundamentals to get started while others provided hands-on explorations of machine learning with development boards. The take-a-way that I gather here is that the effort to bring machine learning capabilities to microcontrollers so that they can be leveraged in industry use cases is accelerating. Lots of effort is being placed in ML algorithms, tools, frameworks and even the hardware. There were several talks that mentioned Arm’s Cortex-M55 architecture that will include Helium technology to help accelerate machine learning and DSP processing capabilities.

Take-A-Way #3 – The Constant Need for Reinvention

In my last take-a-way, I eluded to the fact that things are accelerating. Acceleration is not just happening though in the technologies that we use to build systems. The very application domain that we can apply these technology domains to is dramatically expanding. Not only can we start to deploy security and ML technologies at the edge but in domains such as space and medical systems. There were several interesting talks about how technologies are being used around the world to solve interesting and unique problems such as protecting vulnerable ecosystems, mapping the sea floor, fighting against diseases and so much more.

By carefully watching and listening, you’ll notice that many speakers have been involved in many different types of products over their careers and that they are constantly having to reinvent their skill sets, capabilities and even their interests! This is what makes working in embedded systems so interesting! It is constantly changing and evolving and as engineers we don’t get to sit idly behind a desk. Just as Arm, NVIDIA and many of the other ecosystem partners and speakers show us, technology is rapidly changing but so are the problem domains that we can apply these technologies to.

Take-A-Way #4 – Mbed and Keil are Evolving

There are also interesting changes coming to the Arm toolchains and tools like Mbed and Keil MDK. In Reinhard Keil’s talk, “Introduction to an Open Approach for Low-Power IoT Development“, developers got an insight into the changes that are coming to Mbed and Keil with the core focus being on IoT development. The talk focused on the endpoint and discussed how Mbed and Keil MDK are being moved to an online platform designed to help developers move through the product development faster from prototyping to production. The Keil Studio Online is currently in early access and will be released early next year.

(If you are interested in endpoints and AI, you might also want to check-out this article on “How Do We Accelerate Endpoint AI Innovation? Put Developers First“)

Conclusions

Arm DevSummit had a lot to offer developers this year and without the need to travel to California to participate. (Although I greatly missed catching up with friends and colleagues in person). If you haven’t already, I would recommend checking out the DevSummit and watching a few of the talks I mentioned. There certainly were a lot more talks and I’m still in the process of sifting through everything. Hopefully there will be a few sessions that will inspire you and give you a feel for where the industry is headed and how you will need to pivot your own skills in the coming years.

Originaly posted here

Read more…

Will We Ever Get Quantum Computers?

In a recent issue of IEEE Spectrum, Mikhail Dyakonov makes a pretty compelling argument that quantum computing (QC) isn't going to fly anytime soon. Now, I'm no expert on QC, and there sure is a lot of money being thrown at the problem by some very smart people, but having watched from the sidelines QC seems a lot like fusion research. Every year more claims are made, more venture capital gets burned, but we don't seem to get closer to useful systems.

Consider D-Wave Systems. They've been trying to build a QC for twenty years, and indeed do have products more or less on the market, including, it's claimed, one of 1024 q-bits. But there's a lot of controversy about whether their machines are either quantum computers at all, or if they offer any speedup over classical machines. One would think that if a 1K q-bit machine really did work the press would be all abuzz, and we'd be hearing constantly of new incredible results. Instead, the machines seem to disappear into research labs.

Mr. Duakonov notes that optimistic people expect useful QCs in the next 5-10 years; those less sanguine expect 20-30 years, a prediction that hasn't changed in two decades. He thinks a window of many decades to never is more realistic. Experts think that a useful machine, one that can do the sort of calculations your laptop is capable of, will require between 1000 and 100,000 q-bits. To me, this level of uncertainty suggests that there is a profound lack of knowledge about how these machines will work and what they will be able to do.

According to the author, a 1000 q-bit machine can be in 21000 states (a classical machine with N transistors can be in only 2N states), which is about 10300, or more than the number of sub-atomic particles in the universe. At 100,000 q-bits we're talking 1030,000, a mind-boggling number.

Because of noise, expect errors. Some theorize that those errors can be eliminated by adding q-bits, on the order of 1000 to 100,000 additional per q-bit. So a useful machine will need at least millions, or perhaps many orders of magnitude more, of these squirrelly microdots that are tamed only by keeping them at 10 millikelvin.

A related article in Spectrum mentions a committee formed of prestigious researchers tasked with assessing the probability of success with QC concluded that:

"[I]t is highly unexpected" that anyone will be able to build a quantum computer that could compromise public-key cryptosystems (a task that quantum computers are, in theory, especially suitable for tackling) in the coming decade. And while less-capable "noisy intermediate-scale quantum computers" will be built within that time frame, "there are at present no known algorithms/applications that could make effective use of this class of machine," the committee says."

I don't have a dog in this fight, but am relieved that useful QC seems to be no closer than The Distant Shore (to quote Jan de Hartog, one of my favorite writers). If it were feasible to easily break encryption schemes banking and other systems could collapse. I imagine Blockchain would fail as hash algorithms became reversable. The resulting disruption would not be healthy for our society.

On the other hand, Bruce Schneier's article in the March issue of IEEE Computing Edge suggests that QC won't break all forms of encryption, though he does think a lot of our current infrastructure will be vulnerable. The moral: if and when QC becomes practical, expect chaos.

I was once afraid of quantum computing, as it involves mechanisms that I'll never understand. But then I realized those machines will have an API. Just as one doesn't need to know how a computer works to program in Python, we'll be insulated from the quantum horrors by layers of abstraction.

Originaly posted here

Read more…

SSE Airtricity employees Derek Conty, left, Francie Byrne, middle, and Ryan Doran, right, install solar panels on the roof of Kinsale Community School in Kinsale, Ireland. The installation is part of a project with Microsoft to demonstrate the feasibility of distributed power purchase agreements. Credit: Naoise Culhane

by John Roach

Solar panels being installed on the roofs of dozens of schools throughout Dublin, Ireland, reflect a novel front in the fight against global climate change, according to a senior software engineer and a sustainability lead at Microsoft.

The technology copmpany partnered with SSE Airtricity, Ireland's largest provider of 100% green energy and a part of FTSE listed SSE Group, to install and manage the internet-connected solar panels, which are connected via Azure IoT to Microsoft Azure, a cloud computing platform.

The software tools aggregate and analyze real-time data on energy generated by the solar panels, demonstrating a mechanism for Microsoft and other corporations to achieve sustainability goals and reduce the carbon footprint of the electric power grid.

"We need to decarbonize the global economy to avoid catastrophic climate change," said Conor Kelly, the software engineer who is leading the distributed solar energy project for Microsoft Azure IoT. "The first thing we can do, and the easiest thing we can do, is focus on electricity."

Microsoft's $1.1 million contribution to the project builds on the company's ongoing investment in renewable energy technologies to offset carbon emissions from the operation of its datacenters.

A typical approach to power datacenters with renewable energy is for companies such as Microsoft to sign so-called power purchase agreements with energy companies.The agreements provide financial guarantees needed to build industrial-scale wind and solar farms and connections to the power grid.

The new project demonstrates the feasibility of agreements to install solar panels on rooftops distributed across towns with existing grid connections and use internet of things, or IoT, technologies to aggregate the accumulated energy production for carbon offset accounting.

"It utilizes existing assets that are sitting there unmonetized, which are roofs of buildings that absorb sunlight all day," Kelly said.

New Business Model

The project is also a proof-of-concept, or blueprint, for how energy providers can adapt as the falling price of solar panels enables distributed electric power generation throughout the existing electric power grid.

Traditionally, suppliers purchase power from central power plants and industrial-scale wind and solar farms and sell it to consumers on the distribution grid. Now, energy providers like SSE Airtricity provide renewable energy solutions that allow end consumers to generate power, from sustainable sources, using the existing grid connection on their premises.

"The more forward-thinking energy providers that we are working with, like SSE Airtricity, identify this as an opportunity and industry changing shift in how energy will be generated and consumed," Kelly noted.

The opportunity comes in the ability to finance the installation of solar panels and batteries at homes, schools, businesses and other buildings throughout a community and leverage IoT technology to efficiently perform a range of services from energy trading to carbon offset accounting.

Kelly and his team with Azure IoT are working with SSE Airtricity to develop the tools and machine learning models necessary to unlock this opportunity.

"Instead of having utility scale solar farms located outside of cities, you could have a solar farm at the distribution level, spread across a number of locations," said Fergal Ahern, a business energy solutions manager and renewable energy expert with SSE Airtricity.

For the distributed power purchase agreement, SSE Airtricity uses Azure IoT to aggregate the generation of all the solar panels installed across 27 schools around the provinces of Leinster, Munster and Connacht and run it through a machine learning model to determine the carbon emissions that the solar panels avoid.

The schools use the electricity generated by the solar panels, which reduces their utility bills; Microsoft receives the renewable energy credits for the generated electricity, which the company applies to its carbon neutrality commitments.

The panels are expected to produce enough energy annually to power the equivalent of 68 Irish homes for a year and abate more than 2.1 million kilograms, which is equivalent to 4.6 million pounds, of carbon dioxide emissions over the 15 years of the agreement, according to Kelly.

"This is additional renewable energy that wouldn't have otherwise happened," he said. "Every little bit counts when it comes to meeting our sustainability targets and combatting climate change."

Every little bit counts

Victory Luke, a 16 year old student at Collinstown Park Community College in Dublin, has lived by the "every little bit counts" mantra since she participated in a "Generation Green" sustainability workshop in 2019 organized by the Sustainable Energy Authority of Ireland, SSE Airtricity and Microsoft.

The workshop was part of an education program surrounding the installation of solar panels and batteries at her school along with a retrofit of the lighting system with LEDs. Digital screens show the school's energy use in real time, allowing students to see the impact of the energy efficiency upgrades.

Luke said the workshop captured her interest on climate change issues. She started reading more about sustainability and environmental conservation and agreed to share her newfound knowledge with the younger students at her school.

"I was going around and talking to them about energy efficiency, sharing tips and tricks like if you are going to boil a kettle, only boil as much water as you need, not too much," she explained.

That June, the Sustainable Energy Authority of Ireland invited her to give a speech at the Global Conference on Energy Efficiency in Dublin, which was organized by the International Energy Agency, an organization that works with governments and industry to shape sustainable energy policy.

"It kind of felt surreal because I honestly felt like I wasn't adequate enough to be speaking about these things," she said, noting that the conference attendees included government ministers, CEOs and energy experts from around the world.

At the time, she added, the global climate strike movement and its youth leaders were making international headlines, which made her advocacy at school feel even smaller. "Then I kind of realized that it is those smaller things that make the big difference," she said.

SSE Airtricity and Microsoft plan to replicate the educational program that inspired Luke and her classmates at dozens of the schools around Ireland that are participating in the project.

"When you've got solar at a school and you can physically point at the installation and a screen that monitors the power being generated, it brings sustainability into daily school life," Ahern said.

Proof of concept for policymakers

The project's education campaign extends to renewable energy policymakers, Kelly noted. He explained that renewable energy credits—a market incentive for corporations to support renewable energy projects—are currently unavailable for distributed power purchase agreements.

For this project, Microsoft will receive genuine renewable energy credits from a wind farm that SSE Airtricity also operates, he added.

"And," he said, "we are hoping to use this project as an example of what regulation should look like, to say, 'You need to award renewable energy credits to distributed generation because they would allow corporates to scale-up this type of project.'"

For her part, Luke supports steps by multinational corporations such as Microsoft to invest in renewable energy projects that address global climate change.

"It is a good thing to see," she said. "Once one person does something, other people are going to follow.

Originaly posted HERE

Read more…

An edge device is the network component that is responsible for connecting a local area network to an external or wide area network, which can be accessed from anywhere. Edge devices offer several new services and improved outcomes for IoT deployments across all markets. Smart services that rely on high volumes of data and local analysis can be deployed in a wide range of environments.

Edge device provides the local data to an external network. If protocols are different in local and external networks, it also translates this information, and make the connection between both network boundaries. Edge devices analyze diagnostics and automatic data populating; however, it is necessary to make a secure connection between the field network and cloud computing. In the event of loss of internet connection or cloud crash edge device will store data until the connection is established, so it won’t lose any process information. The local data storage is optional and not all edge devices offer local storage, it depends on the application and service required to implement on the plant.

How does an edge device work?

An edge device has a very straightforward working principle, it communicates between two different networks and translates one protocol into another. Furthermore, it creates a secure connection with the cloud.

An edge device can be configured via local access and internet or cloud. In general, we can say an edge device is a plug-and-play, its setup is simple and does not require much time to configure.

Why should I use an edge device?

Depending on the service required in the plant, the edge devices will be a crucial point to collect the information and create an automatic digital twin of your device in the cloud. 

Edge devices are an essential part of IoT solutions since they connect the information from a network to a cloud solution. They do not affect the network but only collect the data from it, and never cause a problem with the communication between the control system and the field devices. by using an edge device to collect information, the user won’t need to touch the control system. Edge is one-way communication, nothing is written into the network, and data are acquired with the highest possible security.

Edge device requirements

Edge devices are required to meet certain requirements that are to meet at all conditions to perform in different secretions. This may include storage, network, and latency, etc.

Low latency

Sensor data is collected in near real-time by an edge server. For services like image recognition and visual monitoring, edge servers are located in very close proximity to the device, meeting low latency requirements. Edge deployment needs to ensure that these services are not lost through poor development practice or inadequate processing resources at the edge. Maintaining data quality and security at the edge whilst enabling low latency is a challenge that need to address.

Network independence

IoT services do not care for data communication topology.  The user requires the data through the most effective means possible which in many cases will be mobile networks, but in some scenarios, Wi-Fi or local mesh networking may be the most effective mechanism of collecting data to ensure latency requirements can be met.

Good-Edge-IOT-Device-1024x576.jpg

Data security

Users require data at the edge to be kept secure as when it is stored and used elsewhere. These challenges need to meet due to the larger vector and scope for attacks at the edge. Data authentication and user access are as important at the edge as it is on the device or at the core.  Additionally, the physical security of edge infrastructure needs to be considered, as it is likely to hold in less secure environments than dedicated data centers.

Data Quality

Data quality at the edge is a key requirement to guarantee to operate in demanding environments. To maintain data quality at the edge, applications must ensure that data is authenticated, replicated as and assigned into the correct classes and types of data category.

Flexibility in future enhancements

Additional sensors can be added and managed at the edge as requirements change. Sensors such as accelerometers, cameras, and GPS, can be added to equipment, with seamless integration and control at the edge.

Local storage

Local storage is essential in the event of loss of internet connection or cloud crash edge device will store data until the connection is established, so it won’t lose any process information. The local data storage is optional and not all edge devices offer local storage, it depends on the application and service required to implement on the plant

Originaly Posted here

Read more…

by Singapore University of Technology and Design

Internet-of-Things (IoT) such as smart home locks and medical devices, depend largely on Bluetooth low energy (BLE) technology to function and connect across other devices with reduced energy consumption. As these devices get more prevalent with increasing levels of connectivity, the need for strengthened security in IoT has also become vital.

A research team, led by Assistant Professor Sudipta Chattopadhyay from the Singapore University of Technology and Design (SUTD), wit team members from SUTD and the Institute for Infocomm Research (I2R), designed and implemented the Greyhound framework, a tool used to discover SweynTooth—a critical set of 11 cyber vulnerabilities.

Their study was presented at the USENIX Annual Technical Conference (USENIX ATC) on 15 to 17 July 2020 and they have been invited to present at the upcoming Singapore International Cyber Week (SICW) in October 2020.

These security lapses were found to affect devices by causing them to crash, reboot or bypass security features. At least 12 BLE based devices from eight vendors were affected, including a few hundred types of IoT products including pacemakers, wearable fitness trackers and home security locks.

The SweynTooth code has since been made available to the public and several IoT product manufacturers have used it to find security issues in their products. In Singapore alone, 32 medical devices reported to be affected by SweynTooth and 90% of these device manufacturers have since implemented preventive measures against this set of cyber vulnerabilities.

Regulatory agencies including the Cyber Security Agency and the Health Sciences Authority in Singapore as well as the Department of Homeland Security and the Food and Drug Administration in the United States have reached out to the research team to further understand the impact of these vulnerabilities.

These agencies have also raised public alerts to inform medical device manufacturers, healthcare institutions and end users on the potential security breach and disruptions. The research team continues to keep them updated on their research findings and assessments.

Beyond Bluetooth technology, the research team designed the Greyhound framework using a modular approach so that it could easily be adapted for new wireless protocols. This allowed the team to test it across the diverse set of protocols that IoTs frequently employ. This automated framework also paves new avenues in the testing security of more complex protocols and IoTs in next-generation wireless protocol implementations such as 5G and NarrowBand-IoT which require rigorous and systematic security testing.

"As we are transitioning towards a smart nation, more of such vulnerabilities could appear in the future. We need to start rethinking the device manufacturing design process so that there is limited reliance on communication modules such as Bluetooth to ensure a better and more secure smart nation by design," explained principal investigator Assistant Professor Sudipta from SUTD.

Originally posted HERE.

Read more…

When you’re in technology, you have to expect change. Yet, there’s something to the phrase “the more things change, the more they stay the same.” For instance, I see in the industrial internet of things (IIoT) a realm that’ll dramatically shape the future - how we manufacture, the way we run our factories, workforce needs – but the underlying business goals are the same as always.

Simply put, while industrial enterprise initiatives may change, financial objectives don’t – and they’re still what matter most. That’s why IIoT is so appealing. While the possibilities of smart and connected operations, sites and products certainly appeal to the dreamer and innovator, the clear payoff ensures that it’s a road even the most pragmatic decision-maker will eagerly follow.

The big three
When it comes to industrial enterprises, IIoT addresses the “big three” financial objectives head on. The technology maximizes revenue growth, reduces operating expense and increases asset efficiency.

IIoT does this in numerous ways. It yields invaluable operational intelligence, like real-time performance management data, to reduce manufacturing costs, increase flexibility and enable agility. When it comes to productivity, connected digital assets can empower a workforce with actionable insights to improve productivity and quality, even prevent safety and compliance issues.

For example, recognizing defects in a product early on can save time, materials, staff hours and possibly even a company’s reputation.

Whether on or off the factory floor, IIoT can be used to optimize asset efficiency. With real-time monitoring, diagnostics and analytics, downtime can be reduced or avoided. Asset utilization can also be evaluated and maximized. Think applications like equipment health monitoring, predictive maintenance, the ability to provide augmented 3D instructions for complex repairs. And, you can also scale production more precisely via better control over processes and inventory.

All of this accelerates time to market; another key benefit of IIoT and long held business goal.

Why is 5G important for IIoT and augmented reality (AR)?
As we look at the growing need to connect more devices, more sensors and install things like real-time cameras for doing analytics, there is growing stress and strain that is brought into industrial settings. We have seen the need to increase connectivity while having greater scalability, performance, accessibility, reliability, and broader reach with a lower cost of ownership become much more important. This is where 5G can make a real difference.

Many of our customers have seen what we are doing with augmented reality and the way that PTC can help operators service equipment. But in the not so distant future, the way that people interact with robotics, for example, will change. There will be real-time video to do spatial analytics on the way that people are working with man and machines and we’ll be able to unlock a new level of intelligence with a new layer of connectivity that helps drive better business outcomes.

Partner up
It sounds nice but the truth is, a lot of heavy lifting is required to do IIoT right. The last thing you want to do is venture into a pilot, run into problems, and leave the C-suite less than enthused with the outcome. And make no mistake, there’s a lot potential pitfalls to be aware of.

For instance, lengthy proof of concept periods, cumbersome processes and integrations can slow time to market. Multiple, local integrations can be required when connectivity and device management gets siloed. If not done right, you may only gain limited visibility into devices and the experience will fall short. And, naturally, global initiatives can be hindered by high roaming costs and deployment obstacles.

That said, you want to harness best of breed providers, not only to realize the full benefits of Industry 4.0, but to set yourself up with a foundation that’ll be able to harness 5G developments. You need a trusted IoT partner, and because of the sophistication and complexity, it takes an ecosystem of proven innovators working collaboratively.

That’s why PTC and Ericsson are partners.

Doing what’s best
Ericsson unlocks the full value of global cellular IoT connectivity and provides on-premise solutions. PTC offers an industrial IoT platform that’s ready to configure and deploy, with flexible connectivity and capabilities to build solutions without manual coding.

Drilling down a bit further, Ericsson’s IoT Accelerator can connect and manage billions of devices and millions of applications easily, seamlessly and globally. PTC’s IoT solutions digitalize processes and products, combining the physical and digital worlds seamlessly.

And with wireless connectivity, we can deploy a lot of new technology – from augmented reality to artificial intelligence applications – without having to think about the time and cost of creating fixed infrastructures, running wires, adding network capacity and more.

According ABI Research, organizations that embrace Industry 4.0 and private cellular have the potential to improve gross margins by 5-13% in factory and warehouse operations. Manufacturers can expect a 10x return on their investment. And with 4.3 billion wireless connections in smart factories anticipated by 2030, it’s clear where things are headed.

By focusing on what we each do best, PTC and Ericsson is able to do what’s best for our customers. We can help them build and scale global cellular IoT deployments faster and gain a competitive advantage. They can reap the advantages of Industry 4.0 and create that path to 5G, future-proofing their operations and enjoying such differentiators as network slicing, edge computing and high-reliability, low latency communications.

Further, with our histories of innovation, customers are assured they’ll be supported in the future, remaining out front with the ability to adapt to change, grow and deliver on financial objections.

Editor's Note: This post was originally published by Steve Dertien, Chief Technology Officer for PTC, on Ericsson's website, and is part of a joint content effort with Kiva Allgood, head of IoT for Ericsson. To view Steve's original, please click here. To read Kiva's complementary post, please click here.

Read more…

A scientist from Russia has developed a new neural network architecture and tested its learning ability on the recognition of handwritten digits. The intelligence of the network was amplified by chaos, and the classification accuracy reached 96.3%. The network can be used in microcontrollers with a small amount of RAM and embedded in such household items as shoes or refrigerators, making them 'smart.' The study was published in Electronics.

Today, the search for new neural networks that can operate on microcontrollers with a small amount of random access memory (RAM) is of particular importance. For comparison, in ordinary modern computers, random access memory is calculated in gigabytes. Although microcontrollers possess significantly less processing power than laptops and smartphones, they are smaller and can be interfaced with household items. Smart doors, refrigerators, shoes, glasses, kettles and coffee makers create the foundation for so-called ambient intelligece. The term denotes an environment of interconnected smart devices. 

An example of ambient intelligence is a smart home. The devices with limited memory are not able to store a large number of keys for secure data transfer and arrays of neural network settings. It prevents the introduction of artificial intelligence into Internet of Things devices, as they lack the required computing power. However, artificial intelligence would allow smart devices to spend less time on analysis and decision-making, better understand a user and assist them in a friendly manner. Therefore, many new opportunities can arise in the creation of environmental intelligence, for example, in the field of health care.

Andrei Velichko from Petrozavodsk State University, Russia, has created a new neural network architecture that allows efficient use of small volumes of RAM and opens the opportunities for the introduction of low-power devices to the Internet of Things. The network, called LogNNet, is a feed-forward neural network in which the signals are directed exclusively from input to output. Its uses deterministic chaotic filters for the incoming signals. The system randomly mixes the input information, but at the same time extracts valuable data from the information that are invisible initially. A similar mechanism is used by reservoir neural networks. To generate chaos, a simple logistic mapping equation is applied, where the next value is calculated based on the previous one. The equation is commonly used in population biology and as an example of a simple equation for calculating a sequence of chaotic values. In this way, the simple equation stores an infinite set of random numbers calculated by the processor, and the network architecture uses them and consumes less RAM.

7978216495?profile=RESIZE_584x

The scientist tested his neural network on handwritten digit recognition from the MNIST database, which is considered the standard for training neural networks to recognize images. The database contains more than 70,000 handwritten digits. Sixty-thousand of these digits are intended for training the neural network, and another 10,000 for network testing. The more neurons and chaos in the network, the better it recognized images. The maximum accuracy achieved by the network is 96.3%, while the developed architecture uses no more than 29 KB of RAM. In addition, LogNNet demonstrated promising results using very small RAM sizes, in the range of 1-2kB. A miniature controller, Atmega328, can be embedded into a smart door or even a smart insole, has approximately the same amount of memory.

"Thanks to this development, new opportunities for the Internet of Things are opening up, as any device equipped with a low-power miniature controller can be powered with artificial intelligence. In this way, a path is opened for intelligent processing of information on peripheral devices without sending data to cloud services, and it improves the operation of, for example, a smart home. This is an important contribution to the development of IoT technologies, which are actively researched by the scientists of Petrozavodsk State University. In addition, the research outlines an alternative way to investigate the influence of chaos on artificial intelligence," said Andrei Velichko.

Originally posted HERE.

by Russian Science Foundation

Image Credit: Andrei Velichko

 

 

 

 

Read more…

Impact of IoT in Inventory

Internet of Things (IoT) has revolutionized many industries including inventory management. IoT is a concept where devices are interconnected via the internet. It is expected that by 2020, there will be 26 billion devices connected worldwide. These connections are important because it allows data sharing which then can perform actions to make life and business more efficient. Since inventory is a significant portion of a company’s assets, inventory data is vital for an accounting department for the company’s asset management and annual report.

Inventory solutions based on IoT and RFID, individual inventory item receives an RFID tag. Each tag has a unique identification number (ID) that contains information about an inventory item, e.g. a model, a batch number, etc. these tags are scanned by RF reader. Upon scanning, a reader extracts its IDs and transmits them to the cloud for processing. Along with the tag’s ID, the cloud receives location and the time of reading. This data is used for updates about inventory items’, allowing users to monitor the inventory from anywhere, in real-time.

Industrial IoT

The role of IoT in inventory management is to receive data and turn it into meaningful insights about inventory items’ location, status, and giving users a corresponding output. For example, based on the data, and inventory management solution architecture, we can forecast the number of raw materials needed for the upcoming production cycle. The output of the system can also send an alert if any individual inventory item is lost.

Moreover, IoT based inventory management solutions can be integrated with other systems, i.e. ERP and share data with other departments.

RFID in Industrial IoT

RFID consist of three main components tag, antenna, and a reader

Tags: An RFID tag carries information about a specific object. It can be attached to any surface, including raw materials, finished goods, packages, etc.

RFID antennas: An RFID antenna receives signals to supply power and data for tags’ operation

RFID readers: An RFID reader, uses radio signals to read and write to the tags. The reader receives data stored in the tag and transmits it to the cloud.

Benefits of IoT in inventory management

The benefits of IoT on the supply chain are the most exciting physical manifestations we can observe. IoT in the supply chain creates unparalleled transparency that increases efficiencies.

Inventory tracking

The major benefit of inventory management is asset tracking, instead of using barcodes to scan and record data, items have RFID tags which can be registered wirelessly. It is possible to accurately obtain data and track items from any point in the supply chain.

With RFID and IoT, managers don’t have to spend time on manual tracking and reporting on spreadsheets. Each item is tracked and the data about it is recorded automatically. Automated asset tracking and reporting save time and reduce the probability of human error.

Inventory optimization

Real-time data about the quantity and the location of the inventory, manufacturers can reduce the amount of inventory on hand while meeting the needs of the customers at the end of the supply chain.

The data about the amount of available inventory and machine learning can forecast the required inventory which allows manufacturers to reduce the lead time.

Remote tracking

Remote product tracking makes it easy to have an eye on production and business. Knowing production and transit times, allows you to better tweak orders to suit lead times and in response to fluctuating demand. It shows which suppliers are meeting production and shipping criteria and which needs monitoring for the required outcome.

It gives visibility into the flow of raw materials, work-in-progress and finished goods by providing updates about the status and location of the items so that inventory managers see when an individual item enters or leaves a specific location.

Bottlenecks in the operations

With the real-time data about the location and the quantity, manufacturers can reveal bottlenecks in the process and pinpoint the machine with lower utilization rates. For instance, if part of the inventory tends to pile up in front of a machine, a manufacturer assumes that the machine is underutilized and needs to be seen to.

The Outcomes

The data collected by inventory management is more accurate and up-to-date. By reducing these time delays, the manufacturing process can enhance accuracy and reduce wastage. An IoT-based inventory management solution offers complete visibility on inventory by providing real-time information fetched by RFID tags. It helps to track the exact location of raw materials, work-in-progress and finished goods. As a result, manufacturers can balance the amount of on-hand inventory, increase the utilization of machines, reduce lead time, and thus, avoid costs bound to the less effective methods. This is all about optimizing inventory and ensuring anything ordered can be sold through whatever channel necessary.

Originally posted here

Read more…

After so many years evangelizing the Internet of Things (IoT) or developing IoT products or selling IoT services or using IoT technologies, it is hard to believe that today there are as many defenders as detractors of these technologies. Why does the doubt still assail us: "Believe or Not Believe in the IoT"? What's the reason we keep saying every year that the time for IoT is finally now?

It does not seem strange to you that if we have already experienced the power of change that involves having connected devices in ourselves (wearables), in our homes, in cities, in transportation, in business, we continue with so many non-believers. Maybe, because the expectations in 2013 were so great that now in 2020 we need more tangible and realistic data and facts to continue believing.

In recent months I have had more time to review my articles and some white papers and I think I have found some reasons to continue believing, but also reasons not to believe.

Here below there are some of these reasons for you to decide where to position yourself.

Top reasons to believe

  • Mackinsey continue presenting us new opportunities with IoT
    • If in 2015 “Internet of Things: Mapping the value beyond the hype” the company estimated a potential economic impact as much as 11,1 US trillions per year in 2025 for IoT applications in 9 settings.
    • In 2019 “Growing opportunities in the Internet of Things” they said that “The number of businesses that use the IoT technologies has increased from 13 percent in 2014 to about 25 percent today. And the worldwide number of IoT connected devices is projected to increase to 43 billion by 2023, an almost threefold increase from 2018.”
  • Gartner in 2019 predicted that by 2021, there will be over 25 Billion live IoT endpoints that will allow unlimited number of IoT use cases.
  • Harbor Research considers that the market opportunity for industrial internet of things (IIoT) and industry 4.0 is still emergent.
    • Solutions are not completely new but are evolving from the convergence of existing technologies; creative combinations of these technologies will drive many new growth opportunities;
    • As integration and interoperability across the industrial technology “stack” relies on classic IT principles like open architectures, many leading IT players are entering the industrial arena;
  • IoT regulation is coming - The lack of regulation is one of the biggest issues associated with IoT devices, but things are starting to change in that regard as well. The U.S. government was among the first to take the threat posed by unsecured IoT devices seriously, introducing several IoT-related bills in Congress over the last couple of years. It all began with the IoT Cybersecurity Improvement Act of 2017, which set minimum security standards for connected devices obtained by the government. This legislation was followed by the SMART IoT Act, which tasked the Department of Commerce with conducting a study of the current IoT industry in the United States.
  • Synergy of IoT and AI - IoT supported by artificial intelligence enhances considerably the success in a large repertory of every-day applications with dominant one’s enterprise, transportation, robotics, industrial, and automation systems applications.
  • Believe in superpowers again, thanks to IoT - Today, IoT sensors are everywhere – in your car, in electronic appliances, in traffic lights, even probably on the pigeon outside your window (it’s true, it happened in London!). IoT sensors will help cities map air quality, identify high-pollution pockets, trigger alerts if pollution levels rise dangerously, while tracking changes over time and taking preventive measures to correct the situation. thanks to IoT, connected cars will now communicate seamlessly with IoT sensors and find empty parking spots easily. Sensors in your car will also communicate with your GPS and the manufacturer’s system, making maintenance and driving a breeze!. City sensors will identify high-traffic areas and regulate traffic flows by updating your GPS with alternate routes. These IoT sensors can also identify and repair broken street lamps. IoT will be our knight in shining, super-strong metallic armor and prevent accidents like floods, fires and even road accidents, by simply monitoring fatigue levels of truck drivers!. Washing machines, refrigerators, air-conditioners will now self-monitor their usage, performance, servicing requirements, while triggering alerts before potential breakdowns and optimizing performance with automatic software updates. IoT sensors will now help medical professional monitor pulse rates, blood pressure and other vitals more efficiently, while triggering alerts in case of emergencies. Soon, Nano sensors in smart pills will make healthcare super-personalized and 10x more efficient!

Top reasons not to believe

  1. Three fourths of IoT projects failing globally. Government and enterprises across the globe are rolling out Internet of Things (IoT) projects but almost three-fourths of them fail, impacted by factors like culture and leadership, according to US tech giant Cisco (2017). Businesses are spending $745 billion worldwide on IoT hardware and software in 2019 alone. Yet, three out of every four IoT implementations are failing.
  2. Few IoT projects survive proof-of-concept stage - About 60% of IoT initiatives get stalled at the Proof of Concept (PoC) stage. If the right steps aren’t taken in the beginning, say you don’t think far enough beyond the IT infrastructure, you end up in limbo: caught between the dream of what IoT could do for your business and the reality of today’s ROI. That spot is called proof-of-concept (POC) purgatory.
  3. IoT Security still a big concern - The 2019 annual report of SonicWall Caoture Labs threat researchers analyzing data from over 200,000 malicious events indicated that 217.5 percent increase in IoT attacks in 2018.
  4. There are several obstacles companies face both in calculating and realizing ROI from IoT. Very few companies can quantify the current, pre-IoT costs. The instinct is often to stop after calculating the cost impact on the layer of operations immediately adjacent to the potential IoT project.  For example, when quantifying the baseline cost of reactive (versus predictive or prescriptive) maintenance, too many companies would only include down time for unexpected outages, but may not consider reduced life of the machine, maintenance overtime, lost sales due to long lead times, supply chain volatility risk for spare parts, and the list goes on.
  5. Privacy, And No, That’s Not the Same as Security. The Big Corporations don’t expect to make a big profit on the devices themselves. the Big Money in IoT is in Big Data. And enterprises and consumers do not want to expose everything sensors are learning about your company or you.
  6. No Killer Application – I suggest to read my article “Worth it waste your time searching the Killer IoT Application?"
  7. No Interoperable Technology ecosystems - We have a plethora of IoT vendors, both large and small, jumping into the fray and trying to establish a foothold, in hopes of either creating their own ecosystem (for the startups) or extending their existing one (for the behemoths).
  8. Digital Fatigue – It is not enough for us to try to explain IoT, that now more technologies such as Artificial Intelligence, Blockchain, 5G, AR / VR are joining the party and of course companies say enough.

You have the last word

We can go on forever looking for reasons to believe or not believe in IoT but we cannot continue to deny the evidence that the millions of connected devices already out there and the millions that will soon be waiting for us to exploit their full potential.

I still believe. But you have the last word.

Thanks in advance for your Likes and Shares

Read more…

By: Tom Jeltes, Eindhoven University of Technology

The Internet of Things (IoT) consists of billions of sensors and other devices connected to each other via internet, all of which need to be protected against hackers with malicious purposes. A low-cost and energy efficient solution for the security of IoT devices uses the unique characteristics of the built-in memory chips. Ph.D. candidate Lieneke Kusters investigated how to make optimal use of the chip's digital fingerprint to generate a security key.

The higher the number of devices connected to each other via the Internet of Things, the greater the risk that malicious hackers might gain access to important information, or even take over entire systems. Quite apart from all kinds of privacy issues, it's not hard to imagine that that someone who, for example, has control over temperature sensors in a chemical or nuclear plant, could cause serious damage.

 To prevent problems like these from occurring, each IoT device needs to be able, as it were, to show an identity document—"authentication," in professional terms. Normally, speaking, this is done with a kind of password, which is sent in encrypted form to the person who is communicating with the device. The security key needed for that has to be stored in the IoT device one way or another, Lieneke Kusters explains. "But these are often small and cheap devices that aren't supposed to use much energy. To safely store a key in these devices, you need extra hardware with constant power supply. That's not very practical."

Digital fingerprint

There is a different way: namely by deducing the security key from a unique physical characteristic of the memory chip (Static Random-Access Memory, or SRAM) that can be found in practically every IoT device. Depending on the random circumstances during the chip's manufacturing process, the memory locations have a random default value of 0 or 1.

"That binary code which you can read out when activating the chip, constitutes a kind of digital fingerprint of the device," says Kusters, who gained her doctorate at the Information and Communication Theory Laboratory at the TU/e department of Electrical Engineering. This fingerprint is known as a Physical Unclonable Function (PUF). "The Eindhoven-based company Intrinsic ID sells digital security based on SRAM-PUFs. I collaborated with them for my doctoral research, during which I focused on how to generate, in a reliable way, a key from that digital fingerprint that is as long as possible. The longer, the safer."

The major advantage of security keys based on SRAM-PUFs is that the key exists only at the moment when authentication is required. "The device restarts itself to read out the SRAM-PUF and in doing so creates the key, which subsequently gets erased immediately after use. That makes it all but impossible for an attacker to steal the key."

Noise and reliability

But that's not the entire story, because some bits of the SRAM do not always have the same value during activation, Kusters explains. Ten to fifteen percent of the bits turn out not to be determined, which makes the digital fingerprint a bit fuzzy. How do you use that fuzzy fingerprint to make a key of the highest possible complexity that nevertheless still fits into the receiving lock—practically—each time?

"What you want to prevent is that the generated key won't be recognized by the receiving party as a consequence of the 'noise' in the SRAM-PUF," Kusters explains. "It's alright if that happens one in a million times perhaps, preferably less often." The probability of error is smaller with a shorter key, but such a key is also easier to guess for people with bad intentions. "I've searched for the longest reliable key, given a certain amount of noise in the measurement. It helps if you store extra information about the SRAM-PUF, but that must not be of use to a potential attacker. My thesis is an analysis of how you can reach the optimal result in different situations with that extra information."

Originaly posted here.


 
Read more…

Can AI Replace Firmware?

Scott Rosenthal and I go back about a thousand years; we've worked together, helped midwife the embedded field into being, had some amazing sailing adventures, and recently took a jaunt to the Azores just for the heck of it. Our sons are both big data people; their physics PhDs were perfect entrees into that field, and both now work in the field of artificial intelligence.

At lunch recently we were talking about embedded systems and AI, and Scott posed a thought that has been rattling around in my head since. Could AI replace firmware?

Firmware is a huge problem for our industry. It's hideously expensive. Only highly-skilled people can create it, and there are too few of us.

What if an AI engine of some sort could be dumped into a microcontroller and the "software" then created by training that AI? If that were possible - and that's a big "if" - then it might be possible to achieve what was hoped for when COBOL was invented: programmers would no longer be needed as domain experts could do the work. That didn't pan out for COBOL; the industry learned that accountants couldn't code. Though the language was much more friendly than the assembly it replaced, it still required serious development skills.

But with AI, could a domain expert train an inference engine?

Consider a robot: a "home economics" major could create scenarios of stacking dishes from a dishwasher. Maybe these would be in the form of videos, which were then fed to the AI engine as it tuned the weighting coefficients to achieve what the home ec expert deems worthy goals.

My first objection to this idea was that these sorts of systems have physical constraints. With firmware I'd write code to sample limit switches so the motors would turn off if at an end-of-motion extreme. During training an AI-based system would try and drive the motors into all kinds of crazy positions, banging destructively into stops. But think how a child learns: a parent encourages experimentation but prevents the youngster from self-harm. Maybe that's the role of the future developer training an AI. Or perhaps the training will be done on a simulator of some sort where nothing can go horribly wrong.

Taking this further, a domain expert could define the desired inputs and outputs, and then a poorly-paid person do the actual training. CEOs will love that. With that model a strange parallel emerges to computation a century ago: before the computer age "computers" were people doing simple math to create tables of logs, trig, ballistics, etc. A room full all labored at a problem. They weren't particularly skilled, didn't make much, but did the rote work under the direction of one master. Maybe AI trainers will be somewhat like that.

Like we outsource clothing manufacturing to Bangladesh, I could see training, basically grunt work, being sent overseas as well.

I'm not wild about this idea as it means we'd have an IoT of idiots: billions of AI-powered machines where no one really knows how they work. They've been well-trained but what happens when there's a corner case?

And most of the AI literature I read suggests that inference successes of 97% or so are the norm. That might be fine for classifying faces, but a 3% failure rate of a safety-critical system is a disaster. And the same rate for less-critical systems like factory controllers would also be completely unacceptable.

But the idea is intriguing.

Original post can be viewed here

Feel free to email me with comments.

Back to Jack's blog index page.

Read more…

Theoratical Embedded Linux requirements

Hardware

SoC

A System on Chip (SoC), is essentially an integrated circuit that takes a single platform and integrates an entire computer system onto it. It combines the power of the CPU with other components that it needs to perform and execute its functions. It is in charge of using the other hardware and running your software. The main advantage of SoC includes lower latency and power saving.

It is made of various building blocks:

  • Core + Caches + MMU – An SoC has a processor at its core which will define its functions. Normally, an SoC has multiple processor cores. For a “real” processor, e.g. ARM Cortex-A9. It’s the main thing kept in mind while choosing an SoC. Maybe co-adjuvanted by e.g. a SIMD co-processor like NEON.
  • Internal RAM – IRAM is composed of very high-speed SRAM located alongside the CPU. It acts similar to a CPU cache, and generally very small. It is used in the first phase of the boot sequence.
  • Peripherals – These can be a simple ADC, DSP, or a Graphical Processing Unit which is connected via some bus to the Core. A low power/real-time co-processor helps the main Core with real-time tasks or handle low power states. Examples of such IP cores are USB, PCI-E, SGX, etc.

External RAM

An SoC uses RAM to store temporary data during and after bootstrap. It is the memory an embedded system uses during regular operation.

Non-Volatile Memory

In an Embedded system or single-board computer, it is the SD card. In other cases, it can be a NAND, NOR, or SPI Data flash memory. It is the source of data the SoC reads and stores all the software components needed for the system to work.

External Peripherals

An SoC must have external interfaces for standard communication protocols such as USB, Ethernet, and HDMI. It also includes wireless technology protocols of Wi-Fi and Bluetooth.

Software

Second-Article-01-1024x576.jpghttps://www.tirichlabs.com/storage/2020/09/Second-Article-01-300x169.jpg 300w, https://www.tirichlabs.com/storage/2020/09/Second-Article-01-768x432.jpg 768w, https://www.tirichlabs.com/storage/2020/09/Second-Article-01-1200x675.jpg 1200w" alt="" />

First of all, we introduce the boot chain which is the series of actions that happens when an SoC is powered up.

Boot ROM: It is a piece of code stored in the ROM which is executed by the booting core when it is powered-on. This code contains instructions for the configuration of SoC to allow it to execute applications. The configurations performed by Boot ROM include initialization of the core’s register and stack pointer, enablement of caches and line buffers, programming of interrupt service routine, clock configuration.

Boot ROM also implements a Boot Assist Module (BAM) for downloading an application image from external memories using interfaces like Ethernet, SD/MMC, USB, CAN, UART, etc.

1st stage bootloader

In the first-stage bootloader performs the following

  • Setup the memory segments and stack used by the bootloader code
  • Reset the disk system
  • Display a string “Loading OS…”
  • Find the 2nd stage boot loader in the FAT directory
  • Read the 2nd stage boot loader image into memory at 1000:0000
  • Transfer control to the second-stage bootloader

It copies the Boot ROM into the SoC’s internal RAM. Must be tiny enough to fit that memory usually well under 100kB. It initializes the External RAM and the SoC’s external memory interface, as well as other peripherals that may be of interest (e.g. disable watchdog timers). Once done, it executes the next stage, depending on the context, which could be called MLO, SPL, or else.

2nd stage bootloader

This is the main bootloader and can be 10 times bigger than the 1st stage, it completes the initialization of the relevant peripherals.

  • Copy the boot sector to a local memory area
  • Find kernel image in the FAT directory
  • Read kernel image in memory at 2000:0000
  • Reset the disk system
  • Enable the A20 line
  • Setup interrupt descriptor table at 0000:0000
  • Setup the global descriptor table at 0000:0800
  • Load the descriptor tables into the CPU
  • Switch to protected mode
  • Clear the prefetch queue
  • Setup protected mode memory segments and stack for use by the kernel code
  • Transfer control to the kernel code using a long jump

Linux Kernel

The Linux kernel is the main component of a Linux OS and is the core interface between hardware and processes. It communicates between the hardware and processes, managing resources as efficiently as possible. The kernel performs following jobs

  • Memory management: Keep track of memory, how much is used to store what, and where
  • Process management: Determine which processes can use the processor, when, and for how long
  • Device drivers: Act as an interpreter between the hardware and the processes
  • System calls and security: Receive requests for the service from processes

To put the kernel in context, they can be interpreted as a Linux machine as having 3 layers:

  • The hardware: The physical machine—the base of the system, made up of memory (RAM) and the processor (CPU), as well as input/output (I/O) devices such as storage, networking, and graphics.
  • The Linux kernel: The core of the OS. It is a software residing in memory that tells the CPU what to do.
  • User processes: These are the running programs that the kernel manages. User processes are what collectively makeup user space. The kernel allows processes and servers to communicate with each other.

Init and rootfs – init is the first non-Kernel task to be run, and has PID 1. It initializes everything needed to use the system. In production embedded systems, it also starts the main application. In such systems, it is either BusyBox or a custom-crafted application.

View original post here

Read more…

Sponsor